Ex-Facebook employee-turned-whistleblower Frances Haugen’s testimony before the Senate Commerce Committee’s subcommittee on Tuesday presented a picture of a company driven by a singleminded desire for growth and unwilling to implement solutions to known problems for fear of impacting its bottom line.
Under questioning from senators, Ms Haugan revealed that the world’s largest social network has deliberately made choices that prioritise viral content, even at the risk of real-world harm. For their part, senators are now vowing to take action against the company on a range of issues that have befuddled Congress since Facebook exploded into the public consciousness nearly two decades ago.
Here are some of the top moments and takeaways from Tuesday’s hearing.
Facebook repeatedly gets compared to ‘Big Tobacco’
Perhaps no industry in modern American history has a worse reputation for lying than the big four manufacturers of cigarettes – Phillip Morris (now known as Altria), RJ Reynolds, Brown and Williamson, and Lorillard – who in 1998 reached a $206bn settlement with attorney generals from 46 of 50 US states who’d sued to recover Medicaid costs for treating smoking-related illnesses and to stop a slew of deceptive and unethical marketing practices.
Two years before that settlement, a tobacco executive called Jeffrey Wigand appeared on CBS News’ 60 Minutes to reveal that his employer (Brown and Williamson) had intentionally manipulated its tobacco with chemicals to make them more addictive.
As he opened the hearing featuring Ms Haugen – who also appeared on 60 Minutes to discuss what she’d told officials about Facebook’s conduct – Senator Richard Blumenthal made the comparison explicit. In his opening statement, the Connecticut Democrat recalled his role in the tobacco litigation as a state attorney general and said Facebook is facing a “big tobacco, jaw-dropping moment of truth”.
For her part, Ms Haugen also compared her former employer to a tobacco company while telling senators that Facebook hides data to keep the public from being able to understand the harm it causes.
“Facebook hides behind walls that keeps researchers and regulators from understanding the true dynamics of their system. Facebook will tell you privacy means they can’t give you data. This is not true,” Ms Haugen said.
“When tobacco companies claimed that filtered cigarettes were safer for consumers, scientists could independently invalidate these marketing messages and confirm that in fact they pose a greater threat to human health. The public cannot do the same with Facebook, we are given no other option than to take their marketing messages on blind faith.”
Facebook executives discourage building tools that could have detected more threats because there aren’t enough employees to deal with them
Ms Haugen told senators that Facebook’s staffing problems stem from a cycle of scandal that makes talented people less willing to work there.
“Facebook is stuck in a cycle where it struggles to struggles to hire that causes it to understaff projects, which causes scandals, which then makes it harder to hire,” she said.
Ms Haugen later added that she saw a “pattern of behavior” in which projects were so understaffed that “there was a kind of an implicit discouragement from having better detection systems” for problems.
“My last team at Facebook was on the counter espionage team within the threat intelligence org, and at any given time our team could only handle a third of the cases that we knew about,” she explained. “And we knew that if we built even a basic detector, we would likely have many more cases.”
Facebook executives misled the public about why it disabled features that could have prevented the 6 January attack on the Capitol
In her interview with 60 Minutes, Ms Haugen said Facebook’s explanation that it had turned off civic safeguards it had implemented during the 2020 election after the election (but switched back on after the 6 January insurrection) out of free speech concerns was presenting a “false choice”.
“They’ve said the safeguards that were in place before the election in implicated free speech, but the choices that were happening on the platform were really about how reactive and twitchy … how viral was the platform, and Facebook changed those safety defaults in the run up to the election because they knew they were dangerous,” she said. “Because they wanted that growth back … after the election, they returned to their original defaults”.
The fact that Facebook had to turn the safeguards back on after the Capitol was ransacked, Ms Haugen said, was “deeply problematic”.
Mark Zuckerberg rejected changes that could have prevented violence
Under questioning from Commerce Committee Chair Maria Cantwell, Ms Haugen told senators that Facebook CEO Mark Zuckerberg was told of changes that could be made to the platform that may have reduced the reach of viral content that was fueling violence in places like Myanmar, but the CEO rejected them because they would have made the platform “less viral”.
“Mark Zuckerberg was directly presented with a list of soft interventions … about making slightly different choices to make the platform less viral less twitchy,” she said, based on a Facebook metric called “meaningful social interaction”.
“Mark was presented with these options, and chose to not remove downstream MSI in April of 2020, even just isolated in at- risk countries – that’s countries at risk of violence – if it had any impact on the overall MSI metric,” she explained.
Ms Haugen fears what could happen if Facebook’s power is not checked by regulation
In her opening statement, Ms Haugen said Facebook executives want senators “to believe that you must choose between a Facebook full of divisive and extreme content or losing one of the most important values our country was founded upon, free speech”.
But she again called the choice between regulation and openness a “false choice”, and called on Congress to take action before violence and extremism become even more prevalent.
“My fear is that without action, divisive and extremist behaviours we see today are only the beginning. what we saw in Myanmar and are now seen in Ethiopia are only the opening chapters of a story so terrifying, no one wants to read the end of it,” she said.