Haugen placed the blame for violence in Myanmar and Ethiopia, as well as the January 6 riot at the US Capitol, squarely at Facebook's feet during a Monday hearing before the Parliament, arguing that such atrocities were merely "part of the opening chapters of a novel that is going to be horrific to read."
"Engagement-based ranking prioritizes and amplifies divisive, polarizing content," she said, claiming that Facebook could ease up on the division if it was willing to sacrifice a few dollars here and there. However, "Facebook is unwilling to give up those slivers for our safety," she declared – and the results of continuing down the platform's current path would be disastrous.
Denying her goal is to force further censorship on the already tightly-controlled platform, Haugen argued that serving up content from family and friends and requiring users to cut and paste “divisive content” instead of sharing with a single click could cut back on the sharing of “hateful” material.
“We are literally subsidizing hate,” Haugen insisted – equating promoting engagement with promoting hate – because “anger and hate are the easiest way to grow on Facebook.” Continuing, she claimed it was “substantially cheaper to run an angry hateful divisive ad than it is to run a compassionate, empathetic ad.”
Asked whether the platform was “making hate worse,” she agreed it was “unquestionably” doing just that – though neither she nor her interlocutor paused to define “hate” or provide context as to what “making it worse” might look like.
The inquiry took a notably moralizing pitch, with MP John Nicolson (SNP) taking the soapbox to declare, “Facebook is failing to prevent harm to children, it’s failing to present the spread of disinformation, it’s failing to prevent hate speech. It does have the power to deal with these issues, it’s just choosing not to, which makes me wonder if Facebook is just fundamentally evil.”
Rather than chase users away with censorship and further intrusion into their privacy, Haugen insisted, government regulation could “force Facebook back into a place where it was more pleasant to be on Facebook and that could be good for long-term growth of the company.”
While she acknowledged that about 60% of new accounts being opened on the platform were not made by real people, suggesting not only their fellow users but investors in the platform itself were being duped, she failed to explain how screening for such phony accounts could avoid intruding on the privacy of “real” users. Such duping, after all, is going on at no small scale, to hear her speak – CEO Mark Zuckerberg “has unilateral control of 3 billion people,” she told the Parliament.
Despite being good, conscientious people, Haugen continued, the Facebook employees were being corrupted by a system built on bad incentives, in which every penny of profit must be prioritized over the well-being of users for whom one wrong move could send them spiraling down a rabbit hole of extremism.
Facebook didn’t “intend” to send people down such holes or otherwise radicalize them, Haugen stressed, arguing the complexity of the algorithms was responsible for new users being sucked into extremist vortexes. However, she admitted the platform pushed users toward the most extreme version of their interests, since controversy gets clicks.
Haugen insisted the platform was “very cautious” about how it added new forms of “hate speech” into the platform’s compendium of offense, calling for an improvement on “content-based solutions” that took into account national variations of phrases in the same language, such as differences in slang between Scotland and the US.
However, she admitted Facebook focused most of its misinformation-fighting efforts on so-called “tier zero” countries like the US, Brazil, and India, leaving other countries like Pakistan, Myanmar and Ethiopia to their own devices – to what she suggested were disastrous and in some cases genocidal results. This is also after the company caught Israeli influence operators meddling with elections across Africa, Latin America and Asia, raising the question of whether certain countries were being permitted to get away with meddling more than others.
Ultimately, Haugen seemed to blame Facebook users for what happened to them, noting that the ads that got the most engagement – and were thus cheapest – were the most likely to be shared by users precisely because they appealed to feelings of hatred, anger and divisiveness. All Facebook had to do was align itself with the “public good” and not lie to the public, she argued. Barring that, “better oversight” was required, she said.
Haugen praised the UK for its efforts to rein in Facebook’s abuses and hinted that she was “a little excited” about the company’s movement into “augmented reality.”
“The danger of Facebook is not individuals saying bad things,” she claimed, “it is about the systems of amplification that disproportionately give individuals saying extreme polarizing things the largest megaphone in the room.”
Journalists who’d spent years trying to convince the world that Facebook was a menace to democracy felt vindicated, posting links to both Haugen’s testimony and the flood of documents she released.
However, not everyone was taken in by Haugen’s apparent contrition on the part of her employer. Journalist Glenn Greenwald reminded her growing fan-base that her well-heeled whistleblowing campaign was financed by Russiagate-loving billionaire Pierre Omidyar.
Omidyar funded Greenwald’s former employer, the Intercept, supposedly to make NSA Edward Snowden’s own whistleblowing documents available to the world. Over a decade later, just 5% of that material has been seen by the public.