top of page
  • Howard Law

Whistleblower rips the lid off Facebook’s Online Harms

Somebody had to say the emperor had no clothes. In the end it was Frances Haugen, a 37-year old career data scientist from Iowa who worked for Facebook’s version of Internal Affairs, the Civic Integrity Unit.

What she found at Facebook should not have surprised anyone: her understaffed unit was dramatically incapable of dealing with the gusher of dangerous content posted to Facebook and Instagram including revenge porn, incitement to murder and political violence, political polarization, and vaccine disinformation. Meanwhile a white-listing protocol exempted celebrities and elected politicians from Facebook’s few rules on harmful posts.

Senior Facebook leadership, including CEO Mark Zuckerberg, deflected and suppressed the Integrity Unit’s reports, even keeping their celebrated Oversight Board in the dark. Eventually Zuckerberg disbanded Haugen’s unit and reassigned its self-policing responsibilities to operational divisions of the company.

Haugen was the source of thousands of internal documents proving her point, information that once leaked formed the basis of investigative news reports in the Wall Street Journal and CBS’s 60 Minutes, and now have become the key allegations in a series of complaints she has filed to the US Securities Exchange Commission. In the name of protecting investors in a publicly traded company, the SEC and Congressional sub-committees are about to give Facebook a political migraine that just won’t quit.

For political news junkies and journalists, the significant chapter in this story is the 2018 tweak that Zuckerberg made to Facebook’s News Feed algorithm.

Zuckerberg was worried about stagnating growth in audience time spent on the platform, which directly impacted advertising revenue for the $1 trillion company. The algorithm tweak pushed downwards on less sensational news content and upwards on clickbait-style “engagement” content that earned likes, comments and shares. Whatever got an emotional response, says Haugen.

As political actors and news publishers discovered soon after, the Zuckerberg tweak fed an appetite for uploading polarizing content, especially on political issues and campaigns. According to Haugen, political parties warned Facebook that the new algorithm was driving them to emphasize divisive and polarizing content in order to compete for voter attention on Facebook.

Haugen was also mortified by Facebook’s role in developing countries where the incitement of political violence and genocide, and the open recruitment of assassins by drug cartels, was running unchecked, either because of inadequate staffing or because the moderators did not understand the language of the posts.

Another one of her complaints exposes Facebook’s whitelisting practices, consisting of moderators flagging problematic posts in their AI-driven monitoring program, and then giving celebrities and incumbent politicians preferential treatment on take-downs or warnings.

There is no shortage of outrageous whitelisting anecdotes to choose from. There was the deliberate delay in removing soccer star Neymar’s revenge porn posting in retaliation against an accuser, causing her nude images to be shared 54 million times. Zuckerberg personally gave race-baiting US President Donald Trump a free pass on inciting murder with his 2020 post “When the Looting Starts, the Shooting Starts.”

And Haugen found a way into almost every American parental conversation, revealing Zuckerberg’s plan to mitigate an aging Facebook audience by on-boarding young girls to its Instagram app. The Civic Integrity Unit warned Zuckerberg that the celebrity-driven culture of perfected body images was fueling teen eating disorders and depression. He did nothing.

For Canadians who suffered through the C-10 Netflix Bill debate this spring, note the key take-away from the whistleblower’s revelations: it’s Facebook’s algorithm that decides what content gets pushed or buried on their platforms, for better or worse. The much denounced provision in C-10 requiring Netflix to tweak their algorithm in favour of Canadian content seems benign by comparison.

The whistleblower’s story has legs and, in Canada, we can expect it to find its way into the discussion of the Liberal government’s pending legislation to regulate Online Harms, promised by Christmas.

Haugen’s complaints illustrate in a most public way the prevalence of severe online harms and Facebook’s lousy job dealing with it.

But they also raise the important public policy questions of how to regulate a wide-open global communications platform full of the best and worst of what humanity has to offer.

More on that, in our next blog.


bottom of page