Published Date: 02-17-21

Whatever the fallout of the political violence that has already dogged 2021, one thing is certain: Facebook will be under more scrutiny than ever for its role in fomenting it.

The platform has shown in recent years that it is willing to “do something” to moderate hateful and illegal content – but it continues to dodge the kind of systemic fixes that would clear up the problem for good. Instead, it persists in affixing Band-Aids to a gusher, and wallowing in a throw-up-your-hands mentality epitomized by recent comments from Facebook Vice-President of Global Affairs and Communications Nick Clegg – who wrote, “With so much content posted every day, rooting out the hate is like looking for a needle in a haystack.”

Clegg’s subtle feint toward the overwhelmingness of it all feels like victimhood – a discomforting line of thought from the same company who built the mammoth global haystack in the first place. In any case, his imagery doesn’t go far enough. “A more honest metaphor,” writes The New Yorker’s Andrew Marantz, “would posit a powerful set of magnets at the center of the haystack – Facebook’s algorithms, which attract and elevate whatever content is most highly charged. If there are needles anywhere nearby – and, on the Internet, there always are – the magnets will pull them in. Remove as many as you want today; more will reappear tomorrow. This is how the system is designed to work.”

These magnets do not make distinctions between truth and falsehood, between socially beneficial and harmful behavior. They do not care if content was originally created by someone else and disseminated without permission. They do not care if that content is bait for malware, or if it is misleading, hateful, or illegal. They seek only to push out what is engaging and thereby keep users glued to the service so that they may be profiled and monetized in a process that author Shoshana Zuboff has dubbed “surveillance capitalism.”

But Facebook does not just watch us – it influences what we do, it steers us in particular directions. It shapes our reality. It stirs our emotions and, inevitably, our actions – a vicious, self-perpetuating cycle that has proven to have grave real-world consequences for society and democracy many times over.

By hiding behind Section 230 of the Communications Decency Act, Facebook abdicates all accountability for facilitating misinformation and the illegal and hateful behavior of too many of its users. Recently, it has even distanced itself from the consequences of its own content moderation policies – by establishing a controversial Oversight Board.

Early in the process of developing the Oversight Board, Facebook tried to analogize it to the movie rating system. But as film and television producer Gale Anne Hurd explained at the time, the comparison is fatally flawed. Ratings systems — whether for movies, television shows, music, or video games – occur before the release of the content; by contrast, Oversight Board decisions could come – if at all – weeks and even months after a damaging post has already accrued millions of views. Moreover, almost without exception, every commercially released film bears a rating, while the Oversight Board will be highly selective in managing the number of reviews they conduct… a tiny sampling of Facebook’s “haystack” of user-generated content.

The Oversight Board will conduct an after-the-fact review of a handful of decisions by Facebook staff to take content down. It will apparently conduct no review of content that Facebook staff leaves up. The Oversight Board may treat a handful of symptoms while the disease continues to rage. And even these feeble steps are only in response to government or shareholder pressure.

Facebook must stop dodging. It needs to stop with the empty apologies. It needs to quit with the ad hoc crisis management after each specific incident that gets publicized, and instead proactively address the underlying causes of those incidents. It needs to stop insisting in those ever-present ads in Washington publications that it “welcomes” changes to Section 230 without offering serious and effective ideas on what those changes should be and how they would make its platforms more responsible and accountable.

Rather than massaging its public relations, Facebook needs to fix its problems, and in the process, be more transparent. It really needs to start owning the negative consequences of its business model.

Facebook pledges to “bring the world closer together” but until it reverses the polarity of its magnets, it will only keep pulling the world apart.