Published Date: 08/24/22

Meta gave its Oversight Board whiplash this spring, when it abruptly withdrew a request for advice on content moderation following Russia’s invasion of Ukraine. Meanwhile, we’re left to wonder, along with The Verge, “If the Oversight Board’s only role is to handle the easy questions, why bother with it at all?”

Meta can’t get content moderation right even on easy questions like piracy, as creative communities have long known. It’s ironic because Zuck enjoys being compared to the Eye of Sauron – you know, the incorporeal manifestation of evil that relentlessly surveils Middle Earth in J. R. R. Tolkien’s Lord of the Rings. If his android stare ever falls upon illegal or otherwise objectionable content, does Zuck fail to notice? Worse, might he actually like what he sees?

To show the enormity of Meta’s content moderation oversights, we’ve prepared an interactive quiz. Guess whether each of the following items describes posts that Meta leaves up or takes down – regardless of what the terms of service may say. If you, too, are outraged by the answers, then let’s agree it’s high time to destroy the One Ring – er, that is, to make certain that Meta has truly independent, effective oversight.

Meta’s Content Moderation Oversights – An Interactive Quiz

1Terrorist propaganda

The correct answer is (A) Leaves up – it’s “engaging.”
Between April and December 2021, Politico found numerous examples of beheadings, hate speech, and similar content posted by supporters of the Taliban and Islamic State Group. They weren’t flying under the radar. Instead, violence appeared on public Facebook pages, where it often bore promotional tags like “insightful” or “engaging.”

2Ads promoting underwear for diverse female bodies

The correct answer is (B) Takes down – despite small business owners’ pleas.
While horrifying content slips by Facebook’s algorithms, it’s quite efficient at blocking ads from the UK company Pantee. The owners would like to depict their products and challenge unrealistic stereotypes of women’s bodies. Since the algorithm won’t allow it, they’ve been driven to showcase their designs on oranges, grapefruits, and watermelons. Meanwhile, their small business is suffering.

3Holocaust deniers

The correct answer is (A) Leaves up – even after FINALLY deciding not to allow them anymore.
Facebook used to permit this type of hate speech, proudly touting its, um, leniency as an example of Zuck’s commitment to “giving people a voice.” Meta banned Holocaust denial in October 2020, but it continued to abound in 2021, earning Facebook a grade of “D” from the Anti-Defamation League.

4Child predator rings

The correct answer is (A) Leaves up – and recommends.
A researcher from the University of Pittsburgh flagged numerous Facebook groups devoted to grooming children aged 9–13. While Facebook failed to act, its algorithm dredged up similar groups, asking if the researcher might like to join.

5Death threats against a head of state

The correct answer is (A) Leaves up – until losing 80 million users.
Generally, people frown on calls for violence. Nevertheless, Meta decided to allow some death threats against Vladimir Putin and Russian soldiers after the invasion of Ukraine, according to leaked internal e-mails. In retaliation, Russia banned Instagram (Facebook was already banned), prompting a Meta executive to tweet a complaint on behalf of 80 million Russian users. Was he as worried about Russians’ isolation as Meta’s bottom line? All we know is Meta was claiming it had always banned threats against heads of state by the next business day.

6Upskirting videos

The correct answer is (A) Leaves up – they don’t violate any “specific Community Standards.”
At least that’s what Facebook told BBC journalists when they reported thousands of criminal voyeurism posts. After further correspondence, Facebook issued yet another sorry-not-sorry promise to do better.

7Hate speech against ethnic minorities

The correct answer is (A) Leaves up. That’s why Rohingya refugees are suing.
Displaced Rohingya Muslims from Myanmar filed a lawsuit in San Francisco, asking to sue Facebook under Burmese law, which doesn’t indemnify Facebook for user-generated content as American laws do. UN investigators have already found Facebook culpable for giving reach to hate speech. Since thousands of Rohingya have died in the genocide, don’t you think survivors deserve at least the $150 million they’re requesting?

8Cryptocurrency scam ads featuring celebrities

The correct answer is (A) Leaves up. That’s why billionaire Andrew Forrest and the Australian government are suing.
Individuals normally can’t bring criminal charges against companies, but Australia’s Attorney General granted special permission to mining executive Andrew Forrest, who has asked Facebook for years to stop his likeness from appearing in scam ads targeting fellow Australians. The next month, the Australian government initiated similar proceedings, noting that one scam victim lost $480,000.

Scorecard

So, how often could you distinguish content Meta leaves up from content it takes down? How well do you understand the Enemy?

0-2 Correct: Fool of a Took!

Like Pippin, you’re clueless about the evil in the world. Are you not paying attention? Have you never left the Shire? Well, even Pippin had a heroic destiny. After learning a thing or two, you could prove helpful on the quest against Meta’s oversights.

3-5 Correct: Playing at Riddles

With around fifty percent correct, you’re taking shots in the dark, much like Gollum when Bilbo asked, “What have I got in my pocket?” But oversight for the world’s largest social media platform is too important to leave to chance. After reviewing The Facebook Timeline of Scandal and Strife, please reapply for the Fellowship. We need all the help we can get.

6-8 Correct: Council of the Wise

Like Gandalf and Galadriel, you well know Sauron’s mind and history. Perhaps you avidly follow the news. Perhaps you have studied Meta yourself. The quest needs wise heads like yours. Just promise not to pull a Saruman, abusing your knowledge to take the Enemy’s place.

Thank you for playing the Meta Oversights quiz! Remember, we deliberately picked topics that should represent easy moderation calls. Meta clearly can’t be trusted with hard decisions, and sadly, the Oversight(s) Board is just a charade.

Our list of Meta’s failures is far from complete. We haven’t even mentioned Instagram’s pro-anorexia accounts or Facebook’s human trafficking pages, for example. We could write quiz questions forever, unfortunately.

An internet platform with this track record hasn’t earned its 1990s-era liability shields for user-generated content. It’s time for the U.S. government to take them away.