By Ruth Vitale

In November 2018, reeling from growing public outcry over toxic and harmful content on their platform, Facebook pledged to create an independent Oversight Board to assist with making its most important content moderation decisions. 

More than 18 months later, the Board has materialized, with 20 inaugural members in what will ultimately be a 40-person group tasked with nothing less than helping to “answer some of the most difficult questions around freedom of expression online: what to take down, what to leave up, and why.”

The Board is intended to deliberate over Facebook’s most challenging content decisions, cases in which leaving something up or taking it down can be the difference between a safer or a more dangerous internet. We now know that harmful content on Facebook has the amazing capacity to destroy. The Board will be moderating content that has been shown to stoke genocide, threaten the sanctity of elections, and even undermine democratic government.

Why didn’t Facebook create an Oversight Board years ago? Probably because it has had no legal obligation to do so – thanks to the wide “safe harbors” granted by laws written in the late ‘90s, before there was a Facebook. Those laws continue to shield Facebook from liability for any content on its platform, so the only motivation it has is to sit back and collect profits from every post on its app.

Times have changed, of course, and the toxicity of the online environment is evident to everyone now.  But if Facebook had been paying attention, it would have seen an early indicator of the rot at the heart of its business model: online copyright infringement.

Piracy on Facebook was the proverbial canary in the coalmine, illustrating exactly what can happen when the rules that govern the physical world no longer apply on the internet. Since the mid-to-late ‘90s, when the replacement of record sales by illegal MP3 downloads nearly destroyed the music industry, digital piracy has been a way of life, normalized by platforms that look the other way, claiming they do not have the power – or the obligation – to make it stop. As piracy has moved from downloads to streaming, that complacency has helped to create a $29 billion annual loss for the U.S. economy, along with between 230,000 and 560,000 lost jobs. Additionally, 30 percent of piracy websites are deliberately infected with malware. Their operators bait millions of users with illegal content so they can perpetrate identity theft, steal bank and credit information, and inflict other harms.

What makes matters worse is that these devastating losses represent only a fraction of the societal and economic harm caused by irresponsible digital platforms like Facebook. They feed xenophobia, racism, and hate speech to the world. They have distributed global livestreams of mass shootings. They pump out a steady stream of anti-Semitism, public unrest, sex trafficking, child pornography, and other horrors. 

Now these problems are all going to land in the lap of the new Facebook Oversight Board. Its members would be well-advised to study how the out-of-control piracy of past years paved the way for the unrelenting parade of problems they must confront now. This is why it is important to assess the backgrounds of the Board members as they are appointed.

At first blush, the first 20 announced Board members bring diverse backgrounds, cultures, opinions, and beliefs to the discussion. Member Tawakkol Karman, for instance, is the first Arab woman to win a Nobel Prize for her work restoring peace and advocating for women’s rights in Yemen. Member Michael McConnell is a lawyer whose expertise lies, in part, at the intersection of religion and the law, and member Helle Thorning-Schmidt is the former Prime Minister of Denmark.

There are lots of lawyers and politicians. But there seem to be very few who have any creative experience.

The Board must have members with firsthand knowledge of how creative work gets made. Silicon Valley has long had a blind spot about the blood, sweat, and tears that go into the creative process. Currently, only one Board member comes anywhere close to fitting the bill. Alan Rusbridger, the former Editor-in-Chief of The Guardian newspaper, is also an infrequent screenwriter and children’s book author who might have some ideas about the importance of creative people in the digital age.

Aside from Rusbridger, only two other current Board members have biographies suggesting expertise in intellectual property, content regulation, and intermediary liability… as a legal matter, not as a business or cultural matter.

One is “a lawyer specializing in technology, intellectual property, media and public policy” who is dedicated to preserving an “open and free internet.” We hope that phrase is not typical Silicon Valley code for platforms shirking their responsibilities. The notion that almost any form of moderating content is akin to government regulation of speech has been a huge stumbling block to reducing online abuse.

Another member whose background suggests some familiarity with the creative economy has some interesting ideas about the ways social media companies govern our lives, but some of his other writings betray an outright hostility to copyright protections. 

The Oversight Board, according to its own co-chairs, will focus on some of Facebook’s most challenging issues. Some of them are going to require deep legal, moral, ethical, and philosophical analysis, and will probably keep Board members up at night. Other issues should be more straightforward, where the nature of the harm is clear and the appropriateness of a stronger takedown policy is truly compelling.

With creative industries now suffering catastrophic job losses and piracy on the rise in the era of COVID-19, the more than 5.7 million Americans who work in the creative communities desperately need a Board that understands and protects their rights and their livelihoods. We will be watching its decisions, and its future appointments, very carefully.