By Ruth Vitale

Here’s a direct plea to Google and Facebook.

For too long, you’ve dragged your feet on harmful online conduct. It’s obvious that you don’t feel there are sufficient incentives (financial, legal, or PR)  to make the systemic changes necessary to stop the spew of hate speech, fake news, foreign election meddling, radical terrorism, illegal drugs, gun sales, piracy, human trafficking, child sex abuse, and other crimes and misbehaviors on your mammoth global platforms. 

Public shame doesn’t seem to work. In fact, while all of this ugliness continues to thrive on your platforms, you still portray yourselves as altruistic companies with noble missions to “connect the world” and “give people a voice.” You don’t want to acknowledge the fact that your user-generated content business models feed your rapacious appetite to capture and ransom consumer attention and data – and that you show little regard or responsibility for the bad behavior your platforms empower.

Current law doesn’t seem to work. For decades, Section 230 of the Communications Decency Act (CDA) has spared you from being treated as the publishers of content from your users – which is exactly what you are. Meanwhile, the Digital Millennium Copyright Act (DMCA) lets you dump the huge burden on creatives to chase after the pervasive piracy on your platforms, forcing them to file takedown notices ad nauseum.

Together, these two “safe harbor” laws form a virtually impenetrable legal shield from accountability for what appears on your platforms, no matter how unsavory or criminal. Congress is taking a hard look at the flaws in these laws, but that will take time. 

Meanwhile, when called upon to explain yourselves, you do say that you’d like to do more to address abusive activity on your platforms. Yet you’ve been overlooking a critical tool over which you have complete control.

At a recent congressional hearing on “Fostering a Healthier Internet to Protect Consumers,” Dr. Hany Farid, an outspoken critic of your industry’s shocking inability to curb child abuse, gave you the answer: “You should just enforce your terms of service.” Indeed, “[t]he terms of service of most of the major platforms,” he explained, are actually pretty good, it’s just that [the platforms] don’t do very much to enforce them in a clear, consistent, and transparent way.”

Dr. Farid’s argument about enforcing terms of service was a key point of consensus at the hearing, including among some of the strongest apologists for the platforms. Several members of the Congressional committee echoed it. Even Corynne McSherry, legal director for the Electronic Frontier Foundation, a notorious defender of Google (and taker of its money), agreed with Dr. Farid, saying “that it would be nice if [the internet companies] would start by clearly enforcing their actual terms of service.”

Let’s look at some of those terms of service. And let’s start with you, YouTube. 

Your terms of service couldn’t be clearer – they require that the user commit that any content they upload to the site “will not contain third party copyrighted material, or material that is subject to other third party proprietary rights.” Furthermore, they require that none of the more than 300 hours of video uploaded to your site every minute be “contrary to the YouTube Community Guidelines” – guidelines, we might add, that forbid “harmful or dangerous content,” “hateful content,” content involving “child endangerment,” and “videos that someone else owns the copyright to,” among other harms.

“YouTube may at any time,” your terms of service continue, “without prior notice and in its sole discretion, remove such Content and/or terminate a user’s account for submitting such material in violation of these Terms of Service.”

That all seems pretty clear. That’s why it’s shocking, YouTube, that your site remains an active hub of woeful activity such as piracyillegal drug sales, and “a lot of hate”. As Dr. Farid said at the hearing, “If a reasonable person can find this content, surely [YouTube’s parent company] Google, with its resources, can find it as well.” So, why aren’t you and Google doing more to put a stop to this, especially since your terms of service explicitly reserve your right and imply an intention to do so?

And, what about you, Facebook? Although you are taking some steps to better moderate content, they are clearly not enough, given the sheer amount of harm being perpetrated on your networks.

Your CEO, Mark Zuckerberg, has been making the rounds on Capitol Hill, peddling claims about his company’s role in shaping society, and history, for the better. He gave a speech about Facebook’s recent decision to stop factchecking political advertisements, wrapping Facebook in the cloak of free speech, and invoking civil rights heroes such as Martin Luther King, Jr. and Frederick Douglass. 

Not surprisingly, Zuck got a lot of flak for those misplaced comparisons. No reasonable person would conflate the motives of Dr. King and Mr. Douglass with those of Facebook, a massively profitable social media goliath. Zuck’s not a hero, he’s a profiteer.

So, Facebook, prove us wrong. Try more effective enforcement of your existing terms of service to help promote a healthier ecosystem. Instead of hiding behind the First Amendment (which you, as a private actor, are not bound by) and the reputations of great civil rights leaders, show that you will enforce violations of your terms of service when clearly illegal conduct occurs on your platforms.

Facebook, like YouTube, tells all of its users that they must not violate its community standards, and that those who do will see their content removed or have their account terminated. Despite these clear warnings, you’ve been accused of underreporting complaints. Cyber pirates are using your Watch Party tool to host illegal movie marathons for thousands of viewers, and Facebook Groups are widely used for sharing stolen movies. Even more troubling, Facebook Messenger is responsible for nearly 12 million of the 18.4 million worldwide reports of child sexual abuse material, and hate speech continues to spread dangerously across your platform.

We also ask you not to hide behind the argument that “we’re too big” and “it’s too difficult.” Every week, you find new ways to innovate by assessing all the content on your platforms and how it’s being used, and you are amassing unprecedented advertising profits on that innovation. We have no doubt that you can apply that same innovation to identify and remove content that violates your terms of service… and to revoke the accounts of the violators.

Enforce your terms of service. Members of Congress think you should. So do some of your most avid public defenders. You have the right and the power and the ability to do so.  

Show us you have a sense of #PlatformAccountability. Just do it.