Published Date: 08-25-21
In April, Facebook was outed by Business Insider for leaking data from 533 million profiles to a hacker website.
In crisis mode, the company sprang into action. Facebook immediately assumed complete responsibility for the disaster and executed a strong and effective action plan to mitigate damage and keep its users safe.
😂
We almost had you there, didn’t we?
No, Facebook obviously didn’t do any of that. Instead, having served up hundreds of millions of its trusting users’ phone numbers, email addresses, birthdates, and other personal information to scammers, phishers, and spammers on a silver platter, the platform resorted to its usual methods of evasion and distraction. It’s a PR crisis playbook we like to call “B.A.D.”:
- Blame: Facebook published a blog post that framed the incident not as a ”leak,” but as “scraping” of publicly available data by bad actors – an unfortunate event that could have been avoided if only users would be better about making sure “that their settings align with what they want to be sharing publicly.” (Good luck with that.) Takeaway: Facebook is “focused on protecting people’s data” and the leak is your fault, loser… I mean, user!
- Avoid: Facebook announced that to make sure people were informed about the leak, it would go ahead and inform… no one. Instead, they left more than 530 million users spread across 106 countries to fend for themselves. All those robocalls you started getting because of the leaked information? Facebook says, “You’re welcome.”
- Deny: In response to the Irish Data Protection Commission (IDPC) investigation of the leak, Facebook claimed that the data was originally scraped way back in 2019, well before passage of the EU’s General Data Protection Regulation (GDPR). As such, the platform was under no duty to report the leak to the Commission. Ducking responsibility… it’s the Facebook way. We’ll see how that flies in the coming class action lawsuit.
After years of scandals involving privacy, piracy, and countless other harms, we have become all too familiar with these tactics. In fact, their cynical predictability has gotten a bit boring.
But this time around Facebook added an interesting new chapter to their playbook – one that could fundamentally change our perception of their PR efforts going forward: Normalize.
In an email sent in error to a Belgian journalist following news of the leak, a member of Facebook’s communications team wrote that not only does the company “expect more scraping incidents,” but it thinks it is “important to both frame this as a broad industry issue and normalize the fact that this activity happens regularly”. [Italics added by us.]
That’s right. Why fix the problem that allowed hackers to access personal data from hundreds of millions of profiles? Why worry about the next set of leaks? Instead, Facebook wants to force us all to just accept it as a regular part of doing business with a social media giant whose business model relies on privacy violation and the spread of misinformation and outrage.
This got us thinking. How else has Facebook applied this tactic across its long and terrible history of scandal and strife? At what other points has it tried to “normalize” its shameful behavior in a subtle or not-so-subtle attempt to get us all to just drop it and move on already?
We did a little digging and here is what we found:
Issue: Division and hatred stoked by a business model that can, as one former Facebook employee put it, “quickly lead users down the path to conspiracy theories”.
Normalization strategy: In a March presentation instructing employees how to respond to accusations of polarization, one Facebook research “scientist” framed political polarization and social division as a normal and healthy component of societal change. “If we look back at history,” he told employees, “a lot of the major social movements and major transformations, for example, the extension of civil rights or voting rights in this country, have been the result of increasing polarization.”
Takeaway: Sure, Facebook’s algorithm may put us all into filter bubbles that makes any dissenting viewpoint feel like an assault on our belief system – but don’t worry, that’s just what happens when, as Bob Dylan once wrote, the times they are a-changing. The only question is, are we actually changing for the better?
Issue: Toxic content espousing deeply troubling views, such as Holocaust denial.
Normalization strategy: In a 2018 interview with Recode, Facebook CEO Mark Zuckerberg tried to frame Facebook’s refusal to regulate some of its ugliest, most painful content as a normal thing to do for a company dedicated to giving everyone a voice – even anti-Semites. “I’m Jewish, and there’s a set of people who deny that the Holocaust happened,” he told Recode. “I find that deeply offensive. But at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong.” He proceeded to double down: “Everyone gets things wrong, and if we were taking down people’s accounts when they got a few things wrong, then that would be a hard world for giving people a voice and saying that you care about that.”
Takeaway: So, like, it’s a free country, man. Racists and bigots should get to say their piece, too. But Facebook also has the right to decide what it does or does not want to allow on the platform. It’s a private company. (Read: it obviously doesn’t want to decide.)
Issue: Censorship and the rise of authoritarianism.
Normalization strategy: In February of 2021, internal company emails obtained by ProPublica revealed that in 2018, the Turkish government threatened Facebook with a total shutdown in the country if it did not block posts from a Kurdish militia group using the platform to alert Syrian Kurdish civilians of impending Turkish attacks against them. In response, Facebook COO Cheryl Sandberg told her team, “I’m fine with this.” As in, she was fine with caving to the Turkish government’s demand so her company’s platform could continue making money in Turkey without interruption.
Takeaway: Sandberg’s usage of the word “fine” is the real kicker here. “Fine” to assist an autocratic nation with its suppression of dissidents? Yeah, no big deal. Her normalizing tone clearly reflects the company line: Facebook has pulled similar moves in countries such as Cuba, India, Israel, Morocco, Pakistan, and Vietnam. So much for “giving people a voice”.
So, there you have it. Three blatant examples of Facebook using carefully crafted messaging to “normalize” behavior that misinforms us, divides us, and poses a threat to democracies around the world.
Now that you know about it, watch for it. Look for Facebook to craft language that is intended to spin the company’s worst misdeeds as run-of-the-mill procedures that “happen regularly.”
There’s nothing “normal” about Facebook’s normalization – nothing at all.