The Impossible Job: Inside the Struggle in Facebook to Moderate Two Billion People

Aug 28, 2018, 06:20 PM

Vice

This spring, Facebook reached out to a few dozen leading social media academics with an invitation: Would they like to have a casual dinner with Mark Zuckerberg to discuss Facebook’s problems?

According to five people who attended the series of off-the-record dinners at Zuckerberg’s home in Palo Alto, California, the conversation largely centered around the most important problem plaguing the company: content moderation.

In recent months, Facebook has been attacked from all sides: by conservatives for what they perceive is a liberal bias, by liberals for allowing white nationalism and Holocaust denial on the platform, by governments and news organizations for allowing fake news and disinformation to flourish, and by human rights organizations for its use as a platform to facilitate gender-based harassment and livestream suicide and murder. Facebook has even been blamed for contributing to genocide.

These situations have been largely framed as individual public relations fires that Facebook has tried to put out one at a time. But the need for content moderation is better looked at as a systemic issue in Facebook’s business model. Zuckerberg has said that he wants Facebook to be one global community, a radical ideal given the vast diversity of communities and cultural mores around the globe. Facebook believes highly-nuanced content moderation can resolve this tension, but it's an unfathomably complex logistical problem that has no obvious solution, that fundamentally threatens Facebook’s business, and that has largely shifted the role of free speech arbitration from governments to a private platform.

The dinners demonstrated a commitment from Zuckerberg to solve the hard problems that Facebook has created for itself through its relentless quest for growth. But several people who attended the dinners said they believe that they were starting the conversation on fundamentally different ground: Zuckerberg believes that Facebook’s problems can be solved. Many experts do not.

The early, utopian promise of the internet was as a decentralized network that would connect people under hundreds of millions of websites and communities, each in charge of creating their own rules. But as the internet has evolved, it has become increasingly corporatized, with companies like Facebook, YouTube, Instagram, Reddit, Tumblr, and Twitter replacing individually-owned websites and forums as the primary speech outlets for billions of people around the world.

As these platforms have grown in size and influence, they’ve hired content moderators to police their websites—first to remove illegal content such as child pornography, and then to enforce rules barring content that could cause users to leave or a PR nightmare.

“The fundamental reason for content moderation—its root reason for existing—goes quite simply to the issue of brand protection and liability mitigation for the platform,” Sarah T. Roberts, an assistant professor at UCLA who studies commercial content moderation, told Motherboard. “It is ultimately and fundamentally in the service of the platforms themselves. It’s the gatekeeping mechanisms the platforms use to control the nature of the user-generated content that flows over their branded spaces.”

To understand why Facebook, Twitter, YouTube, and Reddit have rules at all, it’s worth considering that, as these platforms have become stricter, “free speech”-focused clones with few or no rules at all have arisen, largely floundered, and are generally seen as cesspools filled with hateful rhetoric and teeming with Nazis.

And so there are rules.

Twitter has “the Twitter Rules,” Reddit has a “Content Policy,” YouTube has “Community Guidelines,” and Facebook has “Community Standards.” There are hundreds of rules on Facebook, all of which fall into different subsections of its policies, and most of which have exceptions and grey areas. These include, for example, policies on hate speech, targeted violence, bullying, and porn, as w...