ProPublica has published documents used by Facebook to train its moderators – or "content reviewers," as the company calls them – on what posts violate the site's policies on prohibited speech.
The documents reveal that the social network has prioritized the speech of the most privileged groups while censoring activists and ethnic minorities. These decisions are made, ProPublica reasons, in the interests of Facebook's business prospects over the expression of its users: "[T]hey serve the business interests of the global company, which relies on national governments not to block its service to their citizens."
Facebook's ideals for accepted speech were integrated into a quiz for moderators, which ProPublica recreated on its site. Some are unambiguous in outlining who Facebook protects: one question asks "Which of these subsets do we protect?" with the multiple choice answers of black children, female drivers, and white men. The answer reads at the bottom of the slide: white men.
The reason "white men" are protected is because at Facebook they are considered a "protected category." Facebook will delete any content perceived as an attack or threat on these categories, which are "based on race, sex, gender identity, religious affiliation, national origin, ethnicity, sexual orientation and serious disability/disease."
This structure is part of a one-size-fits-all approach to expression, according to Monika Bickert, head of global policy management at Facebook, who told ProPublica, "We want to make sure that people are able to communicate in a borderless way." While catchy, this philosophy does not account for discrimination and marginalized communities in the real world, said Danielle Citron, a law professor and information privacy expert at the University of Maryland. "Sadly, [the rules are] incorporating this color-blindness idea which is not in the spirit of why we have equal protection," she said.
Facebook's rules “protect the people who least need it and take it away from those who really need it," Citron said.
Dave Willner, who wrote a comprehensive list of content guidelines for Facebook, said that the site's rules are "fundamentally not rights-oriented" and admitted that they were "mildly upsetting." However, he argued that because of the high volume of content moderators must scan every day, the guidelines must be “more utilitarian than we are used to in our justice system."
These rules, however, are not hard and fast: As The Wall Street Journal revealed last year, Facebook creator Mark Zuckerberg personally intervened to keep the site from deleting Donald Trump's campaign proposal to ban Muslims, even though such a statement violates Facebook's terms.
Facebook has been routinely criticized by black activists and Palestine-based media outlets, whose posts are frequently deleted and accounts locked, while far-right sympathizing groups are untouched.
"As a black person, we’re always have these discussions about mass incarceration, and then here’s this fiber-optic space where you can express yourself," Stacey Patton, a journalism professor at historically black Morgan State University in Baltimore, told ProPublica. "Then you say something that some anonymous person doesn’t like and then you’re in ‘jail.’”