Protecting us from what?: Why does Facebook’s community guidelines enforce myopic standards

Calling someone racist.  Sharing the cover of an LGBTQ novel entitled Faggots, linking to news stories about “the plight of the Kurds in Syria,” posting an image of a woman in a fully opaque but “flesh-colored” bodysuit, sharing a link…

Calling someone racist.  Sharing the cover of an LGBTQ novel entitled Faggots, linking to news stories about “the plight of the Kurds in Syria,” posting an image of a woman in a fully opaque but “flesh-colored” bodysuit, sharing a link to a New Yorker article on the poet Sappho.

All pretty terrible, right?

How about posting ads for adoptable animals, asking for a “white male ban” after the Parkland shooting, sharing an image of a woman in booty shorts, referring to oneself by a slur, and notoriously: writing that “men are trash.”

These are a small sampling of the offenses that have landed users in what has come to be known as “Facebook jail,” where accounts on the site are banned from posting, commenting and using the Messenger app. These bans can last from 24 hours to 30 days, or even be permanent. Users may not be told what rule they violated, and there is no transparent process for appealing these decisions.  

And the situation just got much, much worse.

In April 2018, Facebook, the massive social media platform boastingover 2 billion users, made itsCommunity Guidelines public for the first time. Violations are broken down into six categories: “Violence and Criminal Behavior,” “Safety,” “Objectionable Content,” “Integrity and Authenticity,” “Respecting Intellectual Property,” and “Content-Related Requests.”

In November 2018, founder Mark Zuckerberg published a  long post on governance by  anda report was released on how guidelines are enforced. This peek behind the curtain revealed that yes, Facebook monitoring is as bad as you thought.

Community standards were also updated in November, expressly banning conversations about “sexual preferences” and “suggestive content.” Users can face a ban for describing sexual roles, referencing masturbation, using sexual slang, identifying partner preferences or even making vague statements like “looking for a good time tonight.”

Facebook has a long history of curtailing expression, especially when it comes to sexuality. “We were really aggressive about saying we are a no-nudity platform,” said the site’s first general counselChris Kelly to ProPublica.

Created in stark contrast to the personalize-able format of MySpace, Facebook’s origins from the buttoned-up Ivy Leagues and its cultivation of “community.”

On Facebook, the most likely targets of sexual content bans are groups already on the margins of culture. Sex workers, members of kink communities, and LGBTQ people are disproportionately impacted by these rules. Given that only “female nipples” are outlawed, women are far more likely to be banned for nudity. Ironically, because reporting violations is done anonymously and without consequence for the reporter, the guidelines actually encouragebullying and trolling.

 

The situation is made worse by the fact that Facebook can’t even enforce these prudish standards effectively. In his Nov. 2018 report on governance, founder Mark Zuckerberg admitted a 1-in-10 rate of false positive flags. That is, for every 10 bans placed on users, at least 1 was in response to a post that didn’t actually violate any rules. While the guidelines claim exceptions for posts conveying educational value or artistic depictions of nudity, users are routinely banned for these “violations,” sometimes permanently and without recourse. Popular pages have been unpublished without warning or explanation, including those of educators, activists, and even ironically a chapterof the ACLU for a post about censorship.

It’s not just free expression of sexuality at stake. Facebook’s rules also have a dangerous policy towards hate speech, punishing users without regard to context. Internal documents uncovered byProPublica reveal that while disparaging remarks about subsets of populations, like “female drivers” or “Black children” are not in violation of Facebook rules, a comment about a “protected category” like “white men” would result in a ban. Ironic use of slurs is treated the same as overtly racist or sexist language, but Holocaust deniers are allowed.

In his report, Zuckerberg also described an intentional curtailing of controversial posts which he refers to as “borderline” from going viral. And while he stated a continued priority of removing fake accounts, Facebook still has trouble managing the proliferation of fake news in users’ feeds. Whether by design or not, the result of this policing system is a preference for specific political views. The values of powerful governments are privileged over grassroots groups and those of cultural elites over minority voices.

While the community guidelines claim to be about making Facebook safe for everyone, enforcement does not appear to be based on the needs or experiences of users. In theNovember 2018 report by Guy Rosen, Facebook’s vice president of product management. Rosen described how enforcement of violations have “improved” over the years. Facebook is able to take down more and more objectionable content through use of automated systems.

What he fails to point out is that while users are more likely to report violent content as objectionable, the site is more likely to take action against posts with nudity. The members of the community are apparently not the ones who have a problem with the suggestive material from which they are being protected.

Facebook has also failed to address some of the most pernicious forms of sexual harassment on the site. Women who consensually post images of themselves may be subject to a ban, depending on the capricious whims of the algorithm, but men who send unsolicited dick pics via Messenger face only muting, blocking or a sternly-worded rebuke from the individual who receives it.

In the era of FOSTA-SESTA, the recent legislation that has inflamed sexual censorship across the internet, we can expect more of this. Facebook is one of many social media platforms cracking down on sexual content while failing to address the hate groups that use of their site.

Users can adapt through preemptive obedience and self-censorship or by exploring new options for connection. It might be time to go back to carrying business cards.

TWITTER: @TIMAREE_LEIGH

  • Timaree Schmit Headshot

    Timaree Schmit is basically an episode of Adam Ruins Everything, but in the shape of a person. She has a PhD in Human Sexuality Education and years of experience in community organizing, performance art, and finding the extra weird pockets of Philly.