Does the #FBrape campaign challenge our freedom of speech? Are feminists censoring the internet? Soraya Chemaly, one of the founders of the campaign, gives her insight into the issue.
Image: Feminist Fatale by Steve Rhodes CC BY-NV-SA 2.0
Last week, I along with Jaclyn Friedman of Women, Action and the Media and Laura Bates, of Everyday Sexism, led a movement challenging Facebook's policies about content moderation. Facebook responded by saying it had failed to deal adequately with misogynistic content depicting violence against women and outlined the steps it would take to change a cultural tolerance for violence against women. The social activism, which involved raising awareness and asking advertisers to boycott the company until it acted in accordance to its own terms and guidelines, is notable because it is a rare public acknowledgement that misogyny and sexism are real, that they are harmful. Corporations, like Facebook, have a responsibility to treat hate based on gender in the same manner that they do other forms of hate speech.
Many people are saying this is a case of feminists censoring the Internet. I'd like to address this head on to explain why this is not the case. As a feminist and a writer, I understand free speech and hold it dear, but there are two issues being conflated in the concern that #FBrape, the name of the campaign in social media, will reduce speech. One is: how does Facebook regulate speech in its service? The second is: SHOULD Facebook be regulating speech?
Our initiative dealt with number one, how is Facebook regulating speech? Facebook is clearly regulating speech - they have a moderation policy and a detailed reporting and review process. The issue is that they were not interpreting these processes in a way that treated girls and women fairly and equally. That was the issue addressed in our program.
Page with names like "Raping Your Girlfriend," and text and images of popular rape memes depicting about-to-be-raped, incapacitated girls were easily found. Pictures and videos of girls and women frightened, humiliated, bruised, beaten, raped, gang raped, bathed in blood, and, in a recently publicized case, beheaded were "liked" by tens of thousands. In a milder example that went viral through our campaign, Facebook declined to remove an image of a woman, mouth covered in tape, in which the caption read, "Don't tap her and rap her. Tape her and rape her." Facebook's response to readers who reported it read, "We reviewed the photo you submitted, but found it did not violate our community standards."
Content like this defied reason and Facebook's own terms, which prohibit posts that "attack others based on their race, ethnicity, national origin, religion, sex, gender, sexual orientation, disability or medical condition."
Facebook "censors" content every day already. The company had in place the formal language of a reasonable content policy geared toward ensuring users' safety, but it was not implementing it effectively. This failure disproportionately affected girls and women. That is why we demanded that the company reassess it's definition and interpretation of "hate speech" and train moderators to recognize why violence against women is a real problem and, when graphically represented in the ways found, hateful and threatening,
This content isn't "offensive". The photographs and videos we found depict gross human rights violations for the cruel use, entertainment and profit of others. The offense is that these depictions are considered funny or controversial.
Facebook is not "the internet." We chose it because it had content and community guidelines. The company, with more than a billion users, is an influential force. It is both a mirror and a microcosm of a global culture. As such, it is no more or less sexist or misogynistic than any other company or aspect of media. However, by creating a review process it became an arbiter of norms and provided a way to challenge those that encourage and perpetuate gross and easily demonstrable prejudices against girls and women. We are hopeful that this is a first step in making safer spaces both online and off.
The question of WHETHER Facebook should have a content moderation and review process is an entirely separate one.
Soraya Chemaly is a cultural critic and feminist activist. Her work and writing focuses on the role of gender in culture, in media, politics, religion and more, with am emphasis on the role that sexualized violence plays in sex-based prejudices and gender inequality.
Share this article
Making CryptoParties Inclusive
Wendy M. Grossman responds to "loopy" statements made by Google Executive Chairman Eric Schmidt in regards to censorship and encryption.
ORGZine: the Digital Rights magazine written for and by Open Rights Group supporters and engaged experts expressing their personal views
People who have written us are: campaigners, inventors, legal professionals , artists, writers, curators and publishers, technology experts, volunteers, think tanks, MPs, journalists and ORG supporters.
Jun 17, 2013 at 07:45 PM
But isn't that the question, WEATHER Facebook should censor or not? You seems to say "this is not a censorship issue, because Facebook censors anyway, and we just wanna exercise our right to get on the bandwagon." You thus fail to address the free speech issue altogether. Also, have you thought about the negative condequences of censorship [of human rights abuses]. Should we pretend we live in a perfect world, and shove all things uncomfortable under the carpet? Then how will things get better, how will you raise awareness of the ACTUAL issue - which is the actual abuse, not the joking about it? I think your arguments are not straight and that you're shooting yourself in the foot. Just look at the amazing awareness that's been reached with all this fuzz - NONE of which would have happened if Facebook already efficiently censored all you find offensive - and all the while domestic abuse would continue unabated, not to mention more behind clised doors, shielded from the "perfect" world of Facebbok.