As noted by Jill, thanks are owed to Women, Action, and the Media; the Everyday Sexism Project; and Soraya Chemaly, as Facebook has agreed to remove the kind of content that celebrates violence against women and has been heretofore brushed off as “crude humor.”
In recent days, it has become clear that our systems to identify and remove hate speech have failed to work as effectively as we would like, particularly around issues of gender-based hate. In some cases, content is not being removed as quickly as we want. In other cases, content that should be removed has not been or has been evaluated using outdated criteria. We have been working over the past several months to improve our systems to respond to reports of violations, but the guidelines used by these systems have failed to capture all the content that violates our standards. We need to do better — and we will.
The full statement is much longer than this excerpt and should be read in full, if you have the time, for a clearer look at Facebook’s stated position on hateful content and content moderation. The steps they plan to take, effective immediately, include reviewing and updating their operations team’s guidelines for hate speech, educating their teams to review and evaluate hateful content, increase accountability of creators of “cruel or insensitive” content (that does not qualify as hate speech), and establish lines of communication with and between women’s groups and national and international working groups to better understand and address the impact of online hate speech.
As an interesting resource, last February Gawker posted guidelines provided by Facebook to its content moderation teams for identifying objectionable content. Leaked by a disgruntled employee, the “Abuse Standards Violations” include the prohibitions of “naked ‘private parts’ including female nipple bulges” that have gotten breastfeeding pictures removed and posting privileges revoked.
The “Hate Content” standards specifically prohibit racial slurs, “photos comparing two people side by side,” and photos of unconscious people with things drawn on their faces, and their “Graphic Content” standards specifically prohibit photos of mutilated people or animals, photos of ear wax (I’m quite serious), and “crushed heads” if the innards show; but while said standards also ban “content showing Poster’s delight in … violence against humans or animals for sadistic purposes” and “violent speech,” apparently photos depicting abused women and speech extolling what has been and should be done to them still make the cut. (Forbidden, however: “Blatant (obvious) depiction of camel toes and moose knuckles.”)
Contractors all over the world (the leaker worked in Morocco for a California-based company) work for $1 an hour plus commissions to review reported content.
Obviously, that documentation gets far more specific than the standard Facebook Community Standards, and updates to that documentation have been marked “Proprietary and Confidential”. If will be interesting to see how much transparency is involved in Facebook’s review and implementation of policy.
Congratulations again to WAM! and partners.