For some time now, a growing cadre of feminists, activists and women’s groups have been screaming into the digital well that is Facebook Inc. (NASDAQ:FB), trying to get the world’s largest social network to be more proactive in removing content that they say endorses violence against women and girls.
Graphic memes featuring images of battered women appear with regularity on pages with titles such as “Fly Kicking Sluts in the Uterus” or “Violently Raping Your Friend Just for Laughs.” It’s an ongoing problem for Facebook, one that gained wider attention late last year with the launch of the now-defunct Rapebook, a page set up to draw attention to -- and report -- pages that feature misogynistic content. Meanwhile, a Change.org petition demanding that Facebook remove pages that promote sexual violence has attracted more than 200,000 signatures.
Despite such protests, however, critics say Facebook still refuses to take the issue seriously. The company’s standard response has been that many of the objectionable images in question fall under the realm of humor -- distasteful humor, admittedly, but not hate speech. As its own Facebook Community Standards page states, the social network “does not permit hate speech, but distinguishes between serious and humorous speech.”
Activists and feminists groups are fed up with that position. On Tuesday, they published an open letter to Facebook, asking -- no, demanding -- that it take immediate action on “pages and images that explicitly condone or encourage rape or domestic violence or suggest that they are something to laugh or boast about.”
The letter was conceived by Jaclyn Friedman, executive director of WAM (Women, Action, & the Media), Laura Bates, founder of the Everyday Sexism Project, and Soraya Chemaly, a feminist writer and activist. It was posted on WAM’s website and undersigned by more than 50 organizations, including Equality Now, Hollaback and Ms. Magazine.
But the letter is more than just a list of demands. It’s the beginning of an effort that is part awareness campaign, part advertiser boycott. In a phone interview, Friedman said that decision came after it became apparent that hitting Facebook where it hurts was the only way to get its attention.
“We thought about who it is they really care about,” she said. “They clearly don’t care about their users, so we thought, ‘Well, maybe they care about their advertisers.’”
As part of the campaign, WAM has set up an Examples page featuring screenshots of Facebook ads appearing alongside objectionable content. In one shot, an American Express Co. (NYSE:AXP) ad can be seen next to a meme featuring a gagged woman and the phrase “Rape Her and Tape Her.” From the WAM website, Twitter users can tweet messages directly to Facebook advertisers under the hashtag #FBrape, which now comprises thousands of tweets by users asking companies to pull their ads from Facebook.
“We’re just trying to hold their toes to the fire until they pay attention,” Friedman said.
And some are. As of Friday afternoon, WAM said that seven companies -- including the Nissan UK unit of Nissan Motor Co. Ltd. (OTCMKTS:NSANY) and the privately held WestHost -- have agreed to pull their ads from the site. Others, such as the Zipcar unit of Avis Budget Group Inc. (NASDAQ:CAR), say they are investigating the matter.
Various settings allow Facebook advertisers target specific demographics, but it’s unclear how deep those settings go. In response to the #FBrape campaign, some advertisers have said that they simply have no control over what pages their ad shows up on, as Facebook’s targeting mechanisms focus on “likes” and user demographics, but not specific pages.
@everydaysexism you are right. The ad shows based on a users likes. If a user likes a shoe brand, it might be why it shows up.
â€” Zappos (@ZapposStyle) May 24, 2013
From a user’s perspective, the mechanism for reporting objectionable Facebook content is easy. A drop-down arrow appears on every photo and page on the website, allowing users to submit a report with a single click. Facebook administrators investigate reports and decide if action is warranted.
But the results of that system are often confounding and inconsistent. Friedman said photos of women breastfeeding are routinely yanked for violating Facebook’s nudity guidelines, while photos of battered women and girls are left intact under the guise of humor. What she finds most frustrating is what she sees as Facebook’s refusal to recognize such content as gender-based hate speech, even as other types of hate speech on the site -- racial-based hate speech, for instance -- are promptly dealt with.
“It’s my understanding that they’ve been quite good at responding to hate speech in general,” she said. “They just don’t consider women a group worthy of those protections.”
Facebook says that’s just not true. A site representative said it doesn’t tolerate hate speech or content that incites violence against any group -- including women -- and pointed out that all the pages mentioned in WAM’s open letter have since been taken down for violating Facebook’s terms.
Friedman countered that the pages were removed only after the #FBrape campaign thrust them into the spotlight, and she said that other pages have remained intact, even after they were reported. One such page is Offensive Trolls, which Friedman reported after it posted a graphic photo of a battered woman. The post included a rant in which the user bragged of hitting the girl with a truck and “choke-slamming” her.
Friedman said that, after she reported the page, Facebook responded saying it found that it did not violate the site’s hate-speech standards. The page was still up as of Friday afternoon.
Facebook said it does not comment on why specific pages are -- or are not -- taken down. But, in a statement, its representative reiterated the need for a distinction between hate speech and humor: “[A]s you may expect in any diverse community of more than a billion people, we occasionally see people post distasteful or disturbing content, or make crude attempts at humor. While it may be vulgar and offensive, distasteful content on its own does not violate our policies.”
Friedman said she understands that humor is subjective and that no topic -- including rape -- is completely off-limits. But she added that the difference between a joke and a threat is found by looking at where that humor is aimed.
“If you’re making fun of a rapist or trying to draw attention to an issue, that’s one thing,” she said. “But making jokes at the expense of a rape victim -- that’s just not acceptable.”