YouTube and Twitter have struggled to keep up with users posting video of the beheading of James Foley, the American journalist murdered by members of ISIS Islamic State. Their crackdown, which included suspending accounts and frantically deleting new videos as they surfaced, shows that social media networks have once again become the preferred means of communication for terror organizations.
The Foley family has asked people not to click on or share any links to the video of the 40-year-old’s final moments, but questions remain, however, about how each company enforces their rules and policies, especially when it comes to regulating newsworthy images that comes out of war zones. Twitter removed images of James Foley at the request of his family.
We have been and are actively suspending accounts as we discover them related to this graphic imagery. Thank you https://t.co/jaYQBKVbBF
â€” dick costolo (@dickc) August 20, 2014
Both YouTube and Twitter have policies that prevent users from posting graphic images of violence, with YouTube generally giving news organizations greater leeway than individual users. The problem, according to Jillian York, the Director for International Freedom of Expression at the Electronic Frontier Foundation, is that the guidelines aren’t enforced equally.
“This issue I have is the inconsistency of their rules,” she told the International Business Times. “There’s been lots of graphic, violent content coming out of Syria over the past few years. YouTube tends to allow content that has educational or documentary qualities, but I think they’re treating it differently because he’s American.”
York said that she understood why the Foley video was removed, adding that it’s difficult to picture a perfect solution – especially when so many Internet users seem to have no qualms about passing graphic content on to their own followers.
'An Ethical Question, Not a Legal One'
Zaid Benjamin, a correspondent for Radio Sawa and the first journalist to notice the video online, was among those who had his Twitter account suspended, although he didn’t link to the video. Benjamin told Foreign Policy magazine that Twitter provided no explanation as to why his account was suspended, and admitted that tweeting out still images from the video cost him 30,000 followers.
“Most news organizations would think about this as an ethical question, not a legal one,” York said. “But clearly in this day in age many media companies see this differently if these things are on YouTube… My normal line of thinking is that private companies should not regulate free speech at all because there’s so much collateral damage.”
US lawmakers have previously called on social networks to take down accounts belonging to US-designated terrorist groups. In 2012, during the last flare-up between Israel and Hamas, seven Republicans from the House of Representatives demanded that Twitter remove accounts belonging to Hamas, Hezbollah, and Somalia’s al-Shabaab.
“Allowing foreign terrorist organizations like Hamas to operate on Twitter is enabling the enemy,” Rep. Ted Poe (R-Texas) said in a statement to the Hill at the time. “Anti-American foreign terrorist groups around the world are doing the same thing every day. The FBI and Twitter must recognize sooner rather than later that social media is a tool for the terrorists.”
No Legal Requirement
Social Networks like Twitter may exercise editorial control over posts like Benjamin’s based on their developing ethical considerations, but they have no legal requirement to do so, according to Eugene Volokh, University of California, Los Angeles law professor and creator of “The Volokh Conspiracy” law blog. Volokh said no American law requires social networks to remove posts – even those as extreme as the one depicting Foley’s execution.
“American law currently does not even support requiring social networks to remove material, even if [Twitter] sees them as threatening, encouraging violence, or voicing support for terrorists. Just like a telephone company is not required to cut off service because somebody is using that service to post bad things,” Volokh said.
Similarly to how Google is not required by American law to block access to terrorist websites, Volokh said that in his opinion, there was a “good reason” these companies were free to allow graphic material like the video featuring Foley.
“You don’t want these services to be in a position where employees feel like they are in danger of going to prison over a message, because then they’ll over-censor,” Volokh said. “The moment that some activist, or politician says ‘I think this material is inciting violence and I’m going to complain to the prosecutor.’ [Employees] might feel that this is too much of a risk, and as a result take down material that is not aiding terrorists because they don’t want to have to go to court."
The execution video has value to proponents of free speech because it helped illustrate the true nature of terrorist organizations, Volokh said.
“It shows what we’re fighting against and why we’re fighting it,” Volokh said. “On one hand, it might be a recruiting tool for the bad guys, on the other, but it might be an educational tool for the good guys.”
Tom Halleck contributed additional reporting to this story.
Correction (8/21): An initial version of this story stated that Jillian York "agreed" with Twitter's decision to remove the video. She instead said she understands why the video was removed, adding that content should never be removed by private companies.