Charlottesville
A protester wears a sign reading "Fight white supremacy" at a protest against white nationalists in New York City, the day after the attack on counter-protesters at the "Unite the Right" rally organized by white nationalists in Charlottesville, Virginia, Aug. 13, 2017. Reuters/Joe Penney

Following the Charlottesville attack which highlighted the surge of violence among the white nationalist groups in America, tech companies are looking at the ramifications of letting hate groups use their platforms and services. After GoDaddy de-registered the domain of the racist site the Daily Stormer, and gaming chat app Discord shut down the accounts associated with the attack, social networking sites Facebook and Reddit have started targeting hate groups.

The social networking companies confirmed to CNET on Tuesday they would ban groups that violate their hate-speech policies. The Charlottesville attack received widespread condemnation, however, some far-right and Neo-Nazi groups celebrated the attack on the social platforms. These incidents triggered social networking companies to take stringent action against those inciting violence on their platforms.

The subreddit, r/physical_removal, received scrutiny from Reddit — moderators and people posting on the threads were flagged by the social network.

While both Facebook and Reddit have a history of racist content being posted on their platforms, the recent attack has made even them step up vigilance against these groups. Facebook removed the Charlotteville’s "Unite The Right" page and all links to an article attacking Charlottesville victim Heather Heyer on the racist website the Daily Stormer, except the ones condemning the article.

While the Daily Stormer itself has gone down due to domain de-registration by Google and GoDaddy, Facebook has banned a number of far-right groups, such as:

  • Red Winged Knight
  • Awakening Red Pill
  • Genuine Donald Trump
  • White Nationalists United
  • Right Wing Death Squad
  • Awakened Masses
  • Vanguard America
  • Physical Removal

A Facebook representative told CNET the company will actively target hate groups and attempt to stop them from organizing on the platform. Echoing Facebook's sentiments, Reddit said in a statement to the publication: "We are very clear in our site terms of service that posting content that incites violence will get users banned from Reddit. We have banned /r/Physical_Removal due to violations of the terms of our content policy.”

While banning of such groups has drawn criticism as it is being seen by some as an attack on free speech, in light of incidents such as the Charlottesville attack, it has become necessary for social networks to take action against hateful or racist content.

While hate groups will simply migrate to other platforms, removing them from public ones such as Facebook will make them not so easy to access.

For instance, the Daily Stormer has moved on to the Dark Web, according to Reuters, the domain de-registration. This means that people will not have access to it on the mainstream internet, but will continue to remain accessible for its dedicated users.

This only serves to reduce public exposure to violent ideologies and hides such discussions as the one that triggered the Charlottesville attack.