Discord, a popular chat platform for gamers, is shutting down servers and accounts associated with the "alt-right". The company announced the actions on Twitter following the protests and resulting violence in Charlottesville, Virginia.
According to Discord, members of the alt-right groups on the chat service such as AltRight.com reported messages for violating Discord's terms of service since Discord itself cannot read the private messages on its servers. “When hatred like this violates our community standards we act swiftly to take servers down and ban individual users,” the company said in a statement. “The public server linked to AltRight.com that violated those terms was shut down along with several other public groups and accounts fostering bad actors on Discord. We will continue to be aggressive to ensure that Discord exists for the community we set out to support — gamers.”
Love. Not hate. pic.twitter.com/5xFpvHTuI2— Discord (@discordapp) August 14, 2017
Following the ban, members of those servers were predictably outraged with many claiming censorship and political bias. Others called for Discord to ban the servers of the counter-protestors as well with the company promptly replying to report those servers and they would be investigated also.
The ban of alt-right accounts on Discord follows earlier news of GoDaddy and Google banning white supremacist site The Daily Stormer for posting a disparaging article on Heather Hayer, a 32 year old woman killed in Charlottesville during the protests.
While many would view these series of events as an extreme reaction by companies, there has been a lot of pressure on social media networks in particular to clamp down on views deemed "hateful" or "extreme".
Facebook, Twitter, and YouTube have faced backlash for allowing extremist Islamic groups such as ISIS to use their platforms as a means to spread their ideology. Chat platforms such as Discord and even domain hosting companies such as GoDaddy are now trying to figure out how to deal with views considered extreme.
Last week, Google had to defend itself over a 10-page "manifesto" penned by former Google engineer, James Damore, that criticized Google's diversity initiatives and claimed conservative voices were suppressed and discriminated against. Although Damore was summarily dismissed from Google as a result of his manifesto, there were some within Google itself who quietly agreed with him but did not voice their opinions for fear of being fired. Furthermore, others commented that Damore's firing was exactly the kind of behavior that Damore was decrying, suppression and/or elimination of views that contradict the majority.
In a previous article, I wrote:
A larger issue still looms over what actually defines "extremist ideology". What's extreme for one person may be just fine to another person. Who gets to decide what is extreme? Could this lead to widespread censorship on every major social media platform?
I believe this sentiment still rings true. While many (if not all) companies have terms of service that have zero tolerance policies regarding hate speech against protected classes such as race, gender, and sexual orientation, there still seems to be a lingering issue as to who gets to decide what is hateful and whether or not such views should be allowed. While I believe a great majority of people would agree that white supremacist/neo-Nazi ideologies are hateful, should their voices be suppressed simply because they are hateful? The answer may seem pretty cut and dry to most folks and yet complex to others.