Extremists Purge Hateful Messages After Christchurch Massacre
A network of far-right extremists is self-censoring, and in at least one instance mass-deleting, content from several key online communities following the devastating terror attacks on mosques in Christchurch, New Zealand.
A 颅review of 12 far-right servers on Discord, a chat application favored by video gaming communities and more recently the extreme right,颅 reveals that at the same time users were celebrating the horrific attacks of March 15, administrators and moderators of these online spaces deleted large amounts of content and instituted bans on posts glorifying the perpetrator.
The Southern Poverty Law Center and outside researchers affiliated with the non-profit media organization Unicorn Riot conducted the review in the weeks since the March attacks. The review brought to light that while social media networks have been slow to remove hate content from their platforms, some extremist communities are taking down or banning content due to legal concerns. Portions of those deleted conversations are available .
鈥淎ttention all users. Considering the circumstances we find ourselves in it is very likely that this man was in any number of /k/ servers,鈥 user 鈥淢aj. A------,鈥 an administrator of a 4chan-affiliated server titled 鈥淭he Pathetic Life of an Average /K/ommand,鈥 wrote. 鈥淐onsidering this it is very likely we could all be, in the event the man was in the server, considered accomplices and held for a federal investigation. Seeing as that is the case, any mentioning of the recent habbening 颅from now on us颅 strictly verboten.鈥
鈥淢aj. A------鈥檚鈥 fears seem legitimate. Federal prosecutors in Harrisburg, Pennsylvania, recently 听Corbin Kauffman, a 30-year-old resident of Lehighton, Pennsylvania, with interstate transmission of threats to injure another person for content that he posted on Minds.com, a fringe social media site.
If the suspect in the New Zealand attacks was a member of 鈥淭he Pathetic Life of an Average /K/ommand,鈥 its users may also have cause for concern. David Hyman, a law professor at Georgetown University, told 听in 2018 that anonymous online users can have their identities revealed if a judge deems it relevant to a case. 鈥淧rivate and privileged are not the same thing,鈥 Hyman told Newsweek.
The 12 servers examined in this investigation 鈥 chosen for how their moderators and users discussed and responded to the tragedy in Christchurch 鈥 are part of a larger network of 50 that is being reviewed comprehensively by the researchers. This smaller cluster of chat servers posted an estimated 38,932 messages in the first 24 hours following the terror attack that left 50 dead and 50 wounded.
An even more complex picture of these obscure and chaotic online spaces emerges when considering this: As moderators of these extremist spaces undertook an unprecedented level of voluntary censorship, their users and the broader far-right were engaged in a haphazard campaign to immortalize the perpetrator and his manifesto.
The 人兽性交 and the researchers reviewed copies of the Christchurch suspect鈥檚 manifesto and the Facebook Live 听of the attack filmed by the shooter, which tech companies across the globe were scrambling to remove.
Users in these servers were also creating memes and coordinating the creation of other content, including YouTube playlists celebrating the alleged killer. Some pledged to follow the Christchurch attacker鈥檚 footsteps.
鈥淲ow. Just finished reading the manifesto. Truly powerful,鈥 鈥淪ulferix鈥 wrote in Outer Heaven, one of the servers reviewed for this piece. 鈥淚 will be starting my own contribution to the fight soon, in every way that I can. I will start a group. I will train. I will be part of this if it f------ kills me. I hope I鈥檓 not the only one.鈥
Statements from moderators and users show they fear Discord will remove them from its platform, and they fear prosecution for hateful and violent remarks.
The resilience of these 听on Discord combined with self-censorship illustrates how far Silicon Valley鈥檚 policymakers and content moderators are lagging behind far-right extremists on their platforms.
One moderator told his followers about a delay that offenders can exploit to avoid content moderation or bans from the platform.
鈥淚f someone reports the server, it takes 24 hours for discord to look into the report,鈥 鈥淐aptain Kirk JT,鈥 owner and administrator of the server 鈥淥uter Heaven,鈥 wrote. 鈥淚f the messages are gone by the time they look, its as if it never happened, and the report is dropped.鈥
顿颈蝉肠辞谤诲鈥檚 , last updated on Oct. 19, 2018, state, 鈥淭he company reserves the right to remove and permanently delete your content from the service with or without notice for any reason or no reason.鈥 According to 顿颈蝉肠辞谤诲鈥檚 , flagged content is reviewed 鈥渁s it comes in as quickly as we can.鈥
The dizzying speed with which far-right extremists archive and redistribute propaganda signals an awareness of attempts to limit the spread of media associated with terrorism.
Technology companies are being pressured by governments and civil society organizations into enforcing their corporate policies governing harmful content. Human rights and technology professionals have long been concerned with archiving vulnerable online content. The far-right, which until 听was able to create and disseminate hateful propaganda with little worry of moderation, is demonstrating similar concerns. As content moderation improves, white supremacists and their sympathizers are changing tactics and refocusing their efforts to create archives of their online propaganda.
For instance, the Daily Stormer published its 88th of white supremacist content from the site in PDF, ePub and Build file format on April 28, 2019, in response to its vitriolic content being taken offline repeatedly by web hosting providers.
鈥淲e began publishing a redistributable and archival weekly magazine for two reasons,鈥 read the April 28, 2019, edition鈥檚 introduction. 鈥淭he first was that we wanted to give people the ability to spread our publication as samizdat to evade this global censorship regime. Secondly was that it is easily archivable, and there鈥檚 a nonzero chance that the publisher and staff of this website will be murdered by global Jewry before all is said and done and we want a survivable record for history鈥檚 sake of what we actually said and did.鈥
As violence tied to far-right extremist communities, particularly those online, intensifies, participants are becoming more aware of potential legal and reputational liability. The project of removing violent and terroristic content that defines many of these spaces online is in direct conflict with many users鈥 attempts to glorify those who commit extremist violence and the materials that inspired them.
As these Discord servers illustrate, while extremist communities are resilient and committed to spreading violent ideologies, meaningful content moderation can change the paradigm.
Unicorn Riot (UR), an independent media organization, has published chat logs from Outer Heaven, the Pathetic Life of an Average /K/ommando, and several other related Discord servers, commonly known as chat rooms. The release is part of UR's efforts to expose far-right online spaces promoting the New Zealand shooting.
Photo credit: iStockphoto