What do incels, white supremacists and mass murderers mostly have in common — besides being entitled — you may ask? Well, there’s this: they use the internet as a personal diary detailing all of their very horrifying and disgusting opinions, wishes and desires.
While most of the net is moderated, whether strictly or lightly, unmoderated sites do exist. That type of space online allows for not only racist and misogynistic threads, but for the posting of manifestos days — or minutes — before tragic and heinous acts are committed.
Sites like 8chan are notorious for this, being that a little more than a month ago, the El Paso shooter was just one of many users to post their manifestos on the site minutes before killing innocent people.
While tragic and sad, this isn't the first time we’ve seen this. At least three prominent mass shootings were linked to the site this year alone. Since 2013, 8chan has hosted a space for bottom-of-the-barrel people to spew nonsense, hate and recruit impressionable kids and teens. While the site has had a long stint with hate for approximately six years now, things might be coming to a halt for the site.
In the wake of the El Paso shooter posting his manifesto on the site, 8chan was finally kicked off of the clearnet as the hardware provider, Voxility, discontinued service with the site.
While this is a positive move in some regards, 8chan raises some serious questions. What issues tie into these toxic spaces present online? And does the removal or moderation of such sites really do anything?
Well, for starters, online anonymity and the complexities of First Amendment rights are just a few of the factors as to why sites like 8chan prevail and stay up for so long.
While online anonymity allows 8chan-ers, or online trolls, to feel safe enough to be bold, First Amendment rights surround them like a security blanket they can grip onto if ever they get backlash or are questioned about what they say.
The First Amendment doesn’t protect everything every online troll says, but it protects the vast majority of it. And because of that, it's easy to wonder if sites like 8chan will ever really go away and if this is a problem that can be solved without infringing on First Amendment rights.
While I don’t have the answer, I think that, when it comes to unmoderated spaces online that threaten human life, taking down sites like 8chan is not a violation of First Amendment rights, it's a call for public safety.
Toxic spaces exist online because they are allowed to, for the most part. And with all the faceless and Pepe the Frog profiles online spewing nothing but hate, we wonder if this problem is going anywhere and if we're only pushing these profiles to hide deeper and deeper into the internet.
Social security numbers, drugs, weapons and sex trafficking can all be found on the dark web. What’s to say they won’t just go there, where it will be harder to find them or moderate their hate?
Well, for the most part, it's about attention and camaraderie for hate groups. And with that in mind, it's easier to understand that moderating a space online, while necessary, isn't going to make incels or white supremacist groups go anywhere.
But just because they’re not going anywhere doesn't mean we aren't taking a step in the right direction. Hopefully, legal action can follow in suit of the moderating of 8chan and other sites, so that there are consequences to the owners and users that talk so freely about doing things that are illegal and above all, wrong and horrific.