In many ways, political servers on group messaging platform Discord are just like any other group chat. On one server, 4,000 active users share memes, play games and join voice calls together. It is a standard friend group. That is, until someone brings up politics.
As teenagers increasingly rely on online spaces for community and connection, extremist groups are sensing an opportunity. During the pandemic, these spaces boomed; according to the New York Times, Discord increased its active user base to 140 million active users, the primary base of which are ages 12-23. On its 6.7 million “servers,” or large-scale group chats, these teens organize around interests ranging from mainstream music to the most obscure topics.
The most popular interest by far though is video games. And while gaming platforms have taken some steps in recent months to curb hate speech in the medium, it’s still widespread. In a 2021 survey by the Anti-Defamation League, 10% of gamers aged 13 to 17 had interacted with white supremacist ideology, with 60% experiencing harrassment. Tech magazine Wired found that the large, organized groups are overwhelmingly alt-right, and have been allowed to grow with little oversight on popular platforms like Steam and Discord. As explained by activist group Tech Against Terrorism, members typically introduce themselves through a slightly edgy joke or off-color comment. This can be especially prevalent on popular sites like Roblox, of whose 43.2 million active users, 54% are under the age of nine. Most of the time, this goes nowhere. However, if a player is interested, they are invited to join up on private servers and chats.
“In students’ minds, there really was never a time before the internet,” Director of Network and Support Services Chad Griffith said. “Just like with any tool that can be used for good, it can also be used for bad. And I think that when teens are young, they’re impressionable, they have a strong desire to fit in, to belong, to be a part of something. And if those needs aren’t met, there’s ways on the internet to find those avenues.”
One non Trinity affiliated Discord moderator, who asked that their username not be published, spends six to eight hours a day running a political Discord server of about 2,000 people, most of whom age 13-24. This matches with the Institute for Strategic Dialogue’s study, which found that the average user age in extreme right-wing Discord communities was 15.
“More than a server about politics, it’s a server where you can make friends,” the moderator told me.
But past the server’s welcoming statement, threads titled “conspiracies” and “debates-and-politics” contain a strange mix of Christian theology, bad Donald Trump photoshops and virulent antisemitism. It’s often hard to tell what’s serious and what’s a joke. Next to a meme from Shrek might be a debunked study on African American skulls, and calls for violence against trans people could be done by a user with a Sonic avatar. Many users feel disillusioned with mainstream politics, and often hold contradictory views.
“Am I actually a national socialist? No,” one user told me. “Am I whatever the hell normies think is a Nazi? Maybe.”
The connecting theme was anger. Despite Discord recently taking meaningful efforts to contain hate speech on its platform, users have still figured out ways to bypass the Terms of Service (one user advised me that, as long as I didn’t type phrases like ‘The Jewish Question’ outright, I wouldn’t be banned). Oftentimes, this hate speech can spread through “raids,” where users coordinate on flooding another server with hateful images and links in order to get new followers or shut the server down.
“It’s very tough sometimes,” the moderator said. “We have raids on a daily basis, sometimes a single raid, sometimes up to 6 in a day. We have had raids of any kind, from single raiders, to hundreds of bots, to spam links and invites to the users.”
The majority of these servers are unorganized in both structure and rhetoric, but far-right streamers are a common topic. While major streaming locations like Twitch and Youtube deplatformed most extremist streamers after the Jan. 6 riots, a constantly-growing network of sites has emerged to fill the gaps. Sites like Odysee and Bitchute provide thousands of audience members for commentators, as well as a fundraising opportunity; the Global Misinformation Index recently found over 200 incidents of white supremacist groups using online streaming platforms to raise money, with some making over $100,000 a month.
Despite sites often banning streamers quickly, it’s nearly impossible to completely remove them. For as long as white supremacist and Holocaust denier Nick Fuentes has been streaming, he’s been on the move. Fuentes and his generation Z following has been banned from Youtube, Twitch, Spotify, Twitter, Paypal, Venmo and even infamously unmoderated sites like Gettr and D-Live. Even some members of his own party have split with him, due to his self-quoted attempts to “be on the right, dragging [moderate Republicans] kicking and screaming into the future.”
Despite this, Fuentes remains. Republican congressman Paul Gosar and pundit Michelle Malkin have given their support, and he currently hosts a show on Alex Jones’s network. In February, Fuentes will host the America First Political Action Conference in Orlando. And when a user gave me a link to an unlisted white supremacist streaming site, there Fuentes was, broadcasting to 12,500 viewers.
That’s not to say that forcing provocateurs off of highly visible sites is ineffective. Taking away their reach to new audiences and attention is crucial to scattering the movement. But as Axios reports, the new alt-right isn’t just a base of leaders, but an online ecosystem. Alt-right groups are creating their own niche echo chambers, spending millions of dollars on insular online communities. Once you’re in, so the plan goes, you’ll never need anywhere else.
That’s an enticing pitch, especially when it comes to teenagers. White supremacist groups have always provided simple, fake answers as a solution to personal problems. It’s hard to improve, but easy to blame.
And that’s why it’s more important than ever to reach out to students. More than 2,000 schools already use the Anti-Defamation League’s materials on recognizing and reacting to hate speech online, and the National Education Association provides resources for educators who want to provide more diverse viewpoints on real-world topics.
“If you block one site, then another one pops up, so it’s just a never ending thing,” Griffith said. “I feel like the only real way to combat that is with education, teaching students and teaching faculty and signs to look out for and things like that.”
Griffith hopes professional development will be made in the classrooms by both teachers and administrators in order to broach topics like these and make students more aware.
However, the most effective approach is often the most personal. Parents are the first and last safeguards against radicalization for teenagers. Numerous resources exist to help parents recognize signs, including the Southern Poverty Law Center’s 7-minute guide to both understanding and dealing with child’s radicalization.
It’s difficult to draw a clear line between what’s extremism and what’s just talking, and even harder to tell who’s a true believer. But just because the intent changes doesn’t mean the words do, and as teens are drawn further and further into these virtual groups, they’re isolating themselves in a world of hatred, violence and anger. Part of the danger of the rabbit hole is that you can’t see a way out. We need to throw teens a lifeline.