A few minutes every morning is all you need.
Stay up to date on the world's Headlines and Human Stories. It's fun, it's factual, it's fluff-free.
The internet attracts and spreads extremist views almost like that’s what it was invented to do. It’s hard to escape that kind of content, especially on social media. But did you know it’s apparently become a really big issue in the gaming community?
Last year, experts even warned about how this was becoming a growing issue. "Games are becoming increasingly social ... those social hooks provide the structures and the infrastructure for extremists to organize, mobilize and spread their hateful and extreme ideologies," said Alex Newhouse, the deputy director at the Center on Terrorism, Extremism and Counterterrorism, in a Games Developers Conference talk back in 2022.
It might even seem to make more sense for extremists to target audiences in video games like Call of Duty, which are more mature and violent. But apparently, this kind of extremism has even been an issue on innocent-seeming platforms like Roblox, where you can find users setting up things like Nazi-themed villages.
Similarly, the “sandbox game” Animal Crossing: New Horizons has even seen extremists using the platform to express their views. This game is supposed to be for players to develop their own islands and interact with animal characters and one another to build towns. It’s a pretty low-stakes game. Or, it was. For example, one player complained on Twitter about how they’d accidentally invited some extremists onto their island in the game – and the avatars were wearing KKK robes and a swastika T-shirt.
In a new report from New York University, researchers suggest that a lack of decent moderation for multiplayer games and platforms like Discord and Twitch has basically made multiplayer games a breeding ground for extremism. NYU surveyed 1,128 gamers in the US, UK, France, Germany and Korea to find that 51% reported experiencing some form of extremist statement or narrative while playing multiplayer games in the last year.
“It may well be a small number of actors, but they’re very influential and can have huge impacts on the gamer culture and the experiences of people in real world events,” the report’s author, Mariana Olaizola Rosenblat, said.
Many gaming companies have said they’re doing what they can to crack down on this type of stuff. Some, like Discord, Twitch, Roblox and Activision Blizzard – the maker of Call of Duty – use automated tools to detect and remove objectionable content. In fact, Activision has banned 500,000 accounts on Call of Duty for violating its rules in recent years.
Although extremists don’t make up the majority of the gaming community, the study argues that spreading hate speech or extreme views has a far-reaching effect. By building up virtual communities, users can create an echo chamber of dangerous views that affect younger users who are more impressionable. And the gaming industry hasn’t received as much pressure for moderation as we’ve seen for social media platforms.
So, these experts are making some suggestions for platforms to help curb these extremist trends. For example, by bringing in more human moderators or harnessing the quickly developing skills of artificial intelligence, they may be able to catch more bad actors and content before they spread.