Last December, the United Nations warned of an overlooked but critical “emerging terrorist threat”: extremists radicalizing members of online gaming communities.
Despite ample interest in saving gamers from such exploitation, experts say that a lack of research funding on the topic has put the gaming industry behind social networks when it comes to counterterrorism efforts. That’s starting to change, though. Within the past week, researchers told Ars that the US Department of Homeland Security has, for the first time, awarded funding—nearly $700,000—to a research group working directly with major gaming companies to develop effective counterterrorism methods and protect vulnerable gamers.
The new project will span two years. It’s spearheaded by Middlebury College’s Institute of International Studies, which hosts the Center on Terrorism, Extremism, and Counterterrorism (CTEC). Vice reported that other partners include a nonprofit called Take This—which focuses on gaming impacts on mental health—and a tech company called Logically—which Vice says works “to solve the problem of bad online behavior at scale.”
The researchers have summarized their overarching goals for the DHS project as “the development of a set of best practices and centralized resources for monitoring and evaluation of extremist activities as well as a series of training workshops for the monitoring, detection, and prevention of extremist exploitation in gaming spaces for community managers, multiplayer designers, lore developers, mechanics designers, and trust and safety professionals.”
Take This research director Rachel Kowert told Ars that the primary objective of the project is to develop gaming industry-focused resources. Her group’s ambitious plan is to reach out to big companies first, then engage smaller companies and indie developers for maximum impact.
Alex Newhouse, deputy director of CTEC, told Ars that the project will start by targeting big gaming companies that “essentially act like social platforms,” including Roblox, Activision Blizzard, and Bungie.
Although project funding was just approved, Newhouse said that CTEC’s work has already begun. For six months, the group has been working with Roblox, and Newhouse said it is also in “very preliminary” talks with the Entertainment Software Association about ways to expand the project.
Borrowing social media counterterrorism methods
Newhouse said that within DHS, the FBI has become increasingly interested in research like CTEC’s to combat domestic terrorism—but, to his knowledge, no federal organization has funded such data collection. Although his project is only funded for two years, Newhouse wants to push the gaming industry within five years to implement the same standards for combating extremism that social networking platforms already have.
“I want game developers, especially big ones like Roblox and Microsoft, to have dedicated counterextremism in-games teams,” Newhouse told Ars. “In these days, we need to push to be that sophisticated on that on the games industry side as well.”
Newhouse plans to rely on his experience helping tech giants like Google and Facebook organize counterterrorism teams. He says that CTEC’s biggest priority is convincing the gaming industry to invest in proactively moderating extremist content by “implementing increasingly sophisticated proactive detection and moderation systems” that social networks also use.
Historically, Newhouse said that gaming companies have relied mostly on players to report extremist content for moderation. That’s not a good enough strategy, he said, because radicalization often works by pumping up a gamer’s self-esteem, and people who are manipulated to view this sort of online engagement as positive often don’t self-report these radicalizing events. By relying strictly on user reports, gaming companies are “not going to actually detect anything on the initial recruitment and radicalization level,” he said.
An associate director for the Anti-Defamation League’s Center for Technology and Society, Daniel Kelley, told Ars that online gaming companies are approximately 10 years behind social media companies in flagging this issue as critical.
Limited funding for online gaming counterextremism efforts
Kowert, of Take This, first got interested in the link between online gaming communities and real-world violent extremism after she encountered a 2019 nationally representative survey from ADL. If found that nearly one in four respondents “were exposed to extremist white supremacist ideology in online games.” Newhouse said that estimate is “probably too conservative at this point.”
Still, ADL said, “the evidence of the widespread extremist recruiting or organizing in online game environments (such as in Fortnite or other popular titles) remains anecdotal at best, and more research is required before any broad-based claims can be made.”
Today, the research base remains limited, but it’s become apparent that the issue is not just impacting adults. When ADL expanded its survey in 2021 to reach almost 100 million respondents, the survey included young gamers aged 13-17 for the first time. ADL found that 10 percent of young gamers were “exposed to white supremacist ideologies in the context of online multiplayer games.”
Kowert immediately responded to the 2019 ADL report by pivoting her research and teaming up with Newhouse. She told Ars that the reason there’s so little research is because there’s so little funding.
Kelley told Ars that while it’s good to see research finally receive funding, ADL recommends that government inject way more funding into nipping the issue in the bud. “This is not a time to be supporting stuff with drop-in-the-bucket funds,” Kelley said. “There’s a lot more that the Department of Justice needs to do to fund these kinds of efforts.”
Gaming industry remains unaware
Kowert told Ars that gaming companies have “legitimately” remained “unaware of the scope of the issue” of extremism on their platforms, mostly because they think of themselves as gaming platforms first and social platforms second. Newhouse agreed.
“It is very, very clear in our conversations with the video game industry that they are not fully cognizant of the budding problem that they have on their hands,” Newhouse told Ars.
According to Kelley, it’s not just the counterterrorism efforts of social media that gaming networks need to embrace; gaming companies could also become safer if there were regulations like those forcing social media companies to publish transparency reports. The only gaming company Kelley has ever seen publish a transparency report was a small company called Wildlife Studios, which released its first report this year.
“2022 is the first time we’re getting any kind of transparency reporting from any game company,” Kelley told Ars. “And it’s not from any of the majors. It’s not from EA. It’s not from Riot. It’s not from Blizzard.”
None of the major online gaming companies mentioned here immediately responded to Ars’ request for comment. Kelley said that Roblox is the only major gaming company with a public online extremism policy.
Perhaps part of the problem with gaming companies overlooking the issue, Kowert says, comes from the significant research base disproving that the content of online videos games directly impacts gamer susceptibility to extremism.
The American Psychological Association told Ars that its 2020 report saying that video games do not incite violent behavior is still its most current statement. But Kowert says that focusing discussions on video game content “is hindering the conversation.” There needs to be more focus on how gamers are reached socially by extremists during gameplay.
Kelley says that CTEC’s research is an important first step toward more government involvement in this issue, but that even getting the gaming industry up to social media’s standards is perhaps a low bar.
“I think there’s still a long way that the social media industry has to go before having really robust transparency,” Kelley said.
ADL recommends that online gaming companies go even further than social platforms when it comes to transparency. ADL wants to see gaming companies conducting audits and including metrics on “in-game extremism and toxicity in the Entertainment Software Rating Board’s rating systems of games.”
More transparency is exactly what researchers focused on extremism in online gaming communities need, Newhouse said, because research is also limited by what information is publicly available. However, gaming companies don’t always enthusiastically cooperate with researchers. When Newhouse contacts gaming companies, he said, sharing data isn’t their instinct, and, generally, they have to be spooked into cooperating on efforts to protect users.
“In all honesty, we usually have to scare companies into listening to us,” Newhouse told Ars.