Key Takeaways
- Online gaming communities can often fail to make players feel safe and included due to toxic text and voice chat lobbies.
- AI chat moderation flags potentially toxic behavior so that it can be reviewed and dealt with by a team of human moderators.
- AI voice chat moderation is already in use in popular multiplayer games like
Call of Duty
,
Among Us,
and
GTA Online
Spending time in a gaming lobby or chat can feel terrible. Some communities have become toxic, driving players away from the game before they even get into it. Some publishers think AI could be the solution to making their games approachable for all players.
Gaming Has a Toxicity Problem
I’m from the old-school gaming crowd, so most of my formative years were spent playing single-player games or couch-competitive on a console. Trash-talk was the order of the day, but you knew your limits when you were on the couch next to your friends, and no one ever took the trash-talk personally. The internet changed all of that.
While you can play local multiplayer games with Steam, you’ll usually play online in a lobby. If you play online multiplayer shooters like Valorant or Call of Duty, you will quickly realize the extent of the toxicity problem that many online shooters struggle with. Data from 2022 showed that Call of Duty was possibly the most toxic among all fanbases, making the game look bad in new fans’ eyes. The game also has a notorious history of harmful behavior, including the infamous “swatting” trend from a few years ago.
Activision, the publishers of Call of Duty, weren’t happy with this and decided it was time to change things.
Players have long sought non-toxic games. However, most gaming companies don’t know how to tackle the problem of toxicity among their player bases. Call of Duty recently implemented a system that uses AI to catch toxic behavior in lobbies and deal with players promoting negative behavior in their player base. The results were surprisingly effective.
How AI Deals With Toxic Players
So you’re in a gaming lobby, and some guy gets all up in your face with trash talk. It eventually descends into disgusting language, then slurs. You report the guy, but you don’t even know if anything will be done about it.
This is the reality of a lot of online gaming spaces these days. According to Activision, over four million accounts were hit with enforcement as of January 2024 over toxic behavior, yet the behavior persists.
Activision’s old approach wasn’t working, so they decided to change gears.
The company previously partnered with Community Sift for text-based moderation, which seemed to be working well. However, in console-based lobbies, most communication is done via voice, and voice-based moderation is necessary. This is where the company’s AI-based voice chat moderator comes in.
The moderator, known as ToxMod, is from the company Modulate.ai and has a lot of power within the Call of Duty ecosystem. ToxMod uses nuances of voice, speech patterns, stresses, and other elements to determine whether the user is saying something toxic to other lobby members. The moderator can issue warnings and flag player accounts for enforcement and banning by humans.
Which Games Currently Run the AI Chat Moderator?
Titles like Among Us and Grand Theft Auto Online have both implemented the AI chat moderator. GTA Online players first experienced the beta testing for ToxMod in December 2023, but many users were concerned about their privacy. Rockstar, the game’s publisher, told players that they were just testing the system and would consider implementing enforcement in 2024. To date, however, there haven’t been many updates about AI moderation.
Among Us also implemented the system in its VR release way back in April 2024. These popular games have made ToxMod very popular in dealing with questionable speech and keeping gaming lobbies palatable for anyone. It’s a step forward in helping multiplayer online gaming shed its toxic past and become more approachable to a wider number of people.
What Gets Flagged and Whatโs Okay
According to ToxMod, the AI voice moderator uses emotional cues to determine whether something offensive was said. However, it also uses chat responses to pinpoint offensive statements.
The system is designed for humans to have the final say since ToxMod sends detailed logs of the audio conversation to the human moderator, who can then issue a ban if they find the conversation in breach of the code of conduct. This is a flagging tool, designed to help locate the problem, rather than a fully automated moderation system.
So Why is ToxMod such a big deal if there are already ways for users to submit reports about toxic players? Activision’s problem is that heading into the report screen takes players out of the game. When that happens, they are less likely to want to keep playing. ToxMod’s automatic keeps players immersed by making the report for them.
Many of us are used to simply muting problem players and carrying on, whereas ToxMod ensures that a report is made so that the instigator can be dealt with.
How Has ToxMod Helped Call of Duty?
While it’s easy to speculate that an AI voice moderator might make games like Call of Duty less toxic, having hard facts is a lot better. In this case, Activision estimates that there was up to 50% less toxic voice chat in North America for the titles Call of Duty: Modern Warfare III and Call of Duty: Warzone. The same source suggests an 8% reduction in repeat offenders and a 25% overall reduction in toxicity on the platform.
From the numbers, it’s evident that ToxMod has positively impacted the chat ecosystem. Still, this should be taken with a grain of salt. AI voice moderation tends to have issues with some regional accents, so having a human issuing the bans is vital to avoid any problems.
Even so, some gamers are unhappy they were sanctioned for voice chat. As great as AI voice moderation is, mistakes can still be made. We think it’s a great year to check out the new Call of Duty, especially with the reduced toxicity.
Building a Gaming Future We Want
Artificial intelligence has many uses, from powerful image generation on your PC to using it as a personal fitness trainer. It’s only natural that someone would find a way to moderate offensive voice chats in multiplayer gaming to make them a more welcoming atmosphere.
I’d like to see a multiplayer gaming community where people are supportive and don’t abuse fellow players out of prejudice. There’s a time and place for trash-talking, but there’s a line where it moves into being mean and abusive.
If AI voice chat moderators are the way to get us to a place where gaming is fun again, I’m all for seeing it implemented.