ToxMod can help protect gamers from toxic behavior by scanning voice chat.

Modulate’s ToxMod uses AI to scan game voice chat for toxic speech

Today, Modulate has launched ToxMod, a new service that uses AI to scan voice chat in video games for toxic speech or other bad behavior. It flags everything from racist speech to predatory behavior, taking into context the way in which people say words to tell game developers what needs their attention.

Modulate says the service is the world’s first voice-native moderation service, enabling companies to monitor toxic speech in real time and giving game companies a chance to detect the hateful words in a timely manner. It complements other voice-related technologies at the Cambridge, Massachusetts-based company, which is using machine learning techniques to create customizable “voice skins” for games. These let players modify their voice so they can have funny voices or disguise their identities.

Unlock premium content and VIP community perks with GB M A X! Join now to enjoy our free and premium perks. 

Join now →

Sign in to your account.

Dean Takahashi

Dean Takahashi is editorial director for GamesBeat. He has been a tech journalist since 1988, and he has covered games as a beat since 1996. He was lead writer for GamesBeat at VentureBeat from 2008 to April 2025. Prior to that, he wrote for the San Jose Mercury News, the Red Herring, the Wall Street Journal, the Los Angeles Times, and the Dallas Times-Herald. He is the author of two books, "Opening the Xbox" and "The Xbox 360 Uncloaked." He organizes the annual GamesBeat Next, GamesBeat Summit and GamesBeat Insider Series: Hollywood and Games conferences and is a frequent speaker at gaming and tech events. He lives in the San Francisco Bay Area.