Today, Modulate has launched ToxMod, a new service that uses AI to scan voice chat in video games for toxic speech or other bad behavior. It flags everything from racist speech to predatory behavior, taking into context the way in which people say words to tell game developers what needs their attention.
Modulate says the service is the world’s first voice-native moderation service, enabling companies to monitor toxic speech in real time and giving game companies a chance to detect the hateful words in a timely manner. It complements other voice-related technologies at the Cambridge, Massachusetts-based company, which is using machine learning techniques to create customizable “voice skins” for games. These let players modify their voice so they can have funny voices or disguise their identities.
Unlock premium content and VIP community perks with GB M A X! Join now to enjoy our free and premium perks.
Join now →
Sign in to your account.