Online multiplayer games, like GTA Online, are infamous for the toxicity that some of their in-game servers, especially public ones, can contain. Profanity and negativity can litter both in-game chat and voice communications. It isn't unusual for players to flood the communications channels with spam. Now, with the help of AI, it appears that Rockstar is attempting to crack down more heavily on this behavior.

Unlike Activision Blizzard, which announced its move into AI-usage, Rockstar has "quietly" partnered with Modulate, the company that's behind its newest moderation system to "detect toxic speech" in the game.

The "AI-powered voice chat moderation tool" is called ToxMod and will help Rockstar in filtering bad behavior in GTA Online, including but not limited to, hate speech, harassment and discrimination, all in real time.
ToxMod was exclusively added to GTA Online on the PC as part of its October 3 update, as noted by Tez2.

Without any clear information about how ToxMod will work in GTA Online, all we can do is speculate. But, Tez2 did stir some controversy by mentioning that the AI content moderation software can ban users from using voice chat. It isn't clear yet if it can and/or will ban users from accessing GTA Online entirely.

While profanity and toxicity aren't mutually inclusive, GTA Online does attract a certain type of player.