Skip to main content

Long read: How TikTok's most intriguing geolocator makes a story out of a game

Where in the world is Josemonkey?

If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Call of Duty getting "global real-time voice chat moderation" with Modern Warfare 3 in November

US-only Warzone and MW2 beta starts today.

Image credit: Activision Blizzard

Activision has announced new measures to combat toxicity within Call of Duty, confirming it'll be introducing what it calls "global real-time voice chat moderation" alongside the launch of Modern Warfare 3 on 10th November, with a US beta trial of the feature starting today.

Call of Duty's new voice chat moderation system will employ AI-powered technology from Modulate to identify and enforce against toxic speech in real-time, with flagged language including hate speech, discrimination, and harassment.

An initial beta for Call of Duty's new voice moderation system is being rolled out across Warzone and Modern Warfare 2 starting today, 30th August, in North America, and a global release will coincide with Modern Warfare 3's arrival in November. Activision notes the tools will only support English at first, with additional languages coming "at a later date".

Call of Duty's voice chat moderation gets a global roll-out with Modern Warfare 3 in November.Watch on YouTube

In a Q&A accompanying today's announcement, Activision explains the AI-powered system will only be responsible for identifying and reporting percieved offences for further review - attaching a behaviour category and rated level of severity to each submission - and that the publisher itself will determine how each violation is enforced based on its Security and Enforcement Policy.

It adds that "trash-talk" will be acceptable as long as it doesn't fall within the definition of harmful language outlined in its Code of Conduct, and notes that the only way to opt out of the new moderation system is to disable in-game voice chat.

Activision says Call of Duty's existing anti-toxicity moderation policies have so far resulted in voice and/or text chat restrictions across over 1m accounts since the launch of Modern Warfare 2, and that 20 percent of players did not reoffend after receiving a first warning.

Today's announcement follows the introduction of similar moderation measures elsewhere in the industry. Microsoft, for instance, launched a player-driven voice chat recording and reporting tool for Xbox in July, which is currently in testing with Insiders, while League of Legends studio Riot has been experimenting with voice chat moderation for some time.

Read this next