If you click on a link and make a purchase we may receive a small commission. Learn more.

Google's AI moderation shows results in Counter-Strike: Global Offensive

Toxic avenger

Counter-Strike: Global Offensive became a slightly less accurate name, as 20,000 accounts have been banned from FaceIt's platform since late August by the recently-unleashed admin AI Minerva.

After months spent in machine learning trials, the system built with Google tech led to a 20% reduction in the number of toxic messages between August and September according to FaceIt's blog.

I've changed my mind. Algorithms are good now.

FaceIt are an independent platform with matchmaking, tournaments, ladders, and anti-cheat services for several competitive games, including CS:GO, Rocket League and Dota 2. This is separate to Valve's own Counter-Strike moderation. FaceIt have developed Minerva in partnership with Google and Jigsaw. Its stated goal is to combat toxicity in online games in a way that can scale up for their huge playerbases. It's currently on a humble version 0.1 but assuming these numbers are accurate, that's a strong start, and potentially a significant improvement for players.

Jigsaw recently went into some detail on how Minerva works, based on the operation of its predecessor, Perspective. Flagged messages are rated on their similarity to abuse the AI has recorded in the past. Further numbers are then crunched to determine how severe the suspected violation is, and how frequent such messages were from that player. If it decides a warning or ban is necessary, it does so at the end of the match.

That immediacy and clarity of feedback was particularly important, since manual moderation often takes hours or days, leaving less severe offenders confused, and truly awful ones free to ruin more people's game in the meantime. Skeptical as I am of fully automated moderation, it's hard to deny the upside of griefers and other jebs being instantly splatted by robots.

Other forms of abuse and griefing are harder to identify than a convenient line of text, and that's not going ignored. "FaceIt is planning on combining different data sources like videos, voice chat and in-game events to better evaluate the behavior of a player in a match and be able to address toxicity on different levels", Jigsaw claim. Work on Minerva will continue, and if the tech is as effective as they're making out, perhaps the era of the griefer will one day be just a bad memory.

About the Author

Support Rock Paper Shotgun

Sign up today for access to more supporter-only articles, an ad-free reading experience, free gifts, and game discounts. Your support helps us create more great writing about PC games.

See more information


We love having a friendly, positive and constructive community - you lot are great - and we want to keep it like that. Our main commenting rule is "be excellent to each other". Please see our code of conduct, where you can find out what "be excellent" means. TL;DR? Respect others, think before you post, and be prepared for puns.

More News

Latest Articles

We've been talking, and we think that you should wear clothes

Total coincidence, but we sell some clothes

Rock Paper Shotgun Merch