Skip to main content
If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Counter-Strike: Global Offensive learning to take on bots

Cheats versus machines

It looks like Valve are experimenting with machine learning to deal with cheating in Counter-Strike: Global Offensive [official site]. The snippet of insight comes from a Reddit thread about spinbots in Valve's first-person shooter.

The gist is that Valve are investigating using an artificial intelligence system capable of constantly retraining itself to spot cheaters and try to stay ahead in the cheat creation/cheat detection arms race.

The chat focused on spinbot detection which... I'll level with you, I'm not fully au fait with CS:GO bots and hacks because I don't play enough CS:GO BUT I believe spinbots are used to facilitate other cheats or avoid being hit. The player model is spinning round fast and so has a 360 field of view which can be combined with an aimbot for kills, or you can try to use it to avoid being hit yourself. The player's own screen looks normal so it's not a case of getting motion sick as you play but other people and spectators can generally see it. Especially if hackers don't even try to hide it, like this:

Watch on YouTube

As per the thread:

"So some bad news: any hard-coded detection of spin-botting leads to an arms race with cheat developers – if they can find the edges of the heuristic you're using to detect the cheat, the problem comes back. Instead, you'd want to take a machine-learning approach, training (and continuously retraining) a classifier that can detect the differences between cheaters and normal/highly-skilled players."

The response comes courtesy of the Valve Anti-Cheat account which has previously commented on a thread and been marked as verified by a moderator for the subreddit. Obviously without knowing the verification process I can't state without a shadow of a doubt that it's legit, but it seems to fit with things Valve have said in the past in various capacities. For example, a recent Q&A with Gabe Newell noted that "some of us are thinking about some of the AI work that is being hyped right now. Simplistically we have lots of data and compute capability that looks like the kinds of areas where machine learning should work well."

The Valve Anti-Cheat comment goes on to explain why this isn't just an easy fix, partly due to the resources detection of spinbots actually requires:

"The process of parsing, training, and classifying player data places serious demands on hardware, which means you want a machine other than the server doing the work. And because you don’t know ahead of time who might be using this kind of cheat, you'd have to monitor matches as they take place, from all ten players' perspectives.

"There are over a million CS:GO matches played every day, so to avoid falling behind you'd need a system capable of parsing and processing every demo of every match from every player's perspective, which currently means you'd need a datacenter capable of powering thousands of cpu cores.

"The good news is that we've started this work. An early version of the system has already been deployed and is submitting cases to Overwatch. Since the results have been promising, we're going to continue this work and expand the system over time."

Overwatch in this scenario is the system by which trusted members of the CS:GO community review reports of problem behaviour in the game and apply bans. It's the human arm of cheat/toxic behaviour detection. Since Blizzard launched a game of the same name I tend to forget the other flavour of Overwatch and have a moment of wondering why Valve is funnelling cheaty players into a rival's game. I mean, that would be one approach I suppose. Probably best that they try the machine learning idea first.

Read this next