Statespace: the neuroscientists who want to improve your gaming skills

Aim Lab

With the proliferation of esports and the potentially huge sums of money involved if you make it to the top tier in some games it’s not surprising that there are so many third-parties carving out a niche by offering to help improve your performance. One which caught my eye recently was Statespace [official site] which is being set up by Wayne Mackey and Jay Fuller. The pair have left their academic research positions at New York University to focus on this neuroscience-based tool for developing the skills associated with competitive gaming.

The software and the first product, Aim Lab, is only just launching into beta and this is hardly the first email I’ve had about a project aiming to help aspiring pro players improve their gaming BUT I had so many questions when faced with the idea of what Fuller suggested – “you could think of it like the NFL Combine or perhaps FitBit for gaming” – that I couldn’t resist finding out a bit more:

First here’s the proposal as laid out in the Statespace summary:

“Our mission is to improve training conditions in the eSports space by introducing objective measures of skill based on decades of validated science. As avid gamers ourselves, we’ve seen players at various levels (professionals and hobbyists alike) look for ways to improve their skills, much the same way athletes from traditional sports train to gain a competitive edge.

“Our first product, Aim Lab, assesses a gamer’s neurocognitive skills (e.g, visual acuity, decision-making, reaction time, hand-eye coordination). The platform looks and feels like a classic first person shooter game, but under the hood it runs experiments to assess the player’s skills. Our A.I. identifies player weaknesses and spawns custom practice scenarios to train the player more efficiently. In this way, we can help players get better, faster by serving as a personal trainer.”

The idea is that, in being able to break down game performance and see data assessing specific skills, players would be able to see the areas where they need to improve and thus reap the benefits of those improvements in competitive games.

Obviously there’s a lot to unpack there – how the breakdown of skills was decided, whether it’s even possible to measure the impact of the assessments/training on the game a person wants to improve at, whether this is purely for technical skill assessment or if the team is hoping to address emotional or psychological elements of play…

With that in mind here’s our emailed Q&A:

Pip: How did you decide on ththe skills to focus on (there are so many different elements and skills which contribute to success in esports, plus the combinations needed vary across the games being played and the styles within those)?

Jay Fuller: We are focusing on the fundamental skills that bridge most if not all games. Not all games will require all of these skills, but they will include at least one of them. The fact that there are vast differences between games and between character archetypes within the same game fascinates us and is another reason our unique data will be insightful. To elaborate, in traditional sports, the ability to run fast would correlate well to success in many sports (e.g., football, basketball, soccer, etc). Conversely, upper body strength may be more relevant to football success than soccer. Even within football, arm strength may be of greater importance for certain positions (e.g., quarterback, lineman) than others (cornerback). Nevertheless, the ability to test players of all skills on the same metrics (as they do with the NFL Combine) is helpful for players and coaches alike to refine training.

Now onto gaming, in first-person shooter games, reaction-time and fine motor control will be relevant to most titles, but excellent visual acuity may be especially relevant to a game like PUBG where you are trying to detect subtle motions in a large open environment. Finally, it is important to note that the skills I listed are examples and not set in stone. The best skills to ultimately include in the platform is an empirical question and one that will be guided by combining our R&D expertise with feedback from beta testers.

Aim Lab

Pip: Are there specific games where you think your approach is better placed to improve performance than others?

Fuller: We are currently focused on FPS titles because they require the most mechanical skill, but are moving into MOBA genres next, which as you mention in [the previous question], does require a different skill set and context, but yet, still skills we measure in neuroscience (i.e., adaptability, decision-making, etc). Ultimately, comparing skills between genres of games will be a fascinating area to explore within our combined data set.

Pip: Given the specificity of some of the skills needed to excel in a game (knowing particular combos or timings or matchups or any number of other things) how would you guard against someone using your program to improve but only getting better at playing your specific training scenario?

Fuller: Consider training in stick-and-ball sports. In football, players often lift weights to train, they don’t just play football all day long. While it’s true that this means players may end up getting better at exercises like the bench-press (which has its own skill to it), they still end up getting objectively stronger because they are targeting that attribute, and that strength leads to better performance on the field.

In eSports, strength may not matter as much, but attributes like perception, decision-making and hand-eye coordination certainly do: light and sound come into the brain, this information is processed, and a flurry of buttons are pressed in a fast and coordinated sequence. If there is a weakness anywhere in that loop, then performance will be affected. Esports players need methods to identify their weaknesses and train them. As neuroscientists, we will provide measurement techniques (experiments) that assess skill, and then we will process that data to provide optimal feedback in order to maximize learning. Think of us as what physiologists are to traditional sports: sports scientists.

We are creating a specialized environment in which we can measure and train skills that are not only game agnostic, but agnostic across real life tasks. For example, as humans, we have unique visual biases in which we may see better to the left than the right. This exists whether you are driving a car, playing baseball, or playing Overwatch.

Aim Lab

Pip: You mention the NFL Combine but that is a specific, official NFL event with direct consequences for drafting and salary and so on. Given Statespace and Aim Lab would be third-party software how are you approaching the esports space? Is it a case of approaching amateurs who want to become pros first and trying to grow from there or are you in talks with organisations/specific games, or are you trying something else?

Fuller: We’re working to help players of all levels improve their technical skills. Our first offering, Aim Lab, will be rigorous enough to be used by pro, semi-pro, and college teams, but approachable enough to be used by amateurs. Of course, there are many more amateurs, so our primary outreach efforts have focused there. We are also in conversations with several teams and colleges to help address their unique data needs.

As for drafting, salary negotiation, recruitment: while we’re initially focused on training, we hope our data will be additive to the existing recruitment process for teams. Data on player skill can help players and teams of all levels make better decisions. Numbers won’t tell the whole story on the potential of a particular player, but they will be another factor for decision makers in the eSports industry to use.

Pip: The other thing is that you’ve spoken about assessing skills and working on ways players might develop them with the intention of improving in competitive games as a result of those changes, but what is there to measure that impact?

Fuller: We will be correlating our data on to the data made available to us from games. That metric won’t tell the whole story as there are a lot of factors that go into game performance beyond technical skill (strategy, team work, emotional influences), but it will be one way in which to evaluate the impact of Aim Lab. Qualitative data will also be used. Feedback from players and coaches will be critical to our research and development team to make the best product possible. We’re really looking forward to feedback from the gaming community.

Aim Lab

Pip: I’m asking that previous question because I want to know whether there is any specific research you’ve done (or which you can link me to) which demonstrates that those improvements in the skills you’ve isolated are transferring to games?

Fuller: We have studied many aspects of neuroscience: motor control of reaching and pointing movements, eye-movements, memory, working memory, perception, decision-making, etc. Name an area of cognitive neuroscience, someone on our team has studied it or is well-read on the topic. The cognitive neuroscience experiments we run in the lab are in fact just custom video games built to test specific aspects of the brain. During these experiments, when you give participants richer feedback on their performance, they learn quicker. If that sounds unsurprising, it should be. The hard part is getting good data to present back to the participant. In order to do so, you have to setup the experiment in a way that accurately as possible measures the attribute of interest. As neuroscientists, that’s our area of expertise. With Aim Lab, we’re bringing that expertise directly to gamers.

Pip: Relatedly, I’m interested to know whether/how you’ll be tracking the effects of using Aim Lab on playing those competitive games. (I ask because relying on players’ self-reporting will be so vulnerable to things like confirmation bias and because some changes are potentially hard to quantify).

Fuller: Yes, we’ll be correlating data from Aim Lab to data available from common competitive games. You’re correct in that some changes will be difficult to identify because success in gaming is a combination of so many factors, some of which are measurable, and some which are not. However, accurate and rich data will help players, coaches and the media to evaluate player performance with more objectivity.

Perhaps looking to an example from traditional sports would be helpful for context here. In baseball, the introduction of radar guns in the ’60s allowed for the measurement of pitch speed. Being able to objectively tell the difference between a 95 mph pitch and 85 mph pitch allows players to train/track their ability and coaches and media to objectively measure talent. Now, just because a player can throw 95 mph doesn’t mean he’ll have a better career than the guy who throws 85 mph. That said, pitching speed is known to contribute to success, so people in the industry like to have that number to factor into their decision making. Similarly, with Aim Lab, we want to empower players, coaches, and media with more objective information.

Pip: Finally, are there any elements in your software which focus on emotional self-regulation or related areas? I appreciate you seem to be targeting technical skill and performance but esports success is frequently also about functioning as part of a team and not, for example, tilting when things go awry.

Fuller: We are focusing on technical skills for now as those skills are more feasible to objectively measure as a first step. We want Aim Lab to be another training tool in a player’s arsenal. It won’t replace playing the actual game, it won’t replace the coaches who help with strategy and management, it won’t replace psychologists who help with the emotional aspects like tilting – Aim Lab metrics and training will be additive to all of those existing player support systems. It’s the ultimate tool for exercising and evaluating game relevant technical skills.

Pip: Thank you for your time.

The venture has a $500,000 pre-seed round of financing from a start-up studio called Expa (Expa being a project founded by Garrett Camp, co-founder of Uber and StumbleUpon and invested in by a number of other business/tech figures like Virgin’s Richard Branson, Foursquare’s Naveen Selvadurai and so on) and some angel investors, so it sounds like Silicon Valley is keeping an eye on the idea.

I’ve also been digging around on the Statespace website since it went live and there’s an interesting piece in the article on Plunkbat (Plunkbat being The Artist Usually Known As PlayerUnknown’s Battlegrounds) and peripheral vision:

“We found that your peripheral vision is important for taking in the gist of a scene and that you can remove the central portion of an image, where your visual acuity is best, and still do just fine at identifying the scene,” Adam Larson, K-State Master’s student in psychology. If we take a closer look and try to gain a better understanding of our vision, we can then translate that into how it affects our games and how to use it to our advantage.

We essentially have two important types of vision; peripheral and central. Central is exactly what you think it is; what our eyes are centering on and the main focus of what we are looking at. In a game like this, we scan around the map and buildings at all times to look for any indication of someone being around or right in front of us. Peripheral is the part of our vision that occurs outside the very center of what we are looking at. Peripheral vision is often used to get the gist of a situation or area. For instance we will know that we are in the military base just by being in it, our central vision can be scanning around for enemies and items while our peripheral instantly lets us know where we were.

Now for the interesting part, our peripheral vision is actually better at detecting motion than our central vision. This is super useful in a game like PUBG since any sort of movement is a huge indicator to death being seconds away.

Statespace and Aim Lab are intended for a Q2 2018 release via Steam (so late spring/early summer for the non businessy Northern Hemisphere-ers among us) but the beta started on 13 September 2017. There’s a signup sheet here if you want to give it a whirl.


  1. Seafoam says:

    Oooh! Maybe there is some other long term motive behind this which also features shooting guns, reflexes and peripheral vison…
    Tinfoil hat on.

    (Of course I’m joking, but think about how cool that would be.)

  2. Zetetic says:

    It’s funny how the labelling works in the area of the ‘decision-making’ and the like. It’s interesting whether people call themselves ‘psychologists’ (which would seem to make the most sense), ‘neurologists’ (which appeals to the idea that meat is more sciencey than behaviour and cognition) or ‘behavioural economists’ (which, I think, appeals to the idea that they’re more serious and practical).

    • Siimon says:

      People don’t just call themselves those titles willy-nilly. A doctor who specializes in neurology is called a neurologist. You and I can’t say “we’re doing sciency brain stuffs, so we’re neurologists!”, because we’re not Doctors with a specialization in neurology.

    • riadsala says:

      I agree entirely. [I’m never really sure what field I actually work in!]

  3. K_Sezegedin says:

    Seems kinda dumb since unlike athletes gamers aspiring to get better can just play the games they want to get good at ad nauseum and at any time.

    Athletes in competitive sports simply can’t do that, as you don’t just log in to a game whenever you want..

    Still looks like there’s a lot of thought behind it and if more money keeps coming in to e-sports
    These ideas’ll probably be a thing.

    • Babypaladin says:

      A basketball player can just go to any court nearby and start shooting hoops. A swimmer and just go to a pool and start training. What’s the difference?

      • Alberto says:

        A runner can’t just go running as much as he wants, he’ll reach the end of the world and fall down, eventually. That’s why stadiums have round tracks, as a measure of security.

        • syndrome says:

          I thought the tracks were round so that the audience could gamble while tracking the runners without moving. This is also why Large Hadron Collider was made.

          Poor Hadrons…

      • K_Sezegedin says:

        What I meant is the competitive team athlete doesn’t just drop into the very environment they’re trying to improve their skills for, as in, a competitive match against determined opponents.

        Whereas for a gamer that environment exists pretty much on demand, and seems like it should be the best training scenario available.

    • hungrycookpot says:

      Not really the point. A professional athlete still needs to go to the gym for specific exercises, ie football player can play football all he wants, but he’s still going to get tossed by another player who also spent time lifting weights and is much stronger.

      I think the idea is to use this to identify deficiencies that you might not recognize in your in-game play. Ie. you do really well in Plunkbat, but when you get killed, 70% of the time the shot comes from the left, this can tell you you have a weak/blind-spot on your left. Or you always get shot by someone you didn’t see while jumping like an idiot, maybe this tells you that you jump around like an idiot too much and you need to slow down and open your eyes more.

  4. Herzog says:

    Back in the days we played Reflex on Missionred to hit all the flick rails afterwards. Sadly the website is down by now. Only found a clone over here: link to

  5. MrBehemoth says:

    Hmmm. This was an interesting read. And that’s coming from someone who is really not into esports. In fact, I feel like esports are the diametric opposite of what games mean to me personally.

    But, this got me thinking about other uses for similar software, which Pip kind of touched on. What if something like this could make me a better driver? Not a competitive or expert driver, just better at handling every day traffic efficiently and safely. Or, what if it taught me to handle conflicts and work with social dynamics? What if it helped people to develop better cognitive or emotional skills?

    All these things would be great. (Of course, the cynical side of me knows that there is far more money to be made from dudes who want to become even more hardcore at shooting dudes.)

    • His Dudeness says:

      RPS comments need likes for that reason.

    • syndrome says:

      So, prior to this article, you never thought about games as if they were instruments for cognitive fine tuning?

      Why do you play if not to become better at something in particular, and if you do become better at something, it has to do something with your brain… So it has to be neuropsychology at some level.

      How could you not think about that? Or did you think simulations were for extreme nerds who have no taste in games?

      Even role-play games test your mentality in certain situations, or at least your abilities to optimize a functional strategy. Every puzzle does it, and every UI has its learning curve (not to mention control interfaces). What games do is they’re flexing the so-called neuroplasticity, while naturally rewarding the player with complex neurochemicals that spur emotions and senses of achievement and closure, or the lack of.

      Things that we deem as utility functions in life, even though the most of us play for decadent reasons.

      • MrBehemoth says:

        If you’re asking me the reason I play games, then no, I have never played a game with the intention of developing skills beyond the application of the game. I value the ability of games to develop neuroplasticity and to fine-tune motor-control, but neither of these things are the reason I play games.

        For example, I loved the way that Portal developed my spacial awareness and physical acuity in the process of teaching me to play Portal – but I didn’t play portal so that my spacial awareness and physical acuity improved to make me better as, e.g. driving, I played it because I wanted to experience an immersive, funny narrative and feel clever by solving puzzles.

        To use the sports metaphor, these games are a work-out as opposed to a match. Personally, I wouldn’t play in the match because I wanted to get a work out. And besides there’s a big difference between a game that teaches versus a training application.

  6. NetsukeMonkey says:

    I actually used Euro Truck Simulator 2 to help me pass my driving test and would recommend it to anyone. I’d practise Mirror, Signal, Manuever and it was great for training my situational awareness. It also taught me that drinking whisky and singing loudly to tunes on Radio 6 was great fun but only ever in the world of simulation =p

    • syndrome says:

      I bought G25 to be able to practice hands autonomy in Test Drive Unlimited. Even though I knew how to drive (but I’m not a driver), I had trouble with gear switching, and I’ve managed to train myself into perfection.

  7. Shinard says:

    Hmmm. It sounds cool, but I’m a little worried that there’s no references for the claims he’s making. When Pip asked him he didn’t provide any papers or reviews of the subject, looking at the website I couldn’t find any references – it’s hardly unheard of to do independent research and not publish because you’re developing a patent, but still. It’s hardly unheard of to exaggerate the results of your independent research to market your product, either. There might well be flaws in their research methodology, but there’s no way we can find that out. It really does sound cool, but unless we can see the evidence behind it I’d take it with a pinch of salt.

    • Premium User Badge

      Philippa Warr says:

      This is the same response as I typed below but it bears repeating: Yeah, that’s something which is a concern for me. That’s why I’m particularly interested in whether the beta throws up any concrete data or information about whether this works in practice because my supporting research question didn’t get an answer citing any.

  8. Pintaius says:

    Don’t want to be too sceptic, but the response to Pip’s question about (published) evidence sounds quite evasive, when it seems to me that they should have taken the opportunity to openly promote their past research. In fact the question is truly interesting and spot-on, whether “those improvements in the skills you’ve isolated are transferring to games”. I’m not a specialist in the area by far, but unless they have better evidence themselves, a recent review of “games as training” research showed moderate improvement in visual/auditory awareness after training and only small improvements for spatial/motor skills. Interestingly, no improvement at all for things such as decision-making or working memory, that are mentioned repeatedly here…

    Source: link to

    • Premium User Badge

      Philippa Warr says:

      Yeah, that’s something which is a concern for me. That’s why I’m particularly interested in whether the beta throws up any concrete data or information about whether this works in practice because my supporting research question didn’t get an answer citing any.

  9. HZCH says:

    I started to read this piece, then I launched the trailer, and between the (artificial ?) tearing and the seizure-friendly flashs, I have to pass my tour (and miss something interesting)…