Fuuuuuture: Control Son Of Nor With Your Thoughts

By Nathan Grayson on May 14th, 2013 at 1:00 pm.

Control entire games with only one eyebrow!

Keyboard-and-mouse versus controller? Pfft, only dinosaurs still have that argument. And motion control? That requires primitive hardware like living room boxes and flesh-infested bodies. No, today I’m bringing you a vision of our collective brain-in-a-gelatinous-tube future – the next mighty step in progress’ inevitable march. Son of Nor, you see, is set to allow for partial mind control. Using a wearable cyber crown called the Emotiv EPOC, you’ll be able to cast spells by thinking really, really hard until something cool happens. Or at least, that’s the impression Still Alive Studios’ video gives, and it’s quite an impressive thing indeed.

So yes, keyboard-and-mouse for movement and sheer, unfiltered thought – the essence of human existence, the backbone of all sciences and arts since the dawn of time – for magic powers. Sounds sensible enough.

Of course, it’s perfectly understandable if you’re still harboring some skepticism. I know I am. This, apparently, is how the EPOC device works:

“Based on the latest developments in neuro-technology, Emotiv presents a revolutionary personal interface for human computer interaction. Emotiv EPOC is a high resolution, multi-channel, wireless neuroheadset. The EPOC uses a set of 14 sensors plus 2 references to tune into electric signals produced by the brain to detect the user’s thoughts, feelings and expressions in real time.”

There are different suites calibrated for specific sorts of detection, too – for instance, facial expressions, emotions, or conscious thoughts. The latter sounds most similar to Son of Nor’s razzle-dazzle-spontaneously-combuzzle tech. “Users can manipulate virtual objects with only the power of their thought,” reads Emotiv’s website. “For the first time, the fantasy of magic and supernatural power can be experienced.” And so can passive voice!

I highly doubt Son of Nor will launch with a perfect implementation, but the very fact that games are starting to dabble in this stuff at all is basically insane. Back in my day, the best heavy lifting brains could do was advanced physics or teaching Furbies how to curse. Now: boulders. Take that, physicists.

Son of Nor’s still accepting monetary sacrifices at Kickstarter‘s monolithic altar, and it’s about 1/3 of the way to its $150,000 goal. Has this one caught your (mind’s) eye?


« | »

, , , .


  1. MajorManiac says:

    But when will I be able to control my mind with a keyboard and mouse?

    • Poppis says:

      The mind doesn’t support mouse+keyboard. You have to use a controller.

    • jemkem05 says:

      If you think Patrick`s story is impressive…, five weaks-ago my son in law earnt $8989 workin 40 hours a month from their apartment and the’re neighbor’s mom`s neighbour done this for 3 months and made more than $8989 parttime on there pc. apply the guidelines on this site grand4.com
      (Go to site and open “Home” for details)

  2. Porkolt says:

    I’m pretty sure a device similar to this one already exists – I’ve used it.

    The one I used seemed significantly less advanced, though. It only had 3 sensors or something.

    All it seemed to do was measure slight facial twitches. Getting it to do even the simplest thing was a bitch, not an experience I look forward to repeating in order to get the hang of using it.

    • Autogyro says:

      Yes, the problem with these ‘mind-readers’ is that they cannot actually read your mind. Facial muscles create electrical currents a thousand times stronger than any neural activity. There is no possible way to actually filter out the noise introduced by muscles movement.

      The video provided shows very clearly how these systems work. “.. and now we can lift it up without hands”, raises eyebrows sharply, “… and now lift it up again”, raises eyebrows sharply again, “… and shoot!”, clenches jaw muscle.

      “…with your mind!”

      The ideas behind these mind-readers are nice but the only thing that is possible (and what they actually are) are facial muscle (and neck muscle) detectors.

      • Dieterling says:

        While it seems that he was using signals from muscles to control the game, it is absolutely possible to make a headset that reads cognitive signals and use software processing to separate out the signals from other sources. If we couldn’t, EEGs would be useless as a diagnostic and research tool in medicine and neuroscience. Wikipedia even has a nice section about this in their EEG article: https://en.wikipedia.org/wiki/Electroencephalography#Artifacts

        I personally think that he was using facial muscles for the video demo because they’re much easier to reliably detect, but the 2010 TED video (https://www.youtube.com/watch?v=fVhggGSjXVg&feature=player_embedded) demonstrates that they can indeed detect brain signals with some degree of reliability with this device, as they show that their software has distinct control layouts for detecting muscle activity and cognitive activity.

        • Autogyro says:

          There is a large difference between a controlled environment in which the subject is not allowed to move and someone sitting behind their desk. The difference between these two setups is precisely why the in a clinical or scientific study you can extract data, whereas in a more practical environment you can do absolutely nothing.

          The wikipedia article is very short and mentions only the applications and research into signal decomposition. It does not however detail how well it works and in what contexts. The signal to noise ratio can be kept low in studies by discarding all the data that was destroyed by noise and by trying to keep the noise to a minimum by not allowing the subject to move and by removing very predictable noise, such as the blink of an eye, or eye movement. What you cannot do however is negate the noise the use of your neck muscles, or any of the larger facial muscles introduces to the signal. All you can do is discard that data and try again.

          When it is virtually impossible in a clinical setting with very expensive equipment, it is safe to say that all that these systems do is look at muscular activity. Even in the TED speech, nothing shows that they actually used any neural signals, just by tensing the muscles in your face you produce more noise than the system could ever handle.

          • Dieterling says:

            The “Controlled environment” used in diagnostic EEGs is just getting the patient to sit down and putting the sensors on them. They do not need to be perfectly still, like in an MRI. The tolerance of EEGs to the wearer’s movement is actually listed in the Wikipedia page I linked you to as one of the biggest advantages of EEGs:

            “EEG is relatively tolerant of subject movement, unlike most other neuroimaging techniques. There even exist methods for minimizing, and even eliminating movement artefacts in EEG data [14]”

            On top of this, the headset isn’t being used for collecting data as precisely as a clinical EEG, it’s detecting fairly broad categories of signals that the wearer is deliberately trying to make over several seconds. You don’t need the accuracy of a clinical EEG to detect these signals because you’re not using the signal to discover life-or-death information, so you can make do with a noisier signal and slower responses (5, 10, 20 or even 50 milliseconds isn’t going to matter that much in a singleplayer game) and hence make it really cheap.

            If EEG data became completely useless when the patient moves muscles, it’s primary clinical use probably wouldn’t be the monitoring of brain activity during seizures: https://en.wikipedia.org/wiki/Electroencephalography#Clinical_use

            Also, the Wikipedia article is also 19 pages long and has 71 citations – what do you mean, it’s short?

          • maweirhas says:

            You folks missed this, clearly.

        • DrZhark says:

          EEGs detect many different signals and there is a big amount of background noise. With an EEG we can’t discern between ‘red’ and ‘blue’ or ‘on’ and ‘off’ thoughts. We use EEG to find epileptic wave patterns that are distinctive spikes of electrical activity on a very confusing noisy wave pattern. Besides that and sleep study, the real value of EEGs is rather limited

          My source is not wikipedia, I’m a physician. I think what Autogyro says is accurate.

      • realitysconcierge says:

        I was really excited until I read your comment :-( lol

      • Dozer says:

        When I see stuff like this I ask myself, “What would Dr Gregory House do?” The answer, obviously, is to drill into the skull and plant electrodes directly into the brain. Preferably with the patient conscious and in the back of a moving train or something.

    • staberas says:

      OCZ NIA and i have it and its true alpha and beta are ridiculus difficult to manipulate , only face muscles/eye movements are reliable and even then there is a small latency dew to decryption.

      Updated: i found triathlon thats an open source project that combines nia & computer learning and kinda works when i was playing warframe…

  3. AzzerUK says:

    Emotiv + Occulus Rift = Lawnmower Man? Count me in!

  4. MuscleHorse says:

    “Let’s start with a little telekinesis.”


  5. baby snot says:

    Can we have a caption contest for the banner image or whatever its called.

  6. Baka says:

    Wow, high resolution AND multi-channel?

  7. RProxyOnly says:

    Mind control my arse.

    It works on changes of the brain/body’s electrical field, it has no idea what the brain is asking it to do, it simply reacts to the peaks and troughs of electrical activity. I’m sure you could easily get the same results by manipulating a battery near it.

    Anyway, it still looks shite, sorry.. stupid and shite.

    • Grargh says:

      Bad day? Nobody has any idea what the brain is actually doing, apart from sending electrical activity here and there. You control your whole body with your mind by nothing* but eletrical pulses sent to your muscles, there isn’t any more magic to it than with a hypothetical brain-computer-interface.

      *edit: Of course, a lot of chemical stuff is also happening, but that’s not the kind of direct control we’re talking about I guess.

      • RProxyOnly says:

        Don’t be stupid.

        Your limbs DON’T ‘just’ react to impulses. The impulses form instruction sets that are TRANSLATED by the body/limbs.. Your body knows VERY WELL what’s being asked of it.. which ISN’T just the same as reacting to dumb random impulses.

        There is a MASSIVE difference between what your body does and what this ‘dumb terminal’ does.

        • ComfortFit says:

          “Don’t be stupid.” says the man clearly uneducated on how the body works.

          Cells make up your body, and this is a key point, cells do not have any way of “translating” anything other than their own DNA. They communicate primarily through chemotaxis (an immediate response to a chemical present, usually using cellular membrane surface molecules to detect them) and, in the case of muscle and nervous tissue, electrical impulses. If you want to test this yourself, grab a fork and stick it in the nearest electrical socket. Notice, if you will, that you lose motor functions as your muscles begin to involuntarily contract. It is chiefly due to the electricity surging past them, causing them to fire off and flex their stuff.

        • Dieterling says:

          Your muscles do not “translate” any signals sent by the nerves, they react to the change in potential and that’s about it. Your muscles do not have any processing power of their own, they don’t have an “instruction set” that they can search through and execute commands from. Different movements are just nerve signals of different frequency (nerves can discharge their potential very frequently (only a few milliseconds between discharges), stronger movements are simply more discharges per second than weaker movements).

        • Grargh says:

          A computer getting signals from a brain can actually do a lot more with that information than a muscle, which is just a bundle of strings that contract when stimulated. All the processing done for movement takes place in the brain – in fact, if you wire up an artificial limb to some redundant nerve endings, in time the brain will learn to use it by sending the proper electrical signals. That’s what should happen if you train yourself to use such a “mind control” input device. Why it doesn’t work that way yet is explained in some comment above.

          • Brun says:

            All the processing done for movement takes place in the brain

            Mostly, yes. Things like pain reflexes are “processed” in your motor neurons and spinal cord, since the motions are relatively simple and “processing” them in the extremities is faster than sending and receiving signals from the brain while risking further injury.

  8. Christo4 says:

    Aww, you people and your criticism… You just jelly you can’t move a death star with your mind.

  9. ChrisPolus says:

    Hey RPSsers!

    I’m Chris, the producer of Son of Nor. I think Baby Snot had THE IDEA!

    CAPTION CONTEST! Go to Reddit, participate and win a DIGITAL COLLECTORS CO-OP EDITION should we succeed on Kickstarter: EDIT – new URL

    Good hunting!

  10. Ross Angus says:

    He should have ended the video with “… and if you don’t believe me, that this is actually working, then I will destroy you with my mind.” Then just stared into the lens, for ten seconds, in silence.

  11. DuncanIdah0 says:

    I once tried using voice commands in WOW to cast spells with my druid. Although saying “starfire” was fun and cool to do and show to people it was very impractical, and that is with a few voice commands that were recognized well and consistently…

    With a system like this, more difficult to operate and prone to misdetection I don’t think it will reach even the level of a gimmick.

  12. strangeloup says:

    I’ve no strong feelings about the game itself one way or another, but if you’re prepared to spend $300 on a magic brain hat then I’ve got some snake oil you might be interested in. The Oculus Rift is similarly non-cheap but actually demonstrably works. They both make you look a bit daft.

    Also I had a really hard time understanding that guy’s accent, for some reason.

  13. dahools says:

    But can he do it with his little web cam turned off which watched every movement of his arms and face? If it is real and not the fakest thing I have ever seen, that was the worst video ever to try and prove it, that could have been a tiara on his head or a baked potato and he could have still done the same thing with that camera attached to his monitor!

    Or am I just having a bad day?

  14. Hoardian says:

    Is this compatible with my tinfoil hat?

  15. Clavus says:

    I guess that it’s possible if they can read brainwaves of sorts. Not that they can interpret what those brain waves actually mean, since everyone’s brain is slightly different (apparently), but that’s probably where that ‘spell learning’ comes in. It just reads whatever your brain is doing while you “learn” the spell. Then afterwards it keeps checking if the same brain signal pattern repeats itself, and then it casts the spell. So I could link a firespell to thinking about a bunny, for example. They don’t explicitly say this in the video (I hate it when they’re hazy on technical details) but this is how I expect it works.

    The success of this device will completely depend on its comfort, ease of use, latency, accuracy and application. All of which need to be great.

  16. SuicideKing says:

    You folks missed this, clearly.

  17. Cara Ellison says:

    Where is this guy’s soup

  18. nevarran says:

    Just chilling out, playing some mp with friends.
    Damn it, the dog’s chewing on my couch again. “Stop it, fool!” Damn it, I just shot a team-mate, fuck! No, damn it, it happened again! For fuck’s… Damn you, mind, stop it!

  19. Strangerator says:

    I support any effort to reduce the number of hotkeys I have to manage, although it’ll probably be a while before the technology is up to gaming standards. Imagine, if you will, a world where you don’t have to move your left hand from WASD to press the number keys. Truly, the future is bright.

Comment on this story

XHTML: Allowed code: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>