Best Of Gamescom: IL-2 Sturmovik And Oculus Rift

I didn’t believe in the Oculus Rift. Not in the way that I don’t believe in Derek Acorah, the phantom of credibility, but in the way that I don’t believe in matter transporters or eating only one biscuit. The Rift had seemed like an impossible dream, a product of improbable technology and the overly forgiving impressions of excited humans. Now I believe and all it took was a flight over Stalingrad in a Sturmovik.

Almost everyone I knew who had tried the Oculus Rift told me that it was, if not the future, at least a very compelling part of a possible future. They waved their hands as if they were trying to sell me an overpriced heap of a boat, standing, sipping their fizz and muttering to each other like members of an oddly exclusive cult.

“Total immersion,” one of them would occasionally rhubarb to the others, “deliciously complete and absolutely total immersion.” They sounded like they had molasses for gums and could have been talking about bathing in llama milk for all that it mattered to me. If I asked them to explain what made the immersion so absolutely total, they’d splutter into their drinks. “It’s impossible to put it across in words. You have to be there.” Ha! You lousy voyagers of the virtual, I’m not buying what you’re selling.

Except, of course, now that I’ve tried it for myself, I very much AM buying it as soon as it’s available. The Rift works astonishingly well, even when it’s eye-cups are projecting murky, low-res imagery directly into the brain. That said, for all my newfound excitement, I do have my doubts as to which games it will suit and how effective it’ll be in the long-term.

Like so much else, the following words are mostly about cockpits.

My first experience with the Rift was at the Hawken booth at this year’s Gamescom. I didn’t jump into the actual game, just looked around the garage. I adopted a very British approach to virtual reality, barely shifting my head at all but letting out the occasional ‘cor’ or ‘blimey’ when the in-game visual tracked even the slightest movement of my head. I didn’t want to make a fuss, so when the nice man in charge of proceedings asked if I felt like looking over my shoulder, I went along with it.

I’d been fully aware of the people around me, the thousands of them pressed into the convention centre, I’d been aware of the booth’s flimsy walls and the monitor, whose lightshow was now imprinted on my eyeballs. When I adjusted myself in the seat to look behind me, I saw the back of the garage, where bots were being hosed down or steaming quietly. Probably. I don’t remember the details because the sudden realisation that the game world was now all around me knocked me for six.

As I recovered, out at the boundary, I slipped the Rift off and smiled sheepishly. I’d made a strange excited noise at the end there and wanted to be sure that it hadn’t completely punctured my inscrutable veneer. No chance. I was as scrutable as a Roger Scruton primer and everybody in the room was smiling at me as if I’d just performed some kind of magic trick. Developers, in my few short experiences, are delighted by the Oculus Rift, as if it were their own creation. Their excitement at seeing their work delivered in this way is infectious.

Hawken was the appetiser before the main course. It’s fitting that IL-2 Sturmovik, a new expression of a revered classic, should be my first real introduction to the true future-tech experience.

I sat and I listened, impressed by the scope and the ambition of the team, as Sturmovik was described, with slides, videos and photographs. I’ll briefly cover some of the things I was shown. Bear in mind that I didn’t know about the Rift integration until I was taken into the next room. I thought that the slideshow and talk was the extent of the presentation. Now, it’s little more than a footnote.

Sturmovik can, the developer confidently states, be used as a training program for pilots. It is an authentic simulation, going so far as to use recently declassified files in order to recreate the machines that fought in the skies above Stalingrad. Ten planes have been modelled, with soft body physics allowing them to break apart realistically as the stresses of flight, combat and impact take their toll. The development team build them as if they were engineers, constructing each part and then attaching it as if making a real vehicle.

The terrain covered is enormous – I’ve written 118×103 (?) in my notepad – with the city itself a gray puddle in a much wider landscape. Across that vast area, missions are generated dynamically, with unique objectives every time, in both single and multiplayer. Servers support up to one hundred players at a time but even the fight against AI is promising – no special physics are gifted to the computer, allowing it to cheat the system. It flies as a player does, which allows it to make the same mistakes and to battle against the unpredictability of the simulation.

All of that sounds good but it’s no more than talk until we get to see more of the game. More than talk is how impressive Sturmovik already looks. Individual buildings are recognisable, recreated from period photography. The city is dense and believable, and vehicles are extremely detailed, including the planes themselves, and their cockpits.

Yes, the cockpits. I was still thinking about the brief Hawken experience as I watched and interrupted the presentation: “Are you thinking about Oculus Rift integration?”

For the second time that day, everybody in the room was smiling.

The setup was in the next room, a cupboard really, with the Rift set to one side and a screen set up with the game running. Chief Producer Albert Zhiltcov entered the room with me, practically dancing with
enthusiasm, and told me to put on the headset.

I felt like a professional now, a veteran following one previous field mission. They wouldn’t have to tell me, panto-style, to look behind myself this time. I was ready to play it cool and to have a little more fun.

Zhiltcov told me that he’d take the stick himself, which was something of a relief. I like flight sims but it takes me forty minutes to be comfortable with the controls before I’m even ready to take off. I’m not the most confident pilot.

To be a passenger, flying over Stalingrad, is perhaps the most peaceful experience I’ve ever experienced in a game. There was no combat – presumably because the game is still in the early stages of development – just the scenery and the plane. The instruments are right there, close enough to touch. The stick moves in synch with the pilot’s instructions and the glass coverings over dials and read-outs catch the sun as we turn, tipping a wing toward the sky. I look up and to the right, following the line of the wing as it touches a cloud. The clouds are three dimensional shapes – they’re probably volumetric, but I don’t really know what that means – and the metal makes a vague impression on the shape as it passes through. It’s extraordinary to see. It would be even if it were on a screen but on the Rift, where it seems like a physical thing in close proximity, it’s awe-inspiring.

Not explosions, not the intensity of combat or a parkour chase – clouds. The sky, the land beneath. When I turn in the other direction I’m looking right at Zhiltcov, who I no longer acknowledge except as a voice, responding to my requests even as he points out every interesting sight between our plane and the horizon. As we turn, the view to the left is of the ground, a bleak landscape, made more so by the low resolution grime of the image.

It doesn’t bother me, that technical imperfection. It’d be like complaining about the resolution of stained glass. A truly spectacular sight does not necessarily rely on fidelity and cleanliness to convey its effect. I’m more impressed by this than the sights from my window on the flight to Cologne. The view is less constricted, the altitude is lower and the pilot is willing to take risks.

When our wingmen come into view, they are specks in the distance. We accelerate, the sound of the engine more present than before I donned the Rift. When we catch them, I look up at the belly of one fighter, craning my neck to keep it in view as we overtake. The other flies alongside us. I can see the pilot. And then, without warning, Zhiltcov hits the ejector button.

The sound cuts out, replaced by the (hopefully not) ragged flap of canvas and the gentle rush of wind. When I look up, I can see the parachute above me, the cables that hold me in place. Below, the world seems like waves, gently swaying and ready to welcome me back. I scan the skies, hoping to see my plane as it prepares to end, but I’ve already missed it, or perhaps looked in the wrong direction.

I removed the Rift before I hit the ground and talked with Zhiltcov about the technology, how difficult it has been to implement (not very) and how excited he is by the possibilities. We talk about that more than we talk about the game, which is a shame in some ways, but unavoidable. It was the highlight of my Gamescom, by some distance, and I’ve been trying to put it into words ever since.

When the player is in a moving object but his/her body is not actually in motion within that body, the Rift appears to translate brilliantly. Using it in a first-person shooter, where the motion of the avatar body would be necessarily refuted by the player’s own means of control over it, seems odd. The Rift simulates the movement of a head, nothing else, so it is an ideal fit for games with cockpits. I’m not joking even slightly when I say that I want Euro Truck Simulator 2 implementation right now.

It’s the ideal tool for tourism and, using it as I did, with another player at the controls, seems a wonderful way to share games. It’s a passenger device as much as a pilot device. I want to look at Panoramic photographs using it and to wander Street View. More than anything, I want to fly again. It helps that Sturmovik is shaping up to be such an ambitious game, but I think I’d be almost as happy in another flight sim, something older and more rusted.

Whether it’s trains, planes or automobiles that they spend their time in, the Oculus Rift is destined to be a favourite tool of simulation advocates. Next, I want to take it into space.


  1. Busty says:

    I’m amazed you didn’t get motion sickness.

    • golem09 says:

      1. He already had experience with it.
      2. Everything in a cockpit gives not nearly as much motion sickness as any game where you move your body directly.

      • salena012 says:

        my co-worker’s mother makes $67/hr on the laptop. She has been out of a job for five months but last month her paycheck was $15348 just working on the laptop for a few hours. description>>>>>>>>>>link to

    • Tiax says:

      Plus, not everyone gets motion sickness. I know I never got any with my Oculus.

      • Premium User Badge

        keithzg says:

        Yeah, I played a few 1hr sessions of Half-Life 2, and even with all that mouse movement (necessary to spin around and all that) I only got a slight feeling of . . . oddness afterwards. But by contrast my one housemate tried the Museum Of The Microstar demo and with no mouse movement at all was nonetheless starting to feel nauseous within a minute.

    • Imbecile says:

      For me the motion sickness was related to the software I was using at the time. I was fine with the rollercoaster and Team Fortress 2, but the Villa really made me feel ill. The effects seem to vary a lot, also apparently you get used to it as others have said.

      • InnerPartisan says:

        Imbecile: Do you ever get motion sickness when you play any games “normally”?

        I don’t* – but I’m still worried that it might ruin the experience for me, after hearing so many reports about it. I’m just so damn hyped for the Rift, even though I haven’t tried it myself yet.

        *: I do get other forms of motion sickness. Reading in a car, for example, makes me extremely sick.

        • Imbecile says:

          No, I have to admit I dont, and I was made pretty nauseous by the villa demo. I think it might have been something to do with the speed the screen updated. Not sure – its definitely worth giving a go, but I wouldnt buy anything until youve tried it, especially if you are prone to motion sickness.

    • Dozer says:

      I was getting motion sickness just watching the videos…

  2. dcgc says:


  3. Premium User Badge

    Aerothorn says:

    I was sadly a bit underwhelmed by my Rift experience. No matter how I tried to seat it, it was always a bit blurry, like the angle was a little off. It’s possible I have some sort of eye condition that prevents me from experiencing the 3D effect properly.

    But more than that, the low resolution was really noticable. Don’t get me wrong; the Rift is a fantastic achievement that does pretty much exactly what it promises to do. But I couldn’t see this actually replacing a monitor for standard play, because with the screen so close to your eyes the pixelation was extreme. Experience is totally worth it, but it would have to be a game in which graphical detail wasn’t that important.

    • johnkillzyou says:

      The consumer version will have a higher resolution though, so that might change. And they’re planning to have some from of calibration for a person’s eyes. Dev kits seem to lack this at the moment though.

      • Stochastic says:

        Will the consumer version be higher than 1920 x 1080? Ideally, it seems like they need some kind of variable DPI screen, with the greatest pixel density in the central foveal region and lower dpi in the periphery. I imagine that would be a technical nightmare to implement, though.

        • johnkillzyou says:

          I highly doubt it. The technology is good enough for HD, maybe 720 or hopefully 1080. Going any higher will just make the cost highly prohibitive.

        • xao says:

          Nope, and I suspect it won’t even be close. The bandwidth required to pump 1080+ resolution graphics with zero noticeable lag is… intimidating. It’s technically possible, but in combination with the screens themselves, there’s no way the Rift could come close to its $300 price point.

          • analydilatedcorporatestyle says:

            I disagree, with the advent of ultra high definition screens for Smartphones being released that exceed 1080p. In a year or two they will be at a price point where a sub £300 Ultra HD(UHD) Rift will be highly likely.

            Having the hardware to run two UHD screens at a decent frame rate will be the expensive bit!

          • arccos says:

            Just for clarity’s sake, the OR uses a single physical screen. That’s one of the technical magicks that keeps the price down and the images in sync. It’s the software that generates the two distinct images, which the eye cups stretch out to fill your vision.

            But yes, pumping out two high-res 3d views can be pretty taxing on current systems.

          • Premium User Badge

            phuzz says:

            It’s basically the same problem as driving two 1080 desktop screens and that’s already possible. Sure, you’ll need either a powerful single card, or multiple graphics cards to play a current game at full resolution and high settings, but people can do that (or even 3 or 6 screens!) now.
            For example.

          • InnerPartisan says:

            The bandwidth required to pump 1080+ resolution graphics with zero noticeable lag is… intimidating

            How is that in any way relevant? There are screens with much higher resolutions, after all. It’s not like the Rift uses Bluetooth or anything – it’s got a *cable*.

            Right now, the price is the main factor limiting the resolution – but I can’t see a reason why we wouldn’t see some kind of “Ultra HD”-Rift in a few years time.

          • analydilatedcorporatestyle says:

            Didn’t realise it was a single screen, still the new generation of smartphone screens will go hand in hand with Rift’s development!

        • Jim Dandy says:

          I’ve been thinking about the fovea and display resolution too. Can we exploit the fact that we construct our perceptions from a patch of full-colour, hi-res vision that’s the size of a thumbnail held at arm’s length?

          There’s a television (I’d look it up if wp8 didn’t make opening another tab so painful, I think it’s a Samsung) that has RGB LEDs behind the bezel. The LEDs respond to the colours on the screen – if you’re looking at footage of a beach, for example, the LEDs behind the top of the screen would go blue, the middle ones blue-green, and the bottom ones yellow. If you hang the telly on a white wall and concentrate on the on-screen image, your brain assumes the colours projected on the surrounding wall are part of the image and fills in extemporised detail to suit.

          Logically, we only need to render something like a 64×64 or 128x128px patch of an image, with wondrous brainular trickery filling in the rest from very basic tonal cues. Rendering anything beyond that is literally a waste of cycles. The tricky part would be tracking the saccades – the rapid movements of the fovea as it scans the image. You’d need a “saccade-buffer”. The questions are whether the buffer would be computationally cheaper than just displaying the whole image, and whether the movement of the fovea can be tracked tightly enough.

          Tracking should be relatively easy. There are plenty of analogue signals available: pupil motion, eye muscles, neuro-electric. The sharper the tracking, the smaller the buffer. Is there some degree of predictability to saccades? Is an upper-left usually followed by a lower-right? Does the fovea react consistently to MOBs? What kind of refresh-rate and frame-count do you need to smoothly follow the saccades?

          If this stuff could be worked out it could have a huge impact on the design of VR displays. Fewer pixels of higher quality. Theoretically, you could render a >4k image on a 640×480 display if you only had to render a tenth of those pixels in a given frame. Even subtracting the overheads of buffering and tracking the saccades, you should have several orders of magnitude more headroom to process the pixels you are rendering.

          • Stochastic says:

            Interesting. I hadn’t thought about saccade tracking. I’m not sure if it would be precise and fast enough today, but I guess it would work if you could get it to act in under 16 ms (the time it takes to render 1 frame at 60 FPS). I haven’t used VR myself, so I’m curious as to how much most people move their eyes around the display. I suppose if you’re playing something like a shooter you can’t just rely on head tracking, so the entire screen might need a high DPI to appear sharp (although as you note, not necessarily all parts of the screen simultaneously).

          • Jim Dandy says:

            So, according to this:
            link to
            …there are large 20-200ms saccades as well as constant ‘micro-saccades’ of around 20ms. 60fps should be enough. Interestingly, we appear to neurologically mask out the signal from the macula prior to each saccade in a constant ‘blank and refresh’ cycle. Would we need to v-sync with our optic nerve?

            Could you send 64x64px down a lightweight fibre-optic loom to a contact lens, projecting directly on to the macula? To track the saccades, maybe a couple of contact electrodes to read eye-muscle triggers, or an IR or optical tracking system. Squeeze in the necessary code somewhere between your head-tracking and z-buffer systems.

            Also interesting: apparently birds use saccades to help their eyeballs breathe. Freaky!

          • kaffis says:

            Saccades are an area of study that hasn’t been done in too much depth yet, as we’re only recently really getting the ability to measure and track them *for study* reliably.

            Trying to track them to constrain real-time display is probably impractical, as such. One of two things would prohibit that in our near future: either the unpredictability we’d discover they have would preclude targeted pre-rendering (since you have start rendering the proper section of the image many milliseconds *before* the saccade’s motion stops in order to have it ready by the time your brain starts paying attention to the signals from your fovea again) or the computational power required to track and extrapolate the saccade pattern would approach that of the computational savings… Especially since we still need to have a moderately accurate, if much lower resolution, display for the rest of our field of view, as well.

            Furthermore, it doesn’t really help display tech be “virtually” higher resolution, since you still have to have the maximum pixel density throughout the area of the display since the display itself isn’t tracking (which was how I initially read your proposition; needless to say it sounded even more improbable at that first glance)…

      • mwoody says:

        The dev kits have three pairs of cups for users with 20/20, slight nearsightedness, or high nearsightedness. You can also adjust the lens distance from your eyes, and in software, you can measure and set your IPD (“inter-pupillary-distance”, the space between your pupils) and height to further match your point of view.

        The main problem is that the game has to support these options. Unity based games have these inherently, and pull automatically from your profile. UDK-based games do not, and are basically terrible right now (hard to install, hard to configure, tons of limitations like not being able to modify the pitch/yaw/roll of the viewer independently of the rift controls). Source-based games are better but not as “push a button and go” as Unity; you often need to use command line or console commands to adjust settings.

        I really think supporting the Rift so readily was the smartest business move Unity has ever made.

        • Mctittles says:

          I don’t have much experience with optometry so I could be way off, but I’m curious. I wonder if the Oculus Rift could be used to give people with certain eye conditions (like lazy eye) the ability to see a 3d world normal for once.

          • Axess Denyd says:

            At a guess, I would say probably sometimes.

            I have that problem, and they actually can correct it somewhat by putting a prism in my glasses (not noticeable unless the glasses are being worn, but it basically makes it so the image shifts in each eye until they line up). I don’t get the FULL effect though since I didn’t have one until I was in my early 20s and so my brain just didn’t fully learn to process 3D in that manner. It gets tired and after a while I start to suffer from what they call “horror fusionis”, where my brain actually PREVENTS the eyes from lining up, even with the prism in place.

        • InnerPartisan says:

          I wonder why there would be special lenses for nearsightedness (myopia) – shouldn’t there rather be some for farsightedness (hyperopia)? Being myopic, after all, means that you can see objects right in front of your face perfectly fine, after all (technically speaking, even slightly better than normal-sighted people).
          I mean, I always take my glasses off when looking through binoculars, for example. Why should that be any different for the Rift, with the focal point being right in front of the eyes?

          • Mctittles says:

            Trying to wrap my head around this, and I think the main problem might be if your eyes are very dissimilar, since the rift will calculate distance spread between eyes as if both are equal.

            Also when in fake 3d you are not focusing on the screen right in front of your eyes, but further in space to things that don’t exist. The illusion of things further down is made by separating the two images to where your eyes would be when focusing on something far away and with nearsightedness your focus point would be different. Binoculars work in an actual 3d world and don’t try to simulate it so it’s kind of different.

          • Axess Denyd says:

            It depends how nearsighted you are. I can see something 2″ in front of my face in perfect focus. I cannot see something 1.9″ or 2.1″ away in focus.

        • goettel says:

          Actually, the three lenses are identical, it’s the depth of the plastic cups that vary, providing somewhat variable focus. For me, that’s a problem: my left eye is -2,75, my right -4,25, meaning I can’t get a sharp image in both eyes. It’s usable, and I love the Rift, but I am looking forward to a better solution in the consumer version(s).

  4. Splynter says:

    This is what I was waiting for when I first saw the Rift announced. I can’t wait to get in dogfights where I can look around me without having to fiddle with a POV hat.

    • fish99 says:

      Well, TrackIR has been able to do that for years and is supported by most flight/racing sims. Not suggesting for one second it’s as natural as using VR though, since you have to turn your head but keep your eyes on the screen. It does support 6DOF though (so for instance you can lean left to see buttons obscured below the flight stick) which I think I’m right in saying the Rift currently doesn’t.

      • Pengwertle says:

        I’m pretty sure he is talking about the TrackIR.

        • Thiefsie says:

          I see what you did there.

        • fish99 says:

          If he’s talking about a POV hat he’s clearly not using TrackIR, which is a head tracking camera.

          • Splynter says:

            Nope, not talking about TrackIR. I know it’s been around for years, and while I’ve always thought that what it does is pretty cool, it’s never been enough to get me looking seriously into purchasing. The Rift, on the other hand, does most of what TrackIR does on top of being a VR experience, which pushes it over the edge for me. Those extra degrees of freedom would be nice, but IIRC the Rift is supposed to be getting more tracking capabilities with later models.

  5. Koozer says:

    The Oculus Rift stand featured lots of people sitting near-motionless on circular seating with black blocks on their head, with various attendants checking on them occasionally. It looked like a scene of people having their brains removed by aliens.

    Also, queues to the other side of the hall :(

  6. LXM says:

    Every time I play Euro Truck 2 I desperately want to try it with the Rift. That and War Thunder, which I expect would be a similar experience to IL-2

  7. Reapy says:

    I’m super excited about the rift, and it really comes from experiencing IL-2 with a trackir like (facetracknoir), it was hugely immerse, so I imagine with the rift it will be awesome. A shame about the resolution, but I think the commercial version should have higher pixel density monitors on it?

    I’m honestly ready to fight motion sickness to get used to the device because I think it will be THAT important in the future of experiencing virtual worlds… imagine exploring a MMO with that, imagine walking around guild wars 2’s cities for the first time with the rift on.

    But yeah I think that early on the strength will be ‘cockpits’ mech’s, planes, spaceships, racing games, hell, maybe just a slow ‘train ride’ simulator or anything like that. So much potential in bringing worlds to life with VR, it is really exciting that it is finally getting here.

    • Kerbal_Rocketry says:

      RSC(the makers of RailWorks 3/Train Simulator) seem to be planning to put Occulus Rift into Train Simulator. But the low graphics quality might be a killer.

      I think that Flight Sims will have the most powerful effect as you are using the correct control method (a joystick) and so it’s more immersesive, where as using a keyboard reminds you that you are sitting at a computer.


      • fish99 says:

        Yes flight and racing sims are the perfect application for something like the Rift, because you don’t have two control schemes fighting against each other like you would in an FPS. It’s just flight stick to fly and VR helmet to look around, so it’s really close to the experience of flying a real aircraft.

        FPS becomes more problematic. You’re letting the head turn independently of the body, and then you get into a dilemma of whether the mouse can also turn the view, and which device does aiming. I believe TF2 on the Rift has about 7 control schemes you can pick from, so it seems like a problem people are still figuring out, and there’s always a learning experience and some disorientation for the user since it’s not a natural control scheme.

        A data glove would help fix this so you can hold a dummy gun in your hand and aim naturally, while your other hand did movement on an analogue stick. There’s always going to be some disorientation with VR though because you have to give the user a way of turning in the simulation that doesn’t involve them actually turning their body (or else they’ll end up with wires wrapped around them, not to mention dizzy).

        • Sheng-ji says:

          Surely in an FPS, you would still turn with the pad or mouse but moving your head would let you look around you without having to spin your avatar – like a tank turret, you are running one way and looking/shooting in another, the turn of your head moves the gun to the way you are looking but the fine aim is still done the traditional way

          • Reapy says:

            I assume very similar to the trackir arma interface would be a good starting point for fps. It is slightly disorienting when looking away from your movement path, but honestly the old mech warrior games had this with torso twisting so I assume it to not be too bd, though I imagine it would need some indicator for how off center you are looking as sort of training wheels when starting to use it.

            Thing is I could see snap to center as acceptable in a trackir set up, but disorienting in a vr one… lots of interesting problems to think about.

          • fish99 says:

            Wouldn’t that be kinda weird, so you look left with the Rift, see an enemy but can’t shoot them without turning your virtual body with the mouse. So you turn the mouse only to find you’re looking 90 degree left of where the enemy is now because you also needed to turn the Rift back to facing forward.

            It’s not a natural control scheme, it’s something you would have to learn. And that is just one of the 7 control methods they’ve implemented in TF2, some of the others are very different. Like I said, it’s like they’re still figuring out how VR should work in an FPS. If you think about running around with a real gun, there’s 3 things you have to control, your body, your head and your arm.

    • Busty says:

      ‘I’m honestly ready to fight motion sickness to get used to the device ‘

      The motion sickness is really profound and horrible. I must assume that they’ve fixed the issue now, since these press reports aren’t mentioning it. But if it still exists and it’s just being ignored or carefully managed in press testings, then you can expect a backlash to rival the Xbox One announcement if/when it finally comes out.

      • Koshinator says:

        There are ways to overcome the nausea you can get when first using the rift, and there is a period of adjustment you really need to take to get used to the experience. You can’t just put the rift on and jump straight into a game of TF2 and expect to adapt instantly. That said, another major reason for the nausea is that there is no positional tracking of the rift at the moment, so while orientation and rotation are tracked, moving forward or side to side results in no corresponding action in the game world. This can be a trigger for you to lose your equilibrium (and perhaps your lunch after a while).
        The consumer version will have positional tracking, and currently a good (if amusing) fix for this is provide positional tracking by taping a Hydra controller to your head – this really cuts down on the nausea ppl feel (at least it does for me). I hardly ever get nausea now from my rift, when i was getting it all the time when i first got my device. I think the consumer version of the rift will have far less problems with this than the dev kits, but ultimately it will be the software that has the most responsibility for providing a vomitless VR experience.

        • liquidsoap89 says:

          “The consumer version will have positional tracking”
          Wait what? Is this confirmed? If so that’s super awesome.

          • Reapy says:

            Yes they have said that in most updates I have seen, which is why I’m waiting for the real release vs the dev one.

            One thing the devs mentioned in a post somewhere was they make bad test subjects due to high use of the device, they grow immune to feeling sick, though it takes a while. I am somewhat used to that since Jedi outcast made me want to hurl all over myself, but kept it up, basically feeling like a drug addict, and eventually got used to it.

            The only other game that did that to me in about 30 years of gaming was half life 2, so bad that I couldn’t get through the boat levels until many years later when I could run it at about 80 fps.

            Anyway, if or is immersive enough with the right game, a training myself I will be.

      • Clavus says:

        The motion sickness is triggered in specific conditions, and it has varying effects on different people. Nobody is covering that up. Most of the time ‘cockpit’ games are used for Rift demos since those are the easiest on the brain and are less likely to cause nausea.

        The motion sickness isn’t as bad as long as you take your time in the beginning to adjust to it. Most people that had a bad experience tried to ‘push through’, which you shouldn’t be doing at all. Stop as soon as you feel slightly dizzy and take a break. After a few days you’ll be able to play longer sessions easily.

      • goettel says:

        I see few articles about sailling talking about sea-sickness, but many people still experience it.

        The point is that many people are willing to brave adverse effects of a desired experience, be it VR, sailling or eating a hot chili.

        And yes, I do experience nausea and sometimes even an associated slightly panicky feeling in VR, probably not so different from someone with flight fright during a turbulent flight. But it won’t stop me.

    • Premium User Badge

      Aerothorn says:

      It’s worth noting that I did not experience one twinge of motion sickness, though clearly others have. Possibly the same people who were made sick by Descent? Not sure.

      • goettel says:

        Descent didn’t bother me at all, but I do get nauseous in VR sometimes.

        I’m not so sure that it’s up to individuals more than technology. There are many technical factors that could lead to naussea: not being able to achieve focus for both eyes with the three available cup sizes, too low FPS , enabling (or disabling vsync, using a cloned desktop for the Rift instead of an extended desktop (read somewhere this might lead to sync issues, but I don’t why it would), the position you use it in (tip: don’t slouch down on the couch while using the rift, try to allign your body with your virtual self), ghosting and missallignment of post-processing etc. etc.

  8. apa says:

    How does one ‘rhubarb’ to someone?

    • BTAxis says:

      It’s a word that represents that wash of voices that you get when a lot of people are talking simultaneously. The story goes that to get that particular effect, sound effect makers will get some people, stick them in a room and make them all say “rhubarb” repeatedly.

      • Stochastic says:

        I don’t know if there’s any truth in this or if it’s just a case of fanciful folk etymology, but I find it amusing either way. I’d love to hear a room filled with people mumbling “rhubarb.”

        EDIT: Ha, if Wikipedia is to be believed it looks like there may be some truth in this! link to

        • jash says:

          This reminds me of a prank we used to play back in high school. Load up the back of someones pickup(when passengers back there was legal) and hit up all the local drive through fast food places. When they asked us for our order, 12 teenagers would overwhelm the person at the other end with absolute gibberish all at once.

      • apa says:

        I thought you were joking but then I saw Stochastic’s Wikipedia link :) So, this comment thread is a ‘rhubarb’ to the other comments actually saying something about the article…

  9. sabasNL says:

    I’m not sold until the Rift can support HD, and more games (that actually do work with it) support it.

    • El_MUERkO says:

      They’ve said they’re aiming for a 1080p panel in the retail version, for the experience to be optimum you want a framerate of 60+, but because your displaying slightly different images to each eye and creating a concave effect which the lenses then alter to give you a sense of edge to edge imagery your computer is doing more complicated rendering than you see on a flat screen.

      Simply put, you can get retina displays that will be higher resolution than the Rift but if you put them in a Rift you’d in a ten grand water-cooled mega computer to run it and not melt your eyes.

      That being said, if Elite Dangerous supports it then I want one! :D

    • goettel says:

      That’s entirely reasonable, which is why the Oculus site basically advises consumers (instead of devs) not to buy the devkit. I’m blown away by mine, and I’ll have a co-op VR ‘seat’ available when the consumer version drops.

  10. Sathure says:

    I’m picking up the commercial Rift once it’s available strictly for Star Citizen.

    If you haven’t yet and have the capability, go try the Hangar module with 3D Vision. That sold me right off the bat. My god so much bloody detail. I can only imagine it with the rift, head tracking while flying in the cockpit. It’s going to be like your literally there.

    • Stochastic says:

      “It’s going to be like your literally there.”

      I have a feeling generations to come will look back on such proclamations the same way we look back on the story/myth about a 19th-century audience being overwhelmed by a Lumiere brothers’ film showing a train pulling into a station. According to the story, the image appeared so lifelike to the audience’s untrained eyes that it instilled fear among them (link to ).
      Regardless of the story’s veracity, it’s remarkable how our standards for verisimilitude have changed over the years. Maybe one day people will think that our first reactions to VR headsets are quaint.

      • InnerPartisan says:

        No need to go back so far. It’s likely that everyone who’s been playing video games for more than,say, ten years has experienced that effect themselves.
        I remember well the reviews for Eurofighter 2000 (and other flight sims) back in ’96, describing the game as “photo-realistic” and “unbelievably life-like”.

  11. wodin says:

    Track IR is fine..but I have no interest in Rift at all..I have a sneaky feeling it will be a craze for abit but eventually hardly get used. No doubt so health warning most likely about your eye sight will come along aswell.

    When the day comes you walk into a room and hit the holo switch and then bam your in middle earth or somewhere then yes I’m up for that. This helmet though..I need to see the keyboard when flying flight sims for starters. As for FPS until some sort of hand and arm accessory comes out and maybe even gun shaped controls so you actually feel something in your hand now we are getting somewhere. SO no the Rift does nothing for me..great fun and probably great to mess with at a party using some rollercoaster tech demo.

    • Andy_Panthro says:

      The expense and the motion sickness put me right off it. I’m sure the consumer version will come down in cost (but it will still be expensive), but I don’t see it being that much of a boost to gaming to rush out an buy it. Especially since I don’t play enough FPS or similar sorts of games to really get much out of it.

      I’m sure it would be nice for something like Gone Home or Amnesia, but I feel immersed enough in FPS games without requiring 3D of any sort.

      • Reapy says:

        Could be very interesting for strategy titles. Playing fire emblem on the 3ds was interesting, the levels appeared like small miniature battlefields and gave quite an interesting look. I am a huge fan of mini terrain ( though honestly don’t play mini war games, just like making terrain) but an OR would really bring a tabletop or boardgame style game to life. Total war would probably look amazing in a birds eye view floating about the battlefield.

        The possibilities are honestly endless with this tech working. If I could get in on this company in anyway I would buy, buy, buy.

        • SominiTheCommenter says:

          To bring a boardgame-style game to life you need a boardgame, not a rift.

      • fish99 says:

        How do you know until you try it? For the record I find 3D (nvidia 3D Vision) in games way more immersive. The Rift would be even more immersion.

    • P.M. Gleason says:

      If you use a keyboard for a flight sim, you aren’t flying a flight sim.
      I would argue that most sticks are terrible, too, due to deadzones. I always stick with the Cessna skyhawk-style yokes from CH or Saitek.

      I was an early adopter and still current user of nVidia 3D vision and I’ve never had motion issues (mostly I had to turn the effect way down when I first got it, though) but I used to get motion problems from FPS games to the point where I hated them. I remember playing Doom from a 3.5″ floppy and puking a half hour later.

      As for a flight sim, yeah, it’s extremely likely you’ll be thrown around a gyroscopic loop if you have full range of vision like that. Wait until they simulate g force, then you’ll really be filling up the old drunkard’s bags.

      • SuicideKing says:

        I believe what s/he means is that the keyboard is used in addition to the joystick. And honestly, i find that more comfortable than having both hands on a joystick.

        And flight sims (or any PC game for that matter) do have a lot of controls on the keyboard.

        • Dozer says:

          This is a real-life problem, which is why HOTAS (Hands On Throttle And Stick) was invented. Allow pilots to operate the important stuff by touch while they’re keeping their eyes outside.

          There are three normal ways of interacting with a simulator:

          1) Joystick controls which you can operate by touch
          2) Keyboard controls which you need to look at unless you’re very familiar with your keyboard
          3) Mouse controls where you use the cursor to click on a control drawn onscreen.

          But also super secret bonus method:

          4) Build custom interface hardware using cardboard, plywood, cheap switches from eBay, and a programmable interface board. All of which are very very cheap nowadays.

          This is the first one I made: link to

          Not designed for blindfold/OR operation, as it’s a grid of switches basically, but OR-friendly stuff could be made easily I’m sure. This one simulates the autopilot interface for one specific aircraft in X-Plane, and communicates with X-Plane directly by MAGIC, but these things can also pretend to be joysticks and keyboards and interface with the computer that way.

          Also this stuff is very fun to make :)

        • P.M. Gleason says:

          A decent stick (not a “joystick”) will eliminate the need for a keyboard in most games, even flight sims. Like Dozer said, a HOTAS is a good place to start. The downside of the one I use (a Saitek yoke) is that I also need pedals, but for me, that isn’t a downside because I’m in flight training irl.
          You can emulate pedal functionality with sticks that have a twisting base. But the biggest problem with sticks is that they have a deadzone. When you’re in a real aircraft, that isn’t really an issue. There is no deadzone on a stick or a yoke irl.

          When you actually go for your pilot license, depending on what you’re doing, you’ll end up getting something called an instrument rating. This means that you can fly a plane without ever looking out the window or getting an external reference, looking only at the controls in the cockpit. Think about that next time you’re fumbling on a keyboard and realize that you aren’t playing a flight sim if you’re using a keyboard.

          This is a good place to start if you don’t mind having something that isn’t a stick:

          link to

          It’s money, but it is less costly in the long run then getting a string of really shitty sticks. Stick with Saitek or CH.

  12. bamjo says:

    How do you control a plane in a game like IL2 without being able to see the keyboard? I’m assuming that using the Rift replaces all the view key functions, but even so there are pages of keybindings for controls. Are you pretty much required to use a HOTAS setup with a full key profile? Are there even enough buttons on a joystick/hotas to do this? And you can’t see those buttons either. It seems like you will have to memorize the controls by feel, increasing the already daunting learning curve for a flight sim.

    Or maybe I have a fundamental misunderstanding about how the Oculus Rift works. It seems like this demonstration was tailored to avoid issues like this my making the participant a passive passenger. I’d like to see how the practical aspects of actually playing a game with this thing on your face are addressed.

    • Premium User Badge

      Aerothorn says:

      This is actually an excellent question. My RIft demo was also a passanger experience, and I did not have the foresight to ask the Rift guy this very question.

    • Solanaceae says:

      I play IL2 a lot and honestly it’s not as big a deal as it might seem to memorize many different key positions. If you can type decently you already know most of them, so it’s really just a matter of learning which keys do what, and the slight adjustment of pressing keys while not having your hands in the “standard” typing position, which, again, is a lot easier than it might look on paper.

      • SooSiaal says:

        I play alot of flying and racing sims using joystick/wheel, I can find all the buttons blindly,keyboard will be a bit trickier I guess, controllers are really a must with the rift I think

    • johnkillzyou says:

      It seems best to use a HOTAS approach, but otherwise the numpad is your friend. Its easy to feel your way through that if you want to easily interact with your keyboard.

    • Arbodnangle Scrulp says:

      I expect devices like the Logitech G13 will become popular.

      link to

    • Martel says:

      Mavis Beacon

    • SystemiK says:

      I have a similar (although much more important) concern with the Rift.

      How do I find my beer mug on the desk without tipping it over? If I have to slide the headset up onto my forehead every time I wish to take a sip of my delicious adult beverage it’s probably gonna be a deal breaker…

      Seriously though, if I were on the design team I would absolutely put a cheap forward facing cam on the front and tie it to a hotkey, pressing the hotkey would shift your view in the Rift to the cam view and you could momentarily see your hands, desk, keyboard, phone…whatever it is that you needed to look at momentarily.

      I’d imagine they have already considered this option, though I’ve never heard any mention of it in the Rift coverage.

    • dagudman says:

      If you are a PC user then you should have an idea where the keys are on the keyboard. Otherwise you can buy a controller, In fact I tyoed all this with my eyes closed.

      • Wut The Melon says:

        Was that an intentional typo? : P. Though I do agree, I rarely if ever look at my hands when playing a video game with M/KB. I suppose being able to find the keys blindly would be a requirement when using the OR.

    • analydilatedcorporatestyle says:

      Maybe you could fix your keyboard to the desk, calibrate its position in three dimensions(can the Rift triangulate position in the real world with its standard sensors?) then actually map keys that would be graphically replaced as buttons by software. You would probably have to keep a good gap between the mapped keys to account for error/inconsistency but I could see it working!

    • goettel says:

      I’ve ordered a different bunch of these link to to create a more tactile keyboard, so I can blindly allign my hands to the keyboard by touch and find important keys (for me) keys like WASD, M, TAB, 1, 5, 9, F1, F5, F9 etc.

      The Razor Hydra is a nice option for ‘blind’ control too, but it has some limitations which its successor (STEM) will address, i.e. lag and precision. I’m waiting for a good dataglove offering. The day that you just use the dials and switches in your virtual cockpit by using controller like those isn’t far off.

  13. Hypocee says:

    I had my first Rift experience at PAX Prime last weekend; it was almost immediately nauseating, though I think that was for one specific reason. One of the projects at the one-day, literally underground Seattle Indie eXpo was Energy Hook, a Crackdown/Jet Grind Radio/Tony Hawk style game with wallrunning and a grappling hook system from the designer of Spider-Man 2’s web physics. I wanted to try out the game anyway, so why not jump on the Rift? The line for the HD prototypes had remained around the block the whole show and I didn’t care enough to go through it.

    From third or first-person perspectives, running, wallrunning and jumping were all fine. However, swinging was immediately and terribly nauseating – because, I thought and think, the camera is the same as the screen version and so is always helpfully slewing toward your current direction of motion. I terminated use of the Rift, finished messing with the game (which seems great even in alpha state) and had a bit of a chat with the developer. I think panning – specifically panning – the camera without input from the player or world state is absolutely forbidden in Rift design, and said so at the time, but of course I was doing so on the basis of a couple minutes’ experience so I couldn’t exactly carry a lot of authority.

    I remain confident in the product and don’t believe there’s any conspiracy to cover up motion sickness effects – largely because of Owen Hill’s extensive discussion of the general absence, extent and triggers he experienced in his FPS hands-on in PC Gamer UK Podcast 74, following even more extensive worrying about it on the previous episode.

  14. racccoon says:

    They probably got rights in an Optometrist company. Blind leading the blind,
    Nice game though, please don’t fall into this, just put it into a screen that’s not going to make us more blind than we already are going..

  15. Faxanadu says:

    Oh my god, oh my god, oh my god!


    Getting rid of staring at a square tile! Can you guys IMAGINE the IMMERSION? Oh boy oh boy oh boy!

    I’m much too fussed to give an insightful reply.

    I wish I knew when I should start throwing money at this, though.

    • fish99 says:

      For the record the Rift doesn’t completely fill your view, i.e. you can still see the edge of the screen, and blackness beyond.

  16. Morte66 says:

    TBH I’m mostly interested in the rift for watching movies lying on my back in bed.

  17. Rich says:

    Devil take these useless eyes!

  18. obd2works says:

    you can look at this link:
    VOLVO Vida Dice

  19. daphne says:

    I live in a second-and-a-half-world country, and part of me is terrified that I won’t be able to obtain a Rift when it comes out, for whatever reason. :(

  20. bar10dr says:

    Its so… strange to read a gamer review where its so obvious the author is truly blown away, it doesn’t happen very often any more.

    I think what needs to happen is variable pixel density on high resolution panels cobbled with eye tracking.

    The thing is, the eyes focuses on a very tiny spot, as long as you keep that spot in really high def, you could make the surrounding pixels larger. Of course the panel itself would have to be really high res, think 4k, but you could double up pixels in the rendering engine the further away from the eye focus the pixel is.

    I’d imagine eye focus direction calculation within such a small range would be much easier than real world calculations because you don’t really need that high of fidelity to calculate where on the screen its focusing, you could do it within a mm instead of a tenth of a mm.

    This is the first tech in many many years I’ve really thought, wow, this is incredible.

  21. analydilatedcorporatestyle says:

    The advent of ultra high definition screens for Smartphones being released that exceed 1080p will be a boon for the Rift. In a year or two they will be at a price point where a sub £300 Ultra HD(UHD) Rift will be highly likely.

    Having the hardware to run two UHD screens at a decent frame rate will be the expensive bit!

  22. Furius says:

    oh you can’t mention Acorah without linking to this link to

  23. FullMetalMonkey says:

    I wonder if it is possible for the Rift to have an experience where, the person wearing the Rift, is viewing through the eyes of a game character but all of the controls and movement etc is controlled by another player viewing the game in it’s regular format ie 3rd Person. Similar to the experience mentioned with iL2 Sturmovik. But imagine if it was implemented with a game like Grand Theft Auto 4 (and or 5 if it ever comes to PC).

    I can imagine a few friends come round for the evening and I take them on a virtual tour of Liberty City or San Andreas. They are viewing the game from the eyes of the character but being controlled by myself in it’s regular 3rd Person.

    Would be amazing. Sitting in the back of one of Liberty Cities cabs driving through Algonquin watching theworld pass by.

    Someone make this please!

    • goettel says:

      You already can: use Tridef for stereoscopy and head-tracking and play the game on the (split) screen while someone else wears the Rift.

  24. obd2works says:

    I think what needs to happen is variable pixel density on high resolution panels cobbled with eye tracking.