Week in Tech: 2015 Hardware Hotness

Your new gaming PC...

We wrapped up 2014 with the best hardware bits of the year. Ever the innovator, I’m thinking how better to start 2015 than a look at the likely highlights for the next 12 months? Empty speculation and a dash of rampant SEO cheekiness? Perish the very thought. Instead, hang your cynicism on a coat hook for half an hour and humour me. With upsides that start with faster graphics and cheaper SSDs and might extend to some free gaming performance for your PC courtesy of Microsoft, turning your TV into a massive gaming rig for under £100/$150 and perhaps even a VR revolution, 2015 might not be so bad after all.

What, then, looks hot for 2015? We could start with the CES show, but that’s ongoing as I type. So we’ll do highlights next week. For now, there’s one nifty little item from the show floor that augurs, er, interestingly for this year. The Intel Compute Stick.

Put simply, it’s an HDMI dongle with a full-function PC inside. Yup, Windows 8.1 (mingy Bingy version), a quad-core Atom CPU, 32GB storage, 2GB RAM, N-spec wireless and yours for $149. Running games on the stick itself is likely marginal at best. But on paper, it looks like it could be quite the thing for Steam game streaming.

Frankly, the idea of instantly turning anything with an HDMI port into a PC is plain awesome. If it works. We’ll see.

On the subject of Steam streaming, what about Steam boxes in particular and Linux gaming in general? When it comes to Valve, who knows. They may suddenly drop the whole thing or just as likely finally pull the finger out and put a bazillion bucks into the project. That’s the problem with an outfit with so much money to burn. They can easily afford to get it wrong.

Win10 probably won’t rock our gaming worlds, but DX12 might

The next obvious candidate would be Windows 10. Except it’s actually the upcoming DirectX 12 multimedia API that will be the, ya know, game changer. I remain somewhat sceptical, but the claim is that DX12 is going to make games run dramatically faster. The idea is to remove overheads and have games run on the PC more as they do on single-purpose games consoles.

The technicalities involve things like properly splitting game CPU loads over multiple threads so you can actually make use of those cores sitting idle in your PC and also reducing the overhead associated with draw calls. The latter bit is obviously gibberish but it means there will be less CPU work associated with each individual object rendered in a game. That’s why Intel’s DX12 performance demo last year involved zillions of asteroids (well, 50,000) being rendered in real time.

The best bit, on paper, is we’ll get all this for free from a hardware perspective. It’ll be compatible with most if not all DX11 graphics cards as far as I can tell.

Whether you’ll actually get DX12 entirely for free isn’t completely clear. Obviously Windows 10 isn’t going to be free. And I don’t think an absolutely categoric announcement has been made. But all the indications are DX12 will be released as a free update for Windows 8, though probably not for Windows 7. So that’s some extra performance for absolutely nothing. Doesn’t happen often.

DK2 was a big step up, how good will the retail Rift be?

A full retail version of the Oculus Rift headset and the dawn of VR gaming is an obvious mention for 2015, too. Virtually reality has hitherto been one of those techs that always seems like it’s about 10 years away and somehow never gets any closer. Until Oculus Rift came along and made everything look a lot more plausible, that is.

Personally, I haven’t had enough experience of the Oculus Rift to make a judgement call on this. But what I will say is that the difference today is that the hardware pieces of the VR puzzle are clearly in place. Unless you have things like cheap but super high-res screens, VR is a non-starter. But now we do and VR becomes predominantly a software problem.

And as Alec recently pointed out, it’s far from the only VR game in town.

It could be the proverbial next big thing or just a clunky stepping stone between conventional screens and what I assume is the end game years hence – some kind of direct link into your visual cortex. But it’s been a while since we had something that really changed the way we play games.

We don’t even seem able to shake off the bloody keyboard and mouse, even if Elite:Dangerous may have had a few of you HOTAS’ing last year. Actually, what about Elite and Oculus for 2015? Sounds pretty sexy to me. Anyway, here’s hoping 2015 is the year VR comes good.

Speaking of screens, will Freesync and / or support for high refresh rates be broadly adopted in monitor land this year? I doubt it. It’s an awfully price-orientated market and new standards are frustratingly slow to catch on. So, I don’t think HDMI 2.0 and 120Hz+ will be defaults in 2015, much less those features with IPS on a 40-inch 4K panel, that’s for sure. Things will be more incremental, a year of consolidation.

I foresee 4K and 40 inches in my near future…

But that’s not all bad given how much newness we had in 2014. So, all the hot new display formats, like those superwide panels, the curved stuff, the 4K shizzle and the rest should get a little cheaper and the arguable sweetspot 27-incher monitors with 1440p resolutions will hopefully become very mainstream indeed. With any luck, none of you buying a new monitor will have to settle for 1080p in 2015.

As for SSDs, 2015 should be fairly hawt. M.2 plus that NVMe stuff should become widespread and deliver the big jump in drive performance I’ve been waiting for over the last few years. Meanwhile, 3D flash memory should make 500GB drives cheap by year’s end. The snag is that you can’t just plug one of those new M.2 drives into any old motherboard. At the very least you’ll need an adaptor card.

But what of ye olde olde CPUs and graphics? AMD isn’t due to do anything remotely exciting on the CPU side and while Intel’s high-end Haswell-E was a pleasant surprise, in general it continues to sandbag. So expect little by way of PC processor fireworks in 2015. Instead, incremental improvement will remain the name of the game.

AMD 290s will get even cheaper this year

Not so graphics. 2015 should finally see both AMD and Nvidia break free from the 28nm shackles that have been holding them back for around 18 months. Unless something goes hideously wrong, AMD should be wheeling out some new Radeon R9 300 Series boards and Nvidia will unleash some die-shrunk Maxwell GPUs including a true high end chip to sit above the new GTX 980.

Either way, by the end of 2015, there will be GPUs on offer that make today’s quickest look pretty pedestrian and might just crack out decent 4K frame rates from a single graphics card. What they specifically won’t be, however, is cheap. Not in 2015, at any rate. But new high end cards will inevitably put the squeeze on existing range toppers. So more pixel pumping for less cash will be the result. Hurray.


  1. Wisq says:

    4K is tempting, but I’m holding back for now. The big issue is, I’m pretty sure there’s no video card combo out there that can reliably drive that many pixels at a steady and respectable framerate, which means downscaling. And downscaling is generally best done in multipliers, such as half-res. And half of 4K is 1080p.

    I’ve been a 2560×1440 IPS 27″ monitor user since they were first a thing, and I really can’t see going back to 1080p. Driving a 4K at 1440p would be 2 virtual pixels for every 3 physical pixels, which would likely be ugly as sin.

    The sad thing is, they’re coming out with 5K which would nicely half-scale to 2560×1440, but they currently need crazy two-cable solutions because even a single DisplayPort can’t drive that many pixels. I hear the next DisplayPort spec will deal with that — further cementing it as the cable format of the future and finally getting rid of terrible DVI — but for now, not an option.

    I guess I’ll wait and see what comes first — video cards that can drive 5K on a single cable, or video cards that can reliably drive 4K at 60 FPS.

    • mvar says:

      VR headset is another good reason to hold back on 4k displays. If these things deliver what they promise, for me it’ll be 1st priority along with a good GPU.

    • DanMan says:

      I’m kind of in the same boat. I don’t trust any OS enough to scale its UI well enough to make 4k reasonable at a monitor size that I could see myself using, like 27″. I don’t need a single bigger screen, I have a TV for that.

      And like you said, for games you’d need a monster of a PC to play at the same detail settings @ 2160p60 compared to 1080p60. I don’t play that much anymore to justify spending that much money.

    • TacticalNuclearPenguin says:

      You’re right about downscaling, but be careful with that half scaling opposite ( like running a 4k @1080 ), it never works as you’d expect on paper. The only half solution would be upscaling the image rather than having the monitor adapt.

      Reason for this is sub-pixel layout, even if you ask 4 small squares to act as 1 big one there’s still the problem that they’ll still be divided in four RGB patterns that are not supposed to be stitched together. Here’s an example and here’s what different layouts can do in practice. Whatever appears sharpest in this link is your current layout.

      You can test that too with your 1440p monitor, just set 1280*720 and you’ll realize that the output is not just blockier ( obviously ) but also blurrier and not as sharp as if that screen was a native 720p one.

      Don’t worry though, you’re fine with your resolution, it’s the only realistic successor to 1080p for now and i’d wager it’ll still be like this for another 2 years easily. AHVA ( IPS-like ) 144hz G-sync panels are also coming, another proof that this resolution is here to stay.

      Oh and there’s nothing wrong with DVI as long as you don’t want anything above 1440p @60 hz, there are no difference at all in output and it always works without issues even with flacky software/hardware combos. Google is full of DP problems.

      But sure, moving forward DVI will rightfully be abandoned for good, i’m not deying that.

      • Wisq says:

        Yeah, my problem with DVI is not the technical specs, just that the cables and connectors are giant and clunky. And also that any exposed-pin-based connector is far more likely to see damage than modern contact-based ones.

    • Geebs says:

      The good news is that we’re finally getting resolutions high enough that running an LCD at anything other than a integer multiple of native is just about acceptable; 3d games in particular look surprisingly not-terrible at lower resolution on a ‘retina’ laptop screen, and I’d imagine a 4k display would be similar.

      • TacticalNuclearPenguin says:

        Yeah, approaching the limit of most people’s visual acuity territory helps, as long as the non-native resolution you’re using is still pretty seriously high.

        Not everyone will be so “lucky” though ( i can easily see the pixels in 95% of the displays at the “correct” viewing distance, even retina stuff ) and the thought of the absolute splendor of the real native output would personally taunt me forever and ruin my sleep.

        Uhm, ok, now i’m being overly dramatic. Still, i’d say to only consider crazy resolutions for now if you want to enhance other things rather than just gaming, a native 1440p is already great even for those upgrading from 1080p and the market is showing more gaming support aswell with the monitors i mentioned.

      • Jeremy Laird says:

        Yep – 2,560 x 1,440 typically looks pretty decent on a 4K panel. You can really only just tell in-game that it’s not native. Much more obvious on the desktop.

  2. Andy_Panthro says:

    The trouble with reading these articles is that I keep wanting to wait and buy the latest, greatest tech, but I actually spend most of my time playing older and indie games, so don’t really need all that power.

  3. John O says:

    I toyed around with a Gewgl Cardboard. The S3’s 1280×800 resolution came out crappy, but the S4’s full HD made things much nicer. It did make me aware of how important the little things are: you’ll want a fully integrated solution, making those competing headsets look awful compared to the sheer coding power Facebook bought the Oculus devs. And they’re needed: ironing out kinks in the head tracking, setting up drivers, calibrating those things, game integration and all that. If you haven’t been blessed with 20/20 vision, you know how fiddly getting a new pair of glasses or having your vision tested can be. VR glasses are sort of like that. Good luck getting your optician to cooperate in picking out the perfect lenses for your homegrown gadget.

    When you’re putting an expensive piece of heavy equipment on your head, you want it to work. You do not want to repeat the process of fitting it twenty times because of drivers not working, the lenses being out of focus or hitting the wrong button in your improvised toolchain. That’s not going to happen.

  4. Baines says:

    That HDMI Stick PC looks nice. But, uhm, what about TVs and monitors that have the HDMI port facing downwards? I’m pretty sure I couldn’t use it on my PC monitor, because even if the stick where short enough to not bump the deskstop, it certainly isn’t short enough to not bump the monitor base.

    And what about side mounts? Surely that is a bit of weight (with more dangling bits as you attach an external drive or whatever else) to be hanging from a side port. And again, it will stick out.

    Do they make male-to-female HDMI cables? I guess that could be a solution, even if you lose that sleek plug-in appeal.

  5. thaquoth says:

    Dunno about the HDMI stick thing.

    I mean, it’s neat, but… the geeky tinkerer in me already got a raspberry pi, which is certainly more tinker-y and most certainly more geeky. And I don’t see many other demographics this thing generally would appeal to.

    • frightlever says:

      It’s a Windows 8.1 PC with several times the processing power of a Raspberry Pi. The only downside that I can imagine would be lack of external peripherals, where the Pi conceivably has it beat.

      • Phasma Felis says:

        That sounds roughly as inviting as “a sports car with several times the power of a countertop blender.”

  6. airmikee says:

    I remember trying VR back in the 90’s at a cybercafe downtown, full helmet/glove/rollerball walking tread combo setup. I was so excited to try it after seeing ‘The Lawnmower Man’ and ‘Johnny Mnemonic’. Nothing I’ve seen from the Oculus Rift comes close to wiping away those bad memories. I’ll keep laughing at the Rift until it gets washed away in a sea of better ideas.

    • Scandalon says:

      Sooooo…. your experience a couple decades ago means you’ll discount the experience of all the people now saying “remember how back in the 90’s it sucked? It’s starting to not suck, and has the potential to be great!”? Sure, don’t blindly go out and buy one, but…really?

      • airmikee says:

        I know you started to see red the moment I expressed my dislike of your favorite product, but please try to read my comments before you reply to them.

        “Nothing I’ve seen from the Oculus Rift comes close to wiping away those bad memories.”

        /beginsarcasm Yep, I’m basing my opinion solely on what I saw in the 90’s, and I am NOT comparing it to what is available today. /endsarcasm

        At least the stuff in the 90’s had gloves and a walk in place treadmill to give me the illusion of actual movement and actual control. The Rift has better graphics which is the least important part of video games, but since I’m still going to be sitting at my desk, I think I’d rather look at a monitor. If I’m going to immerse my eyeballs into one thing, then I want to be fully immersed. To me, virtual reality needs to be as close to actual reality as possible, and the Rift just doesn’t even come close to approaching that level. They’re fancy glasses, and MAYBE a stepping stone to VR, but the Rift is NOT VR. /nosarcasm

        • Asurmen says:

          That’s why VR isn’t appropriate for all games if you want balls deep immersion. You might want those extra bits for an FPS, but you don’t need those extra bits for a flight sim for example.

          You’ll also find graphics are a huge part of the immersion, more so than the rest of the gimmicks you mentioned. Can’t convince the eyes none of the other parts will do squat.

          The Rift IS VR, and based on your idea that you need a treadmill and control interface (which isn’t needed for all games as I’ve pointed out) for something to be true VR, there never will be a true VR because that’s beyond the reach of 99% of customers. How many people realistically have a room they can set aside for VR?

          I’m finding it hard to figure out whether you’ve tried the Rift yourself or just seen videos ofnthe concept. How is the Rift clearly not superior to 90s tech? What would you class as a superior idea, and would it be relatively attainable by the majority of interested people?

          I really don’t see a sea of better ideas so you’ll be laughing for some time.

        • iainl says:

          Which is why Elite, Eve Valkyrie and Assetto Corsa are the hotness in Rift-land, because they’re all games that simulate activities performed sitting down.

          • cederic says:

            If ETS2 gets ‘Rift support then I may have to buy one.
            If American Truck Simulator gets ‘Rift support then I may have to buy one and ATS.

        • remon says:

          obvious troll is obvious

        • FriendlyFire says:

          “The Rift has better graphics which is the least important part of video games”


      • Scandalon says:

        Heh, perhaps I replied a bit brusqley, or mis-read your tone. It’s not my “favorite product” (no VR headset has yet to touch my face), and I have no particular love for products or companies, it was just the (apparent) dismissive thinking. You’re right, no headset sitting at the desk will match what we imagined “VR” would be watching Lawnmower Man or looking at the Sega 3D glasses add in the magazine. :)

    • fish99 says:

      Ever consider that VR might be miles better now?

      • airmikee says:

        No, I hadn’t. /endsarcasm

        • fish99 says:

          Well if you’re laughing at the Rift because you tried bad VR 20 years ago that suggests to me that you haven’t.

        • Phasma Felis says:

          You clearly hadn’t, so I’m not sure what’s supposed to be sarcastic.

    • Premium User Badge

      particlese says:

      Lots of excited people above! I actually got a similar reaction to yours with my DK1 from someone who was old enough and techy enough to remember early 90s VR* well. He thought the DK1 was cool and is something to keep an eye on, but he didn’t feel like it was the massive leap a lot of us have in our minds. That said, I differ in my opinion and am totally excited, especially since Oculus now seems to really be trying to make hand tracking decent before the consumer release. :)

      *If I remember correctly, he said it was at a convention, as opposed to a mall entertainment setup, in case that matters. I was only about 10 when I played Virtuality or somesuch, so I don’t remember it well.

      • DodgyG33za says:

        I am also old enough and techy enough remember VR in the 1990s. The excitement I felt when going down to use the Virtuality rig in the Trocadero in London! Even with its high latency, heavy headset and poor resolution it was amazing.

        I bought the DK1 as soon as I could, and was amazed but ended up not using it much – mainly because it was very temperamental. It was exciting enough for me to buy the DK2, which is absolutely amazing when playing elite dangerous. The head tracking made a massive difference.

        Sure you have to tinker with it to get it working well. Sure I had to upgrade my graphics card to a GTX970 to run at decent framerates, and buy a HOTAS so I didn’t have to take it off and sure you look like a complete pillock while wearing it.

        It IS amazing. Every time I put the Rift on to play E:D I get a thrill when I see my virtual cockpit. It feels so familiar and real.

        The future has arrived. At the moment the future is best suited to sit down games inside a cockpit (space, racing, flight, mechs, tanks etc) but it is only a matter of time.

        The rift is the difference between watching a game on your monitor and being IN the game. It is a game changer.

  7. celticdr says:

    No mention of the Avegant Glyph???

    I only found out about it days ago but after reading about the technology, how it works, etc… I immediately pre-ordered it. It’s due in Fall 2015 and I’ll be getting it in lieu of that 40″ mon(it)ster for all my gaming and movie watching – the Glyph will be a real game changer people.

    Before anyone mentions that it’s no replacement for the consumer Rift – I do plan on getting a Rift as well when they finally release it in 2016.

    • Hypocee says:

      They just don’t want to know. I’m a CastAR fanboy and backer myself but I’ve successfully restrained myself from throwing it in provided they don’t literally claim there’s nothing out there but the Rift. And to be fair, CastAR is sort of the only alternative right now. Sony’s thing is hypothetical and for Sony, Cardboard and Dive and alikes are cute but ralphy, Samsung’s a ways off. Meta Spaceglasses are basically the same display tech as the CastAR and have the world’s sexiest website, but tied to a USD4K wearable Android system with a Kinect on your face and I’m sceptical about the tracking.

      I’ve wanted an HMD for years and took a pretty deep look at all the players I saw at various points last year. I hope you’re aware that the Glyph, though a really sexy display and seemingly a good product, is not not not a VR headset. It is intended only for use as a portable big-screen 3D TV, and doesn’t attempt to do head tracking EDIT Whoops, OK, they’ve changed that since I last looked. Still IMU-only though.

    • frightlever says:

      That thing is on pre-order for $550 dollars (I think that was the price I saw before instinctively kicking my monitor over). Whereas consumer OR is aiming to be half that and can also display pseudo cinema effect multimedia.

      But go for it! Early adopters are great as far as I’m concerned.

      • Hypocee says:

        psst CastAR’s currently running $430 for the base kit and surface-free/VR clipon it is a shame they don’t offer the glasses a la carte for those who don’t want the surface or controller OK really shutting up again now

  8. Moraven says:

    Razer announced their new set of couch products

    link to razerzone.com

    Forge TV, Android TV console. $99. $149 with controller. Q1 release
    PC game streaming via Razer Cortex software.

    Turret, lap board keyboard, mouse.
    link to razerzone.com

    Forge TV is a reasonable price and supports all the ChromeCast features to your smartphone and tablet. Unlike the Windows 8 stick.

    Turret lap keyboard is nice other than the price… how is this $129?
    Price it at $50 and I would bite. The dock alone is a great feature and magnets that prevent your mouse from sliding off. But not for $100 more than my current Logitech couch mouse/keyboard set.

    • frightlever says:

      Oh well if Razer has announced it then I’m sure it’ll definitely make it to market.

      • Moraven says:

        MadCatz and other have Android microconsoles coming. Basically Ouya with a lot more features.

        Razer ones seems the best with the ability to Stream from your PC to TV. And the price is right.

        • jezcentral says:

          My reaction on first seeing the Razer Turret was that it completely invalidated the need for Valve’s controller. (It reminds me of the possibly apocryphal story of pens not working in space. NASA spent millions making a pen that had an ink-feed that worked in space. The Russian just switched to using pencils. Simple idea trumps massive overthinking).

          Still, I suppose the controller is for the mass-market, not us sensible types who have already seen the KB+M light.*

          * Yes, I know some things ARE better controlled with a controller.

  9. cylentstorm says:

    No DX12 on my creaky old Win 7, huh? Hmm…I suppose that when I decide to shell out the cash for a new GPU, I’ll have to track down one of those infamous “free” copies of Win 8.1/10/9000 or whatever all the kids run these days. GET OFF MY LAWN!!

    • Nasarius says:

      You say Windows 10 won’t be free, but there have been a few rumors about free upgrades for home users. Nothing confirmed, but not totally implausible either.

      I’ve been running the Technical Preview on my main PC for the past few weeks, and it’s basically fine. I mean it’s buggy at this stage, but they’ve clearly un-fucked the design. Windows 8.1 let you remove some of the crap, but Windows 10 removes nearly all of it.

      • airmikee says:

        The rumors about Win10 being a free upgrade only counts for those already using Win8. Those of us with Win7 will be forced to purchase the OS, assuming the rumors are true.

        • Asurmen says:

          That’s the thing about rumours, there’s always a counter rumour, that it will also be free for Win 7 or at a massive discount. It sort of makes sense because it makes everyone’s life easier (more or less to basically avoid another XP scenario) to be on the same OS.

  10. DanMan says:

    I don’t care about D3D12, please use OpenGL, devs, mkay? Cross-platform and stuff…

    That HDMI stick is interesting. I have a nettop here with about the same specs. :]

  11. TacticalNuclearPenguin says:

    As far as i know Nvidia isn’t going to shrink anything, if they release the full fat thing it’ll be a huge chip. Then again, if some new info cropped out that they are actually considering 20nm instead of simply skipping it for 16nm FinFet for the future, do let me know because it looks interesting. I’d absolutely prefer to be wrong or we’d probably be looking at one of the most avid electricity guzzlers around.

    Rumors about AMD are mixed aswell, maybe 20nm, maybe not, but it sounds unlikely.

    Eitherway, i don’t know where Freesync is going but sure as hell an IPS ( well, AHVA ) 1440p 144hz monitor with G-sync is appearing, which would be this, though i’d speculate the price to hover around 1 grand. Let’s also hope they didn’t cut corners like 6bit+FRC and other evil things.

    • Jeremy Laird says:

      AMD and Nvidia will certainly move to a smaller process at some stage. The only question is when. They are not going to stay at 28nm forever. And it’s already been an extraordinarily long period with 28nm. I’d be happy to bet quite a bit of cash they’ll both be selling GPUs on something smaller than 28nm before the year is out.

      • TacticalNuclearPenguin says:

        Oh that’s likely, i simply had the feeling you meant something closer than the full year, possibly because of some fresh info you might have had and i got excited for moment.

        Being tired of waiting is influencing my reading comprehension.

  12. Premium User Badge

    particlese says:

    You know it’s time to get new hardware when you let out a YESSSSS as soon as you see “Week in Tech”.

    If I can stick it out until March 24, though, I can celebrate my current machine’s 6th birthday by building it a clearly-superior, attention-hogging sibling. Hmm, I do need time to come up with a proper name, don’t I? And…is it a girl? Heck, I don’t even know what Zubon is. Or yasai, for that matter…or thelaptop or Beef, or the cardboard one whose name I forgot and who’s living in a closet with Beef. Wait, did I name two of them Beef? I DID. And the first (Ed: no, the second, so there were three — the first ran away) was so jealous it changed its name to Boxcomp. ;_; Aw, come ‘ere, ya big old hunks o’ metal ‘n’ stuff, I still love you! Uncomfortable rectangular hugs for everyone.

    Nah, who am I kidding? I’m just going to name it after whatever I’m having for dinner in German or something.

    • TacticalNuclearPenguin says:

      You can also rebuild the old machine from the ground up and pretend it’s the same old thing provided you have at least reused the old screws.

      • bonuswavepilot says:

        ‘Tis true! The PC of Theseus. My desktop still has a floppy drive in the front, which since my last upgrade doesn’t even have a cable connected to it (no room for a controller card in the last build, and no appropriate ports on MoBo, of course).

    • airmikee says:

      I’m hoping my GTX560 can make it another couple months, it’s been sounding like it’s in its death throes lately.

    • TheApologist says:

      Naming my new PC is 100% my favourite part of the self-building process. And you never know what they’re called until assembly is complete and you hear the first whir of their little fans. Awww…look at little Britney.

  13. malkav11 says:

    I’ve yet to use any control method for anything that’s as flexible and utilitarian as keyboard and mouse. That’s not to say that others aren’t preferable for certain specific purposes (most frequently gamepad…supposedly joysticks are great for certain things but I was absolutely rubbish with a joystick in the space sims I tried (as well as Descent), likely because I’d never touched one in the first 25+ years of my existence on this earth and they are not intuitive in the least), but I honestly can’t think of a game that they would be completely incapable of controlling in a functional manner. (Not that the default settings are always playable.) Ideal, not always, but capable? Yes.

  14. Siimon says:

    >” (mingy Bingy version)”

    It should be said that the Bing version of Windows 8.1 is -exactly- the same as regular 8.1 except with Bing set as the default (changeable!) search provider. Takes 30 seconds to de-Bing the OS.

  15. fish99 says:

    Just put a GTX970 in my PC and TBH I think that’s me done upgrading for a long time. It runs the latest (console) gen games, and it burns through the sloppy ports like Dead Rising 3 and Watch Dogs. It’ll even run something like Lord of the Fallen in stereo 3D pretty smooth. Just finished Far Cry 4 which was a very nice showcase for what the card can do.

    Plus my PC power consumption is still around the 200-220W mark which I always aim for.

    • Twisted89 says:

      Assume you mean idle because a 970 uses a lot more than 220 when in use.

      • fish99 says:

        That’s what I measured at the socket during gaming (so that’s full system but not including display). The only thing that pushed it up to 250W was Furmark, but in real games it was 200-220W.

        Idle was about 80W.

        The actual rating of a GTX970 is 150W TDP btw, but that’s a maximum and it won’t pull that during most games.

  16. iainl says:

    My next upgrade definitely needs to be in the control department. I tried to put off having to buy a flight stick by buying Assetto Corsa instead of Elite Dangerous, and while it’s excellent, it also really shows up my ancient wheel as a bag of junk because it’s not capable of the full 900-degree turn. You get a free choice between so-twitchy-you’ll-crash-every-third-corner, non-linear-steering-that’s-not-easily-predictable and absolutely-brilliant-right-until-you-want-to-take-a-hairpin.

  17. Stepout says:

    SSD deals are getting so cheap now (and my old HDD getting so slow) that I finally made the switch. Bought 2 Crucial M550 256GB on sale at Amazon last week for 89$ a piece. Since I’m late to the party I figured I’d kick my SSD experience off with RAID 0. Should be fun!

  18. MattM says:

    I really want to try the virtual theater program for the Rift. It might be a system seller for me even if I don’t end up gaming on it.