Why 2016 Will Be A Great Year For PC Gaming Hardware

2016 is going to be great for PC gaming hardware. Of that I am virtually certain. Last time around, I explained why the next 12 months in graphics chips will be cause for much rejoicing. That alone is big news when you consider graphics is arguably the single most important hardware item when it comes to progressing PC gaming. This week, I’ll tell you why the festivities will also apply to almost every other part of the PC, including CPUs, solid-state drives, screens and more. Cross my heart, hope to die, stick a SATA cable in my eye, 2016 is looking up.

NB: Usual TL;DR drill is at the bottom.

Let’s start with the biggie, the CPU or PC processor. It’s debatable just how important CPU power is to PC gaming. Some say CPUs are already good enough. Hell, I’ve even flirted with that notion myself.

Basically, it’s all a bit chicken and egg when it comes to CPUs and games. Unlike graphics, the things that bump up against CPU-performance bottlenecks tend to be a bit essential to the game. You can toggle much of the eye candy and retain playability, in other words. But dumb down the AI and the game itself may end up compromised.

Admittedly, quite a bit of CPU load is graphics related, which is why the prospect of DirectX 12 reducing rendering loads is so tantalising. But the overarching point here is that game developers are in the business of actually selling games and thus will only make games that run on existing CPUs. And if they want a big audience, that means games that run on existing mainstream, as opposed to high-end, CPUs.

Long story short, that big, ugly bar at the top gets the chop with DX12

Put another way, I’m relying on ye olde build-it-and-they-will-come theory, here. Who knows what wonders game devs would create if everyone’s CPUs were suddenly twice as fast? Anyway, there’s absolutely no doubt that mainstream / affordable CPU performance has stagnated in the last three or four years. It’s barely budged.

We can argue the toss over the reasons for that. Has AMD failed to catch up to its age-old rival? Basically, yes. Has Intel intentionally sandbagged? Almost certainly.

AMD’s new Zen CPUs

But here’s the good news. By the end of 2016, AMD should have its new Zen-based CPUs on sale. And Zen is an all-new architecture, not a desperate rehash. Out goes the bold but borked modular architecture of the Bulldozer, AMD’s main CPU tech since 2011. In comes a more traditional big-core approach that majors on per-core processor performance (you can read a deeper dive into Zen here).

Ultimately, we won’t know if AMD has pulled it off with Zen until the chips are out. But AMD is making all the right noises about its design priorities for Zen. Possibly even more important, it was conceived under the stewardship of a guy named Jim Keller. This is the same guy who oversaw arguably AMD’s one great design success, the Athlon 64, and later was hot enough property to end up at Apple doing its crucial iPhone chips. Clearly, this guy Keller knows his CPU onions.

This. Is. Zen. Probably

Of course, if Zen does tear Intel a new one, we can expect the latter to react. In all likelihood, Intel’s spies will discover how well Zen is shaping up well before launch and put the wheels in motion. Personally, I think Intel can probably handle almost anything AMD throws at it. But that’s fine. The important thing is that AMD forces Intel to up its game. Then we all get better CPUs, regardless of who we buy them from.

As things stand, to the best of my knowledge Intel’s mainstream platform currently looks unexciting for 2016. Its existing Skylake chips remain until near the end of the year before a new family of Kaby Lake processors arrive. But Kaby Lake sticks with the old four-core format and very likely similar speeds and overall CPU performance.

Admittedly, if you’re made of money there will be a new $1,000 10-core processor on the high end platform. But it will take a successful Zen launch from AMD to shake up the mainstream. Here’s hoping.

Screens, screens, screens

Next up, screens. Lovely, flat, panelly screens. With one possible exception, I’m not expecting any single explosive moment – no 14/16nm GPU die shrink, no AMD Zen bomb. But there will be lots going on.

For starters, I’m hoping AMD can put the final bit of polish on its FreeSync tech, which in theory makes games look smoother and sharper. For now, FreeSync is patchy enough in practice to be pointless. But it’s surely fixable and it will likely become a near-standard feature over time, which will be great so long as you have an AMD video card.

Then there’s the latest ‘2.0’ version of HDMI becoming the norm. This matters because it allows for 4K at 60Hz refresh and means cheap 4K TVs will be viable as proper PC monitors. OK, you might prefer running at 120Hz plus. But in-game that’s academic, because you probably won’t be able to generate 120fps+ at 4K resolutions, in any case.

Curved was the big idea in 2015, what about 2016?

Already, there are cheap 40-inch 4K TVs with HDMI 2.0 available for under £400 in the UK (or circa $500 in the US, I would guestimate). These screens will not be perfect as monitors. But the important bit is that a 40-inch 4K HDTV has a sensible pixel pitch for a PC monitor. Overall, it’s a lot of screen, a lot of pixels, just a lot of desktop real estate for the money.

Meanwhile, existing screen-format favourites like 27-inch 1440p panels will just keep getting more affordable. Ditto high refresh tech and (if FreeSync ups its game) screens with Nvidia’s G-Sync technology. And yes, even higher resolution screens like 5K and even 8K will pop up. But honestly, 4K will remain marginal for gaming for a while. So anything more is just silliness.

Quantum what?

Another interesting tech tweak to look out for in 2016 is quantum dot. Already popular in the HDTV market, quantum dot is not a revolutionary panel tech like, say OLED. Instead, it’s about making LCD backlights better.

The basics involve materials that absorb certain frequencies of light, convert it and re-emit. The “quantum” bit is because the semiconductor crystal on which it’s based leverages a nanoscale effect known as quantum confinement, which in turn involves electron holes, two-dimensional potential wells and the exciton Bohr radius. Obviously.

Some quantum dots in jars, yesterday

Cut a long story short and you can stick these little suckers in your LCD backlight and improve the quality of light fairly dramatically and for cheap. And that means more accurate and vivid colours. I’ve not actually seen a quantum dot display as yet, so I cannot comment on quite how dramatic the difference is. But better backlights are very important to image quality so I am hopeful.

OLED already

As for the exception I spoke of earlier, I speak of OLED monitors. There’s not space to go into the reasons here, but these things are going to be awesome when they arrive. I’d actually largely given up on the idea of OLED monitors. But 2015 saw the first truly viable OLED HDTVs appear and it seems inevitable the PC market will now follow. I’m just not sure if that will happen in 2016.

Don’t forget the memory

Memory tech is the last big area of likely innovation next year, and specifically I’m talking solid-state hard drives or SSDs.

We already know that PCI Express tech combined with some new middleware known as NVMe is tearing up some of the bottlenecks that have been holding back SSDs in recent years. M.2 slots and U.2 sockets are becoming the norm and fully compatible drives are becoming available now, too.

Confusingly, PCI Express drives come in many forms

That alone is pretty exciting as it will mean a jump in bandwidth from about 550MB/s to roughly 1-2GB, depending on the specifics. But all that could seem positively prosaic if a new tech being developed by Intel and Micron turns out to be everything promised.

Intel’s SSD bomb

Known as 3D Xpoint, it’s completely different from existing flash memory used in SSDs. Instead of memory cells composed of gates in which electrons are trapped, 3D Xpoint is a resistance-based technology. Each cell stores its bit of data by changing the conductive resistance of the cell material. What’s more, individual cells do not require a transistor, allowing for smaller cells and higher density. In other words, more memory for less money.

At the same time, the ‘3D’ bit refers to the stacked or multi-layer aspect of 3D Xpoint. It’s basically the same idea as 3D flash NAND and allows for even more memory per chip. But arguably the really exciting bit is that the memory cells are addressable at the individual bit level. See, I thought you’d be impressed.

Ooooooh, the selector bone’s connected to the cross point structure bone. Or something

In all seriousness, that’s a big change from NAND memory where whole blocks of memory, typically 16KB, have to be programmed when storing just a single bit of data. The net result of that clumsy structure is that NAND requires all kinds of read-modify-write nonsense to store data and also needs complex garbage collection algorithms to tidy up the mess that ensues.

Well, you can kiss goodbye to all that with 3D Xpoint. And say hello to stupendous performance. 3D Xpoint is claimed to be up to 1,000 times faster, 10 times denser and 1,000 times more robust than existing flash memory.

In practice, I’m not close to buying the 1,000 times faster bit. But Intel demoed 3D Xpoint recently and the result was seven times faster random access performance than a conventional flash memory drive. That’s not 1,000 times faster, but it’s still one hell of a leap.

1000x this, 10x that. Believe it when you see it…

The slight confusion in all this is that Intel’s messaging re Xpoint has been mixed. There’s talk of it being used to replace RAM in server systems and noises about it not being intended to replace flash memory. So I’m trying not to get too excited until clarity is achieved.

But 3D Xpoint-based hard drives are apparently due in 2016 and they might just be as big a jump over existing SSDs as they were over magnetic hard drives.

So, exactly what have we learned?

So let’s recap. Last time around we had a big jump in chip manufacturing tech allowing for graphics chip perhaps twice as fast. Then there’s the possibility of AMD putting a bat up Intel’s nightdress. We might just see OLED monitors. And SSD performance could suddenly go interstellar. Not bad for a single year in PC tech, eh?

As for everything else, I’ll let Alec and the crew keep you posted on the likes of the Valve controller rollercoaster and virtual reality (VR) technology. I have my doubts re the latter purely because of the need to strap something ungainly to one’s head. But I would be very happy to be proved wrong and when you factor in the performance gains coming in 2016, it wouldn’t be a bad year at all to really push VR.

Of course, you might say all this is about time after a pretty sucky year or three in graphics, CPUs and storage? And I wouldn’t argue. But however you slice it, 2016 is looking pretty awesome.

TL;DR
– AMD’s new Zen CPUs will put a bat up Intel’s nightdress (hopefully)
– Monitors will get better and cheaper and if OLED arrives it will be awesome
– Intel and Micron’s 3D Xpoint tech could make existing SSDs look like slugs

89 Comments

  1. Lacero says:

    Can we hope for anything from Vulkan this year?

    • Sakkura says:

      The Vulkan API will launch very soon. Then it will take a while longer before games arrive with support for it. The same way DirectX 12 launched half a year ago, but so far we’re only seeing betas and early access games supporting it.

  2. mattevansc3 says:

    While more related to the GPU article Micron has gone on record this month to say that GDDR5X will be available next year and will be a drop in replacement for GDDR5.

    That means the double bandwidth improvements can be applied to existing cards and help bring the cost of 4K gaming down to mainstream/enthusiast price ranges.

    • PhoenixTank says:

      First I’d heard of GDDR5X – thanks for posting! If the rumour mill websites are to be believed, it sounds like HBM2 for the high end and GDDR5X for the low end.
      Performance boosts for everyone! I am so looking forward to 4k gaming that doesn’t require sli/crossfire!

      • Sakkura says:

        GDDR5X is for the midrange, not low-end. At the low end, GDDR5 will do just fine, and right down at the bottom it’s a question of when DDR4 takes over from DDR3. Both are pants, GPU-wise.

  3. Cederic says:

    I switched to AMD for Athlon chips, they had a price/performance ratio that made a lot of sense.

    It’d be lovely to see AMD get back to that level of competitiveness with Intel. I don’t care if they’re faster or not, just competing on performance/price is enough to force Intel to keep innovating, keep improving, keep the supply of awesomeness coming.

    Probably wont make me switch from Intel, but I’ve done it before, will also buy hardware on a rational basis.

    Where I’m really struggling is the shiny new storage. I was going to buy a new PC once the new tiny graphics chips were out, sounds like there may be benefit in hanging on a wee while longer for shinier storage. One to watch, consider, ponder, and then I’ll make an emotive decision that I need a new PC and buy the best components available at that time.

    • frenchy2k1 says:

      Don’t wait unless you need the absolute best, no concern for costs.
      3DXPoint will first come to server side.
      If you read the slide carefully, it’s 10x denser than DRAM (which makes it much less dense than 3D Flash). It’s faster, but will be much more expensive. Companies are trying to work out how to slot it in as an additional cache between RAM and Flash, same as we did slot flash between RAM and HDDs.

  4. Elliot Lannigan says:

    “put a bat up Intel’s nightdress” …. could we please not?

    Excellent article otherwise.

    • SingularityParadigm says:

      Seriously. Both in the article and in the tl;dr that statement made me cringe. I expect better of RPS!

      • Borodin says:

        Call be ignorant, but I don’t understand your objection to that. Are you saying it’s sexist? Hackneyed? Or what?

        It made me laugh and I have no problem with it at all. Please explain to me why I should

        • mattevansc3 says:

          Depends on how you read “bat”. I personally read it as in the flying rodent which is similar to “putting the wind up your arse”. On the flipside you can read it as if its a baseball bat. That of course brings up a lot of rape connotations. It seems to be more of an American thing but Curt Schilling was trolled on twitter by someone saying he was going to violate Curt’s underage daughter with a baseball bat.

          I dont belueve RPS would deliberately go with the second option but it is a common usage.

      • Unclepauly says:

        The fainting couch was getting a bit dusty I guess.

    • Pharos says:

      My god, are Fawlty Towers quotes enough to inspire pearl-clutching these days?

    • Premium User Badge

      Don Reba says:

      Why wouldn’t you want to put a bat up someone’s nightdress?

    • kael13 says:

      Don’t be pathetic.

    • MiniMatt says:

      It’s a phrase with some jokey connotations of sexual violence which is perhaps beginning to feel a little inappropriate.

      As for it’s Fawlty Towers origins, I don’t think we can point to 1970s sitcoms as standard setters for 21st century speech.

      It’s not a question of perceived offence, and it’s certainly not a question of intended offence, merely that language evolves. “Sticking a bat up someone’s nightdress” – to do what? To tickle them? In the context of bettering another party (ie AMD beating Intel) it strikes me as…. well, just dated, slightly inappropriate, language.

      • Jeremy Laird says:

        There were no such connotations, jokey or otherwise. To be honest, if anything is offensive, it is the suggestion that I would joke about such matters.

        Astonishing to have to explain this, but the bat in question is a ‘rodent’ and the phrase is simply a colourful, if cliched, way of saying ‘to startle someone or thing’. I appreciate that cultural differences can lead to confusion, but given the context and the fact a simple internet search would surely have pre-empted the righteous indignation, this is surely a case of willful misinterpretation.

        • Rich says:

          Simply put: if you’re offended, you’ve completely misunderstood.

        • MiniMatt says:

          I’m not going to get into an argument and break my new year resolution on the 1St day, so I apologise for the offence you perceived at my perceived offence.

          I’m a reasonably educated reasonably travelled, reasonably worldly middle aged bloke & I’ve never once heard that phrase. Because of this I did indeed Google (around the above mentioned Fawlty Towers quote) just to confirm it was something that is/was said and saw nothing to indicate etymology.

          Clearly I’m not the only one who had a double take at this phrase.

          • Unclepauly says:

            That’s unfortunate. As someone who has never even left his state (except once to gamble in Detroit), I’ve heard teh phrase and even have seen skits where bats and dresses were involved.

        • Bobtree says:

          Bats are not rodents.

        • poohbear says:

          i think your American readers would read “bat” as in “baseball bat”, hence the rape connotation. I was a bit surprised but hey i’ve read worse stuff.

      • pepperfez says:

        To tickle them?
        Yes, in fact.

    • int says:

      You know how to make your own antifreeze?

      Shove a frozen bat up her nightie.

  5. Simbosan says:

    Typical, I need to buy a PC but can’t wait till the end of the year

  6. edwardh says:

    There’s actually some news on the OLED front:
    “As part of LG’s efforts to create a more diversified OLED product lineup, the company has been improving burn-in issues as well as blurred images associated with images appearing on OLED monitors when used with different sizes and frequencies.”
    link to forums.anandtech.com

    It doesn’t change what was said in this article but it does mean that it’s not just guesswork any more. LG actually seems to be working on it right now.

    • frenchy2k1 says:

      Both Samsung and LG are using OLED heavily.
      Samsung uses OLED on most of their high end phones already and started using it on their tablets.
      They should also have some TVs out.

      There are challenges and prices are still high, but they are definitely banking on it.

  7. PhoenixTank says:

    What, nothing about the first GPU process shrink in about 4 years? i.e. 28nm -> 14nm.
    Skipping 20nm entirely, so no half-node stopgap this time.
    No idea what that’ll do to the prices. 28nm is well known, but AMD and Nvidia have been using complex tech to help optimise and large dies to get the most out of the process. Both teams might be on even terms this year!
    Performance/power draw improvements should be exciting

  8. ibnarabi says:

    Just FYI – Intel has adopted AMD’s FreeSync. Expect it to be everywhere next Christmas…
    link to pcworld.com

  9. TillEulenspiegel says:

    I’d bet pretty heavily against AMD being able to drastically reduce power consumption while similarly increasing performance (IPC).

    But maybe they’ll make some nicer Opteron chips, so I can build a cheap NAS with ECC RAM.

    • Sakkura says:

      I might take that bet then. They get a major boost in power efficiency just from the newer process node.

  10. GSGregory says:

    Directx 12 is like dx 11 and 10 though and even worse in many ways, 99% of people are not going to upgrade to win 10 and maybe 10 games max will ever use dx 12.

    • UncleLou says:

      Not quite sure if you’re serious, but if yes, you REALLY should take a look at the latest Steam survey figures….

      • GSGregory says:

        I never take a look at steam surveys. I highly doubt much has changed with a dumb amount of people still on xp. Either way even if everyone went to win 10 and had dx 12 it would still require companies to put out games on dx12 which means limiting their customer base by a large amount. According to wikipedia there is a whole 20 games(including upcoming games) that have dx12 support.

        • MegaAndy says:

          link to store.steampowered.com

          Windows 10 about 30%, windows xp about 2%. Or you can just make up stats to suit yourself

          • Hedgeclipper says:

            Yep, under 30% and they’ve been forcing W10 as hard as they can for months. At this point I don’t think they’re going to be convincing big numbers to change from W7 or 8 in the near future and 30% isn’t enough of a market for the AAA games but its possible they’ll release with DX 11 and DX 12 optional.

          • fish99 says:

            Sorry but how do you work out that 30% in six months is somehow bad? It’s already way ahead of W8 and not far behind W7.

          • Sakkura says:

            30% is a heck of a lot in that time frame.

          • Hedgeclipper says:

            But you had to pay for 7 and 8 while 10 is free, and not only free they’re using windows update to add nagware and force downloading it. Neither of which they had to do with 7 or 8.

          • Asurmen says:

            And those concerns are irrelevant to the discussion, which is one of simple adoption. Win 10 has been massively adopted.

          • GSGregory says:

            All that really tells you is people using win 7,8 and 10 ar mor likely to take steams survey.

          • Asurmen says:

            So, the people using the gaming OSes are the ones likely to be using Steam? Well there’s a shock.

            Considering the total number of Steam users plus the breadth of games available on Steam, you can use it as a benchmark as to what the average gaming PC is like. It shows that Win 10 makes up a large proportion of gaming machines and appears to be a strong upwards trend in growth.

        • UncleLou says:

          You can lead a horse to water, but you can’t make him drink…

          According to my own made-up statistics, more than 101% of users are on W10.

          • poohbear says:

            alot of us will upgrade at the last minute. I’m waiting till July 2016 to upgrade. I’m sure by then it’ll be stable and all the bugs ironed out.

  11. caff says:

    A graphics upgrade is my only priority – once I see a new card that performs well as a single card at 4k res, I’ll build a PC based around it.

    • Unclepauly says:

      They already have cards that perform as well as single cards at 4k … every single one of them in fact.

  12. Delicieuxz says:

    Don’t forget Vulkan. DirectX 12 will only be for Windows 10, while Vulkan will be for everything, and accomplish the same low level communication between drivers to hardware to improve application performance. Vulkan already has wide support in the PC games industry, with most major 3D game engine developers having stated that their engines will have Vulkan support. Valve has even questioned why a developer would make a DirectX 12 back-end for their game when Vulkan does the same thing, and will work across all OSes, and not just Windows 10.

    • Premium User Badge

      Don Reba says:

      Same as before, DirectX 12 will likely have better drivers on Windows.

      • Hedgeclipper says:

        But only on windows 10

        • UncleLou says:

          Which, as we have established above, is no problem, because Windows 10 is at 30% and rising, while everything else is falling.

          Reading a few comments here, it’s a wonder we’re not still stuck with DirectX 2.

    • mattevansc3 says:

      That has to be taken with a pinch of salt though.

      There is a huge difference between supporting and backing. Microsoft technically supports Vulkan. The vast majority of those supporting Vulkan are already supporting DX12. The only company actively backing Vulkan is Valve and their backing is extending to the Source 2 engine. Lets be realistic about that though, its an engine thats only really used for mods or small scale projects.

      The big development suites such as UnReal and Unity already support DX12.
      If your lead platform is the PS4 its unlikely that you’ll be using Vulkan as its not supported on that console.
      If your lead platform is the XboxOne you are already working with DX12.
      If your lead platform is iOS or OSX Apple want you to use Metal.
      When your project will take over a year to develop there’s little reason not to target a Win10 userbase. Win7 and Win8 have been discontinued so their userbases will decline overtine. SteamOS hasn’t had the disruptive impact it was touted to have. Linux gaming is still a niche hobby and Mac gaming is still in the Other category.

      Competition is good and its great Vulkan is being released but dont expect it to make waves.

    • Baines says:

      While I’m not denying that Vulkan could be a game changer, Valve is hardly an impartial source to cite. Just look at how Valve promoted SteamOS from the start as better than Windows.

      • DanMan says:

        They never promoted it as a Windows replacement. What are you even talking about?

        • Cinek says:

          They basically did. For gaming it was supposed to be be-all-end-all.

  13. Nereus says:

    Sometimes I feel like I’m the only one that doesn’t want to replace all my plastic, metal and semiconductors with slightly better plastic, metal and semiconductors.

    • Unclepauly says:

      If you’re the only one who doesn’t want their plastic, metal, and semiconductors to play games flawlessly then yup you are alone.

  14. Kefren says:

    I recently got a new laptop and was concerned at the dodgy stuff on there – services Intel refused to explain what they did, but seemed to be tied to DRM and the ability to spy on the PC via a backdoor and maybe close processes. It got me thinking about how all that works, and what can be done about it. Are there any CPUs to avoid if you care about privacy? I remember something about Intel wanting to broadcast PC IDs so you could be individually identified, but then it seemed to go quiet so I don’t know what happened. I’d rather avoid all that. I once got a Cyrix PC though and it was problematic so went back to Intel, but just as I am increasingly uncomfortable with where Windows has headed (hence sticking with 7), I’m feeling the same about CPUs. I’d be interested in PC options that give power back to the person owning the PC.

    • Jeremy Laird says:

      I’ve wondered about this myself. One assumes the US spooks have had Intel and AMD backdoor their chips, seems an obvious thing to do given how dominant those two companies are re personal computing and how powerful a tool for spying that would be, and that, as far as one can tell, they are not shy about such measures.

      Or maybe that’s just pure paranoia!

      • frenchy2k1 says:

        Hopefully more likely to be paranoia.
        Adding backdoors into the processors would be *extremely* risky, because if any of your enemies or even hackers found it, there is no patching possible, everything would be fully opened.

        The acronym agencies have shown a lot more foresight than that in their handling of crypto and backdoor placing.

        For example, they have been caught intercepting shipments and adding their own backdoors to exported equipment, which emphasized that they did not have access otherwise.

        For the unique ID, it has gone quiet, but it has been in all CPUs (both AMD and Intel) since the P3 and Athlon64…

    • TacticalNuclearPenguin says:

      I think the russian government switched to their own proprietary tech, which might more or less imply that there’s no other solution apparently.

  15. Asurmen says:

    Someone needs to start bringing out proper Quantum Dot displays and not this LED backlit nonsense. It’s about time displays had another plasma vs LED type war.

    • TacticalNuclearPenguin says:

      OLED already emit their own light, they don’t need backlighting.

      If only SED didn’t die an early death we wouldn’t be there hoping for LCDs to finally get the hell out, sadly.

      • Sakkura says:

        As long as you’re not referring to the SED that was the ruling party of Eastern Germany.

      • Asurmen says:

        I know they do, although that’s irrelevant to my point.

        • Unclepauly says:

          Then why was the 1st half of your post about backlighting?

          • Asurmen says:

            I wasn’t talking about OLEDs hence why I said it was irrelevant. I’m talking about quantum dots.

  16. TacticalNuclearPenguin says:

    “Admittedly, if you’re made of money there will be a new $1,000 10-core processor on the high end platform”

    I think it’s more exciting that this will mark the first 8 cores at 600 bucks, but then again it’s sort of disappointing since X99 already feels old.

    I guess Skylake-E it is then.

  17. Mkohanek says:

    Too bad that consoles will continue to hold most games back at early 2000 level tech

  18. Voqar says:

    This is all fine and dandy, and I guess more is always better for those who want to flush money regularly on hardware, but game design is constantly held back both technically and intellectually by consoles – their technical limitations and the mental limitations of their players.

    I guess it’d be neato to have a faster SSD to get thru all the loading screens one has to endure in console-influenced/held back games.

  19. NephilimNexus says:

    But 2015 saw the first truly viable OLED HDTVs appear and it seems inevitable the PC market will now follow.

    This has me hopeful for entirely non-linear reasons. As we all know, Fallout4 is just one of many PC games that have had their FPS arbitrarily locked at 60 because they’re console ports & consoles won’t go over 60 FPS. Luckily in that game’s case it’s just a matter of a quick file edit to correct this prejudice, but not all games are so easily fixed.

    The reason that consoles don’t go over 60FPS in the first place, however, has nothing to do with their own limitations. After all, graphics is what consoles do best. The throttle is there because of the limitations of current generation television screens.

    Thus if new technology allows TV screens to break the 60FPS limit then consoles will be able to break the 60FPS limit and thus console port PC games (which, sadly, is most decent PC games these days) won’t have to deal with those artificial & arbitrary shackles either.

    • Slackar says:

      As we all know, Fallout4 is just one of many PC games that have had their FPS arbitrarily locked at 60… Luckily in that game’s case it’s just a matter of a quick file edit to correct this prejudice,
      Wait, are you saying they fixed the problem wherein going over 60fps increased the game speed and screwed up game physics?

    • xsikal says:

      “The reason that consoles don’t go over 60FPS in the first place, however, has nothing to do with their own limitations. After all, graphics is what consoles do best. The throttle is there because of the limitations of current generation television screens. ”

      Given how many console games actively struggle to get anywhere near 60FPS, I do question this.

  20. jrodman says:

    I remember the leap forward from 1978 to 1982 to 1986. It’s very hard for me to get excited about “everything will be a little more performant”.

    I am glad that computers have gotten boring, but I’m also a bit sad at the same time.

  21. lexarflash says:

    The biggest thing to come out is the VR-tech like the Oculus, Rift, Sony VR, its supposed to be a “game-changer.” The computer-human interaction model hasn’t really changed during the last 20 years, its user behind a screen, I/O, and feedback through a monitor. VR guarantees an immersive experience, and “augmented reality” which combines both forms, its the top level technology in terms of AI, human vision, and machine learning. So basically its an open-loop system that allows for greater feedback and user engagement. The iterations through technology have advanced this far to allow this concept, I predict there will be some growing pains as they rough out the edges, but it has a lot of promise and investors are hoping on board. And then there is Tesla Motors, trying to do space-navigation and autonomous cars. But thats a way off. Who knows 20 years from now, what the future will hold? AI more advanced than human intelligence, intelligent robots performing human labor, thats a science fiction story not too far away

    • Jeremy Laird says:

      Interesting how much mind share Tesla gets in autonomony. It’s playing catch up, not really leading the pack, but they are good at marketing so are seen as cutting edge.

  22. frenchy2k1 says:

    Just to leave a note on 3DXPoint.
    This will NOT be the next flash. Even the slide quoted shows it: It is 10x denser than *DRAM*, much less dense than flash.

    (by the way, you can write to 1 to individual Flash cells, you need to *erase* by page, leading to the read/modify/write during steady state).

    It’s first use will be servers and enterprise, with a price to match and will probably be used as intermediate cache.
    Compared to 3D Flash, it needs a lot more steps (each new layer is a full stack, including lithography), while 3D flash is all made in one pass (hence the cost reduction per bit despite each cell being much bigger).

    So, don’t expect it to come down to gamers this year.

    M.2 and U.2 ports (PCIe gen3 x4) will be the big thing this year and it will be already pretty nice (up to ~3.5GB/s). Admittedly, for most people, there will be very little difference between those and current SATA drives (but both are much faster than HDDs).

    • Sakkura says:

      I also immediately thought cache when they first presented 3D Xpoint, but Intel have given pretty strong indications that they’re not going to relegate it to just a caching role.

    • Jeremy Laird says:

      Yes, and the quoted slide refers to NAND for the other two metrics…

      Intel has demo’ed a drive comparing it to an SSD with the subtitle ‘early SSD protoype’:

      link to techreport.com

      Surely that is fairly unambiguous. ALso, I believe Xpoint SSDs are officially on the roadmap…

      • frenchy2k1 says:

        XPoint SSD will be there from day 1, but all of it will be priced for enterprise. Intel may not want it as cache, but SAN integrator will use it as such and noone else will be able to afford it…

  23. RizaAn says:

    Watch out for Paragon guys it’s a PC game. You can it it youtube.

  24. Levatine says:

    I’m sure this is an impossible question, but 2016 might finally be the year I get a new PC.

    With this article and the previous in mind, is there an optimum time for it?