Moore’s Law and the Golden Age of PC Gaming

By Jeremy Laird on December 12th, 2013 at 9:00 pm.

Intel's Gordon, not Black Mesa's

Walker’s recent post on the prima facie petrifying plummet in PC sales got me thinking. Or rather rebooted a thought process I’ve been mulling lately. Just what is happening to the PC? You can make a strong argument, for instance, that we’re entering a golden age of PC gaming. Faster graphics, cheaper ultra-HD screens, the new consoles as thinly disguised PCs, VR technology – the next five years or so are going to be fabulous. But there are also signs the wheels are falling off the entire enterprise of the PC as a computing platform. Then there’s the ever-present threat of Moore’s Law hitting the wall. Can we say anything concrete about it all (the future of the PC, not the looming wall)? Prepare yourself for a multi-topic treatise…

Just to address that PC sales thing, my take is that the current noise involves sales of PCs we don’t care about. That’s where the suffering is. Feeble client boxes for corporate drones, cheapo home PCs sold predominantly on price, that kind of thing. Instead of buying a crap desktop, people started buying crap laptops. Now they’re buying crap tablets.

Actually, did you know 225 million tablets will be shipped out in 2013? Combined desktop and laptop PC sales are put at around 300 million. For 2017, those numbers are expected to be 400 million and 300 million respectively, with desktop PCs being the only shrinking market (by about 10 per cent over that period, if you’re wondering – as ever, all IDC numbers).

Think happy thoughts
The main thing that hits me is just how huge those figures are. I get a bit giddy thinking about the rampant consumerism, the scale of the raw materials usage, the environmental and human costs. Is it really sustainable? Must…keep…thinking…happy…thoughts…!

Anyway , I shall suppress the darkness and add a little context. The best figures I can find put the combined total of Xbox 360 and PS3 sales at around 150 million. That’s total sales to date since those consoles were launched, not annual sales.

So, all the evidence is that performance-orientated PCs for gaming are in rude health. As far as I know, GPU sales – a key metric for PC gaming – are stronger than ever. And as John said, Steam continues to be the most virulent strain in the global Petri dish of gaming – user numbers up 30 per cent up in the last year.

Then there’s the other highly relevant angle regards consoles. The arrival of new models from Sony and Microsoft. Architecturally, they’re almost pure PC, pulling the three major gaming platforms closer together than ever.

For the consoles themselves, that’s not immediately of much import. It’s mainly technological expediency. It’s the cheapest, easiest way to hit performance targets. For the PC, however, it could prove significant. At the very least, you’d think that porting games across will be much more straight forward.

High performance ports
The performance advantage of a high end PC means running console ports was never going to be an issue for serious desktop machines, of course. But the close relationship between the latest consoles and the PC will probably help things like cheap PC laptops and tablets with integrated graphics become gaming viable sooner rather than later.

Likewise, those new consoles will inevitably broaden the general game development horizon. Forget all the shader and core counts. Having 8GB rather than 512MB to work with will make a huge difference.

As for PC hardware itself, this is where the message gets a bit mixed. The graphics technology development cycle has gone a bit weird, partly thanks to Nvidia initially choosing to not put its most recent high GPU (GK110 aka Titan) into PCs. And the rate of development seems to be slowing. But really roughly, we’re still looking at refresh cycles every two years or so.

CPUs stagnate
The CPU side is a lot less edifying. AMD really has failed to deliver over the last five years and Intel has taken the opportunity to sit back and coin it in the meantime. Mainstream CPU performance (I barely count Intel’s LG1366 and LGA2011 platforms as PC tech, it’s really server and workstation) has only crept up in that time frame. And unless AMD’s HSA thingie does something spectacular (unlikely!), I don’t see much reason for that to change in the next five years.

Still, you can make an argument for the CPU side of things being good enough. So, let’s recap. We’ve got gaming PC sales and PC gamers themselves rocking on and in numbers, the important graphics part of the hardware equation pushing on, the latest console tech tending to magnify the performance of PCs and good reason to think a new wave of technical innovation on the game dev side is on the way.

Oh, and lest ye forget, with Oculus Rift and others edging closer to retail reality the next five years should also see VR technology finally become viable. Big, high def screens are only going to become more affordable, too. Ditto large solid state drives. Mix that all up and I reckon you have a recipe for a golden age of PC gaming over the next five years. And then what? And then Moore’s Law, that’s what.

Give it up for Gordon (Moore)
Just a quick note on what Moore’s Law is for the uninitiated. I’ll quote myself from a piece published elsewhere because I’m pathologically idle and it seems to do the job.

“It’s simply the observation that transistor densities in integrated circuits double every two years. Put another way, Moore’s Law says computer chips either double in complexity or halve in cost – or some mix of the two depending on what you’re after – every couple of years.”

Now, Moore’s Law is a bit like peak oil. Every time somebody predicts the oil will run out or Moore’s Law only has a few more years left to run, somebody finds more oil and the wizards in the chip fabs find a way to keep the party going.

And yet the fun must end one day. Oil is probably a bad example because it can be synthesised from sunlight, water and atmospheric CO2. But I digress, the point with Moore’s Law is that we are fast approaching the limits of physics as they apply to current chips.

Prophet of doom
You can read an interview with the latest doom monger, a chap from Broadcom here. But the basics go something like this. Today we’re at about 20nm regards production technology and transistor gate size.

When we hit 5nm, the gates will be just 10 atoms wide and that may well be the limit of functionality. Even if you could shrink a transistor down to an atom or two, that would still only represent a stay of execution. Based on transistors, there is a physical limit to computer power as we know it.

The guy from Broadcom also thinks things have already fallen off when it comes to commercial viability. It’s becoming crazy expensive to tool up the latest production tech. For me, the analogues here are jet travel and human space exploration.

The fastest passenger jets are now much slower than the best from the 1970s. The absolute horizons of human space travel have shrunk, too. That’s not because we can’t fly faster or put a man on Mars. It’s because we don’t fancy it. It’s too bloody expensive and nobody cares enough.

The same may apply to computer chips in coming years even if the physical barriers don’t come into play. Of course, a huge paradigm shift – perhaps quantum computing – may pop up. And there’s plenty of scope for clever coding to extract the equivalent of at least a couple of Moore’s Law cycles out of chip performance through more efficient software. When the hardware stops getting faster, that will really focus minds!

But my feeling is that computers – at least the ones we buy and use ourselves – won’t keep getting faster at anything like the rate we’re used to. Again, the transport analogy applies. For a while, further and faster seemed inevitable. But it didn’t work out that way. The assumption of ever faster and cheaper computing may suffer a similar fate, at least for a time.

For the PC specifically, there’s also the Windows angle and Microsoft’s ever-present ability to alienate the installed base. Windows 8 is not encouraging. Given how fast Android devices are multiplying and the prospect of mobile ARM chips closing the gap with x86 processors for performance over the next five years, you’d think the Wintel alliance of x86 processors and the Windows OS needs to be on top of its game to stand a chance.

All of which makes for a bit of a ramble. But really my message is pretty upbeat. Enjoy the next five years. It’s going to be spectacular for PC gamers. Just don’t assume it’s going to last forever.

, , .

119 Comments »

  1. SomeDuder says:

    So it’s not that we can’t build faster machines, it’s just that there’s no need to. I can definately see that point in my own situation (I7-920, 16 GB RAM, GTX 3-something, bought in 2009 and plays all the games). Lacking in graphical oomph? Just replace the GPU, no need to replace you entire motherboard, etc. Something which we couldn’t imagine 10 years ago.

    Stuff like Occulus might be good fun and at least a temporary advantage over consoles, but you can bet that, if it takes off on PC, the next-next-gen consoles will have a similar gadget. Personally, I’m more interested in the science of quantum computing and holographic storage, which is still very much in the pre-prototyping phase.

    • KhanIHelpYou says:

      Except that’s not true, unless you’ve done some serious overclocking your rig will chug trying to play Arma 3 or Planetside 2 and complex games are only going to get more demanding. To get better performance you need a new CPU, yours runs on an LGA1336 socket so to upgrade you have to buy a new motherboard with an LGA1150 socket. And that RAM from 2009, its DDR2 right? You likely need a new motherboard to get support for DDR3 to upgrade it at all and in another year and a half DDR4. In fact the only things you can upgrade without replacing your motherboard are your graphics card and your hard drive but a Titan and an SSD are not going to help you when your helplessly CPU locked at 2.6ghz in 2 years or so.

      If your gaming hobby consists entirely of less demanding indie games that computer will be fine for years to come but if your tastes wonder anywhere near demanding games like ARMA, PS2, BF4, the Witcher 3 or a whole heap of other games that are going to come out, a full system upgrade will be unavoidable just like always.

      • hatseflats says:

        Except what you are saying is totally not true. Sandy Bridge brought some 25% performance increase to Nehalem, Ivy Bridge and Haswell some 5-10% each. In total, the difference is about 40% in games. Mildly overclocking an i7 from 2.8 to 3.6 is going to improve performance by about 25-30%. That means with an overclocked i7 from 4 generations ago you still get pretty close to current high-end performance. I myself am still using an overclocked i7 860, and I can’t say I’ve ever noticed a lack of processing power, not even in PS2 for the few hours (5 or so) I’ve played it, which included some pretty large scale battles.

        • KhanIHelpYou says:

          With respect, I did say “unless overclocked,” maybe not in so few words, but some people cant or wont overclock their systems. Your right though, if you overclock and keep squeezing every ounce out of the high end you can keep it for a while. You could overclock the 920 to something close to 4GHz but would need good cooling and maybe a better motherboard anyway.

          I just don’t really see how that’s any different to the situation 10 years ago. A top of the range Pentium 4 could have been oc’d to something like 4.2GHz and lasted quite a while even with the paradigm shift to multicore.

          • hatseflats says:

            Presumably, if you built the rig yourself, you bought a proper motherboard and cooler anyway, as that’s not expensive nowadays either. And the Nehalem generation already used DDR3. I upgraded my RAM to 8GB and my GPU to a upper mid-range GPU (HD7870) and I have zero performance issues in any game (including e.g. Witcher 2 maxed out).
            In the Pentium era, I had a single core AMD CPU, which back then were better than the Pentium 4 in gaming. Within two years, when the first proper multi-core CPUs arrived, it was already getting sluggish. After 4 years, CPU performance was inadequate even for basic tasks like web browsing and office use. My i7 is also 4 years old, and cost about the same, but is still adequate to do the most CPU intensive tasks. The situation really is different compared to 8 years ago.

          • Corb says:

            If you’re serious into your PC you aren’t letting yourself being stalled by a 7 year old 2.6 cpu because you spent the money on a good processor and mobo. Also, the most important upgrades you can make for gaming are your graphic card and adding an ssd in this day and age, maybe adding in more ram (you bought a board that supports up to 32gigabytes right?). And seriously, if you are using ddr2 in this day and age you are woefully behind and you don’t how building a pc works.

            When you’re building a gaming rig it’s A) timing, you build it after the market stops releasing all the new stuff and it stagnates for awhile and B) you buy more expensive high end parts (read cpu, and mobo) that are going to last long term. If you have 16-32 gigs of ram you’re set for like 10 years on that, your cpu and mobo were purchased with longevity in mind and you can upgrade the graphic card/psu to your hearts content for a solid 5 years at least.

        • Tripkebab says:

          *cough*
          http://www.guru3d.com/news_story/core_i7_haswell_e_engineering_sample_surfaces.html

          8(16) Core K edition Haswell-E CPU’s on the very near horizon, should give a nice boost to the otherwise stagnant CPU market.

          *Spoiler* New Mobo socket required.
          *spolier2* Native USB3, SATA6g and DDR4 supported.

          • Jeremy Laird says:

            Not really a desktop chip, though, is it? Really a server and workstation chip.

          • Sakkura says:

            But that’s a pretty arbitrary exclusion of a selection of CPUs available to, and marketed to, consumers.

          • Jeremy Laird says:

            You may not agree with it, but it’s not remotely arbitrary.

            LGA2011 platform is engineered for workstation and server and has numerous technologies of total irrelevance to the desktop which you nevertheless end up paying for. Just because Intel puts a desktop badge on them doesn’t make them desktop class products – if the term ‘desktop PC’ is to actually mean anything, that is.

          • Tripkebab says:

            I do believe all the new Nvidia cards are rebranded business cards, come to think of it Windows XP and up are running on the Windows NT platform which was also built for business servers.

            Would you want to live in a world where were all still running Windows ME and overclocking our Voodoo 2 SLI graphic setups?

            Tis the way of the world dear boy!

        • Sakkura says:

          Those performance improvements are per clock cycle. Anyway, overclocking an old CPU is irrelevant when you can overclock a new one too (though you would have to factor in how much each CPU can be overclocked).

          But more importantly, newer CPUs support newer instruction sets. Which means the performance gap can grow DRAMATICALLY when new software begins to rely on newer instructions that can do the same work much more efficiently.

      • Borsook says:

        You’re forgetting one factor – monitor. My computer is based on AMD’s APU, middle of the line, theoretically miles away from what gaming computers sport. Yet, I am able to run every single title I have at high details because I am running at 1366×768. Due to things like desk space and dislike of bigger screen I do not plan to buy anything with a higher res, so I expect my computer to be able to fuel all my gaming needs for a long, long while.

        • SanguineAngel says:

          Yeah, I am in that exact postion – I am running an i7-920 with only 6 GB of RAM and have yet to encounter a game that I cannot run at max settings. I reckon the reason is certainly because I am using a monitor at 1280×1024 resolution.

      • Premium User Badge

        Continuity says:

        Yeah you’re right, I have an i7 920 and at 2.6 its just not good enough any more, i’ve been running it at 3.5 for a couple of years and its just hanging in but there is probably only another year or so before it becomes a serious bottleneck. As it is that overclock gets me an extra 20fps or so in ARMA, and that’s the difference between playable and problematic; I’m sure I could get even better fps with a newer mobo/CPU.
        Still, I build this system in mid 2009 and its still going strong with just one gfx card upgrade, that’s something that would of been unthinkable in the years before, so I can’t complain.

  2. bar10dr says:

    In about 30 years someone will call you in order to get an interview about this article in conjunction with a piece about technological predictions always being wrong. At that exact time the memory of reading this comment will race into your frontal lobes and you will freak out, find this comment again, and then make a blog post about it.

    • Jools says:

      The sad thing is that a lot of the limits we’re discovering nowadays aren’t engineering limits (which we have a tendency to plow through in spite of our best predictions), but scientific ones. You can only make electronics components so small, and heat dissipation becomes an issue long before you even hit those limits. Saying that we’ll all be proven hilariously wrong is a bit like suggesting that someday we’ll laugh about that silly ol’ speed of light thing. It’s possible, sure, but it’d require such an upheaval in our fundamental understanding of how things work that nobody will seriously look back and say the naysayers were misguided. It kind of sucks to know that my kids will never really experience the sort of whiplash inducing technological growth that I grew up with, but that’s just the way it is.

      • kalirion says:

        The point is that the these scientific limits apply to the way we’re currently doing things. We find other ways, and different scientific limits will be involved.

        Right now we make things faster by making transistors smaller. There’s a limit to that – so we’ll stop using transistors and move to something else (quantum computing, etc.)

        Similar with the speed of light – we accelerate by pushing (road, gases, etc). There’s a limit to that, so we’ll move to something else (wormholes, etc.)

        • Shuck says:

          Either way it’s the end of Moore’s law. A radically different sort of computing technology might eventually give rise to a similar law, however, but in the near future there’s no replacement for conventional silicon chips, so we’re looking at something of a plateau, compared to recent decades.

        • Jeremy Laird says:

          If it was just a matter of finding a way to do faster computers, I would tend to agree. But there’s the cost and broader how-much-do-we-want-it angle.

          That’s why I brought up jet travel. If someone had done an article in 1975 saying that 40 years hence, passenger jets would be stuck be stuck at 550mph and we hadn’t gone back to the moon much less Mars, a chap called bar10dr would have written a cross letter to the Times pointing out the folly of his words.

          I’m not really making predictions here, just highlighting assumptions that may no longer apply and picking out a few bits of evidence that the status quo, at the very least, appears to be changing.

          Firstly, there are both financial and physical barriers on the horizon for the existing approach to computing. Secondly, there’s no reason to assume the demand for increased compute power from consumers will keep on growing. Just like 600mph jets have turned out to be fast enough to meet demand, there are signs that we’re not far off “good enough” consumer computing on the CPU side, though admittedly not for graphics.

          All of this could be and probably is wrong. Of course, if you take on the challenge of defining precisely how it’s wrong, you’ll be proved wrong, too! It’s all probably wrong, but almost definitely not for the reasons you might think it is!

          • grimdanfango says:

            I find when it comes to the games industry, everyone seems to expect that advances in graphics at least will be driven entirely by moore’s-law-associated technological advances… as it was back in the Doom days, where the technical, mathematical, and artistic abilities of game developers far outstripped the technology they had to work with.

            It just isn’t like that any more. Even the most high-end photorealism in graphics is driven far more by the scientific and artistic abilities of the people creating the game engines and content itself. If it was simply a case of technology alone driving the quality of visuals, every game that chose to would look as photoreal as Battlefield 4/Watch Dogs/The Division, and it wouldn’t impact on other aspects of the game design to do so.

            The reality is, it takes a combination of prohibitively massive budgets, extreme talent, and long, long working hours to create such wonders. Often to the detriment of the rest of the game.

            We’ve long since passed the point where Moore’s Law counted for much in the games industry. The power available to anyone these days far outstrips the abilities of most people, and the limiting factors on those rare few with the talent to even make use of what’s available are entirely human.

            Until artists manage to catch back up with technology en-masse, and it becomes the norm to utilise the massive procedural content-generating potential of even current generation computers to surpass what artists can currently achieve using traditional modelling/texturing techniques, Moore’s Law isn’t going to be driving anything.

  3. IllI says:

    Moore’s Law is not a law but a pseudoscientific “observation” at best. Any more woo woo apart from this and radfemism you espouse here? Need to update my journal.

    • james.hancox says:

      Errm, I probably shouldn’t prod the bear, but… “radfemism”? What the hell is that meant to mean?

      • TillEulenspiegel says:

        It means “waaaah, I don’t like feminism because reasons”.

        Among people who actually know what words mean and what’s going on in the world, radfems are strongly associated with transphobia, so at least as a term it’s generally avoided by decent people these days.

        • dE says:

          That’s one meaning within gender studies. There are other meanings as well. Just because you like this one, doesn’t mean it’s the only one. Unless you’re one of those that can’t differentiate between the different meanings a word takes on, depending on the situation.

          /late edit and my last comment on this particular subject:

          I find it highly ironic that you claim to be the one “in the know”, yet are completely oblivious to one of the central pillars of gender studies: The idea that culture, upbringing, language and social capital influence peoples identity, every day life and not least of all their use of language. And how it’s almost entirely a discourse about power and who gets to have a say within said discourse.
          Kinda goes to show you’re not actually that much in the know, despite claiming the privilege of interpretation – and not as enlightened as you think yourself to be.

          • tasteful says:

            lol shut up dE

            the transphobes they’re talking about self identify as radfems and radical feminism is a defined movement, not simply the most radical wing of feminism. there are many more radical branches.

      • Premium User Badge

        PeopleLikeFrank says:

        Like regular feminism, but with more ninja turtles.

      • Bull0 says:

        It’s a portmanteau of the words “radical” and “feminism” and means “radical feminism”. I also think feminism is pretty rad, in the 90s, Point Break sense, anyway. Rad like a movie about bankrobbing surf dudes.

      • Stromko says:

        Some people seem to think the observation: “Gosh, it sure seems like the majority of games are sexualizing women as hard as they can, odd innit?” is somehow radical.

      • dE says:

        It refers to extremist nutjobs. In this case on the side of feminists. Those that give feminists a bad name by spouting crazy nonsense. In these discussions, its counterpart is the mysoginist. Another extremist nutjob. Neither of them are particularly frequent and might be about as rare as the Yeti. But as with the Yeti, people will claim they’ve seen them. Lots. And in some odd kind of internet law, they will refuse to admit there’s anyone BUT these yetis when they chose the words and disdain with which they approach these discussions.
        The result is quite the comedy. Although in a rather traditional understanding of the word.

        • TillEulenspiegel says:

          Oh look, another person who has no idea what “radical” means in any context, let alone this one.

          • dE says:

            Congratulations, in your attempt to attack me, you’ve completely missed my entire point to begin with. Try again.

          • Premium User Badge

            Jackablade says:

            It’s a synonym of “bodacious” and also “tubular”.

    • Viroso says:

      See there, the kind of person that throws “radfeminism” in a sentence like that has GOT to be the kind of person that’s pedantic enough to feel like explaining everyone why Moore’s law isn’t really a law.

    • iucounu says:

      Yeah, I don’t think anyone (least of all Moore) has ever claimed it’s anything other than an observation. But then your use of the term ‘radfemism’ very reliably marks you out as a dipshit with reading-comprehension problems, so that would explain it.

    • Premium User Badge

      Gap Gen says:

      Oh no the wimmins the wimmins, come to take our games and precious bodily fluids.

    • Deadly Sinner says:

      I think he understands that, considering he spent half the article explaining that there is a hard limit on transistor count and that progress my already be slowing down. You may want to read the article next time, rather than posting your knee-jerk reactions based on the title.

    • Premium User Badge

      MajorManiac says:

      I always thought Moore’s Law was about not poaching sheep.

    • Baf says:

      To be precise, there are two possible reasons to throw complaints about “radfemism” into a comment in a post that has nothing at all to do with it:
      1. You think that complaining about feminism at every opportunity will afford you some sort of victory over it. In which case, you lose: just by mentioning it, you have started people talking about it in a thread where they were not talking about it before, and most of them are mocking you.
      2. Alternately, perhaps you are a troll, and you are trying to derail the comment thread. In this case, well done! But not all that well. The structure of the comment threads here tends to isolate digressions, and so right below this branch you will see people continuing to comment on the substance of the article.

    • Niko says:

      Moore didn’t know about topological insulators.

    • Premium User Badge

      phuzz says:

      Looks pretty good for an observation.

    • TheMightyEthan says:

      You misapprehend the meaning of “Law” in this context. A scientific law is simply a statement of how the world works based on past observations. It doesn’t even have to be correct in order to be a law.

      • gekitsu says:

        you, sir, make my heart tingle with all the warms.

        far too few people get that the laws of empirical science are models, derived from observation, that do their job reasonably well, and not metaphysical statements about the True Inner Mechanics of the World(tm).

  4. Premium User Badge

    SuddenSight says:

    Moore’s Law is already kind of done with. The original concept of Moore’s Law stated that everything about the computer would continue to improve – price, performance, everything. But clock speeds have already stagnated (4 GHz turns out to be about the fastest a computer can go).

    The idea *was* that all the key costs of computers – number of transistors, materials per transistor, power required, achievable speeds, and more – where all based on the *size* of the transistor. If you make them smaller, the whole thing is better. But already smaller transistors doesn’t help with the power usage, which in turn has limited clock speeds.

    People will talk about how the transistor cannot be made smaller, but that isn’t really the problem. The problem is smaller transistors don’t give as much benefit as they used to – it costs too much and doesn’t improve key specifications like speed and power usage.

    HOWEVER, as mentioned, this is not the first time the end-of-days was predicted. Multiple times over the last few decades people have conjectured that Moore’s Law would end, and it found a way to limp on. Someone may yet come up with a cool idea that gives us another twenty years of growth. Or maybe not. The idea had better be pretty brilliant, because the current problems are large.

    Regardless, computers will continue to improve for at least the next 10 years with or without Moore’s Law. The improvements due to smaller transistors were so good that other design features – such as multiple cores, better architectures, and cleverer algorithms – haven’t received as much attention from the hardware community. Now that the golden goose is about to lay it’s last egg, those contributions will once again come under scrutiny, and there is plenty of room for improvement.

  5. mouton says:

    Duh, nothing lasts forever.

    • TheVGamer says:

      Except for the wait for that small indie game coming from a small indie dev called Valve…

    • MichaelGC says:

      By my reckoning, “it” will last approximately five years. XD

  6. denizsi says:

    The next big shift will most likely be new cooling techniques to improve the current technology and then the mass-migration to an entirely new architecture in about 15 years to benefit from a new computation tech-tree into the future which quantum computing will be a part of. Once that happens, hitting any physical limits will be further off in future due to the opening up of entirely new computation technologies so we are way way off from hitting any physical limits in our life time.

    State of economy is a far bigger concern as was with the space program, as the interregnum leading up to the next shift will likely lead to a relatively long financial stalemate that may cause some recession and regression but it is possible that field will also likely be covered by mobile technologies catching up with the high-end desktop and the technological breakthroughs achieved in the process reflecting back on the desktop in some form and then perhaps a new stimulation to be caused by the advanced mobile display technologies finally making it to the desktop in return and keeping the consumer circulation alive.

    nVidia is already preparing for it with their new display chips to prevent tearing at high frequencies. That is just the tip of the ice-berg. Clockwise, we have been slowing down but we have yet to witness the technological spill-overs from other industries to the desktop market and to cover the neglected needs of the desktop market.

    • james.hancox says:

      The “interregnum” will probably kill off quite a few companies. Technologies will continue to consolidate and merge into one, in an attempt to squeeze the maximum efficiencies out of a now strictly limited number of transistors.

      We’re already seeing it happening in the graphics market. APUs (CPUs with integrated graphics) are slowly eating the GPU market from the bottom up- the market for bargain basement graphics cards has completely imploded now that both Intel and AMD have perfectly competent graphics built in. It’s only going to continue- sharing resources between the two just lets you make the most out of a set budget of transistors, power, and cooling capabilities.

      • fabulousfurrygingerfreakbrothers says:

        It happened with sound. When I moved to making music on my PC in 2000 I had to spend £150 on a sound card. Now it’s all there on my motherboard.

        I think the CPU stagnation comes from a desire to develop APUs. Reported Skyrim at 30fps from an i5 Haswell is pretty impressive, and there’s probably more value for Intel in researching that aspect than dedicated CPUs.

  7. james.hancox says:

    Perhaps a 10 year period in which computers don’t get faster will actually be good for video games. A period of stabilisation, a period in which developers no longer need to race to keep up with the latest tech- a period in which we can actually focus on what makes a good game, instead of just a pretty game.

    Do people stop writing novels because you can’t produce books any more cheaply than you could 30 years ago? Of course not! The video game industry is about making exciting and enjoyable experiences. We’ll find plenty more of them, even if we don’t get 50GHz processors.

    • InternetBatman says:

      You can produce books more cheaply than you could 30 years ago…..

    • Viroso says:

      Eh, I don’t think the “race for graphics” is really a thing. I mean yes totally they totally put tons of money on having good graphics. Even indie developers make a big part of the budget having good art for their games, it’s how it stands out.

      But what I don’t think is all that true is the idea that developers focus on graphics and forget gameplay. That’s a notion that started long ago and still endures even though we just keep getting tons of awesome games, some with really good graphics. I say this as someone who has grown to largely ignore a lot of AAA games. I still think people are working on having awesome graphics AND awesome games, both at once.

    • Shuck says:

      Yeah, I’ve always felt that if computing power hit a plateau, that’d be good for computer game development: Not having to constantly re-build engines for ever-changing specs and instead being able to focus on making tools for cheaper content creation, having a consistent level of processing power to count on, and tricks you figure out with the hardware are still relevant for the next development cycle, etc. And good for gaming, too, of course: not having to constantly replace hardware, knowing that any new game would be playable on your machine, etc.

  8. foda500 says:

    The whole “PC sales are falling” thing is pointless as far as PC gaming is concerned, what you should look for is the amount of discrete GPUs being sold every year and from what I’ve read those seem to be on the rise.

  9. phenom_x8 says:

    another deep and complete article about the end of Moore Law
    http://herbsutter.com/welcome-to-the-jungle/

    • Premium User Badge

      stahlwerk says:

      Oooh interesting! Thanks for sharing that!

    • BlueTemplar says:

      Yes thank you. It’s funny how after taking the time to read the article you linked, most of the comments here seem ignorant now…

  10. Bull0 says:

    The computing power of PCs already outstrips what companies are able to meaningfully do with them in terms of their game budgets. We’re going to end up with vast power, but little to do with it, because developing games that make good use of that hardware will be prohibitively expensive. We’re nearly there already.

    • Premium User Badge

      Carra says:

      On the plus side, we’re no longer really forced to upgrade our pc’s every three years if we want to play the latest games, unlike 15 years ago.

      On the down side, seeing how Intels new processors are 8% faster than the previous version is disappointing… And it looks like they’ll just be focusing on the mobile markets in the coming years.

      • Bull0 says:

        Totally. It’s nice not needing to rebuild my PC every couple of years… but I also miss getting the huge technical leaps forward every couple of years that we used to. No arguing that things are stagnating a bit right now.

    • Stan Lee Cube Rick says:

      Plenty of room for the bloat and spyware to grow into.

  11. Text_Fish says:

    Err. If we do hit some sort of ceiling where technology is concerned it will only further the PC’s interests, because it’ll close the performance gap between PC and Console, leaving longevity as the deciding factor and you can’t get more longevous than a PC because you’re free to upgrade individual parts as they wear out.

    • InternetBatman says:

      At the same time, hitting a ceiling would most likely give consoles an inherent efficiency advantage, which is not something I welcome. However, the inherent advantage will shrink as processor get more powerful and consoles get more capable/more features in their OS. It’s refreshing to think that right now the Xbox One OS uses more resources than Windows 8.1 and far more resources than even hogs like Ubuntu.

  12. DickSocrates says:

    Here’s how to get around Moore’s Law: When CPUs become as small and fast as is physically possible, you put another one in the case next to it.

    BOOM.

    • Geebs says:

      Or just find ways to speed up the I/O – which is why an SSD can make it feel like you bought a new computer.

    • Tei says:

      Theres limits on that too. Imagine you are poor, and on your toys room theres only one toy, you will find it almist instantly. Now you become rich and buy many toys, your toys room is now a huge room with many things, if you are dying and you ask your minions to find a particular toy ( your rosebud ) they will take a long time to find it. Mere agregation create more work and can be a killer in the end.

    • Amun says:

      That’s my argument to the folks who think processor power is limited! It’s only limited per unit area!

      I see a future where I have an entire basement of processor racks submerged in a non conductive fluid with the waste heat being extracted and used to heat my house. Then I would use this immense processing power to play Dwarf Fortress.

  13. Didden says:

    Thanks Jeremy

  14. CookPassBabtridge says:

    Who cares about jetpacks? Just give me my damn hoverboard

  15. geldonyetich says:

    I think that games may be improving, but it’s actually being done in spite of Moore’s Law.

    The thing is, the developers had been investing big, big bucks in trying to make really high fidelity games. Such games require big budgets, not because of the cost of the technology, but because of the cost of hiring professionals capable of rendering usable content that actually takes advantage of that technology. (Just read the credits screen for Deus Ex: Human Revolution, it’s comparable to the size of a blockbuster movie!) Only trouble is, these investments tend to blow up in their faces because they’re not recouping the money that went into creating that thing.

    A lot of that inability to recoup the price of development is due to piracy but lets not get into that barrel of fish, and I’ll say competition is a major factor, too: when one outfit is making a technologically advanced game, that’s a hot commodity, but when everybody is making a game like that, it’s highly unlikely the demand will be sufficient to make sure everybody gets paid. MMORPGs are one such example: quite lucrative when there were only a few of them, but now competition is so fierce they have to offer the game for free just to get people to try it out. (This generally pays off because, after all, MMORPGs are addictive by design – the real trick is just to get the fish to nibble on the hook.)

    When the big companies migrated most of their resources to potentially more lucrative platforms, the indies moved right in and exploited an audience which is largely sick of clones. Here is where games actually have a chance of getting better, because the prevailing problems of investors being afraid of investing in anything that isn’t a braven clone has been strangling the game market for so long, many of us have even forgotten it is happening. Note that indies usually don’t have the money needed to develop high fidelity stuff, and this is why we’re seeing a lot of pixel art. (Do we really even need cutting edge 3D graphics? Not really, they’re great for producing breathtaking sights, but they contribute next to nothing to quality gameplay. My condolences to the type of gamer who buys hardware in order to render Crysis 4 at maximum resolution in 6 monitors at once.)

    So technology and Mr. Moore are basically being given the finger here: not many people care about transistor count anymore, we got plenty of firepower to do what needs to be doing, and the real challenge is just producing something worthwhile to do. Incidentally, this is also the reason why the PS4 and XBone are trying to push Social Media and TV (respectively) as their primary selling points: when the need for additional computation firepower is moot, there was absolutely no need to introduce a new generation of consoles, but that infusion of money is a hard habit to kick. (Then they gutted reverse compatibility deliberately, and this was essentially committing suicide seeing how the real bottleneck was in worthwhile games to play.)

    • Shuck says:

      It’s always been true that most games commercially failed. This is largely because they’re creative works – most books don’t sell enough that they were financially worthwhile for the creator, most movies lose money in theatrical release (though secondary markets of various forms of video more than doubled profits when they came along and greatly reduced the number of failures – the decline of video is causing a bit of a crisis in films right now), etc. The return-on-investment for successful games used to be a lot better, though; when the game costs half-a-billion-dollars to make and market, there can only be so much of a profit. Most indie games lose money, too – they’re just losing a lot less, individually, because of smaller teams. Collectively I wouldn’t be surprised if they weren’t losing more, though that’s really impossible to measure. The indie boom right now is being fueled in large part through the individual savings of developers. I think we’re about to see a wave of disillusioned developers who went indie, wasted a couple development cycles worth of their savings without seeing any real returns, and have given up.

  16. tormos says:

    CPUs topping out seems like it’s not necessarily a bad thing. From my understanding part of why game design is such a crapshoot is because any major release is either based on the now antiquated tech that was current when it started its 3+ year development or has been kludged to work with whatever’s on the market now, slowing down the process and making it less likely to use the available hardware to its full capacity. Maybe hardware improvements slowing to catch their breath would allow spiraling development costs to do the same? Also
    de-emphasize “crunch” which I believe is inspired in part by the fact that the last part of the cycle is the most efficient time to work.

    • tormos says:

      Tl;DR this might not be a bad thing unless you own shares in Dell

    • Shuck says:

      That may have once been part of crunch, but now it’s just due to bad management and a dysfunctional game development culture. AAA games are aimed at consoles, and console generations are much longer than game development cycles now.

  17. dogoncrook says:

    Moore’s law is probably already broken on the research end, and definitely dead for consumers. The next 10 years will be dominated by gpu based parallel processing, and advances in programing. That’s where the real gains are. Password cracking has pretty much shown how ridiculously wastefull cpu based processing is. I imagine while we are stuck with silicon, cpu’s will eventually just become traffic cops while stacks of cuda cores rip through data from a cloud. On the other hand it’s hard to see game devolopers switching over if it leaves everyone with a single graphics card behind. Ironically the market that would be best to shoot for would be gaming laptops, and computers with Integrated gpu’s. It’s also worth noting nvidias physx cards failed miserably and that’s the type of architecture that when updated would be best suited to massive parallel processing.

    Edit: cool example for anyone interested
    http://blog.paralleluniverse.co/post/44146699200/spaceships
    Source is available in github and linked in the article to play around with. It’s a good example of how we could see dramatically better performance by rethinking how we program on the hardware we already have.

  18. Consumatopia says:

    People buy consoles because it’s simpler and probably cheaper to have sufficiently powerful console to play current games than a PC, and because consoles have much nicer games than tablets/phones.

    If you assume that Moore’s Law ends, and all devices end up with similar performance, and people keep using their devices until they break, the reasons to buy a console disappear. People will just buy a dongle that plugs into their TV that streams from the PC, tablet or phone. Heck, TVs will just be sold with WiFi streaming built in. Plug a controller into your phone or laptop, now you have a console.

    Consoles should be much more worried than PCs.

    • dogoncrook says:

      I’m not sure when but nvidia is moving towards cloud based gaming, and will eventually rent out tesla rigs for gamers, which eliminates hardware, headaches, is scalable, and is only limited by bandwidth. Once Google fiber takes off, other than privacy there won’t be much need to buy boxes at all.

      • Consumatopia says:

        Probably true. And really, who even cares whether our game playing is private? So I guess the future is screens and input devices all hooked to the cloud.

        I suppose the latency might be a bit of an issue for VR. It would be an even bigger issue for robots using GPUs for vision processing.

        • magogjack says:

          Privacy, yeah who cares. If one person looks through your window they are a pervert, if its the government or a company then no big deal right.

          Information is power, all of it. Anyone who wants it wants to control you on some level, we shouldn’t make it easy.

          • Consumatopia says:

            I’m actually totally with you on this one, but, in all seriousness, if the only reason for people to buy $1000 or even $500 machines is because they don’t want The Man to watch them play high-end video games, no one is going to buy them.

            The problem is that privacy is mostly a social, not individual good. Evgeny Morozov isn’t always the most reliable writer, but he’s right about this: http://www.technologyreview.com/featuredstory/520426/the-real-privacy-problem/ The problem with surveillance isn’t that the government will learn my precious and terrible secrets, the problem is that surveillance threatens democracy. But that just makes it like every other tragedy of the commons–and we know how those go.

            EDIT: Though, if you stop and think about it, you shouldn’t have to read Leviathan to understand why people would be more afraid of their close neighbors or even friends and family spying on them than a distant company or government. I think this is actually a serious problem for alternatives to an All-Cloud-Zero-Privacy future–most people have no patience for computers, so if they need help protecting their privacy they have to call someone for help, and they’re forced to trust that person just like most people trust Google or the NSA today.

          • magogjack says:

            You know what? Your all right. I wish more people were like you.

        • dogoncrook says:

          For gaming yeah privacy isn’t a big deal. To be clear I don’t think people will buy pc’s for privacy, I think the trend will be to rent gaming rigs, and instead of owning a pc for daily tasks switch to tablets and glorified terminals like chrome book. I would imagine purchases of those will come down to peripherals, cosmetics, and form purchases because both are already powerful enough to get the job done.

          Latency kills it right now, but fiber lines will probably be in most major cities this decade. It kinda takes a totally new generation of programmers and training to utilize any of this anyways, most today don’t seem to be able to adapt to multithreading as it is. It also doesn’t work for everything as many processes will always have to be done sequentially. Switching to the cloud will surely have many incremental steps along the way that lessen the load client side and make things like titan rigs only neccessary for hackers and researchers.

        • Premium User Badge

          elderman says:

          I have an suspicion, which my knowledge of things like psychology and data mining isn’t comprehensive enough to confirm or allay, that in principle you could get a deep insight into a person by watching them game. In my more paranoid moments, I think it might become an important tool of control in the surveillance culture.

  19. Juan Carlo says:

    I think a bigger worry for PC Gaming is the impending crash of the steam indie game bubble. Steam’s releasing like 20 games a day now, where they used to release 3 or 4. They used to keep things balanced a bit via their arcane mysterious certification system. But now they are greenlighting 100+ games in every new batch of greenlight games. There can’t possibly be enough market share for all these games, can there be?

    There will always be the stand out successes, of course, but I suspect the smaller companies who used to get by on steam will find it much, much, harder in the future as competition heats up and everyone and their mom tries to get a same piece of the indie pie.

    • DrManhatten says:

      Yep agree Steam has become pretty useless and is going the way of Google and Apple App store. Too much stuff can’t find the good from the bad eventually people get frustrated and don’t buy anything.

      • The Random One says:

        Steam hasn’t become useless. Steam has always been useless; it’s just that as of late it’s been so hard for it to pretend it isn’t, that even trying to put up a show of its usefulness only underlines how useless it is.

      • Consumatopia says:

        Steam works great as a games download service. It works better than ever now that they’re lowering barriers to more games.

        Steam has always been a terrible way to find out which games you should play.

        • lautalocos says:

          that´s where youtube channels come in. and maybe user reviews, but they are just that, user reviews.

          • Grape Flavor says:

            For all the negative stuff that has been said about game critics, I find user reviews to be far more useless. How many times have we seen the user score for a fundamentally decent game dragged down to absurd levels by hordes of 0-voting buffoons, most of which who proudly admit never even playing the game and are just there to push the vendetta of the week.

            No, the best way to find out what to play are good old fashioned critics. I’m not talking IGN necessarily, but find a few websites you like and read what they have to say. And as deeply unfashionable as Metacritic seems to be around these parts, I find it invaluable for getting a feel for what the overall consensus on a game is. If you rely too much on one source you’re not going to know when the review you read is just an outlier and almost everyone else had the opposite opinion.

    • Titanium Dragon says:

      Who cares?

      The reality is that the good companies will survive and the bad ones will die. Metacritic and other reviews are now do-or-die; I simply don’t buy a game with a MC rating below 75 unless it has been specifically recommended to me or otherwise deeply intrigues me – and it is hard to get me to even look at your game if it has a MC below 75.

      The indie market was never the be-all, end-all anyway – it is the AAA games which mostly are awesome, with some exceptions.

    • Phoibos Delphi says:

      I don´t think something like an “Indie” or “Steam-Bubble” really is a problem. There won´t be an audience for every game released on steam. There isn´t an audience for every musiscian on Youtube. And still, there is new music. And you yourself have to find out if there is something out there you would like to listen to.
      Not every Indie-Game on Steam will be a new Mincreaft, Cave Story or Fez. But every Indie-Game on steam is an effort by one or more people. Just getting the game you worked so hard on up on steam is a success for most of the small teams. I also assume that a major percentage of those games is not the main source of income for the creators, most of the will have day-time jobs, just as most of the Youtube musicians do. Why deny them this recognition? Just for you to have a “clean” Steam?
      So I don´t see aproblem there, and as somebody already pointed out: Steam is a place to buy games. Get Your reccomendations at RPS or any other site. I don´t expect my supermarket to have a guy standing at the cheese fridge, telling me which cheese I would like to eat either. I read about it on the Goudaboard or cheddarfreaks.co.uk!

  20. DrManhatten says:

    As the author briefly pointed out it is not the technology or the physics that are going to call an end to Moore’s law or progress in hardware computing it is the financials, the bean counters. Chip manufacturing has become bloody expansive to stay on the top you have to invest billions each year. 20 years ago there were 10-20 real chip manufactures nowadays it is less than 10. Soon it will be less than 5. That whole mobile tablet/smartphone frantic crappy me thing is just accelerating this process. As unfortunately it is eating into the margins of “real” compute platforms. But you need the biggest largest user base to be able to afford to shelf out this ridiculous amount of R&D cost to get to the next process node. And it is not just the CPU side that is suffering from this syndrome it is GPU as well. And it is going to get worse very there rather quickly.
    The golden age of PC gaming ? I don’t see it maybe it was there for a short glimpse but it is going to be swept / swallowed away from this whole mobile platforms if we like it or not but that’s where the money is flowing right now.

    But fear not they will hit the power wall in a couple of years time too.

  21. Apocalypse says:

    If the constant shrinks stops, it means as well the fabs can be used for far longer periods of time, thus become less expensive investments in the long run as you don´t have to build new ones on very short cycles. Guess even with a physical limit on shrinks we may see still that fall in price and a new rise of multi-CPU systems. Hello HSA, async and asymmetric compute.

  22. lautalocos says:

    nobody will care about computer sales once we live in the matrix. or at least, the virtual world. which is actually the same as the matrix.

    yeah, that was kind of redundant

  23. Premium User Badge

    Klatu says:

    This article, and the comments below, are the reasons I literally love RPS. From PC sales to Moores ‘Law’ to Steam being shit and/or a bit not shit to Qbit processors. All with very little rancour. More of this sort of thing.

  24. Nate says:

    Mr. Laird, I find it odd that you feel that there is plenty of room for improvement with graphics, but that CPUs may just be good enough already.

    Faster GPUs will let us run at higher resolutions with higher levels of AA, while drawing increasingly detailed foliage (art budget permitting). Faster CPUs, on the other hand, will allow for more interesting AI, deeper levels of simulation, more procedural generation– in short, all of the things that I want to see more often in games. And judging from the posts I see, I believe that RPS readers want the same.

    If your CPU isn’t being stressed by current games, may I suggest that it’s not because the developers did everything they wanted with your CPU, but because they were developing for users with much slower computers?

    • drinniol says:

      You may suggest that, but you’d probably be wrong. Look at Limit Theory.

    • Jeremy Laird says:

      Ah, but most of the really interesting non-graphics things you might want to do in future games tend to lend themselves to parallel rather than sequential processing. Do you see where this line of argument is heading…?

      • Nate says:

        I get your drift :) But parallel processing is really hard to do well, and it does impose some limits on the way things are handled that aren’t there with serial processing. Certainly, if game developers want to stretch non-graphical boundaries, they’re going to have to figure out how to take advantage of more and more cores.

  25. HisDivineOrder says:

    Doomsayers have always been claiming, “Moore’s law will end!” and it always gets pushed back. I just don’t buy it anymore. I think by the time x86 runs its lifespan out, you’ll have ARM CPU’s or some other technology (quantum perhaps) being able to run x86 programs in compatibility mode at mostly the same performance as the final high end parts of x86 (if any ever were final). Otherwise, x86 will linger for a long time.

    Meanwhile, I think the real doomsday of the application will be when ISP’s start unchaining our connections and letting them fly faster than fast. Because that’s when applications will start running from the cloud. That’s the real way to bring down the cost of computing and the size of devices. Suddenly, you’re just buying terminals into superspeed computers that run games for us streaming.

    OnLive wasn’t wrong about what the future is. They were just making tabletPC’s before the iPad. They didn’t do it well or right, but their way will bear out eventually. That’s what their company’s going to make a ton of money doing. Taking other companies (Steam, Sony’s Gaikai, MS’s streaming, etc) to court to get settlements for a lifetime of royalty payments. They’ll be the Rambus of game streaming and game streaming has already showed up as application streaming. These are only the early days. It’s going to accelerate from here.

    Still, Steam will win the hearts and minds of users. I suspect Steam’s home network streaming will be used as a bridge to get people comfortable with the idea. Expect them to build support for it into their apps for Android and, Apple willing, iOS. Expect them to build more and more focus on it. Eventually they’ll offer you streamed demos. Perhaps even streamed rentals of games. In time, they’ll start selling games based on it, perhaps offering a carrot of some kind like being able to buy limited time with a game to beat it and move on, saying it’s an acceptable way to avoid paying full price.

    It’ll happen the way everything with Steam has happened. One day, you’ll think, “It’s absurd to have to use an online service to play a single player game.” But they’ll promise automatic patches and updates. “But I don’t want that. I can update games just fine.” Soon, though, you’re relying on that and you’re thinking, “What was it like before auto-patching?” Then they’ll promise you achievements. “Why would I want achievements?” Eventually, you’re getting achievements and corresponding badges and coupons and scores and you wonder, “Wow, how did I go on without these things?”

    So will it be with game streaming. They’re going to make it so you don’t even remember a time before it. That’s why you can’t count out their efforts on the Steam Machine. That’s just them setting the stage for selling you a really cheap front end to streamed game services for the people out there who MUST game through a dedicated game box.

    For the people who’d prefer to have a multi-use device, well they already own PC gaming, don’t they?

    • drinniol says:

      I dunno, I have no problem with having no achievements and badges. The problem with game streaming is the infrastructure has a long way to go. Like, a long long long oh say the amount of time it took Voyager to leave the solar system way to go.

      Bandwidth is limited – if your ISP removed speed caps you’re only getting ~20Mb/s unless you have fibre to the premises, less if anyone is on the same exchange, and less if the server you are getting your games from limits outgoing traffic. The real killer will be latency.

      Edit- Unless you mean streaming from a local PC only.

  26. Titanium Dragon says:

    The actual cause of this is quite trivial:

    Tablets still suck, PCs don’t. I have a four year old PC, and I know people who make do with 5 and 6 year old PCs. I can STILL run every game on high settings, four years down the road from when I built my rig. This was absurd back in the day, but now, it is the new normal.

    The reason PC sales aren’t going up is because PCs have hit the point of being “good enough” at a lot of things. Continuous improvement can make them better and better still, and I think that is STILL valuable from a corporate standpoint, but the reality is that I only need to replace my PC when I can’t run what I want to on it reliably. That used to happen in 3-4 years; that has stretched out today to much longer. And the longer you can use your computer for, the less often you need to replace it.

    Tablets, conversely, suck, as do mobile phones. People replace them far more often because the technology is so bad, but as it gets better and better, people need to replace them less and less often. What is happening is that everyone is buying up tablets. What is going to happen next is tablets will get “good enough” and tablet sales are going to crater. We’ve already seen iPad sales hit a barrier, and it is not unlikely other computers will suffer the same fate.

    Once that happens, the halycon days of selling new chips to people every few years will be over. Computers are transitioning from an emerging technology to an appliance. There will come a day when you replace your computer only when your old one wears out. We aren’t quite there yet – we’re still pushing upwards – but that day will soon arrive. And to a lesser extent, it already has.

    Everyone who is throwing everything to tablets simply does not understand this reality. The reality is that tablets are just as vulnerable to the same market forces as PCs are – more so, really, because they have PCs to leapfrog off of. Once tablets hit “good enough” their sales will crater even worse than PC sales, because it will happen much more rapidly, and PCs are far more useful than tablets are.

    As far as the idea of cloud everything – people love this idea. It doesn’t work at all in reality, though. Cloud everything SOUNDS glorious, but the problem is that it doesn’t actually work. It is very expensive and inconvenient to cloud everything, it is vastly wasteful of bandwidth (which is always going to be a major constraining factor), and it means that if you have a problem elsewhere, it screws YOU over personally. That’s aside from the major security concerns that cloud everything raises. The cloud is very useful, but it is also a dead-end technology – it will be very useful for some things, but you can’t substitute cloud technology for people actually owning the hardware on their end. It is much more practical, and much cheaper. The idea that people will do cloud gaming in the future is comical – sure, it is technically speaking possible, but it is actually incredibly economically infeasible from the standpoint of the people on the end. It requires everything to be subscription based and it leads to enormous hardware overhead on the part of game producers – increasing the capital cost of games FURTHER is an untenable solution for the market. No, that is not going to happen, no matter what suckers will try to sell you, because it just doesn’t work. Look at the SimCity debacle for what happens when you don’t have enough hardware for your servers, and now multiply that by ten thousand times to get an idea of what a fiasco that would be.

  27. SuicideKing says:

    Yeah, Intel will probably hit 5nm around 2019/2020…so it’s more likely we have 6 years more unless something goes very wrong.

    However, all signs point to 8-core mainstream CPUs once Skylake hits. There’s a fair chance Core and Atom pretty much merge architecturally in 2016 with Skymont, which should pave way for higher core counts without costing Intel (and us) too much.

    As the new current gen (TM) consoles are targeted exclusively, games should get more threaded, which will likely put some emphasis back on core count again.

    AMD’s pretty much sitting out of the CPU wars in 2014 (and i think 2015 too, not sure)…HSA if anything, will merely lower the barrier of entry to PC gaming.

    GPUs are still on 28nm, TSMC’s 20nm is delayed (and/or in too high demand), so Maxwell’s essentially delayed till 2H 2014. But yeah i think GPU tech will still follow Moore’s Law a year or two longer than CPUs, since they’re behind the curve.

    As for ARM catching up to x86 in performance, it’s difficult, at least without losing any efficiency advantages. Apple’s coming really close, though, i’m sure Qualcomm will strike back soon enough. For Intel it’s just a matter of time (a year more) before they’re ahead of ARM in efficiency and raw performance.

    Also, Nvidia counts 600+ million gaming PCs. No idea on how they’ve reached those numbers, but seeing that the total number of PCs in use is an estimated 1.2 billion, i think they’re overestimating/inflating the numbers.

    There’s also going to be a lot of CPU/Motherboard sales in the next year or two, as people like me finally decide to retire their Core 2 era CPUs.

  28. frightlever says:

    “Then there’s the ever-present threat of Moore’s Law hitting the wall.”

    The threat? More like the promise. We had decades of PAL/NTSC television, then there was DVD and we will probably have decades of HD television and HD video on demand. I don’t see 4k TVs taking off in a big way as the gains are marginal. They’ll be standard eventually but it’ll be years away.

    And yet the quality of the best television and movies that we get is better every year, while the core technology to watch it largely stays the same.

    If we do hit a wall on PC performance it means that every year the number of top-end, game-capable PCs will increase, as the technology becomes cheaper to produce. Eventually an entry level PC will be capable of photo-realistic gaming and the market will have been blown open.

    Hell, the technology might even be built INTO your TV. The equivalent to a SMART TV now could be a STEAM TV tomorrow. Can’t wait.

    • Rapzid says:

      4k is marginal? It’s quite incredible IMHO. I can tell the difference from 25+ feet away.

  29. DThor says:

    It’s the age of “good enough”, and I don’t mean that in an old man keeerack-SPIT bitchy sort of way, just the opposite – the tech has gotten to(well, getting to) the point that your average mainstream tech user simply doesn’t need a desktop for what they used the tech for. This is awesome. It also means it’s the first time since I stuck that 5 1/4″ floppy into the work PC after hours to watch an infocom screen light the room I really haven’t needed to upgrade my desktop, graphics card aside, for 4 years. I’ve got 12G of fast memory, 4 3.2GHZ cores, and I’m still playing new release games with awesome settings and no waiting. Previously just upgrading the graphics card was a waste of time, the surrounding tech had aged so badly it was sort of a waste.
    Desktops aren’t going away, professionals need them to code, make content, and enthusiasts (gamers /nerds) will grip their mouse/keyboard combos until you pry it from their cold dead etc, but the sales will keep plummeting. This will eventually translate to higher cost, but you’ll still buy it anyway, right?

  30. Checksumfail says:

    ITT: No one even considers that Quantum Computers already exist and it’s only a matter of time before they violently overtake traditional PC’s.

  31. JRHaggs says:

    One thing to consider when looking at GPU sales is the effect of bitcoin. High end GPUs are flying off the shelf for use in bitcoin mines. It may be possible to control for that in assessing PC gaming sales, but it’s worth being careful when correlating PC gaming enthusiasm to GPU sales.

    • Jeremy Laird says:

      Interesting point. However, two things:

      1. I thought the ASICs had taken over from GPUs for bitcoin mining. It’s futile to mine with a GPU now (not true for other cryptocurrencies, admittedly).

      2. Isn’t AMD miles better at hashing than Nvidia for some obscure architectural reason? I think everyone who buys GPUs to mine buys AMD. ANd yet Nvidia still outsells AMD. So the mining-related sales can’t be that big a slice of the market.

      • SuicideKing says:

        Both points are true, and now litecoins are the new fad, so.

  32. afarrell says:

    “Enjoy the next five years. It’s going to be spectacular for PC gamers. Just don’t assume it’s going to last forever.”

    The fact that the “it” in the last sentence doesn’t mean “the power available to us” but rather “the rate of change of the power available to us” is kind of what’s wrong with a facet of PC gamers, to be honest.