Week in Tech: The PC is Doomed, Long Live the PC!

By Jeremy Laird on November 21st, 2013 at 7:00 pm.

Transparent aluminum?!

Or maybe it’s the other way round. Anyway, the Xbox One only has 16 ROPs. I know, 16 ROPs. The humiliation. The humanity! Nvidia’s GeForce 6800 had 16 ROPs in 2004. No idea what I’m on about? It’s cheap point scoring from a smug PC evangelist, of course, but also just a single entry in a long list of reasons why the PC is looking pretty clever now the new consoles are roaming the wild. On the other hand, I’ve had a grope around the latest factoids and rumours relating to PC processors for the next year or so and the shape of things to come feels awfully familiar. Maybe the prophets of doom are right, after all…

Give those consoles a kicking
Firstly, then, those consoles. The usual suspects have done the blow torch, tweezers and digicam job on the new boxes and the result is that we now know all the dirty secrets. And Xbox One’s graphics only looks worse on closer inspection.

The 768 AMD GCN graphics cores to PS4′s 1,152 (and, of course, up to 2,816 on a single card for a PC) has been discussed for a while. It’s actually been known for almost as long that Xbox One’s GPU only has 16 ROPs to the PS4′s 32 (and as many as 64 per GPU on the PC).

How do you like them ROPs, Xbone?

ROPs ultimately define how many pixels a GPU can punch out per cycle, so they arguably are more relevant to the current will-it-won’t-it controversy over Xbox One’s ability to achieve true 1080p visuals. Yeah, apparently 1080p is a big deal in console land.

But like I said, that’s just point scoring. It’s the broader picture that’s more interesting. Things like a piss-poor launch catalogue of games, which matters since the new boxes don’t play old games. Then there are bizarre ideologically-driven limitations, like the PS4′s inability – for now, at least – to play locally stored video files. And the promising features that turn out not to work well enough to actually be usable. Like voice control and TV pass-through on the Xbox One. Oh well, yet another voice control system that sucks.

Hello, computer. Er, computer?

And how about game load times reportedly measuring in aeons or the slightly sinister all-seeing, always-on Kinect eye? In the spirit of full disclosure, I’ve not been hands-on with either or the new boxes. Sadly, all my displays require dual-link DVI output, so simply not poss. But overwhelmingly, my spidey sense tells me that to invest in either of these new consoles at this early stage would result in disappointment on a fairly epic scale.

Inevitably I’m preaching to the converted here, from a biased platform and with the knowledge that the Xbox One and PS4 will look very, very different in two year’s time much less five years down the road. These consoles will serve up some special moments in computer gaming, of that there can be little doubt. The consoles have plenty to teach the PC when is comes to couch potato gaming, too, though argubly the PC is finally catching up in the living room stakes as the likes of SteamOS and Nvidia Shield come on stream.

But I’m a bit of a purist in most things. I like my cars without driver aids. I’m thinking about eschewing newer smartphones in favour of an iPhone 4S for its form factor, construction and core functionality. And the same applies to the PC in general and games in parrticular. The PC sidesteps ideologically imposed limitations and compromises. You feel so much more in control of your destiny, so much less a slavish, anonymous consumer.

Show me the pixels!

Meanwhile, most PC hardware just keeps getting faster, keeps on innovating. Whether it’s detailed stuff like Nvidia G-Sync and the promise of super-slick frame rates, AMD’s new Radeon R9 290 GPU that brings massively more performance to the £300 price point than any previous GPU or the tantalising prospect of 4K gaming, the PC today is a very different beast to that of even 18 months ago.

Or how about emerging VR tech? The latest full-HD demos of the Oculus Rift sound orders of magnitude more exciting and, well, game changing than gaudy chintz of the new consoles. And it’s interesting – though hardly shocking – that the Oculus Rift’s creator doesn’t fancy the new consoles much as platforms for driving HD-and-beyond virtual reality headsets.

Oh no, not four cores again
If that’s why the PC rocks, I’ve also got one hand on the rug and feel a tug coming on. The latest scuttlebutt says Intel’s upcoming Broadwell family of mainstream CPUs (successor to the already-underwhelming Haswell generation) will be four-core all over again, despite yet another shrink to 14nm. They’re out in 2014 and will stick around at least a year, so we’re talking late 2015 at the earliest least for any significant upgrade on the CPU side from Intel.

OK, high end Broadwell chips will apparently get Iris Pro graphics. But you still need a proper GPU, so that’s more or less moot.

‘Orrible ‘aswell was bad enough, Broadwell looks worse

Meanwhile, AMD’s latest official roadmap sees existing Piledriver-based FX CPUs stick around in current four-to-eight core / two-to-four module format for all of 2014. No change at all. We’ll get get new Steamroller cores in the Kaveri APU with two-to-four cores / one-to-two modules, but the roadmap explicitly describes the old Bulldozer chips as maintaining performance dominance, so they’ll be as good as it gets from AMD next year.

There’s the good-enough argument for CPUs, of course. They’ve been fast enough for nearly everything for years. But this technological stasis on the CPU side still makes me a bit nervous about the future for the desktop PC. It also seems to me that there’s an easy win here for Intel in terms of PR. It says it wants to revitalise the PC. Actually upgrading its desktop CPUs significantly for the first time in years wouldn’t be a bad start.

, , , .

88 Comments »

  1. Sakkura says:

    “the roadmap explicitly describes the old Bulldozer chips as maintaining performance dominance”
    AMD just went full retard.

    • Artist says:

      How else should a cornered mouse without cheese react?!

    • Premium User Badge

      PoulWrist says:

      They might be dumping a lot of hopes on the success of Mantle which apparently reduces the requirement for a fast CPU greatly.

      • Premium User Badge

        Cinek says:

        We’ll see. Too early to tell, and it seems like next generation of Direct X will offer an alternative to Mantle. So I’m not worried. (Even more so as I happen to own Radeon GPU anyway)

    • waltC says:

      Nah…;) Am3+ simply won’t do it for Steamroller. IMO, it’s gonna need a new socket, much as AMD likes to preserve value by sticking with existing sockets for as long as possible and then some, I think ‘roller will need something a wee bit out of AM3+’s capability. Intel, now, they change sockets like I change underwear…;) (OK, I do change underwear more often.)

  2. Solidstate89 says:

    Meanwhile, AMD’s latest official roadmap sees existing Piledriver-based FX CPUs stick around in current four-to-eight core / two-to-four module format for all of 2014. No change at all. We’ll get get new Steamroller cores in the Kaveri APU with two-to-four cores / one-to-two modules, but the roadmap explicitly describes the old Bulldozer chips as maintaining performance dominance, so they’ll be as good as it gets from AMD next year.

    Glad to know I took the right approach at not bothering to wait any longer to see their Steamroller FX offerings before building a new desktop. I ended up just going wit Haswell because of AMD’s delays and seemingly uncaring approach to their high-end CPU range.

  3. geldonyetich says:

    From the sounds of things, the XBone’s processing power is the least of its concerns. Not when they’re pushing, “Enhances the television-watching experience” as the primary feature of the console. PS4 isn’t much better, with their, “Includes social media features” primary feature. I’m not sure what market the executives came from that were currently working at the top of these organizations, but are they completely unable to comprehend that gaming consoles are supposed to be about the games?

    On the PC front, I’m currently cruising along with a AMD FX-8120 CPU that has the true performance aficionados grimacing. Don’t I know that Intel is the way things are done these days? Well, there’s been precisely one game that’s ever required more CPU firepower, that being Planetside 2, prior to being optimized, so do enjoy having spent four times more on your cutting-edge chip that does absolutely nothing of value for you, Mr. Aficionado. (Unless maybe you have a lot of video to render.) Seeing AMD is still working on AM3+ socket chips makes me think I might even get a second AMD CPU upgrade out of this motherboard before it’s time to move on to the next one.

    • Penguin_Factory says:

      Well, the non-game features can be easily ignored. It’s not like they’re removing the ability to play video games to facilitate that stuff.

      • chargen says:

        Certainly not! They’re just reserving some of the cores, memory, and GPU for these capabilities and the neato instant app switching, giving you even less power than the modest amount these boxes seem to have in the first place.

        This time next year there should be steam boxes that are faster than these things at the same price or less, if one were so inclined.

    • Mercykiller101 says:

      Obviously, one horribly optimized game is all it takes to render your buying sense as superior to everyone who buys intel.

      • geldonyetich says:

        Funny how you read the sole game that would motivate a gamer to buy Intel as somehow being presented by me as a point in favor for buying AMD.

        Now, if there existed some examples of where a genuine CPU bottleneck existed in gaming, especially in a case where it was not a “horribly optimized” game, that would be a decent refutation of the point I’m making here.

        TL;DR, that point is essentially, “Sure, Intel has more powerful chips, but they charge way more for them, and when does the CPU actually bottleneck in gaming?”

        • Mercykiller101 says:

          If you’re running with an entry-level graphic card, the CPU will never bottle-neck your game. If you’re running a better performing card, stay far away from AMD CPUs. And I’m not sure where you do your shopping, but AMD CPUs aren’t that much cheaper than Intel’s mainstream lineup.

        • hamburger_cheesedoodle says:

          I’m here to confirm that Planetside is not the only game that bottlenecks on CPU. Both Mechwarrior: Online and Natural Selection 2 were bottlenecking for me, though they’ve both received significant optimization passes since I upgraded.

        • Thunder says:

          Simulation games like the DCS Flight Simulations or ARMA 2 / ARMA 3 games demand a lot of CPU and thats because there is a lot to compute like bullet trajectory, flight physics, AI of hundrends of soldiers. You are really lost in these games with an AMD CPU.

          • SkittleDiddler says:

            “You are really lost in these games with an AMD CPU.”

            Nice fanboy conjecture there. Have you even bothered to play any of those games with an AMD CPU?

          • Sharlie Shaplin says:

            They have no personal knowledge, they just repeat stuff they heard.

          • stupid_mcgee says:

            It blows my mind that people even still think like this. I hope it’s just trolling, but:

            The main way AMD and Intel differ is in there ideological stance on pipeline.

            AMD chips tend to have more cores and a shorter pipeline. They can crunch a bunch of simple data very quickly. Most games rely on simple and quick bits of code. Hence why AMD traditionally excelled in gaming.

            Intel’s pipelines tend to actually be faster, but they have less of them. This can help peak out performance on really tough jobs like video transcoding, 3D modeling, and even applications like Adobe’s Photoshop and Illustrator. There’s several reasons why Apple went with Intel, and the performance difference with these kinds of traditional workhorse applications is one reason.

            People like to talk about Apple as a status or “cool” thing, but before that they were noted for being workhorse machines and heavily prevalent in the graphic design, computerized music, and video editing market. You worked in one of those fields, you used a Mac. Period. There was incredibly little exception, in professional studio environments, until the mid-to-late 2000′s.

            Nowadays, though, these things really make very little difference unless you’re intensively doing these tasks. If you’re compiling a huge, 1,000-page book with a lot of vector and raster graphics as well as type, then you will see performance gain on an Intel-based Mac of the same cores and clocks over those of the about-the-same-time HP chips, or even versus a similar spec’d Windows PC (it involves the improved way OS X handles rasterization and file-to-print). If you’re compiling a feature-length movie, you will see performance gains with Intel over AMD. If you’re doing a lot of minor computations, you will see gain on an AMD.

            So, for big and complicated tasks there are notable differences, but not for common tasks like browsing, running office suite programs, gaming and watching movies. And even on big tasks, we’re talking about very minor gains. When you adjust that to minor tasks, the benefits wind up being incredibly marginal. Nowadays, all of the latest chips and such are so fast that it makes very little difference to the consumer what they have.

        • Baines says:

          CPU can be the bottleneck in emulation, particularly with more demanding projects. Emulation can also be somewhat resistant to spreading work across multiple cores, at least if you care about precision timing.

          (Emulation, to me, is a viable concern when my actual systems are aging and breaking down. Being able to run games beyond their original hardware limitations and with additional features is a bonus.)

          I want to recall that Anandtech’s various testing ran across a simulation racing game where the bottleneck was not the GPU.

        • maaarrrk says:

          Good point, the game types listed are very specific.
          Also, many comparison sites use PCs that don’t have GFX cards, which is imo unrealistic.
          Hell, I’m still running an old X2 3000+ and, with a half-decent GFX card, haven’t had any problems playing modern games in high settings.

          • Bigmouth Strikes Again says:

            What modern games? Minecraft? Or is it at 1024×768?

    • Moraven says:

      The people who want to play on a coach and pay half the cost for the same output a PC can do.

      • PopeRatzo says:

        I can’t think of any coaches I would want to play on.

        • Premium User Badge

          cpt_freakout says:

          We all know Madden’s belly could fit several of us, playing on it like pros.

    • Grey Poupon says:

      Mantle will reduce the CPU bottleneck in supported games in the future anyway, at least if you have a GCN card. Seems like pretty much every big developer is going to support it, but for now that’s just talk. If EA has taught us anything, it’s that talk is cheap and everything else is way too expensive.

    • Shuck says:

      I understand that almost half the people who used the Xbox 360 did so for something other than playing games. Which means there’s a significant audience for non-game uses of “game consoles.” Consoles are also traditionally marketed by the improved graphics on display, however the difference between this console generation and the last, graphically, is the most subtle ever. Development costs have made AAA games more conservative than ever in terms of gameplay as well. So it’s hard to sell them based on either graphical improvements or gameplay. That pushes console manufacturers towards alternate selling points.

    • Premium User Badge

      ordteapot says:

      You may not need a faster CPU for gaming, but it’s disingenuous to imply the choice doesn’t noticeably affect game performance. And I think you’ve got your cost efficiency backwards: AMD chips tend to have better price/performance for general tasks but worse gaming performance than similarly priced Intel chips. If you’re going to be encoding video, you’d be better off getting and AMD chip than an Intel chip at the same price point.

      Techreport.com AMD FX-8350 Review

  4. Razgovory says:

    This is always so unseemly.

  5. Kinth says:

    I actually find it kind of funny that people keep decrying the recent PC resurgence as dead (some going as far as saying PC is dead all together) Just because new consoles are starting to release.

    What I have personally seen is that the fairly lacklustre showing from these consoles has been driving people to PC. Even on the more powerful machine (The PS4) AC 4 is only 900p and AC4 isn’t exactly what many would call next gen. These games are having a lot of frame rate trouble as well. Dead Rising goes as low 10FPS at parts and is also not rendered in 1080p. Resolution isn’t everything but when most console gamers are gaming through ever increasing in size TV’s it’s going to start being a major problem.

    Even console gamers expected a baseline of 1080p, this coupled with the low frame rates has been driving most people I know to PC.

    • Premium User Badge

      PoulWrist says:

      One thing in the consoles’ defence, is that the games like AC12, or however many there are of them now, Battlefield 4 (1600×900, 60fps on PS4, 1280×720, 60fps on XB1) and Dead Rising 3 (720p on xbox, capped at 30fps) haven’t been developed natively for those consoles with people experienced and expert in the hardware. They are last-gen efforts sort of strapped in on the new boxes to give them a bit more value at day one.

    • HidingCat says:

      I remember having a couple of kids pour scorn on me in not getting a Playstation 1, by saying that the PS1 had way better hardware than a PC. This was 1995.

      I’ve been gaming long enough to see these things come up time and time again. Every new generation of consoles heralds the doomsayers, while the PC just keeps at it. As long as games are being developed for it, it will always be in the game.

  6. kael13 says:

    What would more CPU power bring to the PC in terms of gaming? I too, hope for more cores for consumers, but it really seems to have little to no effect over getting a beefier GPU.

    I desperately want to upgrade my first generation i7 next year (mostly to take advantage of SATA/PCI Express in a sassy new SSD) but I’d like to know what kind’ve improvement I could expect. 30-40% perhaps?

    On second thoughts, it could be logical to conclude that, as the new consoles are 8-core machines (correct?), future games will be more highly-threaded.

    • Person of Interest says:

      You can expect 30-100% improvement in benchmarks by upgrading your 1st-gen i7 to the latest and greatest.

      i7 920 vs. i7 4770k: http://anandtech.com/bench/product/47?vs=836

      The conundrum is: “plenty fast” + 30-100% = “plenty fast”. There’s no need to upgrade unless there are specific tasks you can’t do unless you upgrade, such as: 6th-gen console emulation; TrueCrypt w/ AES on an SSD (AES-NI instruction set); or saving a few watts on your power bill.

      You can get most of the SSD benefit on SATA II, so don’t let your 5-year-old motherboard discourage you from taking that upgrade. Here’s a Crucial m4 benchmarked at SATA II vs. SATA III: http://anandtech.com/bench/product/356?vs=355 Notice that, while the raw disk benchmarks show a big difference, the actual “do real stuff with your computer” benchmarks (PCMark Vantage) are very close.

    • Premium User Badge

      melnificent says:

      There is the tech spec only sheet site cpuboss that will compare any pc processor it can to any other.

      A gen 1 i7 to gen 4 looks like this… http://cpuboss.com/cpus/Intel-Core-i7-920-vs-Intel-Core-i7-4770

      And if you wanted to see just how far back you can go then you can do things like this too. http://cpuboss.com/cpus/Intel-Pentium-II-266PE-vs-Intel-Core-i7-4770K

      • kael13 says:

        Awesome website, thanks. I have a 960 but the performance delta to a 4770k is still almost 50%, so I will most certainly upgrade next year!

        • Premium User Badge

          melnificent says:

          There is also a GPUboss.com does the same thing for graphics cards :)

  7. suat88 says:

    Good news….Google is paying 75$/hour! Just work for few hours & spend more time with friends and family. On sunday I bought themselves a Alfa Romeo from having made $5637 this month. its the best-job Ive ever had.It sounds unbelievable but you wont forgive yourself if you don’t check it out http://goo.gl/f6e95V

    • RDG says:

      Let me get this straight, this comment gets published yet my enormous wall of informative awesomeness is awaiting moderation?

      Oh well, at least now I have an excuse to use a Star Trek quote.

      Damn it Jim, moderate already.

  8. Snids says:

    I don’t understand if you’re being ironic or not.

    • PopeRatzo says:

      This is one of those articles where it doesn’t matter if it’s ironic or not.

  9. CookPassBabtridge says:

    If Intel and AMD apparently see no market in more powerful chips, why did NVidia and AMD’s graphics wings just go through a massive cock waving session then? Pure competition factor? If Intel bring out The Beefington 9000X Black Edition 10.6GHz, will that automatically mean AMD do the same? If so, SOMEONE HURRY THE FORK UP AND DO IT

  10. Penguin_Factory says:

    You raise some good points here about the superiority of PCs in a lot of ways (this is coming from someone who likes to have every platform under the sun but plays most often on PC), but I think the major appeal of consoles that often gets left out of discussions like this is ease of use.

    I consider myself fairly tech savvy and I still quite often find myself spending hours or even days wrestling with technical problems, just to get a game working (I’m currently going through this with both Battlefield 4 and Crusader Kings II). I can totally understand why a lot of people just want to be able to buy their hardware and not have to worry about any of that nonsense.

  11. Moraven says:

    NVIDIA had G-Sync demos at BlizzCon with side by side comparisons (on two identical machines) of a pedulum swinging back and forth. V-Sync on and you could visibily see the stutter of V-Sync. G-Sync machine is smooth.

    Problem is I have no need to invest in a new monitor for another 3 years nor graphics card. And prefer AMD.

  12. RIDEBIRD says:

    The “HEY GUYS THE PC IS FINE I SWEAR ITS FINE HEEEEHHHH GUYYYYSSSSSS!!!!!!!!!!!!!!!” thing RPS has been going on a bit about in the last couple of months is so, so, so beneath this site and the writing on it.

    Yes, there’s new consoles out. There is absolutely no point to be all YEAH WE DONT CARE LOOK AT THESE HUGE ARTICLES ABOUT HOW MUCH WE *DONT* CARE. Please just stop it. No, we don’t really care.

    The new consoles are fine machines for those that prefer that way of gaming. That is okay. That is fine. This changes nothing for the PC except that we can expect more graphicser console ports.

    I will never buy a console again, and I don’t really pay much mind regarding next gen other then it’s exciting to see what the games can do and what the consoles can do as well. Basically, there’s room for both PC and console in this harsh world, and there is no point in making a fuzz about which is better.

    • Snids says:

      Agreed.

    • Premium User Badge

      Don Reba says:

      There was a “we don’t care about PS4″ article, so there has to be a “we don’t care about Xbone” article for symmetry, lest they be accused of favouritism.

      • Premium User Badge

        cpt_freakout says:

        Also a “we don’t care about the 2DS” article full of Fire Emblem references.

  13. Didden says:

    We shouldn’t be to happy that the consoles are underpowered compared to modern PC Graphics cards, as sadly, that means many games will be designed to work only at their level. Because you know. Its PC and publishers just love PC.

  14. Baboonanza says:

    I love having twice as many ROPs as much as the next guy but the biggest reason for being a PC gamer for me is the indie/strategy game scene. Without owning a PC I wouldn’t have been able to play Crusader Kings, Hotline Miami, FTL or any of the other amazing PC exclusives (though they never get called that) released in the last couple of years.

    • Viroso says:

      Yeah the funny thing about PC is that despite all of the potential power, the most interesting games on it can be run on a potato or at least a potato with an iPhone stuck through it.

      People who believe on the myth that PC is all about expensive utterly powerful machines (to run god knows what games) end up missing out on a bunch of great games that they could be playing on their PCs simply because they don’t pay attention to what’s being released or think that anything 3D requires THE machine, possibly unaware that their CPU is actually a good enough APU.

      • Saltyaubergine says:

        Totally agree. I recently build an 6800k based PC. It is running at 4700 MHz, dead silent and unobtrusive. It runs the games I want to play, though I might upgrade to kaveri at some point, especially if DDR4 becomes available. Then all of a sudden you’ve got something pretty close to a PS4 in terms of architecture, just more awesome. I think a PS4 port with Mantle support might run pretty well on such a machine.

  15. db1331 says:

    One of the things that made my eyes roll when the new consoles were first revealed was all the devs talking about how excited they were to start making games on the new platforms, because of all the cool stuff they could now do-the same stuff they could have been doing on PC for years, but were completely ignoring to make console games.

    • DanMan says:

      Yeah. A lot of them keep doing that to this very day. Mostly in-house companies though – who’da thunk. Cross-platform devs know better than that.

    • Kubrick Stare Nun says:

      ♪♫ It’s all about the money money money ♫♪

  16. Viroso says:

    Is it trippy to expect to use something like DDR4 and find an affordable price for 1tb of SSD by 2015? I got a decent but cheap motherboard that got me stuck in… I dunno, 2011 or 2010. I only have two RAM slots, which is actually enough, and a FM1 socket. I was thinking I’d just save money to upgrade to entirely new technology of the distant future of 2015, since all the stuff I’m packing right now suits me fine and will continue to do so for at least one year going by my backlog alone.

    • Cam says:

      I would say that if you’re fine right now, then it’s always better to wait until you just can’t help the desire to upgrade. 2015 will be a good year to build a complete high-end system from scratch; you’ll have all sorts of new technology to play with, as well as graphics cards more suited to running at 4k than the ones we have now. You’ll also get much better prices on SSD’s than we have at the moment.

    • Williz says:

      As long as DDR4 gets to the same demand levels that DDR2 go to in it’s heyday (I don;t think it will by 2015 but we’ll see.). Now with the SSD I see that as possible as I saw a 1 TB go up for sale at around £400 the other day.

  17. Didden says:

    Also, Jeremy, would you consider doing an article on the old moore’s law and how it could affect PC’s in the future? After 14nm, we are set to get 10nm, which will be doable it seems, but beyond that we’re getting into squeeky bum time in terms of 7nm, 5nm and beyond. Intel’s R&D budget has risen dramatically in the last few years (and we’re talking billions and billions more) and each fab to build these smaller chips seems to double in cost each time. It’s not that far away, but the thought of overclocking my CPU when its transistors are literally made up of a few atoms and weird stuff like quantum tunnelling kicks in, is worth pondering.

    • Premium User Badge

      melnificent says:

      That’s why you hold back increasing the core count. Shrink to 10nm then increase the core count by 2 every two years and introduce new instructions etc in the opposite year.

      • Didden says:

        If only that increased performance. The 8 core AMD stuff is a good example of why that isn’t a path to victory, although going sideways does seem to be the only way at present. Its just a case of how to adapt the software to make realistic use of it. Not an easy thing to do.

      • Williz says:

        MOAR CORES

    • Jeremy Laird says:

      I did something along those lines here:

      http://www.techradar.com/news/computing/why-the-death-of-moore-s-law-wouldn-t-be-such-a-bad-thing-1175977

      Moore’s law is a bit like global oil reserves. People keep on predicting doom. And then expectations have to be reset. I remember back when people used to talk about Intel knowing how to get to 130nm, but it was a blank sheet of paper after that. Now we’ve got 14nm coming next year.

      It will end eventually – at least in terms of conventional computing, once you get down to a gate made of an atom or three, it’s a done deal. But then other paradigms like quantum computing might reset the clock.

      Or like I say in that article, like air transport, it may be possible but simply not necessary or affordable to go faster / smaller.

      • Didden says:

        Oh crap – you’re basically saying in 12 years time, our PC’s will be some form of Ryan Air! :)

        For some reason, I still have in my head a thing on tomorrows world about them using light waves in chips. But granted that was the 80s. Although it was interesting to see them increase networking speeds using such ideas.

        Personally, I think Moore’s law is already slowing down, with the current chip generation. Intel are in no hurry to get their new chips out there, because the longer they can start to stretch these things out, frankly the better it is for them, given the known issues past 10nm – which isn’t that far away. It seems, neither are AMD, so we’re already looking at a longer gap with this generation. Same goes for memory, with the reluctance to switch to DDR4, which should have been here by now but has been held back more for economic reasons. All the memory manufacturers can make it, they have products demoing DDR4, but no one is in a hurry to produce it because of the economics of scaling it and lack of demand (again mainly down to Intel not using it yet). Intel might say they can get all the way to 5nm, but then again – they do have share holders to keep happy, so its not like they would say its not possible. Going to be an interesting 12 years.

        • Didden says:

          http://www.bbc.co.uk/news/technology-24579776

          And Jeremy, I like your optimism. We’re going to need it :)

        • Jeremy Laird says:

          I’d have to say you are incorrect. Intel is very much in a hurray to use its process tech to get an edge in mobile. So they want to keep shrinking transistors as fast as possible to beat the ARM army!

          • Williz says:

            Come on it was right there… ARMy. I know it’s early but get your pun hat on.
            Either way Intel probably won;t beat Arm.

          • Didden says:

            They are going to be in for a hard job to surpass the ubiquity of ARM processors, thats for sure. While it might make sense for them to try to get smaller to increase their foothold in that market, there is also no doubt that with such a dominant desktop position, and the cost of the fabs increasing, it makes much more sense for them to stretch the desktop processor release cycle down, even if its by 5-10% to maximise profits on their existing fabs. With AMD basically uncompetitive, they can afford to do so, while releasing the odd high end piece to keep the elite mob happy.

    • Premium User Badge

      Don Reba says:

      There are other ways of increasing the transistor density beyond making them smaller. Coming up are things like layered 3D chips and transistors with more than 2 states. Google recently did a series of blog posts on this: http://googleresearch.blogspot.com.es/2013/11/moores-law-part-2-more-moore-and-more.html

      • stupid_mcgee says:

        Yeah, this has me really excited. This will be the next big step up. Not 320 cores all running at 3GHz.

  18. rustybroomhandle says:

    Long live the Pina Colada!

  19. mygaffer says:

    “But this technological stasis on the CPU side still makes me a bit nervous about the future for the desktop PC.”

    This was always going to happen. CPU progress has largely been driven by shrinking transistors and we are reaching a point where it is getting harder and harder to do that.
    Not only that but a lot of users don’t need more computing power. For email, for web surfing, for HD video, for office work, even 4 year old computers still do these tasks fine.
    PC sales are going to settle into a much lower yearly sales rate with much slower growth. This will be the new norm.

    • Jeremy Laird says:

      Umm, Intel has been humming along, releasing a new process every two years. 65bn begat 45nm, then 32nm and 22nm, with 14nm next year. Process tech absolutely has not being holding back CPU development.

  20. DanMan says:

    Intel is probably waiting for AMD to finally catch up. If Intel released a massively better CPU (and they already are better anyway), AMD would take a nose dive and thus create a monopoly for Intel in the desktop space. They don’t want that to happen because goverment institutions around the world could get involved.

    I still have my i5 760 and I can’t really complain. I’ll probably wait until NV releases their Maxwell GPU next year and then do a full upgrade. Might even end up putting that into a SteamBox instead of my tower.

  21. Asdfreak says:

    Call me crazy, but I am absolutly convinced that both Intel and maybe even AMD already have much better chips, but don’t release them because of some smartass marketing and buissness people telling them they can make a bigger profit slowing down the development artificially.
    Also Intel is very likely to be already much better, but they simply don’t release that stuff because that would crush AMD and get them a monopoly that would enrage governments around the world. They seem to be happy to just release stuff that is slightly better than AMDs and laugh about them

    • Jeremy Laird says:

      Intel could indeed very, very easily release chips much faster than it offers today. If it was actually trying to achieve the highest possible clocks / running them as close to the wire as they did with Netburst, for instance, with its current tech, I suspect 5GHz quad-core processors would be doable.

      It could also very, very easily be selling eight-core CPUs for the desktop right now clocked about as high as current four-core models. But there’s no competition, so it doesn’t need to.

      AMD on the other hand would love to have far, far better desktop CPUs than it currently offers.

  22. Universal Quitter says:

    “Intel’s upcoming Broadwell family of mainstream CPUs (successor to the already-underwhelming Haswell generation) will be four-core all over again, despite yet another shrink to 14nm”

    Unless I’m ignorant of what you meant by “four-core,” wouldn’t focusing on making individual cores smaller, cooler, more efficient, and faster be the smartest way to go about things? Multi-core support is spotty throughout the software industry, and usually for good reasons, like debugging becoming a complete mess. Bugs are prevalent enough as things are, I think. We don’t need to make anymore excuses for that.

    Sometimes it feels like computer technology advances at such a quick, steady rate, humans rarely get a chance to really refine certain aspects of it.

    • Jeremy Laird says:

      Really no need for smaller cores at 14nm on the desktop. Intel’s die sizes and power consumption – especially if we’re just talking about the CPU portion – have shrunk dramatically in the last three or four years.

      Agree that cores that do more work per clock would be great, but that’s very easily said, very hard to achieve. The GHz wars are over. Clockspeed is not the answer. Meanwhile all the low hanging fruit and most of the stuff in the higher branches has been picked re IPC. Intel cores are miles better than AMD cores, for instance. Can’t really see Intel delivering a really big boost in IPC. Also, game devs aren’t going to code for that since the single-core IPC on the new consoles is absolutely minging.

      All in all, more cores is the only reasonable option for really big gains in the short term.

  23. somnolentsurfer says:

    How many ROPs does Iris Pro have? (how is laptop gaming going to be holding up to the new consoles?)

    • Premium User Badge

      melnificent says:

      The Iris pro has 2 rops and 40 shader units. With a theoretical peak of just 16 gflops.

      It’s definitely the weak point, which is probably why intel is improving the on-die GPU

  24. virginiaejones says:

    ▂ ▄ ▅ ▇ Good News My Buddies ▇ ▅ ▄ ▂
    …”Can you have some spare time to sit back in your chair having your laptop with you and making some money online for some interesting online work “said Jenny Francis in the party last night ….see more what is for you there to increase your pocket money============ W­­W­­W­.J­o­b­s­5­3.­ℂ­­ℴ­­m

  25. SuicideKing says:

    @Jeremy, a few points:

    1. Comparing ROPs across architectures might not be the best idea. 16 ROPs on the GeForce 6800 are very different to 16 GCN ROPs.

    2. Haswell-E is likely out in the second half of 2014, and all the rumours so far indicate 6 and 8-core configs instead of the 4 and 6-core configs.

    3. Broadwell may or may not be LGA at all, though recent speculation suggests some like the Broadwell-K chips may be. Also, the cache in Iris Pro parts acts as an L4 cache for the CPU if you aren’t using the iGPU, and both AnandTech and Tech Report did post benchamarks that showed a pretty big improvement thanks to Crystalwell’s L4.

    4. Mantle and G-Sync. These NEED to be cross-vendor.

    • Williz says:

      I agree with 4 but I honestly don’t see that happening despite it not being goof for the customer as they’ll have to choose an nvidia compatible monitor…

    • Jeremy Laird says:

      Chap, think you must have missed the bits where I repeatedly described the ROPs thing as ‘cheap point scoring’. Wasn’t meant to be a serious comparison and was clearly flagged!

      Haswell-E is in my view a server and workstation chip on a server and workstation platform with pricing to match. Get back to me when Intel does a desktop chip beyond four cores!

      • SuicideKing says:

        Ah. Aha. I didn’t miss it, i just took in another sense. Sorry!

        Though among the consoles themselves, 16 vs 32 is a perfectly valid comparison.

        Yup, Haswell-E will be a workstation chip (Xeon is for the servers) but prices will be between $500 to $1000…you could, if you wanted (rather, had the money), make a bleeding edge gaming rig with an 8-core Haswell.

        But yeah, I’m with you on the “we need 8 cores under $350″ thing…i think it’ll come with Skylake. I hope, rather. :|

  26. HisDivineOrder says:

    Rumors also suggest that the LGA2011 successors based on Haswell (re: Haswell-E) will be octacore finally. If that’s the case, it’s probably a decent deal for someone who’ll get 4-ish years off the upgrade to buy into that. Look how long SB-E’s lasted at the top and consider Intel’s actually slowing down on actual performance improvements for their CPU products.

    They’re so focused on performance per watt that buying a CPU once every four years might not be such a bad thing. And considering the cost like that, suddenly the LGA2011 gets more affordable on a “per year” cost basis.

    Even if you ignore that, you’ve got hexacores dropping for less money, too, pushed down by the octacores.

    Bonus: When Intel says hexacore or octacore, they mean six or eight actual cores. They don’t count the pseudo/fake cores like AMD when they call their CPU’s hexacore or octacore. Counting the hyperthreading, Intel’s releasing chips that are 12 and 16 thread-capable.

  27. sabasNL says:

    Consoles are easier to use with a television, relatively cheaper, and optimalized for playing games.
    How many multiplatform games were badly ported, unoptimized, or horrible for PC? I can name a whole list.

    But, in every other aspect, PC’s are superior. They are built to be able to do anything. And when it comes to staying up-to-date, you can simply upgrade your PC. You don’t have to buy a whole new one.

    I’m a PC gamer that spends time on his PS3 as well. My PC is absolutely my favourite, but sometimes some games work better on (or are only released for) consoles. I don’t think the Steam Machine is going to change that.

  28. stupid_mcgee says:

    “The consoles have plenty to teach the PC when is comes to couch potato gaming, too, though argubly the PC is finally catching up in the living room stakes as the likes of SteamOS and Nvidia Shield come on stream.”

    This. I’ve been trying to explain this to some of my friends who have said that they don’t understand what the point of the Steam OS is. It’s a way for Steam to try and grow their market share and to try and muscle into the console space without having to create dedicated hardware to directly compete against the consoles. It means I can still sit in my chair and enjoy Payday 2, and when friends come over I can switch on the Steam OS PC (that I built) and play Pro Evo with friends in the living room.

    While the strategy may seem weird, it’s quite ingenious, actually, and I can see it really advancing the PC gaming format out from its traditional desktop “single person in an office chair ” role.

    Valve is certainly a company that needs money to operate (which is the most banal argument ever), but they have always been,primarily, about advancing the PC as a gaming platform. Yes, it’s self-serving, but they do it in such an open way that it really does enrich the platform. I would argue that Steam has done more to advance PC gaming than anything else, and very well might be why we’re seeing so much more love now than we used to.

    The idea that anyone can build a cheap streaming PC and install the Steam OS is awesome. The fact that any wholesaler can build a bunch of streaming PCs and slap Steam OS on there is exciting. And it’s even more exciting that. because it’s open source, you will see people iterate and build their own versions.

    This is a benevolent business model. Valve puts this out there in hopes that it will spread and present a paradigm shift for living room gaming. Meanwhile, they hope that this will also funnel more users to Steam, all the while without resorting to any kind of exclusives or any of the other closed-garden tricks that so many other have used. And people really wonder why Valve has such a devoted following?

  29. Shooop says:

    Without anything to run on these parts what’s the point?
    You don’t need a 16-core CPU and video card powerful enough to map the entire human DNA structure in an hour if all you’re playing is some rejected Atari 2600 game on it.

    And no one who already is right now is going to stop developing primarily for the consoles. No one. Because they like making money, and to make money you have to sell it to as many people as you possibly can. It doesn’t matter how much more restricted the platform is, how much harder it is to do what they want. The sight of that number before the word “sales” is more than enough to ease their disappointment in not being able to do everything they wanted.

    The PC only makes a big comeback once the old consoles start collecting dust. And it only lasts until the new ones arrive and all the devs start circle jerking all over them. The fact that the only thing we saw during what should have been a resurgence is some nostalgia trips just goes to show how far the PC’s actually fallen behind in the gaming world’s favor. More devs stuck with the geriatric PS3 and Xbox 360 than have a go at the PC. And now the few that did are about to pretend they don’t exist for a few more years.

    So that leaves us with The Witcher 3, maybe Star Citizen and more crappy dime-a-dozen Atari 2600 “retro” cash-ins on nostalgia. Maybe someone will start a Kickstarter to bring back CRT monitors. It’d probably be 1000% funded in a week.