AMD’s 16-core CPU and other exciting stuff

Yes, yes, a 16-core CPU is likely-going-on-definitely irrelevant for PC gaming. But it’s an exciting notion in simple technological terms and it represents something that certainly is important for gaming, namely that the PC as a platform has woken up again. Along with that mega-CPU from AMD, we also have the imminent prospect of new graphics card families from both AMD and Nvidia, new CPUs from Intel in response to AMD Ryzen assault, an intriguing new APU, again from AMD, that could just make for some nice cheapo laptops with genuine gaming chops and, well, plenty more. And the annual Computex tech shindig hasn’t even kicked off yet…

AMD’s saucy new 16-core Threadripper (yes, really) processor, then. What’s it all about? Socking it to Intel, quite simply. I can’t imagine that AMD is going to sell very many of these new 16-core beasts to desktop PC enthusiasts. But beating Intel on core count and (possibly but not quite certainly) for outright performance in certain metrics has significant PR value.

For the record, this 16-core / 32-thread chip will apparently be branded Ryzen 9 and the full range will also include 14-core, 12-core and 10-core models. It’s also mated to its own new socket, much like Intel’s high-end desktop CPUs, so it’s not cross compatible with existing Ryzen CPUs and their AM4 socket.

Clockspeeds are expected to be surprisingly high, with a baseclock of 3.5GHz and 3.9GHz Turbo for the top 16-core Ryzen 9 Threadripper. Thus, you can have your single-threaded performance cake and eat the multi-core goodness at the same time. In other words, if you can afford what will undoubtedly be a very expensive chip, it’ll likely be decent for gaming and killer for almost everything else. One chip to rule them all, perhaps.

Threadripper will likely be a pair of AMD’s eight-core Ryzen CPU dies crammed into a single package…

For real-world gamers, the existing Ryzen 5 and Ryzen 7 models still make much more sense. But Ryzen 9’s impact will be felt in terms of how it pushes the market on in general, regards both performance and pricing. Even if you never buy a Ryzen 9, it’ll probably make whatever you do buy a bit faster or a bit cheaper.

Indeed, it’s no surprise to learn that Intel is planning a riposte. The web has been alive with reports of a new high-end 12-core Intel CPU for the desktop, branded predictably enough Core i9, plus a pair of slightly odd-ball quad-core CPUs that eschew integrated graphics and slot into Intel’s high-end LGA2011 socket.

But for me, it’s what Intel does with its mainstream LGA1151 socket, the one that accepts chips like the £200/$200-odd Core i5-7600K, that’s of more interest. The expectation here is for the debut of a six-core processor as part of the upcoming Coffee Lake family later this year. That’s a big deal given that mainstream Intel desktop CPUs have topped out at four cores ever since they were split off into a separate socket eight long years ago.

Even if you’re not convinced that six cores will have much impact on game performance, inserting a new six-core model as the de facto top rung of the mainstream Intel CPU line should push the four-core models down the stack a little and make them more affordable.

AMD’s mainstream AM4 socket that appeared with the Ryzen 7 and Ryzen 5 CPUs isn’t compatible with the new Threadripper megachips

Another fun-sounding new CPU, or rather APU, due soon is Raven Ridge. It’s basically an AMD Ryzen CPU in dual up to quad-core trim with a slice of AMD’s next-generation Vega graphics all squeezed into a single chip. Nothing is certain, but leaked AMD docs indicate it will have 11 so-called Vega graphics cores, which should work out at about 700 pixel-prettifying shaders or roughly the same as the original Xbox One console.

Imagine all of that in a genuinely cheap, thin and light laptop PC and I suggest you are now quite interested. At the very least, Raven Ridge should massively lift the bar for entry-level integrated graphics performance. It might just be the first APU that’s genuinely gaming capable. Dare we hope for, say, a $400 / £400 laptop with decent gaming chops? Fingers crossed.

On the graphics side of the equation AMD still has an awful lot of catching up to do. At the high end, it’s a full generation behind AMD’s Pascal graphics cards, such as the GeForce GTX 1080.

AMD has had no response to Nvidia’s high-end Pascal cards, like the GTX 1080 Ti

In fact, Nvidia even beat AMD to the punch with its recent announcement of the first GPU from its next generation Volta family of graphics tech. It’s a chip aimed at high-end compute and machine learning rather than gaming. But as a member of the new Volta family, it’s fully two generations advantage over AMD’s last high-end GPU, the Radeon Fury.

In fairness, it was only a week later that AMD rolled out its own new high-end GPU for compute and in turn the first member of its long-awaited Vega series of graphics cards. The Radeon Vega Froniter Edition, as its known, gives a sneak peak of what to expect from AMD’s next high-end gaming cards. And what we can apparently expect is a GPU that depends on high clocks to achieve its performance edge.

If that’s accurate, then it may explain why Vega is so late to launch (yes, late you naughty people in the comments below). Perhaps AMD is struggling to get the clocks up to sufficiently competitive speeds. We’ll find out the truth soon enough as I expect a fairly substantial Vega reveal from AMD in the coming weeks.

Anyway, the sum total of all this is that by year end the PC landscape will likely look very different. At any given price point, both CPU and GPU performance will have stepped up fairly dramatically on the desktop and there’s also a half decent chance that new generation of cheap gameable laptops may have arrived.

With all that in mind, my broad advice re upgrading your PC or buying a new rig is a very firm hold, currently. Recent years have seen fairly long periods of stagnation in terms of key PC hardware, which meant you may as well buy sooner rather than later. But right now looks very much like a moment where holding off for three to six months could pay significant long-term dividends. Watch this space.

24 Comments

  1. nitric22 says:

    This sounds great for just causing the next price drop of current-gen chips and cards. I’ve never had the budget to go full-on-top-o-the-line. And I’m always content to turn the dial down a few notches from ULTRA to regular old High, Medium-Well, W̶e̶l̶l̶ ̶D̶o̶n̶e̶ (scratch that I want a little pink in my graphics). So with my current i5 and 750ti, I already planned to go another year and change. But perhaps this will become a Christmas ’17 to truly remember?

  2. vorador says:

    On the positive side it will definitely put the screws on Intel, who has been lazy as hell lately on the CPU department, with new models barely a 5%-10% performance improvement over previous ones.

    While it likely won’t beat Intel at gaming purposes, for tasks that benefit from multithreading this thing is a beast. Even the biggest Xeon has “just” 12 cores and 24 threads, and its price is/was ridiculous.

    And the its equivalent on Ryzen has a maximum of 32 cores and 64 threads, a chip named “Epyc” (yes really) the size of a fist.

    • Premium User Badge

      Don Reba says:

      If CPU cores are what you want, Intel’s Xeon Phi has you covered.

      • tormeh says:

        Technically yes, but if I remember correctly, the cores in Xeon Phi are essentially slightly modified 386 cores. In other words, the Xeon Phi has absolutely useless single-core performance. While there definitely are applications that benefit from this approach, you would be really screwed for everything else. The Phi booting Windows could take a while. In fact, you should definitely consider taking a nap while it happens. Or even work a couple of days; I’m unsure what order of magnitude this thing is on. The Xeon Phi is supposed to compete with GPUs, not CPUs. That it can do all the instructions a standard x86 CPU can do is a unique selling point, but you shouldn’t take that as an invitation to use it as a CPU.

        • Premium User Badge

          Don Reba says:

          They are slightly modified Xeon cores, but you’re right that they are not really comparable. The Phi runs its own OS, it’s not just a chip. We evaluated them for the LHCb compute farm, but they turned out to be far from price-competitive compared to GPUs.

    • TillEulenspiegel says:

      I swear people are spoiled by the huge jump that occurred around Core 2 and have no memory of the miserable Pentium 4 years. Intel isn’t “lazy”, the problem is that CPU development isn’t magic. The only straightforward thing is using smaller processes for better efficiency, everything else is either a tradeoff or expensive or a genuine innovation. It’s astonishing we’ve come this far; ~15 years ago there was real fear about reaching physical limits due to heat.

      Yeah you can shove more cores on a single die, but that doesn’t make it a useful consumer product. You add a lot of Watts to crush some benchmarks, but it has zero relevance to the vast majority of “enthusiast” gaming.

      Of course, it’ll be great for the multi-threaded successor to Dwarf Fortress that I keep hoping somebody makes.

      • Dinger says:

        I certainly remember the miserable Pentium 4 years, and I also remember the rapid growth that led up to it: the 1990s opened with the 80486 at 33 MHz and ended with Coppermine PIIIs and AMD Firebirds breaking the GHz barrier. So, yeah, the nineties were a decade that saw a 30-fold increase in desktop clockspeeds. The “Megahertz Race” also set the table for the disastrous Pentium 4 era, where Intel ran headfirst into a wall of power and thermal issues, and everything stalled out under 4 GHz.

        Now, ten years after P4 finally died, we’re getting 16 cores running just under 4 GHz.

        In short, no, I don’t agree. The Core 2 was an enormous step forward, certainly, but the P4 doldrums before it were largely due to the time it takes for design and administrative inertia to change radically the way chips are designed. Now the doldrums are that the new possibilities offered by hardware don’t make a compelling case for businesses to shorten their hardware replacement cycles.

      • Premium User Badge

        phuzz says:

        Even when Intel don’t have to compete with AMD, they’re still competing with their old chips. If this years CPU is only 5% better than last years, with no new bells or whistles, then why bother to upgrade?
        What this recent crop of AMD processors will hopefully do is prompt Intel to drop their prices somewhat to compete.

  3. theallmightybob says:

    found a bit of a typo, you put “On the graphics side of the equation AMD still has an awful lot of catching up to do. At the high end, it’s a full generation behind AMD’s Pascal graphics cards, such as the GeForce GTX 1080.” then in the caption you have it right listed as an nvidia

  4. ButteringSundays says:

    “A 16-core CPU is likely-going-on-definitely irrelevant for PC gaming.”

    My thoughts exactly.

    “But im gonna write a bunch a stuff about it anyway”

    Oh.

    • jezcentral says:

      This will have a big impact on the PCs that make those games, though.

      • ButteringSundays says:

        Anything that happens to any hardware that goes into PCs effects games in some way or another.

        My CPU hasn’t been a bottleneck for me having fun in any game for 5 years, and it’s an i5. What does any of this actually matter?

        • ZippyLemon says:

          I work in hardware PR and I can’t answer that question.

    • ravenshrike says:

      Thing is, the Threadripper series of CPU(Incidentally Jeremy, that’s going to be the chips actual name, not Ryzen 9) in combination with Epyc and Ryzen completely saturates every level of the market. This puts immense pressure on Intel, as last year they had none. Combined with the aggressive roadmap that AMD has laid out for die shrinkage and general processor improvement and Intel has been scrambling. You can see this in their early(by about 6-8 months) release of the 299 chipset and processors. Now, Intel does have an advantage with Optane for servers, but the amount of data sets that could take advantage of it and would find the load time boosts upon system restarts that 6 terabytes of Optane can give economically feasible at $2-4 per GB has to be pretty damned low.

      • Buuurr says:

        ” This puts immense pressure on Intel, as last year they had none.”

        Yeah, Intel is terrified to lose that other 15% of the market that gives them a ginormous lead in RND and marketing and income. That 15% provides Intel with the best competition that it could possibly ever want to go against… competition that is always on the back foot. Sorry, competition that is always in the air about to hit the mat after a proper upper cut. If you think Intel is fearing anything, go ahead and check the stocks and cash reserves to find all the ‘fear’.

        AMD’s release was nothing more than a last gasp from them. Years ago when they were emergent they had some great, genuine competition to Intel chips. They really did. But, they fell and haven’t gotten up yet.

        Yeah, they released some new chips that do a touch better than the Intel line released a year ago. Great. Hi, welcome to the party. Intel releases the new ones soon. As usual, they will destroy.

        • ravenshrike says:

          Yeah, you don’t seem to understand AMDs play. They’re making a serious push for the server and content creation market out of the gate with servers that will cost from 60-70% of comparable Intel servers. Moreover, Intel doesn’t look to have more than their 5-10% real world performance gains in their current architecture assuming the past three refreshes are anything to go by. Whereas this is the first iteration of Zen, and it’s immediately going through a process shrink, disregarding any other improvements to the underlying architecture itself. Unless Intel pulls a series of rabbits out of their hat AMD is going to seriously cut into their marketshare over the next 5 years. Especially if Apple goes with Raven Ridge for the new Macbook Pro.

        • neshi says:

          They fell?
          Really? They were forced out of the market by Intel who conspired with vendors to make it so. They got a slap on the wrist for that… but it has severely hampered AMD ever since

  5. ephesus64 says:

    What is a Threa Dripper, and why do I want one in my computer? I am skeptical.

    More seriously, is anyone considering an upgrade, and why? I’ve been a cheapskate for so long that the only things which really excite me are the high-value inexpensive things like that Raven Ridge APU. I’ve had a couple of A8 series APUs which were firmly in the “better than nothing” category, but did great with old games.

    Now I’ve got an i5 3550 that a family member gave me, which is stubbornly holding on at the very bottom of the highest tier in Tom’s Hardware’s hierarchy, so I think Threa will have to drip somewhere else. link to tomshardware.com

    • Lukasz says:

      I’m building new system. Couple weeks from now.
      Why? Because my gaming laptop cannot handle witcher 3 in playable quality (still played forty hours of it)
      Many old games have issues with choosing dedicated gpu over Intel’s shit hd
      I’m playing at 1600×900 on 17inch screen
      And I wanted new desktop since 2011 but life moving to different continents and study debts got in the way.

      I do agree waiting till July or august would make more sense but meh. There is always something awesome coming up in the future. And I wanna play now

      • ephesus64 says:

        Always waiting for the newest stuff does seem self-defeating, unless the only fun thing about PC gaming for you IS making it faster. A really fast customized car or motorcycle isn’t that useful either, but that’s hardly the point, I think. Shiny new stuff is fun.

        Where I have been glad I waited is when I had a vague idea of what I wanted and what I could afford pay for it, and just watched for deals. That’s how I ended up with a GTX 1060 3gb for just over 160 USD, and I’m happy as a clam with that card.

        The hierarchy on Tom’s Hardware for GPUs which I linked to is also helpful, because (even if it’s oversimplified) it shows how silly it would be to drop two or three hundred bucks just to go up one tier for almost no noticeable gains.

        link to tomshardware.com

  6. snv says:

    Well, link to xkcd.com but it is my guess that we can thank VR for the restart of the hardware spiral.

    Also, since games are still mostly limited by single thread performance (if the CPU is the bottleneck), i find the news about intel’s 4.5 GHz CPU more interesting.

  7. Premium User Badge

    MajorLag says:

    I’d really like to see AMD get seriously competitive again. Not because I have any particular love for them or dislike of Intel, but just because it’d be better for the market if they did. In my heart of hearts, I hope that they do something as game changing as when they invented x64 and somehow their new advantage is parlayed into an avalanche-like shift to Linux in PC gaming. Not because I’m a Linux evangelist, mind you, but just because Windows 10 is an abomination we only put up with for compatibility reasons.

  8. ZX k1cka55 48K says:

    “it’s a full generation behind AMD’s Pascal graphics cards, such as the GeForce GTX 1080.”
    What?

  9. milonz says:

    Ryzen line of products could be a good deal for the professionnal market, in virtualization (ESXi, Xen, KVM) or rendering farms (3dsmax, Maya, etc).
    Not as a pure performance product (the Xeons rule) but as the best performance / value / power efficiency match.
    With virtualization or 3D, you really don’t care about a single box performance, but about how many boxes you can cram in a datacenter, for a defined budget.
    AMD has yet to resolve the compatibility problems the Ryzen have on ESXi, but it already seems that Ryzen CPUs are already a good performance / value investment in the rendering farms field.

    For the enthusiast gamer market, things are a little bit tricky …
    10 years ago, 2 cores were enough, because multi-threading a game is a complex job.
    Now, every single game which comes out is 4-way optimized, it took the gaming industry nearly 10 years !

    Multi-threading means separating the game logic between multiple threads, which are run in parallel.
    You can for exemple separate the main thread, the sound thread, the ai thread, the physics thread.
    But there’s no point in separating the main thread, because it is a sequential process, not parallel by nature. It cannot be separated, it is considered as “atomic”.

    The Ryzen many-cores line of products future could be determined by how many threads the gaming industry will adapt to in the not so distant future, and what usages they’ll find to them.
    More threads would maybe mean more AI or more physics, which are for now quite rudimentary.
    Or a completely new usage for the threads, like machine learning, which could enhance the player’s experience greatly.

    In any case, AMD Ryzen could drive Intel’s CPUs prices down, which is a great thing, for the professionnal and the gaming market.
    The worst case in any market is when a single company has a monopoly, and for many years, Intel had a real monopoly on CPUs =/