AMD’s Ryzen: A gaming CPU worth waiting for?

Something good is about to happen. I’m fairly sure of that. RPS isn’t exactly hardware rumour central, of course. There’s plenty of that elsewhere and, frankly, I can’t compete. But after the downbeat tone of my recent Intel Kaby Lake coverage, I reckon it would be remiss not to balance things out with a quick preview of what to expect from AMD’s new Ryzen CPU. It’s definitely coming soon and will probably go on sale in around six weeks. Exactly how good is Ryzen going to be? I don’t know. But all the indications are that it’s going to be at least good enough to make AMD CPUs relevant for gaming again.

The unavoidable context to all this is the question of whether CPUs even matter for gaming. We haven’t really the space to do that justice. It’ll have to suffice to say that I think CPUs do matter and I think having faster CPUs will enable better games.

That aside, what’s for sure is that AMD’s inability to really compete over the last, say, five years has allowed Intel to sandbag pretty spectacularly. We are all paying more for less processor performance than would have been the case had AMD kept Intel honest.

With that party-political polemic despatched, let’s talk about Ryzen. As the name implies, it’s based on AMD’s new Zen CPU architecture. I believe it’s pronounced with a long ‘i’ as in ‘high’, but it’s also seemingly play on ‘risen’, which has a short ‘i’. It’s a fairly odd name for a CPU, but then marketing has never been AMD’s strong point.

I’ve covered the basics before, but the short version is that the new chip is something of a back to basics design. It’s all about proper old-school CPU cores that get a lot of work done each operating cycle, just like Intel’s CPUs.

In fact, AMD’s approach is arguably even more old school than Intel since it won’t insist on lumping the CPU die with huge amount of non-processor circuitry, like Intel does with many of its chips. You can no long buy a mainstream Intel CPU without integrated graphics.

I reckon that’s critical. Check out the die map of of a quad-core Intel processor above. It’s a Broadwell chip (so a Core i7-5xxx Series) with the biggest integrated graphics option. Roughly half the chip is graphics. That’s an awful lot of largely useless baggage you have to pay for.

Admittedly, most of what you might call Intel’s gaming-centric mainstream desktop chips don’t have the biggest available graphics solution. But here’s a Kaby Lake map of the die used in the Core i5 and Core i7 chips I covered last week.

Yup, it’s 35, maybe 40 per cent graphics. OK, the graphics bits are used for more than just gaming. But the point is that Intel blows a lot of die space of stuff that we gamers mostly don’t care about on the desktop because we all have a proper graphics card.

That’s a huge let off for AMD and helps offset the fact that Intel will typically have a process advantage. If Intel used all those transistors for CPU cores in its mainstream chips, AMD wouldn’t have a chance. But Intel doesn’t. So AMD does, thank goodness.

Anyway, the alleged specifics of AMD’s product plans have now emerged and there will purportedly be three basic models – the four-core SR3, the six-core SR5 and the eight-core SR7 (AMD has said publicly that the first chips will top out at eight cores, each with the ability to process two threads a la Intel’s HyperThreading).

There will also be AMD APUs based on the same core architecture and with graphics built in. But most of the pure-CPU Ryzen chips will be priced up against mainstream Intel chips with that integrated graphics baggage.

Speaking of pricing, supposedly it’ll be $150 for the quad-core chip, $250 for the six-core model and $350 for the full eight cores, plus a special overclocking model for $500 (though all models will be unlocked and overclocking friendly, which is another easy win versus Intel and its restrictive approach to overclocking).

But what of performance, I hear you cry? AMD has said all chips will have a minimum baseclocks of 3.4GHz. Of course, clockspeeds don’t mean much when you’re comparing different architectures.

However, AMD has performed various public demonstrations in recent months. Probably the most intriguing involved an eight-core Ryzen versus an eight-core Intel Core i7-6900K. The AMD chip was slightly faster in a range of demo benchmarks including, critically, Battlefield 1. The 6900K is a $1,000 CPU. You can watch the demo here on AMD’s Youtube channel. Hmmm.

Various alleged benchmarks of early engineering sample Ryzens have also been doing the rounds and make for a similar story. I’m not sure I totally buy the idea that AMD is going to be selling a chip that matches or beats a $1,000 Intel chip for just $350. But I think AMD has almost certainly closed much of the per-core performance gap to Intel.

It’s also worth noting that the financial markets are much more bullish about AMD of late. As recently as February last year, AMD was trading at $1.80 a share. It’s now up over $10. Clearly, the self-styled master race of financiers are pricing in expectations of something very good coming down the line.

Similarly, I cannot help but notice that at around the same time that Intel will very likely have found out just how competitive Ryzen is likely to be (if anyone outside AMD knows what to expect from Ryzen, it will be Intel) a six-core CPU first appeared on its mainstream CPU roadmap. I do not believe that is a coincidence.

The bit of the story I won’t cover here is the supporting motherboard chipsets that come with Ryzen. Arguably, they’re just as desperately needed as the CPU itself, so utterly antiquated have AMD’s desktop chipsets become. The outlook here looks fairly promising, too. But we’ll cover the details when the chips finally launch, probably at the beginning of March.

Much of the above, therefore, is just a bit of fun. I don’t think Ryzen is going to blow Intel away. If the rumoured Ryzen pricing is right, that indicates a competitive CPU, not an Intel killer. But that’s all we actually need. Something competitive to wake Intel up and keep prices honest.

I reckon a six-core Ryzen for $250 / £225 with a spot of overclocking could well be the gamer and sensible-money PC enthusiast’s weapon of choice this summer. But worst case scenario, Ryzen will mean we’ll all have to pay less for a gameable CPU, whoever we buy it from. And that cannot be a bad thing.

From this site

48 Comments

  1. Sp4rkR4t says:

    Good write up, although the last I heard there were doubts about any 6 core models at least in the launch window since the core packages come in batches of 4 unlike previous AMD chips which came in 2’s. If they do bring a 6 core to market it will be because of heavy binning and you might even be able to switch them back on like the early AMD multi-core chips.

  2. Generico says:

    On the CPU relevance in gaming, they could be a lot more important than they are. I think a lot of their seeming irrelevance is due to the fact that most devs really don’t put a lot of effort into their AI code, which is where the CPU can really make a difference. I think a lot of this is the result of generations of consoles that had in-order processors that are great for nice clean operations like graphics and physics, but really terrible for unpredictable highly conditional things like AI. The long standing impotence of console processors in that category has, I think, dulled the perception of what AI could do in games for an entire generation of game designers.

    • Bremze says:

      You’d think if AI was such of a big deal, there would be PC games making use of the beefy OoO CPUs with good branch prediction and loads of cache. The truth is, AI that’s makes you think that it’s good is generally more convincing and way less performance and development intensive than actual good AI because people don’t care whether the behaviour is generated by clever hacks or actual simulation and nor they should.

      • Premium User Badge

        Don Reba says:

        But wouldn’t it be nice if Civ actually got better AI instead of cheating at higher difficulties?

        • fuzziest says:

          Writing good gameplay AI is way more complex than just throwing more CPU at it. You can brute force improved graphics but no way to get better AI without a lot of engineer and designer hours.

          I’ve heard many stories of devs spending time on complicated smarter AI that was way less fun to play against than something dumb but easier to tweak so they ripped it out.

          Often when gamers have angst over a cut feature that a company was “lying” about at trade shows they don’t realize devs probably spent ages getting it to even 90% of the way there but that last bit might have taken years to sort out or make consistent.

          • dashausdiefrau says:

            The problem is rarely that features are cut. The problem is that the changes are not communicated before release, then they withhold review copies. Simply, they deliberately try to mislead the consumer.

          • Rumpelstiltskin says:

            Yeah, AI quality depends mostly on the amount of R&D time spent on it. What does scale well with CPU power is physics/simulation.

          • Hedgeclipper says:

            They’d find the money and the hours if AI was something you could sell. The trouble is that its not an easy point to explain or market. With graphics (and note how much time and money goes into this on AAA games) you can literally see the difference and it doesn’t change game play to have a range of settings for low to high power graphics cards. AI though is much more ‘hidden’ you can’t see the difference between CPUs unless you play a while at different computers and I’m not sure how you’d even approach game design where the AI worked more effectively for some customers.

          • Premium User Badge

            Don Reba says:

            You can brute force improved graphics but no way to get better AI without a lot of engineer and designer hours.

            Not at all. AI often scales well, especially in a strategic game. When you have more computing power, you can plan more moves ahead, take more variables into account, use more exact algorithms.

  3. Penguin_Factory says:

    Interesting. I’m planning on getting a new gaming PC some time after the release of the Nintendo Switch (sweet gamer $$$ are locked up for that at the moment), so I’ll definitely be keeping an eye on this.

  4. kud13 says:

    Hmm. I upgraded from my barebones AMD chip to the 8-core FX 8350 two(I think?) years ago, in anticipation of the Witcher 3. It cost me less than $200 in a regular retail store. Never had any issues with my machine after that.

    Not sure how AMD was considered “not relevant” for years.

    • Raoul Duke says:

      Ditto, running the same chip and yet to be cpu bound in anything.

      • Xenotone says:

        You’re not wrong, but you’d be getting better framerates with an Intel setup even if you’re not strictly cpu bound. You’d pay more, sure, but AMD has no equivalent in the high end consumer segment.

    • Premium User Badge

      phuzz says:

      “Not sure how AMD was considered “not relevant” for years.”
      Because you could spend less on an Intel chip, and get more for your money. Unless you had a massively parallel workload (ie, not games), there has been no point to buying AMD.
      That said, CPU has been much less of a factor in gaming for the last five(ish?) years. Spending money on a better graphics card is a better use of your hard-earned.

      • Horg says:

        ”Because you could spend less on an Intel chip, and get more for your money.”

        Hell no. Intel have been far worse than AMD in terms for cost / power ratio for a long time. AMD CPUs have been the go to for budget buyers for several generations becasue the equivalent intel chip is frequently more than twice as expensive.

        • Moraven says:

          AMD has only been competitive in the $100 range.

          • Horg says:

            AMD haven’t been competitive at any range. They have been affordable for several generations for people on a budget, while Intel have not.

    • Moraven says:

      i5-35xx for $200 would have been a better value than a $180 (USD) AMD FX-8350. The i5 is 20-30% better in single thread performance (most games) along with the smaller intel chip had better power consumption.

      • Horg says:

        Those prices are way off. The FX8350 has never been that expensive, it’s always been closer to $140.

  5. montorsi says:

    Uh huh. Roughly 0% chance that this will be competitive for anyone who isn’t on a very tight budget, which is fine. They are what they are and it’s an important market to serve, but it’s about time we stopped implying or suggesting AMD is really going to be competitive with anyone (again). They aren’t. They’re their own thing. It’s OK.

    • Lukasz says:

      No. Chances are high that they will be competitive with Intel in mainstream market. Maybe not at the 800-1000 dollar CPU level but at level where majority of us operate.
      No idea where you are coming from with that zero percent.

      Secondly amd not being competitive is really bad thing which we have seen in past five years. Crappy progression on part of Intel combined with very high prices…. Decent CPU is over 300 AUD… That’s silly

    • SquarePeg says:

      The IPC (instructions per clock) difference between Haswell and Kaby Lake is only around 5.7%. This is a good write-up on Skylake. Since for gaming Kaby Lake is just a process optimization with zero increase to IPC versus Skylake this applies mightily.

      link to anandtech.com

      Ryzen has already shown that it can overclock to 5ghz on air. So if you can grab a Ryzen 6c/12t R5 cpu for the same cost as a Kaby Lake i7 7700k and overclock to near 5ghz with just a good $30 cooler it negates Intel’s minor IPC advantage. Ryzen will compete with Kaby Lake on all fronts, efficiency, performance, and bang for buck.

    • Nosada says:

      It’s good to be sceptical of hype, but we shouldn’t go overboard either. Everything indicates that AMD has fixed its IPC deficiency, so if even half of what they claim is true, near-parity with Intel is certainly an option.

      When people say “but Intel has a far larger R&D budget than AMD, so catching up to them is impossible”, I can only assume they weren’t around for the launch of the Athlon 64’s. AMD had a far smaller R&D budget back then as well, and pretty much wiped the floor with Intel. Gamers back then didn’t even consider Intel’s Pentium 4’s an option, they were that inferior to AMD’s architecture.

      One can only hope Jim Keller (the architect of the Athlon) performed another miracle (he was hired for Zen as well).

    • PoulWrist says:

      What an absolutely dismissive opinion you had to share. If you followed anything you’d also know that the 8c/16t CPU is 95w TDP, that’s around 60% of the Intel equivalent.

      So that’s rather competitive.

      • ColonelFlanders says:

        When did the watts thing get so important? I don’t recall ever seeing an appreciable change in my electricity bill after using my computer a lot.

        • Premium User Badge

          Don Reba says:

          Fewer watts don’t just show up on your electricity bill, but they also mean less heat, and therefore less noise from cooling systems and fewer problems from overheating.

  6. Saiko Kila says:

    Me only cares for single thread performance, i.e. Dwarf Fortress. And with that, for now Intel wins. Funnily, till Kaby Lake, it was winning with a 3 year old chip (i7-4790K), because its successor (i7-6700K) was slower in Turbo mode, unless overclocked (and after burying three motherboards I say “thanks, no” to overclocking).

    Anyway, I’m really looking to real life tests with single thread. Also I lied earlier about caring about ONLY single thread performance, because I also play AAA titles and such, though ultimately it is a deciding factor to me.

    • thenevernow says:

      Dwarf Fortress requires a top-level CPU? I’m baffled.

      • ColonelFlanders says:

        Dwarf Fortress is the most processor expensive game on the market. Doesnt matter how shit yiur graphics are, if you want to simulate the you need a good CPU. Dwarf Fortress does a LOT of simulating.

        In fact I’m pretty sure that there isn’t a processor on the market that wouldnt choke on your fortress eventually

    • MacPoedel says:

      If you killed three motherboards by overclocking, you were doing something wrong. Most likely your voltages were too high, either because you were a bit overambitious or because you left them on automatic (voltages when in automatic mode tend to get way too high when overclocking). You also need an appropriate motherboard for overclocking, not every board has power circuitry that can handle the extra stress. I’m not saying you need a €400 RoG Elite whatever, but you just need to pay attention to the components, there are perfectly fine sub €150 overclocking Z170/Z270 boards.

      I have a 6 year motherboard running a 1GHz overclock on a Core i7 860 and it’s still 100% functional (and waiting to be replaced by Ryzen or Kaby/Coffee/Cannon Lake if AMD fails to meet the hype).

  7. HumpX says:

    Ive had a cheap 6 core AMD CPU for the past 4 years along with a middle-to-lower-end Nvidia GPU. Ive been able to play every game in that/this time to my satisfaction. The “chase” for premium hardware ended for me many years ago. Its futile and it drains ones finances. Stop reading hardware reviews and hardware sites. Buy a middle-of-the-road system and enjoy the games and ignore the framerate numbers. You’ll be far happier.

  8. Premium User Badge

    Don Reba says:

    I find the AVX2 instruction support exciting. Eight cores, each processing 8 floating-point instructions at a time — there is so much you can do with this!

    • Premium User Badge

      Don Reba says:

      Oh, wait, just checked, and it looks like it does it at 2 cycles per instruction — so, the instruction set is supported but gives no performance benefit over SSE. That’s no fun.

      • Jeremy Laird says:

        Yeah, IIRC, Zen’s floating point units are still 128-bit width to Intel’s 256-bit ALUs.

      • FriendlyFire says:

        Even Intel’s AVX implementation isn’t ideal. Apparently running it full-tilt at 256-bit width produces a tremendous amount of heat, so throttling happens really fast.

        • Premium User Badge

          Don Reba says:

          My 4-core Haswell with the stock cooler has been able to run heavy AVX2 computations for hours and days on end with no overheating. I guess it could be a problem for 8-core or overclocked chips.

  9. geldonyetich says:

    I had predicted when it came time to finally drop this FX-8120 that I would graduate to an Intel CPU. But in light of this news, Intel’s layoffs, and Intel’s refocusing on mobile devices, perhaps not.

    Well, since when was it not the case that predicting technology was a fool’s errand? When it comes time to upgrade, I’ll put the chip that performs the best on benchmarks without costing many tines the runner up in my system, per usual. If Intel wants that to be them, they’ll have to exert themselves a little more.

  10. kazriko says:

    If you watched their reveal stream where they announced the name, they were emphasizing that it was a new “horizon” for AMD, which is why Ryzen is pronounced with the long I. Nothing at all to do with “risen.”

    • dozurdogbite says:

      But wait…
      On the horizon! A new dawn seems to be rising.
      Yes I can see them now, the Zon cpu family.

  11. ColonelFlanders says:

    To be honest I don’t see the point of getting horny over new CPUs. The i7 2600k which launched 6 years ago is still more than capable of running every game on its maximum settings. Anything above and beyond that is just us falling for their bollocks marketing. CPUs are so fast and changing so little, and the instruction set has been mega efficient for years. What AMD and Intel BOTH need to be doing is making graphics cards work a bit harder for the supper – they are horribly inefficient and need replacing every 3 years so that you don’t have to turn your games down to ‘snes mode’ to get them to run. Also widen them buses

  12. Grovester says:

    Whilst people are right in saying that even a 4-5 year old CPU cuts the mustard in most games, there’s the counter-argument that other supporting technologies such as NVME SSD’s that have come in over the last few years, that a new Motherboard will utilise. That makes it worthwhile to upgrade once every five years or so, and if Ryzen is a good value CPU with a strong motherboard chipset behind it, then it could well be worth it.

  13. slartibartfast says:

    All I think when I hear people querying if a CPU is really that important for gaming is “but have you played ARMA?”.

    Seriously though there are a lot of games (albeit a lot of them early access) that depend massively on single core performance. I speak as a former AMD user that found this out the hard way.

    • slartibartfast says:

      Also I wonder how they intend on avoiding confusion with the Riesen chocolate chew???

  14. Saturday Boy says:

    I see the AMD CTO is a keynote speaker at the imagination technologies graphics summit in Santa Clara in about six weeks time. Could AMD be about to license PowerVR or Ray Tracing?

  15. Thomas Gaskins says:

    I agree, the Ryzen was a big hit of AMD and it can let everyone can overclock without spending like thousands on PC’s hardware

    I will buy a Ryzen rig and test it then. Stay tuned!