Update: AMD’s new graphics and CPU awesomeness

AMDVega2

It’s all kicking off at AMD, peeps. The new Vega graphics chip is now more than merely a press release and has finally been released into the wild. Meanwhile, the insane ThreadRipper CPU with 16 cores and 32 threads has also landed. It’s all a far cry from just a few months ago when AMD was soldiering on with an elderly graphics product and a deadbeat CPU line up. Time to catch up with AMD’s latest hardware awesomeness.

To quickly recap, AMD has wheeled out its latest graphics tech in high-end form. That means a trio of pretty pricey cards aimed at the enthusiast end of the market. We’re talking $400 / £400 minimum, or thereabouts.

You can bone up on the speeds and feeds here. But the basics involve three cards based on the same underlying graphics chip, albeit with varying specs, namely the new Radeon RX Vega 56, the Radeon RX Vega 64 and the Radeon RX Vega 64 Liquid.

AMD is styling this new graphics tech as the fifth generation of its long-established GCN architecture, but this time it’s purportedly all change. AMD says the main compute units that contain the brains of the graphics processing hardware have been redesigned from the ground up. AMD has also apparently pinched some features from its new Ryzen CPUs including high speed SRAMs and the clever Infinity Fabric interconnect.

That’s all great, but whatever AMD has done it isn’t immediately translating into a big uptick in actual game performance from an architectural point of view. Not across the board, at least. Often it performs pretty much as you’d expect of any old GCN graphics chip from AMD given the headline shader count and clockspeed specs of the new boards. But hold that thought.

My understanding is that much of the increased transistor count in the new Vega GPU versus the old Fiji chip found in the Radeon R9 Fury boards has been spent on enabling higher clockspeeds through deeper pipelines and other related features. In other words, not on adding computational hardware.

However, if Vega does have an on-paper strength it’s the potential for performance in the most advanced game engines. The specific special sauce here is the so-called Rapid Packed Math feature which essentially enables a high-speed floating point computation mode that comes at the cost of accuracy.

As it happens, the last word in precision isn’t necessary for graphics rendering much of the time. The upshot of which is that this high speed mode should be a boon for handling advanced lighting calculations, HDR rendering and all that jazz. As yet, it’s a somewhat theoretical feature as no games currently support the Rapid Packed Math stuff, but titles are reportedly incoming.

AMDVega1Could it be true that here in my mortal hand I do hold a nugget of purest graphics?

In the meantime, the yardsticks that matter are games you can actually play right now and when you boil down all the benchmarks, the overarching narrative to Vega’s performance goes something like this. If we’re talking about the air-cooled Vega 64 board, it generally falls somewhere in between an Nvidia GeForce GTX 1070 and a 1080. In older games, it’s closer to the 1070. In newer games, make that the 1080. There are of course exceptions, with Doom being an obvious example where the Vega 64 beats the 1080 fairly comfortably.

But as a broad picture, the “1070-ish in older games, 1080-ish in newer games” thang is a reasonable rule of thumb for where Vega 64 (air) currently sits. What that doesn’t capture, however, is power draw. Normally, it’s not something I care a great deal about. But this new GPU is extraordinarily power hungry, sucking up as much as 150W or more above a GTX 1080 under full load.

If that matters it’s because it indicates AMD is likely running the chip right at the ragged edge in terms of clockspeeds and voltages, which isn’t terribly desirable. That said, it may put off the cryptocurrency mining crew for whom the cost of feeding the board with all those watts will be off putting. And that should at least help keep prices in check.

Speaking of the sordid matter of money, pricing of the first high performance Vega boards is high enough to make them irrelevant to most of us even if they stick close to the recommend retail stickers of $499 and $399 for the Vega 64 and 56 respectively (UK prices in £ won’t be much if at all lower, I suspect). So, in many regards, what really matters is the indication these first boards give about what we can expect from more mainstream Vega-based graphics boards in the coming months.

That doesn’t look hugely promising from where I’m slouching. The chip used in these first Vega cards is absolutely massive and it struggles to compete with a fairly old and much smaller second-rung Nvidia GPU in the form of the GP104 item found in the GTX 1070 and 1080 boards.

It’s hard to see how a much smaller GPU based on this Vega architecture is going to radically shake up the space immediately below the 1070. The possible exception to that is if AMD manages to pull off something really special vis-à-vis clockspeeds. But that’s pretty speculative.

All in all, then, I’m not saying Vega is a disaster. But it does feel like a GPU design that is waiting for games to catch up and I’m not convinced that’s a terribly good thing. Nvidia is brilliant at making GPUs that work great in the here and now and the problem for AMD is that a GPU isn’t for ever. It’s barely for next year.

So performance today is ultimately more important than potential for tomorrow. Whatever, one thing I am sure about is that Vega isn’t as good as AMD would have liked. Not by a fair old whack.

Happily, that isn’t something you can say about its new Ryzen CPUs. For the most part, Ryzen must at least match and probably exceed AMD’s hopes for a new CPU architecture.

AMDThreadripperFly my pretties! 32 threads. Count ’em!

It’s a pity for us that the one area where Ryzen isn’t a complete smash hit is games. But it’s good enough most of the time in current games and I reckon it will get significantly better over time as developers get to grips with its strengths and weaknesses. It’s a CPU that’ll be around for a long time in some or other form, so I’m confident doing that work will be seen as worthwhile.

As for the new 16-core ThreadRipper chip, well, it’s really a technological curiosity rather than a realistic proposition for gamers. But I couldn’t help taking it for a quick spin anyway. For the most part, it’s pretty much Ryzen redux in most games. But a brief zap in Total War reveals a level of chronic stutteriness that’s even worse than the more mainstream Ryzen chips.

It’s a very expensive chip, so that arguably makes its gaming performance somewhat academic. But the stuttering was certainly bad enough that it would put me off bagging a ThreadRipper for now, if I had the inclination to spend that kind of money on a CPU. Which I don’t.

Overall, it’s interesting times, far more so than a mere 12 months ago. I do rather wish Vega was a stronger competitor. But it’s probably just good enough to keep Nvidia on its toes rather than sand-bagging, Intel style. Moreover, if Ryzen sells as well as it deserves to, AMD should have some money to plough back into the graphics business and give Vega the polish it probably needs.

46 Comments

  1. Kreeth says:

    Checking out Scan, the cheapest Vega 64 is almost exactly £100 MORE than the cheapest (factory OC’d) GTX 1080 (£587 vs £489). OCUK it’s £550 vs £500.

    I’d guess this might calm down a bit after the initial rush but buying one of these for gaming seems… unwise.

  2. NightRaven says:

    According to OC3D those are just launch prices Launch Pricing

    • Flopper says:

      Glad to see AMD releasing competition 15 months after Pascal dropped. Shame Volta will be dropping in spring and crush this current offering that finally caught up to Pascal.

      Now AMD has to dig themselves out of constantly being 1 year behind in speed. At least it’s some progress though. Hopefully that does something to GPU pricing across the whole market because it’s getting out of control.

    • Malcolm says:

      Yes, some fairly disgraceful marketing tactics on display there. They’ve essentially subsidised the first few hundred cards at select retailers so that the price/performance calculation looks a lot better than it actually is – presumably with the aim of getting some rave reviews in the bag on the basis of a hypothetical launch price.

  3. Ragnar says:

    I think the Vega 56 is the real star here. As fast as the 1070 in older games, faster than the 1070 in newer games, not significantly more power hungry, can come close to Vega 64 speeds just by overclocking the memory, and it’s cheaper than the 1070 against which it’s competing.

    As fast or faster than the 1070 for less money seems like a win to me. Very much catching up rather than launching the next generation, but still a win. It’ll hopefully keep Nvidia competitive with its pricing for the next gen.

    • Fade2Gray says:

      Yeah. The 56 definitely looks more interesting to me. Depending on where prices settle in a few months, I might end up buying one for my next upgrade.

    • fray_bentos says:

      Except the Vega 56 draws 368 Watts vs the 290 Watts of the 1070 under load. That is 26% more, resulting in substantially more heat/fan noise (+8 db), and possibly enough for someone to need to upgrade or buy a more expensive, beefier PSU. AMD are yet again taking a brute force approach just to keep up with Nvidia. Nvidia could easily add 25% more transistors and be 25% percent faster at the same power usage.

      • Sakkura says:

        Those wattages are wildly exaggerated unless both cards are heavily overclocked. More likely, your source was quoting full system power consumption rather than measuring just the graphics card.

        As for noise, just skip the reference cooled version and you’ll be fine. Vega 56 draws less power than the GTX 1080 Ti, which nobody considers a loud card (at least not the non-reference versions).

        • fish99 says:

          TDP is 150W for GTX1070 and 210W for the Vega 56.

          The card looks legit to me though, especially the DX12 performance.

        • GurtTractor says:

          Undervolting Vega is actually where it’s at I think. You can lower the voltage from stock and increase the clocks for a slight increase in power but significantly lower temperatures for the speed, which helps the core frequency stability too.

          • Rozza says:

            Genuine question – if this is the case, why aren’t they like this from the factory?

      • Asurmen says:

        It’s not actually that much more, it outperforms the 1070 and where do you get 8 dB from?

      • Kamikaze-X says:

        Except Vega 56 will be limited on release, it will be limited beyond release due to miners, and it will be priced above MSRP. When you have GTX 1070 Minis due out from Zotac and the like from next week priced at £329 or thereabouts, the Vega 56 is dead in the water if you use your brain instead of thinking of graphics cards as the product of some weird tech charity. We are in AMD’s Fermi period. They do not deserve our custom, much like nVidia didn’t when the 480 was consuming more power than equivalent AMD/ATi cards at the time. We should be beyond the old Netburst lets-just-throw-more-wattage-and-clockspeed-at-it thinking, and AMD in the past made some great leaps with GPU design.

    • Flopper says:

      Wait until Bitcoin miners start buying up all the Vega 56’s. The only reason 1070’s cost what they do now is because they’re the choice card for Bitcoin mining.

    • phuzz says:

      I’ll probably be upgrading around the end of this year, but as I’ll be watercooling, an OC’d Vega 56 sounds like a good option.

  4. Premium User Badge

    Raelalt says:

    I wish there had been some comparisons to older AMD cards such as the RX580. I don’t think I would upgrade, and least not just now, but some information on what kind of increase there is over them would be helpful.

  5. Premium User Badge

    The Almighty Moo says:

    Still worth holding off on that brand new rig then Jeremy?

  6. Buuurr says:

    Called it! Nvidia wins again. Is AMD overcharging us all off, just a bit? Lulz! #thirdworldgaming #Icantbuyarealcard #settleforlessandwhine

    • toastmodernist says:

      hang in there buddy.

    • Premium User Badge

      Mikemcn says:

      Maybe be less of a fanboy and be happy there is some competition still for graphics chips? This is good for customers.

      • Buuurr says:

        “Maybe be less of a fanboy and be happy there is some competition still for graphics chips? This is good for customers.”

        See, that’s the thing. I am not really a fan boy. I am all for performance. I guess, as I have said before, you could call me a fan boy of performance. If AMD had come out with something to make Nvidia up its game I would be excited. They didn’t. So, as I said weeks ago, we are stuck with more mediocre, over-hyped AMD gear. Truly the fan boys are AMD fans who can’t see all the hype they (AMD) build and then fail to deliver. These new Ryzen chips were supposed to crush the Intel chips. Didn’t happen. The Vega was supposed to bankrupt Nvidia it would be so good. Not even close.

        I’m sorry but I think AMD fans need to hold their hardware makers to account for the bullshit. I mean what are they going to do when the 2000 series is out in a few months? How about the next Intel line in a few months? I mean the stats are going to be insane at that point. Right now between Intel and Nvidia vs AMD it is like watching a pit bull shake a pug. Soon, it will be like watching a bear eat a kitten.

        There really are only two companies pushing clock speeds and graphics ahead in this world and for some reason (as per reviews on this site) they are hated. I get it. They are expensive. I don’t see anyone bitching at Bugatti for their cars though. Top of the line, was and is ,always going to be a premium. That said, the mid range, poverty chips really do need to do more to keep up.

        I used to love the early to mid 2000’s when AMD and Intel had a true rivalry performance-wise. ATI was a thing and it was good if the drivers held up. Now though. Now AMD fans are hitting a bad stretch. They are about to hit the Windows user vs Linux guy stretch… always clinging to the one thing that makes it reasonable to hold on… price.

        Sorry. That isn’t good enough. Keep up.

        • dr.denton says:

          Whoever talked about “crushing” anything, was either someone working in PR or a complete idiot or both.

          Of course there is no way AMD is going to completely trump nVidia’s and Intel’s products. It’s simply a matter of numbers – AMD has a fraction of the money to spend on R&D, so they have to make more compromises in their chip designs than nVidia or Intel.

          The fact alone that AMD is able to even play in the same ballpark as either company is a great achievement and should be appreciated as such.

          Looking at the mob like business practices Intel employed in the early 2000s, when there was real competition from K7 and K8, one can sort of understand the hatred against them.

          Not that it matters to anyone today, but the technological merits of Vega will become clear over time. As it has been the case with pretty much any ATi/AMD µarch since at least R520.

      • RichUncleSkeleton says:

        People keep saying this and yet Nvidia and Intel keep price-gouging and releasing dinky revisions to existing hardware. That whole ‘competition drives innovation’ thing only works if there’s any degree of parity between the supposed competitors. AMD is a high school football team to Nvidia’s Super Bowl champs.

        • Buuurr says:

          Right, because they can. No one forces anyone to buy them though. Personally, I wait for the actual advances before upgrading. I was going to get the X299 platform but saw, for me, it didn’t offer anything mind-blowing over what I already have. So, I will skip at least a generation or two. I think that if everyone did this they would actually have to put something out there to entice an upgrade. AMD. Nvidia. Intel. All of them. If this was held over their heads it would change. It won’t though. Big business ordering the latest and supposedly greatest is what drives it… not the everyday enthusiasts.

  7. fish99 says:

    Competition is good (and overdue).

    I already have a GTX1070 so I won’t be changing GPU anytime soon, but the DX12 performance of the Vega cards looks really impressive. Will be interesting to see the prices going forward.

    • Premium User Badge

      Mikemcn says:

      Nvidia so badly needs good competition. It’s weird to me how hard it is for other companies to compete in the graphics space.

      • ThePuzzler says:

        There’s enough potential competition from AMD that NVidia feels the need to spend a lot of money on R&D to make sure they don’t fall behind. Any outsider who wanted to get in on the market would have to spend a fortune to have any chance of catching up, with no immediate prospect of making any profit.

    • -Spooky- says:

      DX12 in general? Or specific on one hand DX12 games?

      • fish99 says:

        Not entirely sure what you mean, but having looked at a bunch of benchmarks I see Vega 56 pulling comfortably ahead of the GTX1070 in most DX12 games and matching the 1070 in DX11 (the one exception I saw was GTAV).

        link to guru3d.com

  8. Voldenuit says:

    I’m looking at a Ryzen 5 6-core build (for NLE and gaming) but there is no way I would touch Vega with a 10-ft stick.

  9. Jason Moyer says:

    The Vega 64 is slower than a GTX 1070, as far as I can tell. And they can throw as many cores at their CPU tech as they want, if the per-core performance is still slower than a 10 year old i5 then who cares. No gaming relevance whatsoever.

    • Asurmen says:

      No, it’s faster and competes against the 1080. What reviews have you seen showing the 64 slower than 1080?

      Also the gaming performance is better than a ten year old i5

    • shaydeeadi says:

      As far as Ryzen goes, 1 or 2 frames either way is hardly a chasm in performance (ref: link to legitreviews.com some benchmarks from april.) And if you use your PC for any sort of creative activity as well then Ryzen is a very tempting proposition.

    • fish99 says:

      In the benchmarks I’m looking at the slower Vega 56 is beating the 1070, and beating it comfortably in DX12 games. Dunno where you’re getting your info from.

      link to guru3d.com

  10. ThePuzzler says:

    Has anyone come up with a form of bitcoin mining that benefits humanity in some way, instead of just wasting electricity and burning out high-end graphics hardware?

    • syndrome says:

      bro do you even understand what mining is….

      if something is beneficial for the whole of humanity that has to be blockchain. you can’t have blockchain without mining, bitcoins are just the side-effect that sucks profit-minded people into the value-belief system.

      when you have real-world electricity (and effort) burnt up so that bitcoin could accumulate value, this is what imbues an intrinsic value to the work involved. AND the network.

      it is all very similar to digging gold. if anyone can have it without any work whatsoever, WHERE IS THE VALUE?

      the real thing that mining does is that it makes it necessary to spend time to validate the network (or more precisely, blocks). by calculating problems that are impossible to approximate, you preserve the data integrity. that means that tinkering with the network data would require spending more time, making able for the network to heal itself (thanks to redundancy), due to the time it takes to flood it with corrupted blocks.

      even if someone would create a superior computer that could crack these math problems in a fraction of the time, it would still have to spend some time processing such vast amounts of data, and vast amounts are required to fool the most of the network anyway. in other words, tinkering is prevented by system design.

      there, I hope you can see why is important to have it “waste” the electricity and burn out high-end graphics hardware, so that your grandsons are freed from the banks, monetary funds, and national reserves monopolizing the worldwide economy and cutting into everybody’s ledger. their entire job is to “guarantee” the state of the ledger, and to record and certify the individual transactions in order to “prevent” fraud, massive devaluation, and crashing of the economy. it is basically a BELIEF SYSTEM, much like religion is: “this is a holy man, he would never touch a boy!”

      as we all know it, the banking job is highly corruptible, but we historically couldn’t find any other option, but to empower some people beyond all reason! so stfu and inform yourself on blockchains.

      • ColonelFlanders says:

        Except that the crypto-currency bigwigs are just as corrupt as the banking bigwigs. The only way you’re going to free yourself from the jaws of the evil money spinning banks (or whatever bollocks you want to spout) is to teach human beings not to be fuck heads.

      • konondrum says:

        As I thought, a bunch of cultists chasing imaginary loot.

  11. KenTWOu says:

    Well, at least Linux community seems happy with Vega and its open source driver.

    • Premium User Badge

      MajorLag says:

      The Linux Desktop community has tragically low standards though. I mean, just look at the Linux Desktop.

  12. Diziet Sma says:

    As an owner of a Vega 64 Liquid, I’d like to chime in with some positive news. Paired with a 4K freesync monitor it absolutely is 4K gaming capable. I now have the games I used to have to play in 1440p running natively at 4K with the same settings at the same FPS or approaching 60fps. Freesync takes care of the rest.

    Batman Arkham Knight 60FPS at 4K was the most surprising result for me I think. I still need to see how the Witcher 3 and Deus Ex: Mankind provided perform but everything I’ve played so far has seen a marked improvement, including No Man’s Sky and XCom 2.

    As for VR gaming I’ve not played much yet, but it’s certainly made Rallycross races in Dirt Rally playable in High in VR whereas they weren’t in Low/Medium settings for me before.

    It’s the AMD/Freesync pairing that made the purchase a no brainer for me. I like freesync and having only recently got such a monitor the thought of switching sides and then forking out for a new display + paying the NVidia G-Sync tax left me a little cold. Undoubtedly the NVidia chips are more efficient and faster, but I’m happy with the AMD/Freesync ecosystem I have.

    Now… if only AMD would get something like Ansel out there… I enjoy taking screenshots and Ansel looks sweet.

    • GurtTractor says:

      Yeah Freesync has been the major draw for me with AMD cards, and I’d like to upgrade to another AMD card so I can continue using Freesync on my monitor and upgrade to a reasonably priced 1440p 144Hz in the future. But I’m getting a bit frustrated with Vega right now, the performance per watt and the extra heat (and noise) compared to the 1080 are one thing. But if the prices stay £100 more than the initial RRPs then it’s pretty much dead to me, as that would be seriously overpriced and the extra cost will start to eat into the savings over getting a Freesync monitor anyway.

      Hopefully the prices will stabilise a little after the aftermarket cooler cards get released, and good deals can be had by the end of the year.

      At this point all Nvidia would have to do to just about totally kill the competition is to add Freesync support to their cards, then there would be little reason to buy AMD at this point. They probably won’t do that though.

  13. Premium User Badge

    Carra says:

    Unlike Intel Nvidia hasn’t been sleepwalking these past few years.

    Bought a 1060 a year ago which is massively better then my previous Nvidia graphics card. My Intel processor however, not worth upgrading, CPU’s have barely improved.