AMD’s new RX Vega gaming graphics revealed at last


It’s been an arsingly long time coming but AMD finally has some new graphics tech to flog and for all of us to game upon. The new Radeon RX Vega generation of gaming cards has been announced. Inevitably, we’ll have to wait just a little longer to find out exactly how fast they are but we know enough to begin answering some key questions and posing a few more. Is this the graphics revolution we’ve all been waiting for, for instance, or is it one derivation too many of AMD’s successful GCN architecture? Strap in and let’s go.

Let’s quickly cover what exactly has been announced. It essentially boils down to three different video boards, all of which are derived from the same underlying GPU. Known as Vega 10.

It’s a 14nm chip, if you care about that kind of thing, and packs 12.5 billion transistors. To put that into some kind of intelligible context, AMD’s previous ‘big’ GPU known as Fiji and found in the Radeon R9 Fury X was 28nm and 8.9 billion transistors.

Immediately, you might think that transistor count is modest given the major step change from 28nm to 14nm. However, the GPU at the heart of Nvidia’s mighty Titan Xp graphics card clocks in at 12 billion transistors. So in terms of raw, dumb complexity this new AMD beast is right up there. It’s an attempt at a proper high-end gaming graphics card.

As for the other major specs, well, we’re talking 4,096 AMD-style and pixel-prettifying shaders, 64 ROPs for outputting pixels and 256 texture units for, ya know, texturing stuff and shit (technical term, shout if I am losing you).

Now the slightly odd thing about all this is that those numbers – the shaders, the ROPs, the texture units – are all the same as the aforementioned and frankly rather elderly Fury X. It is not, of course, a simple matter of a direct comparison. But hold that thought.


The new Vega 10 GPU differs, however, in a couple of key areas. First, it sports a second-generation version of the HBM or high-bandwidth memory tech first seen in the R9 Fury series. Apart from anything else, this second generation version kicks off with 8GB of graphics memory where the first-gen effort was capped at a problematic 4GB.

The other big change involves clockspeeds. The old Fiji chip was only good for about 1GHz. This new GPU is good for more like 1.7GHz. So even with all other things being equal, and a lot of things do indeed look equal, this new RX Vega GPU should be getting on for 70 per cent faster by at least some metrics, which is a very nice performance step.

Of course, at this point we have no independent performance numbers. That said, the mere fact of where AMD is pricing this card gives us some indication of where it sees its performance potential.

Notwithstanding some current pricing oddnesses due to a spike in GPU demand driven by crypotcurrency miners, AMD has priced the three versions of the new GPU to roughly align with Nvidia’s GeForce GTX 1070, 1080 and 1080 Ti boards at $399, $499 and $699.

Which reminds me, I haven’t detailed the three. The entry-level effort is the Radeon RX Vega 56. It gets 56 compute units (hence the name) and thus 3,585 shaders, 64 render outputs and 224 texture units and a maximum boost clock of 1,471MHz. Next up is the RX Vega 64 with all 64 compute units and thus 4,096 shaders enabled, along with 64 render outputs and 256 texture units. It clocks in at 1,546MHz, max.

Finally, there’s liquid-cooled RX Vega 64 Liquid which ups the ante 1,677MHz but otherwise offers the same numbers as the air-cooled 64. All three pack 8GB of that HBM2 memory on a 2,048-bit bus.

At this point there’s not a great deal more to say other than wait and see. If you twisted my arm, my spidey sense is telling me that this family of GPUs is probably AMD’s long-established GCN graphic architecture (and I believe that Vega is largely a respin of GCN, not something substantially new) stretched to its limits.

It will probably suffice at the high end, but I feel the battle in the heart of the mid-range will be tougher for what is a rather elderly technology, at heart. Given that GCN has given AMD so much success and is indeed found inside both of the major games consoles, you can understand why AMD has stuck with it this long. But eventually it will become a liability in a PC gaming context.

That may not quite have happened with Vega. But it might be a close run thing.


  1. Sakkura says:

    These cards are based on brand-new GCN 5, not the same old GCN you find in consoles and graphics cards going as far back as December 2011.

    • Buuurr says:

      Nice. I know nothing about this card. I will go ahead and say it will clock in at or a little lower than the 1070. I don’t know this… I just know AMD is always a little lackluster. Who knows? I may be surprised.

    • Flopper says:

      They’re also primitive in that they use double the wattage for being equal or still lesser than the Nvidia 10xx series. Don’t see any mentions of power in this article.

      • guidom says:

        Power consumption on the Vega 56 is similar to the 1080 Ti series, which has comparable raw power in Tflops. You want more flops, you’re going to need more watts.
        Speaking of the floppites, glad to see AMD catching and beating Nvidia with raw power for once. The Vega 56 running at 10.5 flipping floppers! I am still flogging my ancient R9 290, waiting for AMD to release something with more than half a dozen zigga flops. Maybe the time has come…

      • Ragnar says:

        I thought that was a feature, to discourage miners from buying them all.

  2. Premium User Badge

    Drib says:

    But will they have suitable drivers to actually make them worth using?

    Last time I had an AMD card, it crashed constantly on anything newer than a year or two before. I asked their tech support, and got “Yeah, it’s a known issue for the past eight months. We’re not fixing it”.

    I will never buy another AMD card.

    • nikgtasa says:

      Yeah, relive always crashed on me too some drivers ago. And replay function just bugs out still.

    • b00p says:

      i had this exact same thing happen.

    • Crusoe says:

      And me. Their software is total garbage. I’ve purchased two cards from them before in an attempt to curb cost, but it’s just not worth it.

    • MadArcher says:

      I must be lucky then. Pretty happy with my 3-year old setup.
      And Nvidia can wipe their asses with their nosy software.

      • GenialityOfEvil says:

        Geforce Experience is optional. They talked about making it mandatory 2 years ago but never bothered. I’ve always just downloaded the drivers from their website.

        • sosolidshoe says:

          Experience may be optional, the tracking software is not, it started installing itself even with the base drivers around the end of 2016. It’s also not clear whether or not it can be fully turned off – you can disable a telemetry service but that’s not a guarantee.

          As for AMD’s supposedly garbo drivers – eh, prior to my current 1080 I used a 7970 for years and before that a 5850 and they worked fine. There were occasional performance dips on certain games, but I’ve had the same thing happen with nvidia drivers.

          • GenialityOfEvil says:

            You can turn it off, it’s a Windows service, all of which can be disabled.

    • SyberSmoke says:

      AMD Has really been on the ball with drivers in the last couple of years. They have been far more stable and they have been releasing the drivers at a more consistent interval. Things change…other wise I would still be using a 3dfx GPU. ;-)

    • Jokerme says:

      When was this? 1997?

    • Michael Anson says:

      Chiming in to say that I haven’t had an issue with AMD cards for years.

    • gruia says:

      same here. thats why in the last 5 years i had nvidia. and i have no reason to let go. always dependable. plus invested in gsync

    • Frosty Grin says:

      AMD’s drivers are great now. Stable and with a ton of new useful features.

    • GrumpyCatFace says:

      Same, and no support offered anywhere. I ended up going back 3 versions into legacy drivers, just to get a semi-usable display. Never again.

    • KenTWOu says:

      Sounds more like your PSU was the problem.

    • syllopsium says:

      Not to mention the dropping of support for older cards sooner than NVidia. Now, OK, you might say it’s reasonable to expect an upgrade every so often, but NVidia is supporting some cards seven years after their release.

      AMD? Gave up over two years ago on similar timeframe cards, the drivers were sub par, and they still don’t support stereoscopic 3D by default.

      Saving a minimal amount of money doesn’t seem worth it when experience shows their long term support is questionable. Great for those people that change their cards every two years, not for those who play older games, or shift their cards to other machines and would still like them to work.

      • Psihomodo says:

        Way to flop the situation, nVidia is the one shunning their older cards and even slowing them down to make the new ones seem needed. AMD on the other hand supports theirs even longer than needed. Sure, some less used functions stop working but you can play games without much problems for years. Just like I did with 9700 Pro, 1600x, 5870, R9 290 still. And it saved me a lot of money for needles upgrades.

    • Premium User Badge

      phuzz says:

      The drivers for my R9 290 work fine. Never had any crashing problems with them, and it’s even got built in overclocking controls which is nice.
      Pretty much on par with the nVidia drivers when I last used them a couple of years ago.

      • Psihomodo says:

        Can’t remember any problems in the last 15 years. Of course, I know how to setup, use and maintain my PC. Also, changing drivers every other week like nVidia needs, no thank you.

    • Shlork says:

      I’ve been alternating between NVidia and AMD cards for a long time now and I only ever had issues with NVidia cards. My 4 years old Radeon is still working just fine, no issues with the drivers. That said, I’m giving NVidia another chance, as I’m not sufficiently impressed with what was revealed about Vega so far.

    • Otterley says:

      Until about two months ago I had an RX 470 (which I was able to sell to a miner at only €10 loss) and was also disappointed with the drivers. Initially, viewing videos in any media player resulted in the fans jumping to max speed and staying there until I rebooted. WattMan kept forgetting my settings with every driver update and dual-monitor use increased power consumption far too much. Then, of course, ReLive would only work sporadically – which was a pity, because it was great when it worked.

      As far as the actual card was concerned, I was quite pleased. The performance was close to the RX 480 with the factory overclock, and it was additionally possible to undervolt and overclock (simultaneously ), bringing power consumption down to about 90W. All in all I was still happy to see it go, though :/

    • Malcolm says:

      I’ve been running one AMD card or another for the best part of a decade (4870 followed by a 7850 – admittedly not exactly cutting edge these days). The only game stopping issue I remember was a bug in id’s Rage on release, which was fixed fairly promptly.

      The desktop tools used to be slow and clunky (mainly because they failed to follow best practices) but things improved a lot with the advent of the Crimson drivers a year or so back. Had an issue for a couple of months with a crash on windows shutdown – irritating because it interrupted shutdown until you’d ok’d the error message. But since that was fixed it’s been plain sailing.

      I don’t have any recent nVidia experience to compare it to (my work machine runs an ancient Quadro, but that doesn’t seem a useful point of comparison). But based on my personal experience I would not dismiss AMD on driver concerns.

    • engion3 says:

      I’ve been lifelong amd guy, around the 6970 they fixed drivers and now with the new suite they are pretty awesome.

    • Moraven says:

      No problems with my 5+ year old 7950 with drivers and Crimson.

    • mactier says:

      I never had any driver issues in 2016 and 2017. I think they are great.

    • Ragnar says:

      I’ve been using the RX 480 for the past year. The only issue I’ve encountered was with Wolfenstein The New Order, which was partially resolved by updating the driver.

      AMD has really upped their game over the past couple years, with frequent driver updates and new features. They just added adaptive V-sync, bringing them closer to parity with Nvidia, and their multi-monitor features with Eyefinity are way ahead of Nvidia’s features with Surround.

      I’ve been firmly in the Nvidia camp, having used them for the past decade, but bought AMD this gen because the price/performance was too good to pass up. Based on my experience, I wouldn’t hesitate to recommend AMD or buy it again.

  3. GenialityOfEvil says:

    One thing I found interesting on Gamers Nexus was that AMD have barely talked about Crossfire, despite it being in leaked marketing materials earlier, and are only really maintaining its existence rather than investing it. Along with Pascal downsizing support to 2-way SLI, looks like the foolish venture into lazily cobbled together multi-GPU setups might be coming to an end.

    • Chorltonwheelie says:

      Crossfire? Have you seen the power draw on these things? You’ll need to build a small nuclear power station in your garden.

  4. DamnCatte says:

    Just gonna say it, I think this card is ugly. Like, really ugly. There’s just something about that flat aluminum looking cover that reminds me of cheap knockoff electronics you’d find at flea markets.

    • Quickly says:

      The lighting of the card isn’t doing it any favors. Underwhelming first impression, looks-wise, yeah.

    • SalaciousJames says:

      I’m pretty sure I read on PCWorld that’s some kind of limited edition pre-order shell or some such, for those who like that mid-twenty-aughts Apple brushed metal design, I guess. I would think that the various GPU manufacturers would create their own designs, like most cards.

    • NightRaven says:

      I completely agree! When I first saw the Founders Edition, I thought, damn this is ugly. I just don’t see the aesthetic appeal of what essentially looks like a cheap shell.

    • Addie says:

      While that is true, it does fulfill my only requirement of a graphics card: is there enough surface area available to grip it while you push it into place, before resealing the pc box and pushing it back under the desk, not to be seen again until it is replaced? And to be honest, I’d rather they didn’t push up the cost with lights and frippery which will inhibit airflow and cooling, anyway.

      • Ragnar says:

        Don’t you know that it’s not a gaming product unless it’s bright red and black, has cooling fins jutting out everywhere, and enough lights to run a small nightclub?

        I’ve come to accept that I’m not a real PC gamer, because my case doesn’t have a window, my RAM doesn’t have any fins, and my (beige) keyboard and mouse are hidden on a keyboard tray under my desk where no one would see their light show if I had one.

    • Shlork says:

      I don’t know, to me the plain styling of Vega looks better than the juvenile designs that most manufacturers tend to come up with.

    • mactier says:

      I think the Vega FE design (colour scheme) is gorgeous, but this isn’t ugly either. I like the new metallic design (either pure or that other blue) as opposed to pure black.

  5. noiseferatu says:

    To counterpoint the previous commenter I kinda love that lame-ass Descent 1 aesthetic the card has going on. Finally gonna get those 60 frames in Descent!

    • Premium User Badge

      keithzg says:

      Oh man, I should dig back out my game files, fire up one of the open-source engine reimplementations, and replay Descent 1. What an awesome game that was.

  6. UnholySmoke says:

    No mention of FreeSync, I assume it’ll play nice with FreeSync 1 screens? Of NVidia’s many cynical moves, putting a big premium on their frame syncing tech is what might push me into switching back to AMD.

    • Asurmen says:

      Er, yes it will play nice with Freesync.

    • Nouser says:

      It supports Display Port 1.4, so adaptive sync is a given.

    • GenialityOfEvil says:

      G-Sync is set at a premium because Nvidia makes the scalar chip, AMD doesn’t for Freesync. It’s just a standard so it’s up to the manufacturer, who usually just put in a cheap one.

  7. AutonomyLost says:

    Glad AMD is finally getting their new line launched; hopefully it helps increase their market-share. However, I will be smitten for some time to come with my 1080 Ti Hybrid and am looking forward to how it will perform with this Fall’s lineup of new titles.