AMD’s new RX Vega gaming graphics revealed at last
What happens in Vega
It’s been an arsingly long time coming but AMD finally has some new graphics tech to flog and for all of us to game upon. The new Radeon RX Vega generation of gaming cards has been announced. Inevitably, we’ll have to wait just a little longer to find out exactly how fast they are but we know enough to begin answering some key questions and posing a few more. Is this the graphics revolution we’ve all been waiting for, for instance, or is it one derivation too many of AMD’s successful GCN architecture? Strap in and let’s go.
Let’s quickly cover what exactly has been announced. It essentially boils down to three different video boards, all of which are derived from the same underlying GPU. Known as Vega 10.
It’s a 14nm chip, if you care about that kind of thing, and packs 12.5 billion transistors. To put that into some kind of intelligible context, AMD’s previous ‘big’ GPU known as Fiji and found in the Radeon R9 Fury X was 28nm and 8.9 billion transistors.
Immediately, you might think that transistor count is modest given the major step change from 28nm to 14nm. However, the GPU at the heart of Nvidia’s mighty Titan Xp graphics card clocks in at 12 billion transistors. So in terms of raw, dumb complexity this new AMD beast is right up there. It’s an attempt at a proper high-end gaming graphics card.
As for the other major specs, well, we’re talking 4,096 AMD-style and pixel-prettifying shaders, 64 ROPs for outputting pixels and 256 texture units for, ya know, texturing stuff and shit (technical term, shout if I am losing you).
Now the slightly odd thing about all this is that those numbers - the shaders, the ROPs, the texture units - are all the same as the aforementioned and frankly rather elderly Fury X. It is not, of course, a simple matter of a direct comparison. But hold that thought.
The new Vega 10 GPU differs, however, in a couple of key areas. First, it sports a second-generation version of the HBM or high-bandwidth memory tech first seen in the R9 Fury series. Apart from anything else, this second generation version kicks off with 8GB of graphics memory where the first-gen effort was capped at a problematic 4GB.
The other big change involves clockspeeds. The old Fiji chip was only good for about 1GHz. This new GPU is good for more like 1.7GHz. So even with all other things being equal, and a lot of things do indeed look equal, this new RX Vega GPU should be getting on for 70 per cent faster by at least some metrics, which is a very nice performance step.
Of course, at this point we have no independent performance numbers. That said, the mere fact of where AMD is pricing this card gives us some indication of where it sees its performance potential.
Notwithstanding some current pricing oddnesses due to a spike in GPU demand driven by crypotcurrency miners, AMD has priced the three versions of the new GPU to roughly align with Nvidia’s GeForce GTX 1070, 1080 and 1080 Ti boards at $399, $499 and $699.
Which reminds me, I haven’t detailed the three. The entry-level effort is the Radeon RX Vega 56. It gets 56 compute units (hence the name) and thus 3,585 shaders, 64 render outputs and 224 texture units and a maximum boost clock of 1,471MHz. Next up is the RX Vega 64 with all 64 compute units and thus 4,096 shaders enabled, along with 64 render outputs and 256 texture units. It clocks in at 1,546MHz, max.
Finally, there’s liquid-cooled RX Vega 64 Liquid which ups the ante 1,677MHz but otherwise offers the same numbers as the air-cooled 64. All three pack 8GB of that HBM2 memory on a 2,048-bit bus.
At this point there’s not a great deal more to say other than wait and see. If you twisted my arm, my spidey sense is telling me that this family of GPUs is probably AMD’s long-established GCN graphic architecture (and I believe that Vega is largely a respin of GCN, not something substantially new) stretched to its limits.
It will probably suffice at the high end, but I feel the battle in the heart of the mid-range will be tougher for what is a rather elderly technology, at heart. Given that GCN has given AMD so much success and is indeed found inside both of the major games consoles, you can understand why AMD has stuck with it this long. But eventually it will become a liability in a PC gaming context.
That may not quite have happened with Vega. But it might be a close run thing.