Is AMD’s New Fury GPU A Titan-Killer?

AMD wheeled out a whole new family of graphics cards at E3 this week. And there was much rejoicing. Well, there was much scripted triumphalism on the keynote stage, at any rate. I say a whole new family. Inevitably, there’s some (read: a lot of) rebadging of existing GPUs. But there is one entirely new GPU, known as Fury. It’s an absolute beast and it costs a bomb. So not many of us will be buying it. But it does debut that snazzy new HBM memory tech. Anyway, as Uncle Ben would say and would probably make for more compelling dialogue than the gunk that actually makes the big-screen cut, with new GPU badges sometimes comes much improved value for money…

(As ever, TL;DR at the bottom.)

First up, the graphics chip known as Fiji. This is the properly new one and it’ll be flogged under the new ‘Fury’ branding. Fury does for AMD what Titan does for Nvidia. In other words, it’s a standalone sub-brand sitting at the very top of the range.

Fiji is an absolute monster with 8.9 billion transistors and no fewer than 4,096 of AMD’s GCN stream processors. Silly numbers and enough for symbolic victories in the PR war versus Nvidia.

The Titan X, you see, is a piffling 8 billion transistors and 3,072 CUDA cores. Not that GCN processors and CUDA are remotely comparable. But who cares? 4,096 is a lot more than 3,072! How do you like them apples, Nvidia?

A more direct comparison is with AMD’s outgoing flagship GPU, Hawaii, as found in the Radeon R9 290X. That packs 6.2 billion transistors and 2,816 GCN processors. Since Fiji is based on a lightly revised AMD GCN architecture, you can get a rough idea of the raw shader power on offer.

Fury X will be water-cooled as standard

For those of you who like numbers, other key metrics include 256 texture units, 64 ROPs and a core clock just over 1GHz.

Where Fiji definitely differs from AMD GPUs passim is its memory subsystem. In this age of bonkers panel resolutions, graphics cards are having to pump millions of pixels upwards of 60 times a second to achieve really smooth gaming.

4K at 60 frames per second means pushing roughly 480 million fully processed pixels out to your monitor every second. It’s pure insanity when you think about it in those terms. How is that actually possible at the same time as the BIOS on my motherboard taking 10 seconds simply to wake the f’up?

Anyway, at modern resolutions bandwidth is a major issue and AMD has decided to shake things up with a new memory tech known as HBM or High Bandwidth Memory. I’ve covered this previously, but the basics of HBM involve stacking memory chips in 3D-style à la 3D SSDs and sticking those stacks on the same package as the GPU.

HBM memory makes for mega bandwidth but also frame buffer limitations…

Oh, and with HBM comes a mega-wide memory bus, in the case of Fiji it’s fully 4,096 bits wide. Cue ghastly marketing puns regards 4K memory tech.

Anyway, the upshot is a big uptick in bandwidth despite using slower clocked memory. Fiji is good for as much as 512GB/s. Crumbs. For context, using old tech AMD cards topped out at 384GB/s.

There is, however, a snag. Putting the memory on the same package as the graphics chip in this new stacked format has lead to limitations in terms of the amount of memory. A Fiji GPU-plus-HBM package is limited to 4GB of memory.

How much of a limitation that will be in practice is currently unknown. But I personally don’t like the look of it in the context of the bullish noises AMD has been making about 4K and 5K performance for the new GPU.

In any case, AMD hasn’t dished all the details yet, but here’s what we know. There will be four Fury boards. The single-GPU performance leader will be the R9 Fury X with water cooling as standard. The plain old R9 Fury will be air cooled. Both of those will launch shortly. Pricing is $649 and $549 respectively. Call that £550 and £450 in old money with the Chancellor’s 20 per cent cut.

Two Fijis, one board…

Later this year there will also be a dual-Fiji Crossfire board, but arguably the most interesting of the new Fijis is the R9 Fury Nano. This is a titchy 6-inch board. AMD hasn’t provided full specifications, but it’s likely to lose some of the functional units but still be the most powerful card yet in this kind of form factor. Nano arrives later this summer and could be killer for small form factor rigs.

Speaking of performance, AMD was pretty cagey. 1.5x versus the old 290X was the repeated refrain. But direct comparisons with Nvidia’s Titan X came there none. Combined with the pricing, it therefore seems that Fury X will be second best in pure performance terms. Which is fine, given that it’s cheaper, but still a pity for AMD in PR terms if true.

Finally, even the top Fury X board is rated at 275 watts, only a little higher than the 290X. Not bad considering AMD remains stuck on 28nm silicon tech, just like Nvidia. In fact, in that context Fiji is about as good as you could reasonably hope for given that a die shrink and smaller, more efficient transistors still aren’t available to AMD.

The Fury Nano six incher is the most interesting new board

As for those rebadges, well, say hello to the Radeon R7 and R9 300 series. At the top we have the 390X for $429, which is essentially the same as a 290X but packs 8GB and immediately makes for an awkward comparison with the 4GB Fury boards.

By the same token, the new 390 is basically the old 290 priced at $329, the 380 is the old 285, yours for £199 for the 4GB version. Finally, the the new R7 370 is a dead ringer for the old R7 265 at $149 and the R7 360 looks just like the old R7 260 but at $109. UK numbers aren’t currently available, but keep your scanners peeled to the usual online retail susepects.

With AMD’s production partner TSMC largely responsible for keeping the graphics industry stuck at 28nm, the carry-over technology is perhaps predictable. Then again, Nvidia made the most of a bad situation by developing its undeniably impressive Maxwell tech that really does seem to deliver the benefits of a die shrink in terms of both performance and efficiency despite also being stuck on 28nm silicon.

Those (not very) new 300 series boards in full

AMD’s GCN tech remains great for gaming. But it’s not as space efficient as Nvidia’s Maxwell tech and it’s very likely that hurts AMD margins. Not good if you believe that a healthy AMD is crucial for the health of PC graphics in general.

Whatever, the immediate appeal of AMD’s new 300 series will come down to pricing. As the 290 was before, I have a feeling the new 390 will be the sweet spot for really serious gamers. An 8GB 390 for circa $300 / sub-£250 would be very tempting.

TL;DR
AMD has a new mega GPU. It’s called Fury and it looks to be about 1.5x more powerful than the old 290X. But probably not as fast as Nvidia’s Titan X. Thanks to a fancy new high-bandwidth memory tech design for the demands of really high resolutions, Fury boards are limited to 4GB, which ironically could be a problem at 4K resolutions and beyond.

AMD has also launched a new family of Radeon R7 and R9 300 boards. These are essentially rebadges of AMD’s existing R7 and R9 200 Series. Some of which are themselves rebadges. Fun, fun, fun.

55 Comments

  1. gunny1993 says:

    AMD dont need a titan killer, they need to start making better mid range cards, the 390X is a waste of money for anyone with money issues compared to a 970 or 290X

    (The difference between a 970/290X and 390X is 10 fps average and 100 GBP on overclocker right now)

    I really hope the fury is great because it might bode well for the next gen, right now AMD need to come out swinging

    • TacticalNuclearPenguin says:

      Still, they also need a monster, because they sell. Any Titan model with the exception of the Titan Z far exceeded sales expectations.

      Buyers interested in the high end usually simply purchase the best, money takes a serious backseat, and whoever can offer that reaps some serious rewards.

      Besides, if AMD can do that ( i’m unsure of that, but it would be nice for competition ) they can just scale down the monster and offer a better midrange, as you say, but it all starts from the “big thing”.

    • Baines says:

      Sadly, AMD might have been able to ride (and even inflame) the backlash that Nvidia received over the GTX 970 if they’d had a better mid-range card to promote.

    • Ufofighter says:

      I have to agree with this, I’m looking for a replacement for a 7870 with artifacts between 250 and 350, and there’s no reason to buy an AMD except maybe dx12 performance(will see), at same price they use around 30 to 40W more and they are hotter and not especially cheaper or more powerful compared to the Nvidia ones.

      The money in gaming gpus is in the 200 to 350€ ranks for 1080p gaming, where AMD used to have very power efficient and cool cards (5750 to 5850 , 7750 to 7870, etc). Right now dependig on the money the gtx 960 or the 970 are almost a no brainer.

  2. remon says:

    Eh, the typical RPS hardware article. Full of halftruths and blatant Nvidia favoritism.

    • Jediben says:

      You’d have to be a fool to have the 2nd best as a favourite.

      • tellemurius says:

        You would be a fool to not look where the money is at, 2nd best or not.

        • jrodman says:

          Thanks all. This comment threads has really shed a lot of light on the hardware situation. I’ve come away from it with a much richer understanding of the lay of the land, and the fine details of the technology.

          • Heavenfall says:

            I’d like to step in at this point and express my gratitude for a quick lesson in the finer nuances of irony. Having almost reached the end of that baseless debate, jrodman through his wit encouraged me to keep reading for much longer than was needed. Surely, having neither gained nor lost from doing so, my lack of interest more or less remains undefined.

          • BTAxis says:

            I agree with everyone here.

    • TacticalNuclearPenguin says:

      Uhm, if i’m not mistaken Jeremy keeps suggesting the 290 as the best thing you can buy since, like, 24 years or so.

      • All is Well says:

        But it’s not a proper GPU article until someone gets accused of fanboyism. The finer points, like “who” or “why”, are really of secondary importance.

    • thedosbox says:

      Citation needed.

      Are many of AMD’s “new” products rebadges, or rebadges of rebadges? Yes
      Is the Fury’s frame buffer limited to 4GB? Yes
      Does AMD make any performance comparisons to the Titan X? No

      In other words, it’s clear who the fanboi is – and it’s not RPS.

      • Asurmen says:

        Is any of that a problem as the article is making out? No.

        • thedosbox says:

          If a company is going to make a big marketing splash for a new product line, the consumer ought to know that a significant chunk of that product line consists of old products.

          If a company is going to make a claim about how big a performance leap their new hotness provides, the consumer ought to know that this claim ignores the competition.

          If a company touts a whizbang new feature that will better support 4K resolution, then the consumer ought to know that feature is potentially hobbled.

          • Asurmen says:

            None of those are problems/unique to AMD and Jeremy is ignoring information out there that answers some of those.

  3. TacticalNuclearPenguin says:

    Then again, there is no need for a Titan X killer, it already died when the 980ti came in.

    This one, well… it’ll need some serious performance, the 4GB thing might be a problem for some, in some extreme situations even at 1440p ( possibly ), and it will all come down to the overclocking potential ( and scaling, more importantly ) of the liquid cooling one.

    The benchmarks cannot come soon enough, AMD really needs to stop competing on price if they want to stay healthy, they need the absolute best with no strings attached.

    With all this said and done i believe whoever wanted to go for a 980ti can do stop now, i can’t see a price cut on the horizon but the regular 980 might drop a tad more, although it already lost 50 bucks or so, allegedly.

    • geisler says:

      I think they should have called it: “GM 200 killer?” In my opinion TitanX isn’t “killed”, at least not by anything in NVIDIA’s lineup. In performance terms, there is no real TitanX killer, not even a 980ti overclocked with custom PCB on water can get far enough in real world performance to be called a “killer” (at least not when you overclock the Titan as well).

      Value is another story of course, although even 6GB VRAM could be a problem on some titles for someone who wants AA at 1440p or higher. So in my opinion the huge framebuffer on the TitanX is added value for the enthusiast demographic. They have no problems throwing money at these problems for the best graphical fidelity at high framerates.

      • xyzzy frobozz says:

        Well.

        As someone with a 1600p monitor, I have seen no title use 6GB with AA on. GTAV comes closest with all of the ultra settings on, using just over 5GB.

        • geisler says:

          Ever downscaled from 2160p? Bingo. I own a 1440p monitor myself (powered by 2 Titan X GPUs), but downscale a lot of titles from 4K. GTA V is far from the only title coming near to 6GB. Try Shadows of Mordor, Dying Light, and other open world games at 4K.

          You’ll see 6GB getting saturated by a lot of games soon, at least if you want high resolution with AA type of fidelity.

        • geisler says:

          You can add Batman: Arkham Knight to the list, even without downscaling (so at 1440p native), i’m seeing 6.3GB VRAM use. The game is a technical disaster though, might be due to leak.

  4. caff says:

    Lazy question because I can’t be bothered to google it… but are the fans connected to heatsinks on watercooled cards quiet – generally? I’ve always been a stickler for quiet fans.

    • pepperfez says:

      Lazy answer because I can’t be bothered to check my sources: I believe the fans on this particular watercooled card have been identified as some model of…Noctua? Scythe? Some reputable brand, anyway. So probably pretty reasonable sounding.

    • TacticalNuclearPenguin says:

      As with other sealed loop coolers ( but not all ), the fans are often nothing really exotic but they simply don’t need to spin that fast.

      Actually, to be more precise you’re sort of soft-limited by the radiator fins, if the layout is not that dense any airflow above a certain treshold will be overkill and won’t do much at all. Some radiators require more static pressure than others, the very dense layout helps making the most use of a limited surface aereal.

      But to answer your question, when it comes to GPU cooling anything below an ultra noisy fan for your radiator will result in far less noise compared to the coolers we’re used to. So yeah, you don’t need anything special in this case to keep noise low, it’ll just be the result of far superior cooling capability.

    • Person of Interest says:

      Silent PC Review wrote a series of articles on quiet liquid-cooled gaming PC’s this year: link to silentpcreview.com

      My takeaway from the articles is: if the cooler doesn’t come with a quiet fan, you can replace it with one. What you have less control over is the pump noise, which can become the most noisy component (not necessarily in raw decibels, but in how distracting it sounds) once all your fans and airflow are optimized. So that’s what you should pay attention to if you are shopping for a quiet watercooled card to put in a near-silent system.

    • mattevansc3 says:

      Generally its just a generic case fan so are quite quiet – the larger the fan the more air gets pushed through per rotation. This allows them to operate at lower speeds which in turn makes them quieter.

      Also in the majority of these all-in-one coolers the fans are not physically a part of the unit and can just be swapped for any case fan you can buy at retail.

    • SteelPriest says:

      Lazy question because all the cool kids are doing it – are these waterblocks easy enough to plumb into a proper watercooling loop?

  5. Allenomura says:

    “But it’s not as space efficient as Nvidia’s Maxwell tech and it’s very likely that hurts AMD margins.”

    I thought the form factor of the new AMD series was distinctly reduced from that of the older cards?

    • sirroman says:

      @Allenomura, AMD is better at mm2.

    • Jeremy Laird says:

      My bad re clarity. I meant architecturally space efficient – in other words performance / die size, which Maxwell basically owns.

      • Sakkura says:

        Not really. Fiji is smaller than GM200, and apparently performs on par. AMDs internal benchmarks show the Fury X outperforming the Titan X by a slim margin, but of course that’s just what AMD would want you to think. More likely it’ll end up essentially a wash, varying from game to game.

  6. L3TUC3 says:

    Actually, by the looks of it it looks like a standard 80mm fan, which is a sound practical choice as it’ll fit practically any tower case but typically aren’t that great when looking at the decibels under load. Generally, bigger fans with lower rpms will give the same amount or better airflow while being more quiet.

    If you want quiet, buy a case that can fit a 120mm fan and wait for the 3rd parties to bring out their own closed loop packages with a 120mm radiator + fan, or retrofit it to your GPU yourself. I’ve moved to a closed loop top mounted cooler for my CPU and it makes a big difference. Plus it’s an OK tea warmer.

    Anyway, the closed loop solution by itself should result in quieter cooling.

  7. J. Cosmo Cohen says:

    HBM

    Heh, heh…BM.

  8. satan says:

    I appreciate the TL;DR, ty.

  9. Wedge says:

    I mean, nVidia already has 970’s around the Nano form factor, but this would theoretically blow those away in terms of performance. I just want to see if they scale it down further, so you could get something with around the graphical power of my trusty 7950 into a low-profile bracket HTPC fitting GPU for around $200. I was hoping for some big advances in iGPU tech to come along, but if you can get a small enough dedicated GPU I’ll take that too.

    • mattevansc3 says:

      Not sure how much more space AMD can save. The big PCB reduction in the Fury cards is the move from GDDR5 to stacked memory. GDDR5 has to be laid along the PCB so the more VRAM the longer the card. Stacked VRAM effectively takes up no more room than a single chip of GDDR5 on the PCB.

  10. Katar says:

    UK prices aren’t as good as they should be for a rebrand. The 390 is going for £270ish which is at least £20 too expensive, the 390X is even worse at £350ish which is £50 too expensive.

    If you can find a 4GB 290X still on sale at £220 or a 290 under £200 those should be the AMD cards you should be buying until the 300 series cards drop significantly in price.

  11. GAmbrose says:

    Was really looking foward to the AMD Fury X, but not putting HDMI 2.0 on a 2015 GPU is a bizzare decision.

    Particularly as they seem to be targeting the Nano at high-performance small form factor (HTPC)

    Yeah it might not affect desktop gamers, but when you can’t actually connect up to a 4k TV and get 60hz, it becomes a bit useless for big-screen gaming.

    FYI – Nvidia have HDMI 2.0 on all their ‘Maxwell 2’ cards (960, 970, 980, 980Ti, Titan X)

    I just can’t understand AMD’s lack of foresight here.

  12. Rumpelstilskin says:

    If they want a “Titan killer”, they’d have to name it “Zeus” or “Olympian”. Would be a bit of Saul Goodman-style marketing, but AMD was always a Saul Goodman against both Intel and Nvidia.

    • pepperfez says:

      If a hardware company respected their customers enough to think they’d get that, I’d buy whatever they were selling (proving myself undeserving of respect).

  13. Razumen says:

    The Fury Nano DOES look really interesting, if it’s a decent contender It would make a great replacement for my HTPC build, freeing up room for three more HDD slots the current GFX card is hogging up space in.

  14. xrror says:

    Ugh, we’ve been stuck on 28nm node too long. Overall I’m disappointed that no “fully enabled” Tonga card came out – supposedly the GPU used in the 285X/380 really does have a 384-bit memory controller, but it’s never been fully enabled except for one weird Apple iMac SKU.

    Also not real sure about seeing Pitcarn drug along for yet another generation (78xx, 270, 370) – especially since it will probably be the biggest bulk seller due to price – and not even offered fully enabled. Pitcarn doesn’t have any of the later features/improvements added later so it holds back adoption for devs. It’s not even really DX12 compliant… (it “is” – but in the same way USB 3.1 was neutered by Apple to just be old USB 3 with the new connector now. reconned specs for marketing).

    I’m gonna stop babbling. Fury seems very promising with HBM, but gen 2 of HBM can’t come soon enough so they can have 8GB. Realistically 4GB will be fine for 90% of gamers… but it’s that psychological part that 4GB is “last gen” now. Hopefully by the time 4K gaming is really affordable Fury gen2 will be here.

    AMD just has to keep the momentum going.

    • Katar says:

      Currently AMD doesn’t have a 380X card or anything in the £200-£250 range, neither does nVidia really. So we might see a “full” Tonga card come out at some point in the future.

      8GB is a bit strange as unless you are going to Crossfire or SLI up multiple cards there isn’t really any card that is powerful enough to cope. It’s at best psychological at worst cynical marketing trying to “con” people into thinking these rebrands are far more powerful then the old cards when they aren’t.

  15. Hobbes says:

    The problem isn’t the hardware. ATI’s hardware is remarkably decent and always has been since the Radeon HD days. It’s the drivers.

    Their drivers have been designed by a pair of men called Budd and Bob, and hammered together with planks and nails, then wound together with duct tape, before Budd and Bob stand back and call the Hardware guys over and go “HEY Y’ALL CHECK THIS OUT!”, not since the dark days of Windows ME have I seen such terrible software performance extracted from such powerful hardware, and that’s saying something.

    I imagine the Hardware division must look at the Driver people and go “Why did we hire a million monkeys and a million typewriters?” because the works of Shakespeare the Catalyst is not.

    As shady as Nvidia may be at times, at least they make a driver that -works-.

    • Asurmen says:

      Never really see this argument in practice.

      • Hobbes says:

        No? Want to hear about the one about the PC with the ATI graphics card where the Catalyst drivers locked up due to a known bug with Skype and the ATI audio transcoder that would result in a hardlock of the system, mandating a forced reboot? The only fix was to delete the offending driver file wholesale after performing a system rollback to figure out that the Catalyst was at the root of the mess and then googling up the issue?

        Or perhaps the one about Catalyst drivers causing untold problems with Project CARS and me, patient soul that I am having to go through friends config files because I wanted them to enjoy the game only to find out that AMD is merrily blaming SMS when in fact it’s their useless drivers.

        Their drivers are junk, and on Windows they’re only -mostly- junk. If you’ve had the misfortune of experiencing their Linux drivers then you have my most deepest, most heartfelt sympathies. For that is a hell I would not want my worst enemy to suffer through. I mean that, I really do, they are the most miserable, hateful little pieces of bug ridden, hackity, unusable code that I’ve bore witness to. They should be burned, then buried somewhere deep, where their evil will never see the light of day again.

        • FuriKuri says:

          I could pull anecdotes out of my arse about how terrible nvidia’s drivers are, but what’s the point? With either brand they’re fine 99% of the time until they’re suddenly not. The only issue I had with amd (when I had one a year back) was with Rage and, frankly, there’s was nothing to blame there but the game itself…

          Not that anyone will or should care but I’ve certainly had a lot more hardware failures from nvidia then I ever did with amd. Even as I write this I’m awaiting a replacement 780 via RMA, in the middle of a steam sale no less!

          Woe, woe is me :(

          • Hobbes says:

            Oof :( RMA on a 780? Owwie. That’s a painful thing. I’ve seen Nvids fail to be sure, and make no mistake, I have just as much of an issue with Nvidia’s business practices as I do with AMD’s cleetus drivers. I suppose my thing is that it boils down to which evil would you prefer – The shady underhand sales pitch where you get “4 GB” that is not really 4GB or drivers that potentially gimp older Kepler cards, or drivers hammered together by incompetent cross eyed ham fisted hicks?

            I decided I’d rather the smarmy used car salesmen, at least if I’m being ripped off, there’s a chance that what I get will work -roughly- as it’s supposed to.

        • zarthrag says:

          I’m going to have to agree here. AMD 6950/CF drivers made me *quit* Ubuntu linux on my desktop. Triple-monitor configuration/setup was difficult to get running, and any tiny automatic update broke everything irreversibly. Games on windows rarely worked with CF enabled (Deus Ex was particularly disappointing) I eventually just gave up and got a 780 Ti. The R9 debacle didn’t help my opinion, either. Now that I’ve seen these cards, the fact that they didn’t provide a benchmark against the 980Ti/TitanX/Z tells me they aren’t worth waiting for. (Maybe some more “golden samples” are on their way? Oh wait, those were attributed to bad drivers too!)

          I’m no Nvidia fanboy, especially when it comes to the lock-in I’m starting to feel. But AMD isn’t helping. CF/Catalyst profile tweaking isn’t something I’d wish upon my worst enemy. Nvidia has broken exactly one game for me with an (Alien Isolation), and they fixed that quite promptly. New titles almost always work day one. Gamestream, and their controllers, however…. that’s another story.

          • Hobbes says:

            “I’m going to have to agree here. AMD 6950/CF drivers made me *quit* Ubuntu linux on my desktop.”

            So many times have I heard similar statements or variations of that.

            *the tiger slides over a cup of coffee and a shot glass of a strong drink of your choice*

            Nvidia Linux drivers were … troublesome, but most of the time behaved, even if they were very blackboxy and thus made me a bit wary when for the rest of the system I could always get arm deep in the code if there was a -real- problem to untangle. AMD’s linux drivers were just plain abysmal. It was like someone hired an intern and locked them in a luggage trunk with a 386 and told them to code the drivers using spaghetti basic. Only pausing to let them out for air, food and water once a day.

    • hpstg says:

      Their driver in terms of single gpu performance has always been on par, and if you really are the kind of person who follows this, check the guru3d forums. They are leaked drivers that put the 7970 on par with the 780/Titan. They will be officially released by the end of this month, probably as Catalyst 15.30.

      Their Windows 10 driver (apart from multimonitor configuration stuff, I’m talking raw performance), looks fantastic. It seems that they’ve really made great strides to put WDDM 2.0 to good use.

      • Hobbes says:

        It’s the Win 10 driver that has me interested, but mostly if I can eliminate the Catalyst horror that comes with it and replace it with a control panel that isn’t made out of duct tape and dried gentlemans’ relish. Overall performance on Win 10 in general looks like a massive boon for AMD overall, and looks to level the field, finally redressing the current dominance Intel and Nvidia have maintained as regards Win 7 and Win 8.

        That and AMD are finally putting Bulldozer so far underground that it won’t ever be mentioned again *spits*

  16. chris1479 says:

    That’s nice… but it’d be even nice, AMD, if you could FIX YOUR GOD DAMN DRIVERS AND REPLACE THE TINY GEORGE FOREMAN GRILLS WITH ACTUAL GRAPHICS CARDS K. THANKS GUYS.