AMD’s Radeon R9 380X Graphics And ‘Crimson’ Driver

XFX's 380X - other 380X's are available...

Rejoice, for among we mere mortals walks a new AMD graphics card. But hang on. Is it actually new? If it isn’t, what is going on with PC graphics these days and why do we keep having to make do with these thinly disguised rebadges? The answer is simple and the solution, happily, is imminent. Meanwhile, AMD has a new graphics driver out, and by that I mean not just a driver update but a whole new interface and platform. Give it up for AMD Crimson and kiss goodbye to that awful Catalyst interface.

Take the jump for an overview of the new AMD Radeon R9 380X and Crimson and a hint of why 2016 is shaping up to be the most exciting year in PC graphics since the early days of hardware T&L…

The new(ish) AMD Radeon R9 380X
So, the new AMD Radeon R9 380X. What, exactly, is it? In simple terms it’s an AMD Tonga graphics chip, ‘Tonga’ being one of those internal codenames the online rumourati love to bandy about.

Tonga is not a new chip. It first appeared a little over a year ago in the then-new Radeon R9 285. Of course, you could argue even Tonga was just a rehash of the Tahiti chip that first appeared in the Radeon HD 7970, shortly after cyanobacteria evolved the ability to photosythesise carbon dioxide and water into sugars. Early 2012, I think it was.

Admittedly, Tonga sports the ‘1.2’ version of AMD’s prevailing GCN or ‘Graphics Core Next’ graphics architecture where Tahiti is merely GCN 1.0. But we are not talking about dramatic differences. In reality, Tonga exists to do the same job as Tahiti but more cheaply, thanks to its less complex 256-bit memory bus. Except it actually has more transistors, which normally makes for a more expensive chip. Confusing? Hang with me.

A Radeon R9 285. Which isn’t anything like the new 380X. Not one little bit

Of course, we’ve already seen the Tonga chip rebranded. What was the Radeon R9 285 became the Radeon R9 380 with virtually no changes at all. In both cases you got 1,792 of AMD’s GCN 1.2 shaders for doing all those fancy graphical effects, 112 texture units for processing, er, textures and 32 ROPs or render outputs for spitting out the end result. OK, the 380 was clocked up from 918MHz to 970MHz, but little of consequence had changed.

The saving grace for this ‘new’ Radeon R9 380X chipset is that it’s what you might call Tonga fully unleashed for the first time in a desktop graphics card. That’s because the Tonga chip itself has always had 2,048 shaders and 128 textures, it’s just AMD hadn’t previously enabled them all. That’s what made previous Tonga implementations cheaper. The odd broken shader or texture unit didn’t matter due to the built-in redundancy.

What does this all mean in the real world? Well, it’s a bit faster than a 380. But the 380X along with the 380, the 285 and indeed the 280X and 7970 before them all deliver a similar subjective gaming experience. They’ll generally beast any game running at 1,920 by 1,080 pixels, otherwise known as 1080p.

Ramp things up to 2,560 by 1,440 pixels or 1440p with the details set to full reheat and it will then be a matter of what game you are running. Something arcadey like GRID Autosport will return frame rates in the high 50s and be very playable. Something more demanding like BattleField or GTA V will be mid 30s or maybe even high 20s and so marginal at full graphical detail.

The Nvidia GeForce GTX 960: The new 380X’s nemesis?

As for the Nvidia comparison, most of the time the new 380X is a fair bit quicker than a GeForce GTX 960 but also a lot slower than a GTX 970. In the end it all comes down to price and with the 380X coming in around £185 / $229 it’s inevitably positioned between those two Nvidia chipsets. You pays your money. You takes your choice.

Graphics in 2016
It’s certainly a nice board for the money. The catch is that there’s a revolution coming in 2016 and anything bought in the next, say, three to six months could find itself rapidly outdated in a more dramatic fashion than has been the norm of late.

That’s because pretty much all GPUs have been made using 28 namometer silicon since way back in late 2011. But now, finally, that’s going to change. Both AMD and Nvidia are reportedly gearing up to unload new graphics chips built on much more advanced silicon. This won’t be an incremental die-shrink thanks to slightly smaller transistors, but a really big jump that incorporates fancy stuff like FinFET technology, truly monumental transistor counts up near 20 billion and perhaps the promise of true 4K-capable performance. Genuinely, it is all very exciting.

nullHere’s an AMD 290. Which then turned into a 390 18 months later…

I don’t want to get too bogged down in the details here as I’ll wrap that all up in some end-of-year coverage that’s all about why 2016 is looking like a corker for the PC generally and not just for graphics. But it certainly seems like the 380X is the last in a long line of what have often felt like stopgap graphics card. And a good thing that is, too.

AMD’s ‘Crimson’ driver
But what of Crimson, AMD’s new graphics driver? Well, say goodbye to AMD Catalyst. Say hello to simply Radeon Settings. That’s the name for the new interface and from the initial installation dialogue through the to the final UI it’s all much cleaner, much more modern. It also makes ‘discovery’ of features a lot easier. I’ve been playing around with AMD’s Advanced Super Resolution purely because its existence was so much easier to stumble upon in the new interface. I think it’s actually been in the driver since late last year.

AMD’s new Crimson driver isn’t just a pretty face. Well, that’s my story and I’m sticking to it

The panel for the individual game profiles and optimisations is particularly sweet, too – it’s the sort of improvement that I think will actually encourage AMD users to get the most out of their cards, which can only be a good thing.

Having said that, clicking ‘Additional Settings’ in some parts of the interface kicks you back into the old ‘Catalyst’ style menus for some of the more advanced settings. That looks and feels clunky and does rather beg the question of whether the changes are more than skin deep. It’s redolent of Windows 10’s patchy mix of old and new and not in a good way. Sorry, but even if all the important code is actually ancient, I don’t want to be reminded!

Even the install process looks cleaner and simpler

As for performance, well, AMD has released a long list of tweaks with the new driver platform. That includes general game performance and specific features like its LiquidVR tech that’s designed to optimise AMD graphics for virtual reality rendering, one key feature of which is reducing latency, which is critical for VR. There are also improvement for FreeSync, 2D video decode, Eyefinity multidisplay and more.

On the subject of quality control, AMD says it has ramped up testing to include more PC configurations than ever in order to reduce the bug count. Of course, it turned out the first Crimson release had a bug in the fan control software that actually killed some cards, which is so typically AMD you couldn’t make it up. But there’s now a beta version available that fixes the problem.

‘Discoverability’ of new features is much improved…

In practice, it seems like the new driver is only adding a few percentage points to game performance but overall I’m feeling pretty upbeat about the changes, even if 20 seconds of black screen during install did give me some harrowing flashbacks of earlier AMD driver installs that have completely broken my OS. It even seems to have fixed my multi-display HDCP problems in Amazon Prime Video.

As you leaf through the reams of changes and upgrades, it’s hard not to marvel and what has become something of an OS within an OS. The levels of complexity are borderline alarming these days. But at the very least this new AMD effort is a lot nicer to use and look at. If you have an AMD card and haven’t tried it, it’s worth a quick spin.

But some throwbacks from the Catalyst era remain

52 Comments

  1. flashman says:

    I got my 390 the other day and it is a joy (having upgraded from a six-years-old HD5870). There’s nothing like dropping a new piece of hardware into your case and jumping from 30FPS on Medium to 60FPS on Ultra.

    • Rich says:

      Agreed. I went from a HD7790 to a second hand R9 290. As you can imagine, the difference is amazing.

      • Cryio says:

        260X jump to 290, not bad.

        I’m still using my 560 Ti, so I’m also sitting somewhere between the 260X and 265 performance wise. Can’t wait to upgrade.

  2. Gordon Shock says:

    After years and years of supporting AMD for my graphic cards I recently switched to Nvidia and bought a 970. Can’t say that I regret it one bit as everything as been super smooth so far even the new Black Ops 3 and the rest of my rig is good at best.

    • Jason Moyer says:

      Just bought my 970 a week ago after going straight AMD for the past 5 years or so. I was debating getting a 380/380X/390 but a 970 costs basically the same for similar performance and is a hell of a lot more efficient in terms of power and heat dissipation.

      • Cryio says:

        Given the fact that 390s are selling at the same price or cheaper and are faster WITH more VRAM, I don’t know any reason why anyone would buy the inferior product today.

        Do yourself a favor, return the 970 and get the 390.

        • Pulstar says:

          Thiiis.

        • Premium User Badge

          Don Reba says:

          If the 970 were more quiet, that could be a big reason to prefer it.

        • Jason Moyer says:

          The Sapphire R9 390 is going for $334, I bought my EVGA factory OC 970 for $290. While it’s true that the 390 has slightly better performance in most situations, getting an extra 6 FPS in Crysis 3 isn’t really worth the extra 140W in power draw for me. I’m ok with 60 FPS on Ultra/1080P in Fallout 4 until the real next gen cards come out that aren’t just old tech being pushed to higher frequencies.

    • PoulWrist says:

      Why would it be cause for regret at the moment? Both manufacturers make excellent cards.

  3. raiders says:

    I have TWO AMD cards.
    And, yes, I have tried it for spin; since day one.

    My verdict: it’s pretty damn sweet. Knocked my load temps down 7°. I’m no longer gaming above 90° anymore. The VSR is remarkable. I couldn’t believe how much cleaner/sharper/crisper my resolution became.

    But all ain’t peaches & cream. That UI you gush over is too cumbersome. I mean, I like my apps, but nobody wants to look at a spreadsheet all the time. Especially when they’re looking for the setting they really want to exploit.

    All-in-all, it’s much better than CCC. Hope they can get their drivers this clean.

  4. ZippyLemon says:

    Interesting. I’m looking to build my first decent gaming PC in about six months. Should I sit and wait for the advanced silicon GPUs? It sounds like a definite leap in quality.

    • jorjordandan says:

      Even if there is a huge jump in what the manufacturers are capable of, the increase in performance for consumers will still be incremental. Slowly doling out the improvements is in their business interests, and probably necessary for them to pay for r & d and manufacturing improvements. Still prices will go down a bit, performance will go up a bit. If you hold your breath, you’ll be disappointed.

      • Cryio says:

        Except they when they made the last jump from 32nm to 28nm I think and AMD increase performance by a factor of 2x. The 7970 was two times faster than the 6970, competing easily with the 6990.

        So it’s not unprecedented.

    • gunny1993 says:

      Only thing you’re likely to see with the new gen is better support for DX 12, lowers temps/power usages and a marginally increased FPS but you’ll be buying them at top price.

    • Unclepauly says:

      May even be teething problems with the new process so a wait n see approach is best for people without monopoly money. Performance is expected to be a 50% leap or more though, so we can all expect to be paying a nice premium to test this process out for them.

    • Premium User Badge

      phuzz says:

      There’s another upside of waiting until the new graphics cards start coming out, which is that the older cards will get more affordable. Plus if you don’t mind buying second hand, you might be able to pick up a bargain from someone upgrading to the latest and greatest.
      On the other hand, there’s always something new and shiny about to come out, and whatever you buy this month, would be that bit cheaper and/or better if you’d waited until next month.

      • Sakkura says:

        There isn’t much room for the older cards to get more affordable. They’ll just go EOL, maybe you can get lucky with a clearance sale, then prices skyrocket as stock dwindles.

  5. bhauck says:

    Is it rude to ask for advice on video cards not related to the topics in the article above? Hope not:

    I just upgraded from a 17″ 1024×1280 screen to a 24″ 1920×1200 screen (both 60hz). I don’t expect to upgrade my monitor again for 3-5 years. Is a 2GB card like a 750 Ti or 260X enough to expect to play any games released in that time at that resolution with at least mediumish settings, or would I need a 4GB card to be safe? Thank you and/or I’m sorry.

    • Clavus says:

      Probably. You can expect to at least run things medium-ish during this console cycle. Don’t expect to do much fancy stuff such as VR gaming with it though.

    • Lyon says:

      I’m still gaming on an AMD HD 7850 (overclocked to 1050Mhz) which is a 2gb card, I game @ 1920 x 1200, 90% of my games run on highest settings at native res.

      Some tougher games like BF4 run on high (settings go one higher to ultra) at 1920 x 1200 with some AA at 50fps. Honestly, I’m slightly annoyed the card still performs so well! I can’t justify an upgrade until more games start dropping to 30fps. I haven’t bought a big AAA title in a while though, but I did try the Battlefront Beta and it ran similarly to BF4.

      If I were to buy a card now I would make it a 4gb for future proofing, but if budget is a concern I seriously wouldn’t regret getting a 2gb card.

    • gunny1993 says:

      I play @ 1440p with a 2GB card, memory is far less of a problem than simply raw power … you’ll be fine.

      • Unclepauly says:

        Depends on your texture settings which is the main contributor to vram being filled up. Pretty much just drop the textures until the game fits snug in that 2gb. Almost all graphically demanding games from here on out are pushing 2k or 4k textures at their high and ultra settings which requires a 4gb card. Many games are leaving medium textures at 1k for 2gb cards though.

    • Cryio says:

      Absolutely not. 2 GB is a bottleneck for mostly everything nowadays, regarless of resolution. You need a 4 GB card to be future proof. For 1080p, the best options are 380X 4 GB, 380 4GB or 960 4 GB (but really don’t get the 960 because it’s the far slower card)

    • bhauck says:

      Thanks for the help everyone. Based on the multiple votes for 2GB and the one fervent vote against, I think the best plan is to try and find a sub-$100 2GB card that’s a decent enough upgrade on my 1GB GTX 285 to last me until late 2016/early 2017 when we’ll have a better idea on what 14nm is doing to the market. Whether a 14nm mid-range card or a then-cheaper card that’s out now, I should be able to buy something for ~$150 that can last me until the end of 2020 when I’ll see what 4k monitors are going for. Thanks again!

      • mao_dze_dun says:

        I’ll give a different advice – get a second hand 280/280x which run at about 100 bucks and overclock it. They’ve 3GB of VRAM and will eat through anything at 1080p.

        • bhauck says:

          A handful of 280s have sold on eBay since the beginning of December for $135-$175 (including shipping). One 280X did sell for $110, but overall, 59 functioning 280Xs have sold since December 1st with an average price of $176 including shipping.

          Both of those cards are re-branded HD 7XXX cards using GCN 1.0 from 2012. I may be proven incorrect, but I’m going to feel more future-proofed through 2020 if I can get a card made with 2015-2016 tech, not just 2012 tech at 2015-2016 prices.

          I bought a 265 from Amazon for $100 (with simpler shipping than something from eBay) that I expect to get 1-1.5 years out of, then I’ll see what I can get for around $150.

          I’m sure no one is still monitoring this thread, but this is fun for me.

  6. Baines says:

    had a bug in the fan control software that actually killed some cards, which is so typically AMD you couldn’t make it up

    To be fair, the news would have been just as believable if it was Nvidia instead of AMD.

    • Sakkura says:

      In fact the same thing has happened with Nvidia cards, except worse. The AMD cards were mostly getting stuck at 20% fan speed, while the Nvidia bug was shutting the fan off entirely.

      Anyhow, the cards shouldn’t really be dying just from the lower fanspeed, most likely there were other problems with the cards that the fan speed bug just exacerbated. And there’s no telling how many cards actually did die.

    • Banks says:

      I had this bug, sooooo annoying. Every time I booted up my computer the fan speed turned to 100% and I had to manually manage the settings again. It did not kill my gpu but It surely killed my patience.

  7. Capt. Bumchum McMerryweather says:

    A word of warning for you commentators; this crimson business isn’t all roses. My upgrade path has been a FUCKING NIGHTMARE. Updating to Crimson broke Assassin’s Creed Unity, Fallout 4, and updating to the beta drivers that were suggested to make Just Cause 3 worked ok… for a bit, now I get constant ctd’s and stuttering.

    Also the application itself seems to suck a bag as well; every time I go into Radeon Settings and click the display heading (because it got rid of the underscan on my DFP telly), the application crashes and refuses to relaunch unless I restart my computer. I tried uninstall-reinstall, restore points, repairs, everything, and it’s still a nightmare.

    I know for sure as bastard mustard that my next card will be an nVidia. I have had AMD cards since they were ATi, and the best card in the world was the 8800GT, but I can’t be doing with this any more.

    • Sakkura says:

      Probably a problem somewhere else. It’s running beautifully for most people. And Nvidia has definitely had more driver issues this year than AMD.

      • Eirik says:

        Well, see, pointing out that the competition allegedly has/had more problems (your other comments appear to cross you into the AMD apologist territory) doesn’t help someone experiencing problems with a company’s product.

        • Raoul Duke says:

          It doesn’t, but if someone is complaining “I’m sick of my car getting flat tyres, I’m selling this Ford and buying a Honda” then pointing out to them that Hondas also get flat tyres makes sense.

          The most compelling argument for nvidia is their lower power and as a result heat thanks to newer technology. Drivers are universally awful. But AMD is doing a good job of its usual approach, being good value for money in the moderate part of the market.

    • PoulWrist says:

      Try going for a driver cleaner of some sort, like DDU from Guru3d.

    • DigitalSignalX says:

      I was pretty disappointed to discover my HD 7970 was *not* supported by Crimson, despite the architecture being identical to some supported “R” series. Gaming at 1080p is still 60+ fps for anything I throw at it, including FO4 on ultra. Only the godray and distant shadow failures in optimization drag it down to the 40’s, and mods have fixed that already.

  8. TacticalNuclearPenguin says:

    Nvidia’s Pascal might end up being one of their biggest jumps as of several years, and i’m only talking about the chip, let alone the fact that it will sport HBM2.

    With that plus the fact that it seems 98,3% confirmed that broadwell-E will finally have a 8-core variant as it’s “mid” priced option, well…

    I’m not sure my wallet is ready.

    • PoulWrist says:

      One thing you can be sure of is that both nVidia and Intel are quite ready to eat your wallet :p I fully expect all these options to once again increase the price-ceiling by around 20%, regardless of how much we used to think that smaller manufacturing process meant cheaper hardware.

      • TacticalNuclearPenguin says:

        This is my fear as of now indeed.

        It seems Nvidia this time around might release the “full fat” straight away ( think 780ti vs 680 ) with HBM2, and the “middle” version with GDDR5+ or what’s-the-name.

        If that’s the case, my expectancy for the big beast is 800 euro or so for a decent non stock option, otherwise i’m gonna bail out.

        When it comes to Intel i’d be curious if they are really willing to break their rule of 999 dollars for the highest option, if they don’t the “mid” one should stay around the 600-700 mark again. I’m probably still going to wait since i want an X platform but not an “old” chipset though.

        Eitherway smaller process can mean some savings when we talk about refreshes on a different node, but when you doube the transistor count, or even more when it comes to GPUs jumping two nodes, well… it can’t be cheaper.

        • TacticalNuclearPenguin says:

          Besides i don’t think AMD will be in a much better position when it comes to price/performance, they’ll probably stay a little on the cheaper side but still in the “bucketload” scale when it comes to the high end, most likely.

          Thing is, unless we move out from conventional silicon the prices will never go down, it’s increasibly more difficult to shrink the things further and it’s reflected on stretched roadmaps and yeld hurdles.

  9. axfelix says:

    I’m excited for 16nm but my enthusiasm has dampened a bit after Nvidia have failed for the past couple generations to put out a card at the $200-250 price point that seems properly future-proofed — unfortunately I still tend to compare everything against the 9800 GT in terms of value so there’s really no helping me, particularly as it still seems to be the entry point for most non-AAA titles, as it’s now tied with the average Macbook onboard chipset. At this point I wouldn’t buy anything weaker than a 970, and given that those still cost more than $400 CAD, I’m not sure there will be enough incentive to drop prices.

    • PoulWrist says:

      Yea, long gone are the days of the 8800/9800GT and their near-top tier performance for a reasonable buck. The follow up in the 5850 and 5870 over at AMD was also really, just super great value at the pricepoint, which was about the same as those nvidia cards at launch.

      To get similar performance today, a.e. the top-end for today’s titles, you have to shell out twice what those cards initially retailed for. And noone’s bothered about that, all they’re talking about is how “AMD SUCKS GO NVIDIA” while someone at nvidia is just rubbing their hands together in greed as they release the next boring, conservative product.

      • TacticalNuclearPenguin says:

        Ahhh, good old ATI times.

      • Fataleer says:

        As someone who had good time having 8800GTX-9800GTX+ (8800GTX died on me after some electrical problem in the house, but they RMAd it anyway for 9800 series) and jumping to HD 5870 afterwards I agree.

        Only… little quirks I had to endure with AMD made me lost confidence in them. Whole power management problem thing, were the card did not use 3D clocks for rendering if there was a hw accelerated video running and need to do full uninstall of drivers (safe modes and whatnots) to have zero problem running of the system made my upgrade decision much easier. (980ti)

        But still without competition, we would be in CPU world, the world of zero competition when it comes to mid performance field and higher. Single price entries for same performance rated CPUs within generation, no price wars… It is no longer the time of first Core 2 Duos/Quads and their crazy good pricings — E6400/Q6600 yay:) )

        So hey. Not to say AMD is bad, bud my experience is much better with running Nvidia cards.
        Hopefully I will see something that would make me a bit more confident in them.

  10. alms says:

    Could you be a little more condescending, Jeremy? I think this fell a little short of the mark. Maybe add another section about “value monitors” comprising about 2 (TWO) models. Gah?

  11. drewski says:

    Having used Crimson ever since I finally got around to upgrading to Win 10 about a week ago, I can indeed confirm that it makes AMD cards functional and hasn’t yet crashed.

    So, y’know. That’s a pretty big step up for AMD.

    I jest, I actually haven’t ever had significant problems with the card in this machine. So 10/10 for not breaking what wasn’t broken, at least.

    • waltC says:

      Actually, if you check out AMD’s Catalyst/Crimson site you’ll see that by no means is everyone pleased with Crimson. The big problem is that Crimson was not ready for prime-time when they released it–it completely lacks such bedrock GPU function as resolution switching…! That’s right, you cannot switch Windows resolutions from inside Crimson–you have to drop out to Windows Display Settings (or another 3rd-party application) to change screen resolutions, believe it or not. Crimson also includes a new utility–a custom resolution utility that allows users to create their own custom resolutions & refresh rates–which is long overdue, but there’s one tiny problem. After you make your custom res and test it–Crimson doesn’t provide any way for you to *use* it….lol…! (Because Crimson doesn’t yet support resolution switching.)

      Good news is that AMD employees on AMD’s site have stated that they plan to eventually move all of the Catalyst features over to the Crimson drivers–*and* this employee also stated that until then you can revert back to the Catalysts. More good news: the Crimson driver itself works fine inside the Catalyst shell…! Install a set of Catalysts, then use the device manager to load in the Crimson driver–it works perfectly. All standard Catalyst functions including resolution switching (and using custom resolutions!) work as they always have. Why AMD released Crimson in this fashion I have no idea–the driver is still beta and as far as Catalyst features are concerned it’s only ~66% finished.

      • alms says:

        You seem to speak English well enough to understand that what you describe (the new control app not exposing some controls) is different from what you are claiming (the driver not supporting changing resolution)

  12. Premium User Badge

    Carra says:

    My Geforce 670 still runs most games but I’ll be upgrading once the next generation is here. Should be a big enough improvement to warrant upgrading.

  13. one2fwee says:

    Does the new eyefinity actually let you enter the bezel compensation numerically? As it always used to annoy me how that wasn’t possible. Especially as whenever you did a clean driver install when upgrading you would haver to redo all settings.

    Mind you if games are implemented properly, they shouldn’t need fake eyefinity bezel compensation resolutions.
    However sadly only a few driving sims (and probably flight sims) actually do triple screen properly.
    Mainly because as usual, the industry is ignorant and doesn’t care.

    Much like how totalbiscuit is completely arrogant and ignorant about FOV, something he has shown time and again to have no understanding of.

  14. billyunaire says:

    I don’t care for the new crimson interface that much. It was giving me issues trying to control my GPU fan. It was conflicting with MSI Afterburner. I had to uninstall it, keep the crimson driver, but reinstalled Catalyst Control Center.

    The program still seems raw. Needs a little more work.