AMD Aiming To Smooth Ports Between PC And Console

Both the Xbox One and the PS4 are going to contain AMD graphics chips. Which must be lovely for them, and deeply annoying for NVidia. Of the current gen, the Xbox 360 has an AMD GPU, but the PS3 sports Nvidia’s idiotically named RSX ‘Reality Synthesizer’. The next gen consoles are both basically PCs in a box, and as such both are going to feature a version of AMD’s Radeon – the card that fills so many desktop PCs. And indeed both contain AMD CPUs too. According to a report on PC Advisor, that means Advanced Micro Devices (as I’ve just learned their name stands for) are hoping this means they can make ports far less of a faff.

Called the “Unified Gaming Strategy”, this was an initiative the company mentioned at GDC this year, but of course without being able to talk about what was in the Xbone. The aim is to make porting much easier, letting games run across PCs, consoles, and indeed portable devices, with relative simplicity. That’s in a large part because both new consoles will now be using x86 processors, just like the daddy PC.

It puts AMD in an interesting position. They’ve fallen hugely behind in terms of CPUs, but with their nex-gen console coup, it rather reinforces their position in the GPU market. Which is, it must be said, second place on PC. According to last month’s Steam Survey, Nvidia cards are still in the slight majority of gaming PCs. That’s something they’re hoping to change with their as-yet unreleased chip, code-named Kaveri, due later this year.

The company says they’re working with developers to create this new-found smooth path from console to PC, but it does seem that this is the direction they intend to move things. According to PC Advisor, AMD’s Lisa SU said,

“It is absolutely the end goal to create a development ecosystem where first-party games will be written to the games consoles first … but providing the capability to leverage that investment into PC market, into mobile form factors, into cloud. Definitely there’s that desire.”

Although she did acknowledge that “a lot of console games are developed on the PC first”, and talked about how their tech can make this easier. This tech being something called HSA – Heterogeneous System Architecture. To the best of my understanding, this is a setup where AMD CPUs and GPUs can share resources, better working together. And since that’s likely to be inside both new consoles, their hope is that the utopian design across consoles, PCs and portables (and indeed “cloud” as they insist on saying) means that programming for one will be much the same as programming for another.

I’ve bluffed understanding as far as I can here, but it strikes me that this sounds great, except for that I’ve got an Nvidia GPU and Intel CPU in my PC. Where does this leave me? Presumably with a clumsier port. With AMD still behind on PCs, it’ll be interesting to see just how effective their strategy can be.


  1. FriendlyFire says:

    Honestly, AMD’s weakness and the reason they’re still trailing a bit isn’t their hardware. Their hardware is good, it used to be cheaper than Nvidia’s equivalent, and it’s very reliable.

    The issue is that their awful drivers are still trailing behind badly. They’ve made progress, yes, but they’re still behind Nvidia for things like GPU switching on laptops, game profiles, multi-GPU support in games, graphics research, etc. Even plain old support of games is often patchy.

    If they’d dedicate proper resources for an almost total re-engineering of their software suite (Catalyst), they could take the crown back from Nvidia.

    • Christo4 says:

      Indeed, this is the sole reason i prefer Nvidia to AMD.
      But, truth be told, on my gaming laptop the Nvidia drivers are kinda screwy lately, but at least they solve the problem pretty fast!
      Before i had a laptop with an ATI GPU and i didn’t even have a good release driver to download for it, had to try some beta’s before reverting back to the driver that the laptop originally came with.

    • BTAxis says:

      You pretty much said it. I’ve stuck with nVidia for the past years mostly because of this.

    • grundus says:

      I’m putting my hand up as well. I would like to have a 7970 instead of my 680 (for the whole extra GB of RAM for not a lot more money) but the software either still sucks or the general impression people have is that the software still sucks. I haven’t used an AMD card since 2011 (it was actually an ATI 4870, so I’ve never used an AMD card… Though I have used the Catalyst software) and friends of mine who have AMD cards have software trouble a lot of the time. One even used his 24″ monitor with a 1 inch border around each side because his 5670 wasn’t filling the screen, and the workaround was so convoluted he just couldn’t be bothered to fix it.

      Anyway, with the way Intel and Nvidia are going (mobile), I would like it quite a lot if AMD got their shit together, made a CPU that outperformed a high end i5 and got some talent to work on their software. The whole reason I went with Nvidia is because I was building my PC just as Rage was coming out… Dodged a bullet there.

      • Grey Poupon says:

        ” One even used his 24″ monitor with a 1 inch border around each side because his 5670 wasn’t filling the screen”

        The “workaround” is one setting in CCC (you don’t even have to use CCC if you don’t want to). Just because the default setting isn’t perfect for your computer, doesn’t mean the drivers are broken. If people would just use as much effort in trying to fix their issues as they do at whining about how they have them, the internet would be a lot quieter place.

        • WoundedBum says:

          I think all it takes is switching off overscan, right?

          • KenTWOu says:

            Yeah, it’s underscan/overscan settings.

          • yojimbojango says:

            Correct, you open the control panel and click on one of the options (I just had to do this with a new pair of dell ips panels, but I forget what it was called) then you grab a slider and move it all the way to the right.

            I mean the issue is still dumb (it detected the monitors as televisions for some screwy reason) but it’s not hard to fix and it probably took me under a minute from, “Well this is weird, I’ll google it.” to “Ok it’s fixed now.”

          • Colthor says:

            For over a year and countless driver revisions I couldn’t get that setting to stick on my 4870. Found a program to run in the background and set 0 underscan (why isn’t that the default anyway?) every time something changed resolution, but that didn’t fix its occasional habit of running interlaced or at 24Hz or some nonsense.

            Eventually fixed both problems with a 560ti. Everything’s gone more smoothly since then, and I’d be unwilling to give AMD another chance, as good as their hardware is.

      • engion3 says:

        I’ve had a 7970 (for a year, I think?) and it’s been a huge improvement over previous cards. I’ve always had ATI and my 4870 and even 6970 had tons of issues. So far this year I’ve played every major game without a hitch on my 7970. I did notice they updated CCC interface and refined a ton of things so that may be it.

    • Grey Poupon says:

      I think both drivers suck equally, just on different things. I haven’t had a problem, apart from graphical issues in games, for a very long time and the frequency has felt more or less the same as it did with nVidia cards. Obviously TWIMTBP (or whatever it’s called) games are going to work better at launch with nVidia cards since nVidia often tells the devs not to give the code for testing to AMD before the game is launched. At least AMD’s drivers haven’t fried any GPU’s. Can’t say the same about nVidia. At the moment AMD is rewriting the memory controller to reduce frame latency on the new architecture. I’d say that’s a lot bigger deal than designing yet another CCC.

      I don’t even like defending AMD as their drivers are far from perfect, I just don’t find them much worse than nVidias as long as you’re even a bit computer savvy and know how to debug issues.

      • Flopper says:

        Maybe I’ve just been lucky. But I switched to Nvidia when they released the 8800. Since then I’ve owned 8800, 9800, 280, 480 and now I own 2 580’s in SLI. Haven’t felt the need to upgrade to 600 series because I’m still maxing out all my games with 60 frames.

        The only time I’ve ever had an issue with any of my cards was when Rage released and it was a god damn disaster. But I think that was less on AMD or Nvidia and more on the fact that Rage was a buggy piece of shit designed for consoles and ported at the last minute to PC.

    • GameCat says:

      “they’re still behind Nvidia for things like GPU switching on laptops”

      Oh, welcome in the land of “installing drivers that aren’t ~2 years old will make you stick to integrated Intel 3000 graphic chip, because there’s no fucking way you will use your better NVidia chipset”.

      DO NOT buy laptops with NVidia Optimus.

    • Asurmen says:

      Am I the only one who hasn’t had a problem with AMD drivers?

      • KenTWOu says:

        Definitely not the only one.

      • Major Seventy Six says:

        You are not, I sail a smooth ride. I have an older desktop with a 4870, a laptop with a 5650M and a desktop with a 7870 and all run very tightly.

        The last Nvidia card was a GEForce 7950GT it was a nice card then.

        • Ryuthrowsstuff says:

          Right I’ve got had a 5770 running in my box for years without a single issue. With the previous AMD/ATI card problems were infrequent and usually just needed a driver update. I haven’t used Nvidia since the late 90’s early 00’s when everything on pc was frustrating and stupid.

      • PopeRatzo says:

        I’m also a happy user of AMD graphics. Right now there’s a 6850 and a 7870 humming away in the two game-capable machines in my house.

        I can play any contemporary game on either machine. Just finished Metro:Last Light, in fact. Smooth.

        The 6850 is about $100 less than the least expensive nVidia card that will run that game. Occasionally, I’ll have to install a beta driver for some new game that comes out, but that’s the extent of my involvement.

        • jezcentral says:

          I’m currently an Nvidia user, but my 4890 was a true star. (A bit loud, though, but that’s because I bought a cheap one with a whizzy cooler fan). AMD really nailed that generation of GPUs.

          As we all know, those of us that are happy will be too busy happily playing games to complain on forums about things.

      • zdeno84 says:

        nope, I’m running 7870 plugged in to plasma HDTV and it all setup and works smoothly on it’s own which kind of impressed me, because it’s usually the OS (w7) giving me trouble with custom DPI settings. Fortunately all the games graphics work w/o myself having to tweak anything in CCC. I think I turned that one on only once out of curiosity, but have to admit the user interface feels a tad unorganized and could get restructured. Something like ‘want to do some gaming?’ flashing button which lets you do all the tweaks and gives some explanation on each setting.

    • Major Seventy Six says:

      I work in a game testing environment and we definitely have all types of configs possible.
      Right now, the most problems are found using Intels CPUs with Nvidia GPUs.

      Last year it used to be Intel CPUs with AMD GPUs.

      But consistently AMD CPUs paired with their GPUs are a smooth experience.

      What will happen soon is that developers will start to optimize everything to work with the consoles AMD hardware and those with PCs with similar specs (AMD Apus or recent CPU GPU pairing) will likely require less beefy hardware to get the same type of performance than their Intel Nvidia colleagues.

    • VelvetFistIronGlove says:

      Not related to their driver quality, but—my favourite feature of CCC is that, when it’s running at the same time as Photoshop, together they stop the escape key working in any program.

      • Nicodemus Rexx says:

        Glad to see I’m not the only one this happens to.

        Or maybe I”m sorry to see it, because it really shouldn’t be happening at all. Curse you, semantics!

    • Joshua says:

      AMDs driver related woes has come from them mainly emphasizing quick performance gains over reliability over the past few years (due to things like a bi-monthy release cycle). Ever since last year, however, they have turned this strategy around, and they are making major strides in the driver department. AFAIK it’s a result of the AMD merger.

    • iridescence says:

      Funny, my experience is the exact opposite. I had an Nvidia gfx card on my last system and was always running into problems with the buggy drivers which got really annoying after a while so on my current system I went with an AMD card and it’s been humming along for almost a year with no driver problems at all.

    • iridescence says:

      Funny, my experience is the exact opposite. I had an Nvidia gfx card on my last system and was always running into problems with the buggy drivers which got really annoying after a while so on my current system I went with an AMD card and it’s been humming along for almost a year with no driver problems at all.

    • Wedge says:

      I’ve been on AMD/ATI since I won a computer with 4870’s in it a while back. Got a 7950 for my new computer last year and it’s been great. The only time I had issues was with some graphical glitches and screenshots crashing in Guild Wars 2, but amazingly, I reported these problems to AMD and THEY GOT FIXED in the next driver version. Could’ve just been luck, but made me feel like my reports actually mattered.

    • BrightCandle says:

      I maintain a list of confirmed bugs that impact gamers and while both solutions have their problems the severity of bugs on the AMD cards is quite a bit higher. NVidia bugs are largely limited to individual games and mostly to particular parts or settings within the game. AMD has quite a few bugs that make a game utterly unplayable regardless of settings, bugs that impact its Windows 2D experience and impact every single game.

      What still surprises me is that AMD with clearly superior hardware barely beats out a card with some 30% less theoretic compute power. The gains they found in drivers alone, and the massive rewrites they are doing now to fix stuttering and multi GPU support show just how much of the problem is purely their software, because without a doubt their hardware should perform quite a bit better.

    • El_Emmental says:

      +1 on the driver issues.

      I owned 2 ATI cards and 3 nVidia cards (one mobile GPU).

      Never had serious issues with the ATIs myself, but when digging through forums I frequently noticed driver issues, especially in the first 3-4 months of release (some games even hard-crashing on launch). And recently we got the frame latency issues on AMD (ex-ATI) cards. I very rarely noticed such driver-related problem with nVidia.

      On the other hand, as far as I know, only nVidia pulled a G84M/G86M fiasco (hundreds of thousands of faulty chips, refused to replace them despite clear failure on their side, proved by several experts in courts, only accepted to pay a small compensation/coupons FOR CUSTOMERS RESIDING IN THE USA ONLY after a class-action took place there – and later publicly boasted about their positive financial results despite that fiasco, how it didn’t hurt their sales, how they were proud of it – sickening behaviour from corporate *ssh*les).

      AMD really needs to keep on improving its driver department, and they’ll rapidly get a bigger market shares.

  2. bstard says:

    I’m picturing hell as a place filled with consolfication and people who correct spelling.

  3. lijenstina says:

    Yeah good to learn something new. They were founded in 1969 it’s difficult to keep with all the names of new firms popping out in the x86 arena. :)

  4. Tyrmot says:

    Interesting developments with this. It makes me wonder whether for a gaming PC the best setup in a year or two may not end up being a 6/8/+ core AMD chip with an AMD GPU as well – like John I currently have an Intel/nVidia setup – while I expect that to stay in the front in raw power – since gaming is my main (only) requirement of all this power whether things will change…

  5. Lewie Procter says:

    What’s the difference between a “PC in a box” and a “PC”?

    • Misnomer says:

      Snark and derision?

      More likely it is a “PC with harder to remove case screws.”

    • BTAxis says:

      “in a box”.

      • darkChozo says:

        ” in a box”, actually. You’d make a terrible revision control system.

    • Fred S. says:

      A “PC in a box” presumably comes all in one box.
      If you build your own, your PC comes in lots of boxes.

    • FuriKuri says:

      The ‘in a box’ implies shame.

      Like what you keep in that box at the back of your wardrobe.

    • Giuseppe says:

      I dunno. Last I heard all PCs were housed in some sort of box-like enclosure. Unless THIS is how you prefer to keep your PC. :D

      • phelix says:

        That is brilliant. Now I feel inclined to build a PC inside a refrigerator just to see if it works and stays cool.

        • BulletSpongeRTR says:

          Won’t work due to condensation and the fridge’s compressor would burn out quickly trying to remove the heat.

    • pakoito says:

      Slightly different architecture while still using x86_64 instructions. Special emphasis on GPU memory and CPU-GPU data transfer.

    • golem09 says:

      A PC in a box has ATI hardware.
      A PC has PC hardware.

    • belgand says:

      Presumably because there isn’t an upgrade path. You’re stuck with the system the way it was developed and don’t have any latitude to rebuild it. Rather than a traditional PC where you generally do as you like and dedicated hardware where it’s not just an assemblage of more-or-less off-the-shelf components.

    • povu says:

      The box is locked and only Sony/Microsoft get to decide what goes in them.

    • Lenderz says:

      Well I don’t know about you, but I keep my PC in the cat, the taxidermist worked wonders

  6. FuriKuri says:

    The ‘in a box’ implies shame.

    Like what you keep in that box at the back of your wardrobe.

  7. Screamer says:

    Oh God no, I don’t want anything to do with Catalyst drivers again.

    • Sakkura says:

      Go Nvidia 196.75 then.

    • Asdfreak says:

      You haven’t tried nvidia laptop drivers then. Every ten minutes, the display driver would crash, didn’t matter wether I was using old drivers, newest ones or beta ones. Minecraft was especially bad with this. Something about the OpenGl stuff didn’t go well with the drivers. The Card itself should have worked fine, it more than fully fullfiled the requirements. It was a real pain in the ass, random framerate drops everywhere, but no matter what I tried, it allways boiled down to the goddamn driver. I don’t know if how good or bad the AMD drivers are, now I have a tower instead of a laptop with a nvidia card, but you all act as if the nvidia drivers were some kind of holy piece of perfection that became code

  8. Theory says:

    Ports…and emulators. :)

    • Raiyan 1.0 says:

      Yes, please! It’s hard enough games like Phantom Crash, Vanquish and Bayonetta are going to be pretty much lost when their respective platform hardware all eventually fail, without the benefit of archiving on PC through console emulators that every previous console generations have benefited from. Especially true for the Xbone, whose games are all going to be bricked when authentication servers are eventually shut down.

      What remains to be seen is how many games this generation is going to be worth the effort.

      • Theory says:

        Agree. It’s shocking to someone used to PC to discover that in general you simply cannot find proper copies of PS2/Xbox/Gamecube games (used or ludicrously expensive Ebay sales aside).

      • TheSwitch says:

        Had to sign up and reply to this comment. Phantom Crash was easily my favourite Xbox title, then I discovered it’s PS2 counterpart, S.L.A.I [Steel Lancer Arena International]. Same gameplay, pretty much, just different story/execution. If you weren’t aware of it you should check it out.

  9. Phantom_Renegade says:

    If they want me to switch to AMD cards, they need to get rid of Catalyst. Seriously, it’s like the GFWL of driver updating. No amount of crappy pc ports will get me to struggle through that again.

    • DrGonzo says:

      I recently switched from AMD over to Nvidia. Don’t understand what is so good about Nvidia drivers, and what is bad about the Catalyst drivers.

      • ch4os1337 says:

        The ignorance party is out to play today, both drivers are equally bad. Nvidias used to be better but are just as bad now.

      • MacTheGeek says:

        It’s just FUD being FUD.

        I have two desktops at home that run 24/7, one with an AMD GPU and one with an Nvidia. Never had a problem with either one.

  10. Fitzmogwai says:

    Really, all you people bitching about AMD’s drivers – the 90s called and asked for their joke back. The drivers are fine. They work and have done for quite some time now.

    If you’ve got nothing better to do than have a go at graphics driver support, turn your guns on Intel, because they really DO need dragging over hot coals.

    • Flopper says:

      You’re living in a fantasy world.

    • andytt66 says:

      I take it you’ve never had the pleasure of trying to get a CrossfireX system running smoothly!

      • Sakkura says:

        What about the Nvidia driver that would fry the graphics card?

      • HisMastersVoice says:

        I’ve run three different CFX setups in the last two years, including one that was essentially a quad card rig. Smooth sailing with the notable exception of Shogun 2 being a jerk.

        • andytt66 says:

          Did it handle Witcher2 okay? I’m willing to entertain the notion that I was just being technically inept, but I consider myself pretty computer literate and I simply could not get that to not CTD during the intro video.

          Single HD7970 is lovely, mind you, but two of them in one box just wouldn’t play nice together :(

          • HisMastersVoice says:

            At least two of them (one of which was a double 6990 setup) had no issues with Witcher 2.

            I actually have two 7970 in the PC I’m typing from and I have yet to encounter a serious issue with any game. In fact, the only problem I’ve had so far was with PS2 where I had to force alternate frame rendering to avoid negative scaling, which is as easy as it gets with Radeon Pro.

  11. US_D3ltaF0Rc3Warr10r says:

    “Nvidia cards are still in the slight majority of gaming PCs.”

    *slight* LOL

    Among Top 10 DX10/11 capable GPUs, AMD has 1(one) entry – HD5770, Intel has one, and the rest is Nvidia.

    link to

    • John Walker says:

      You’re misreading. Look how tiny the percentages are in that “top 10”. Overall, Nvidia have around 52%, and AMD have 33%.

  12. mattevansc3 says:

    Well that all sounds nice from AMD but I don’t think it was differing hardware setups that caused Mass Effect 3 and Dragon Age 2 to be designed for gamepads and ship with no gamepad support and low res textures or for Skyrim to not handle more than 4GB RAM on a 64bit system and ship with poor quality graphics and gamepad centric menus.

    The reason PCs get shoddy ports is because publishers know they can do the smallest amount of effort and it will still sell and no amount of hardware parity will fix that.

  13. Torn says:

    So does this mean more quick & dirty console ports with little-to-no proper PC features, or are we talking about things all PC ports should have like configurable graphical options, Field of View sliders, key remapping, etc. etc.

    If the engines the devs create, or use, don’t support these things then I don’t see how an ‘easier porting process’ would necessarily mean better ports.

    • gi_ty says:

      It would mean that exactly actually, rather than devoting the time required to rewrite code for completely different platforms, the labor savings could be applied to platform specific enhancements and needs.

  14. deke913 says:

    I currently have paired an 8150 8core with a evga gtx 680. Never used an AMD or ATI card but would if I had to balance the cost and the AMD/ATI came out cheaper.

    I wonder if in a few years I will be better served with an amd/ati card now however.

    The 8150 with the nvidia card is just damn smooth though. I oc’d the 8150 from 3.6 to 4.0 and it idles at 17c with water cooling.

    Never had a single problem from drivers for any nvidia card I’ve owned all the way back to before nvidia bought them and I was running a voodoo 3.

  15. Liudeius says:

    On a not-so-related note, I really don’t get why companies don’t just put an unbelievable amount of RAM into their machines. RAM is pretty cheap as far as I’m aware, so doubling what you think you need it only going to raise the price by… Well it seems 8 GB of RAM (DDR3) is $60 on Amazon at the moment, though I’m sure mass production and direct from dealer buying of a manufacturer would cut that down.

    But also for PCs, maybe there is some processing/driver/architecture issue of which I am unaware, but an excess of RAM (well, VRAM) seems to me to be a pretty cheap method to ensure the only thing holding your system back is how much you have to spend on processing power (and even then, low processing power means a slow down, low RAM means a crash).

    • HisMastersVoice says:

      Vram (as in ram you get in your GPU) is actually much more expensive than the ddr3 stuff the system uses. The recent 780 is essentially a Titan with half the memory and it costs something like 300 bucks less. Not all of the difference is in memory but it makes a decent chunk.

  16. Don Reba says:

    You just made a powerful enemy, mister.

  17. Velko says:

    “Kaveri” is Finnish for “friend” or “pal”.

  18. Saul says:

    “Many” games are originally developed on PCs? Where are the others developed?

    • Don Reba says:

      On consoles, I imagine. During development, you could run your game on a PC or on a devkit. There are drawbacks and benefits in either case.

  19. BrightCandle says:

    AMD has always been a company that tried to push standards into the market place. They all too well that NVidia is in the PC market space so I hope what they do is make a solution that ensures all 3 parties (Intel, AMD, NVidia) can benefit. If its an initiative to attempt to sell more APUs then it will largely fail to change a thing and likely backfire.

  20. VengefulGiblets says:

    Much of the benefit will probably come from all platforms using DirctX 11. My guess is that AMD will have a performance benefit in a way that is similar to how NVIdia’s “way it’s meant to be played” campaign provides a benefit to NVidia hardware.

    That’s my take on it anyway. Guess we’ll see.

  21. mganai says:

    I don’t think it matters whether you’ve got an Intel or AMD CPU. There will just be differences in handling. Perhaps someone better versed in the intimate differences in how things are handled can explain.