Why £200 / $250 Is The 1080p Graphics Card Sweet Spot

Is this the best sub-£200 board you can buy?

What’s the best graphics card mere mortals can buy for around £200 / $250? This is a question for the ages. Or at least for a slow Thursday evening. In all seriousness, the £200 / $250 price point ticks a lot of important boxes. It’s been in and around the sweet spot for balancing price and performance for properly gameable graphics for a while. I reckon it’s also pretty near critical mass in terms of how much you lot are willing to spend on a video board. At a push, most of us can stretch to £200 / $250 if the payoff is great gaming. Luckily, it is.

As it happens, it wasn’t all that long ago that AMD argued this price point was where its efforts would increasingly be focused. The inference at the launch of the Radeon HD 4870 back in 2008 was that anything more expensive amounted to little more than pixel-shaded point scoring for PR operatives. Not many people spent more.

Sadly, it turned out those sentiments were mere convenience. At the time, AMD didn’t have a GPU to compete at the high end, so the £200 / $250 focus made for a plausible and punter-pleasing argument. As soon as AMD had a competitive high end chip again, guess what? It jumped straight back into the £300 – £500 market. It was ever thus.

But that was then. Right now, £200 / $250 lands you right in the meat of the gaming graphics market and with an awful lot of options. As ever, you’ll have to start by grappling with a few classic trade-offs. You need to compromise on something.

You can have the very latest technology, for instance. Or a card with true high-end features like a really wide memory bus and a tonne of memory. What you can’t have at this price point is everything. Then there’s the whole AMD versus Nvidia thing, not to mention the question of matching your graphics card to the rest of your rig.

AMD’s Raeon R9 285 is nice for the price. But the 280X is a more bona fide high-end board…

I’m thinking here of things like CPU performance and monitor resolution. If you have a really feeble processor, money spent on graphics will basically be blown. Just how feeble need your CPU be to render a GPU upgrade redundant? On the Intel side, almost anything remotely recent with at least a couple of cores and decent 3GHz-plus clocks is pretty gameable and worth pairing with good graphics.

For AMD CPUs it’s a little more complicated. For starters, my experience with AMD chips in-game has dwindled so much in recent years, I’ve lost that gut feel for what to expect. But if it’s not at least four cores and probably 3.5GHz, I’d be worried about CPU bottlenecking, that’s for sure.

As for the the monitor thing, you just need to bear in mind loads generated by different resolutions and the critical need to run modern LCD panels at native resolution for optimal image quality. 1080p (ie 1,920 by 1,080 pixels) is the most common resolution by a mile currently, and it’s not actually all that demanding for modern GPUs.

In other words, if you’re running 1080p with a slightly old graphics card and getting decent frame rates, maybe you don’t actually need a new board. On the other hand, the step up to 1440p (2,560 by 1,440 pixels) is a biggie in terms of GPU load. You’ll need something pretty snazzy in the graphics department if that’s the direction you want to go.

Anywho, with those provisos in mind, what are your options?

For the 1080p-ers among you looking to keep costs down, you could do a lot worse than Nvidia’s Geforce GTX 960. Instinctively, it’s a GPU that makes me very uncomfortable with its pitiful 128-bit memory bus. But it does sport what you have to concede is the best graphics tech currently available, Nvidia’s Maxwell architecture, and it knocks out great numbers at 1080p. Not bad for just £150 / $200 for a 2GB version. Just don’t expect things to fly if you upgrade to 1440p. It’s just not that kind of card.

nullAt $260 in the US, the AMD 290 is impossible to ignore.

It’s also worth noting that a 2GB frame buffer could become increasingly marginal, even at 1080p, with more demanding future titles. It’s tricky to know just how much of a limitation that could prove.

The AMD alternative here is the Radeon R9 285 2GB. It’s available here in the UK for sub-£130, which looks appealing. But US prices seem stuck up around GTX 960 levels / $200 for reasons I can’t quite fathom. Still, it’s another good 1080p card so long as you’re not expecting epic future proofing. Oh, and it has a 256-bit memory bus. Decent.

The next step up involves a pair of distinctly elderly cards that somehow still remain relevant. I’m talking AMD Radeon R9 280X and Nvidia GeForce GTX 770. Availability on both is patchy, but I still prefer them to the newer alternatives (ie the AMD 285 or aforementioned Nvidia 960).

That’s because both were once true high-end boards with features like big memory buses to match. For the Radeon, that means a full-on 384-bit bus and 3GB of memory for £200 / $250. Meaty.

AMD factory coolers look like this. Avoid them.

The GTX 770 is probably even harder to find. But it can be done, with 4GB variants available within our £200 / $250 remit. That’s a lot of hardware for the money, even if it’s old-school Nvidia Kepler clobber, not the latest Maxwell gubbins.

Both of these boards will do the business very, very nicely at 1080p but also make a pretty good stab at 1440p gaming, albeit with a few compromises on the eye candy required to get some games running really smoothly.

As a final twist, ponder this. Bump the budget to £250 / $300 and suddenly the likes of the AMD Radeon R9 290 and Nvidia Geforce GTX 970 enter the equation.

Nvidia’s 970 looks like value here in Blighty.

Actually, in the US the 290 seems to go for as little as $260 while the 970 is more like $320. Such are the perils of trying to cover both sides of the Atlantic. Either way, what you’re getting is a slightly cut-down version of each outfits’ finest current GPU (well, bar Nvidia’s new Titan X which is almost a category of its own). And that for me has always been the optimum.

Again, it’s complicated by vagaries of pricing each side of the Atlantic. But $260 for an AMD 290 looks like a steal to me. In the UK, I’d lean towards an Nvidia 970 for about £250 simply because I continue to suffer significant driver-related headaches with AMD graphics. It’s a cliché, but it’s a cliché because it’s actually true. AMD drivers remain more problematical.

As for what specific card to buy once you’ve chosen a chipset, I’ve never personally found it matters all that much. Occasionally something compelling appears. But of late that’s chiefly been the compulsion to avoid the awful factory cooler on AMD cards. Fortunately, almost any AMD 290 or 280X you can buy today will have custom cooling. So, I’d say choose according to price, a supplier you are happy with and the warranty cover. Good luck.

148 Comments

  1. Continuity says:

    Are we still caring about 1080p for some reason?

    • Rich says:

      The majority play at 1080p, so yes.

    • dontnormally says:

      “1080p […] is the most common resolution by a mile currently”

      Because not everyone is you, believe it or not!

    • doswillrule says:

      Most people looking to upgrade will already have a 1080p monitor. If the choice is buy a new screen and a more expensive card, or keep the screen and buy a cheaper card with the same performance at a perfectly serviceable res, that isn’t much of a choice. That will change as more games and TV/film provide the option, but a bit like a mechanical keyboard, 1440p is something you can’t easily sell the benefits of: you have to see it.

      • malkav11 says:

        Yeah, skimping a bit on monitor resolution is a step that pays significant hardware load dividends in the long run. I’m sure 1440p is nice, but it’s not more money on all my components nice and until it’s a default (which may not be until the next major console generation, if then), I won’t be biting.

        • Unclepauly says:

          I bought this 27″ Monoprice IPS 1440P for 300 usd and it’s the second best monitor I’ve ever owned (1st place goes to my sony fw900). I chose to skimp on the graphics card to get the screen and I feel it was the MUCH better choice. I just upgraded to a gtx 970 which maxes out most games at 1440P but my old gtx680(overclocked to 1300mhz) actually managed most games at high settings with OK framerates. Putting more significance on the screen imo is the wiser choice in 95% of instances for the simple reason that we can lower settings until that GPU upgrade.

        • TacticalNuclearPenguin says:

          1440p will probably be the new 1080p when the next GPU generation hits, and i’m talking 16nm, with 4k for the enthusiast minded who might just need a single high end GPU for most games, and SLI for the few others ( I’m talking TW3 and the likes ), while 1440p will be achievable on the same budget this article is targeting.

        • Geebs says:

          1080p may be more practical for gaming, but 1440 is much more practical for actual work, as in you can have two decent sized documents open side-by-side, etc. At 27″, the DPI is just about high enough that you can run games at 1080 and have them not look too much like arse, unless you’re running Far Cry or an Unreal engine game (I don’t know why, they just seem to scale horribly). And, yeah, as UnclePauly said, a 680 runs most current games at either 1080 or 1440.

          Plus a 680 confers the major advantage of not being made by AMD *duck and cover*

          • Geebs says:

            As a recent Retina convert, though, I wish I had enough money for a decent 4k monitor for my desktop.

          • TacticalNuclearPenguin says:

            Yeah, but if you want “retina” and still stay cheap on the hardware, then you’re no longer looking at games+work but work alone.

            Mostly because i still can’t believe how someone who feels the need for higher density would ever consider running non native, at least if we’re talking about perfect visual acuity ( over 10/10 )

            There are many myths around, like 3840*2160 being able to display 1080p properly since it’s double on both sides, but that’s untrue aswell because of the subpixel structure of every pixel. You’d want 4 of them to perform as a single one but that’s impossible as the whole block should divide itself into a single RGB pattern but instead there are 4 of them in a repeated pattern. Those 4, non unified patterns will destroy the sharpness of the image and everyone can try as it’s very easy to spot.

            I know i’m derailing your point a little but i find it important that it will never be forgotten that anything non native in a LCD panel, no matter the situation, is always going to have a problem in a way or another.

            I respect your opinion about it if you want the tradeoff decent enough for you, but if i’m writing all this is simply because i noticed that more and more people are starting to err on the uninformed side of this matter, and i’m not talking about you there. Again, sorry for using your post as a platform for that.

          • souroldlemon says:

            @tacticalnuclearpeguin, you’re right about such a system being work-oriented.
            The subpixel geometry means that adjacent pixels are not identical, and this slightly (imperceptibly) distorts the image. Replacing that pixel with 4 pixels reduces that distortion. So the 4x resolution monitor showing a lower resolution image will not precisely reproduce the artefacts of the lower resolution monitor, but that’s a good thing.
            Displays use complex subpixel geometries, but a simple example configuration could be:
            R G
            G B R B
            R G
            G B R B

          • souroldlemon says:

            I’ll try again:
            **R*****G**
            G**B**R**B
            **R*****G**
            G**B**R**B

          • TacticalNuclearPenguin says:

            I think you have some confusion here, read this thing over there: link to lagom.nl

            Every single pixel has 3 subpixels, the layout can change but the point is that 4 pixels “stitched” together won’t display a uniform, bigger subpixel layout, but 4 discrete ones, as in:

            RGB RGB
            RGB RGB

            Instead of a super big RGB 4x times the size.

            You can try this yourself by setting your screen at exactly half of your resolution, both horizontal and vertical, so that you’re feeding the information of 1 pixels to 4 of them. It’ll look like a mess.

          • Geebs says:

            Oh, I totally agree that native resolution looks much better on an LCD monitor, certainly up to 1440 at 27″. On my laptop, “non-retina” applications look like total crap but the dot pitch is sufficient that 3D games actually look OK. I was kind of hoping the same would be true of, say, a 27″ 4K monitor, so that I could run native for text-based work and lower res for games. It’s disappointing to hear that it’s not (although I do get a fair amount of free antialiasing from my visual system these days :) )

          • souroldlemon says:

            @TNP: yes, what this means is that if your operating system is trying to squeeze out a bit more antialiasing, at the expense of colour resolution, by doing subpixel rendering, and the display hardware and the rendering middleware disagree about the resolution, it’ll make things look worse.
            If you want to see how images would appear on your your monitor at half the resolution, you’ll also have to disable subpixel rendering in the OS.
            If the native resolution of the image is 1080p, and BOTH the monitor resolution AND the operating system resolution setting are twice that in each axis (which is how things are in practice for a high-res monitor) then there will be no artefacts, whether or not subpixel rendering is enabled.

          • TacticalNuclearPenguin says:

            Oh, i think i get what you mean now, thanks for the debate!

          • TacticalNuclearPenguin says:

            Oh and yeah Geebs, gaming wise the free “antialias” thing is sort of nice, i’m just worried that if i notice it in the UI it would spoil the game. Well at least for my taste, that is.

    • fish99 says:

      Everyone I know still games on 1080p screens.

    • McPartyson says:

      I care about 1080p. I bought a 50″ 1080p LED 2 years ago and I use it as my main pc gaming screen. I don’t plan on replacing the thing any time soon and games look just great on it.

      • melnificent says:

        Same here… We have a 42″ Samsung. I don’t see the point of ditching a good tv for more pixels, when I’ll then have to upgrade GPU and CPU to take advantage of them.

        • Unclepauly says:

          Well one other reason to ditch those TV’s for pc gaming is the input lag. TV’s range anywhere from 40ms(the very best tv’s get this) to (I shit you not) 500ms. The average LCD TV gets around 100ms input lag while my 27″ 1440P IPS monitor gets around 8ms while the average pc monitor gets has between 15 and 25ms.

          • TacticalNuclearPenguin says:

            The only thing you can do with a TV is disable everything and hope for the best.

            Afterall, each single one of those special features, like artifact-happy enhanced sharpness, magical color and contrast mode and any other kind of silly crap is extra processing, which means input lag, and i never saw i single TV set ( and i saw many ) that was clever enough to use any of those magical features and not ruin something somewhere.

            Skin tones are usually the first thing to die a horrible death, with grass as a direct follower. In the worst case fire and explosions can get a weird tint, or other stuff like the sea, especially if we’re talking about a tropical island in which case nausea might occur.

          • McPartyson says:

            I don’t notice any input lag of which you speak of. I have several 20-22″ monitors rated at 5-8ms in my family’s house, and I personally cannot see or feel any input lag on the 50″ in comparison to the smaller monitors. The 50″ is a basic Samsung LED 1080p/60.

          • TacticalNuclearPenguin says:

            Remember that input lag has nothing to do with the rated milliseconds though, that is simply a measurement ( and the methods for that are debatable ) of pixel response time.

            Sorry if you already know what input lag is, but since you said that i’m just taking for granted that you don’t. It’s basically the time it takes for any input to be actually displayed.

            It’s also something that you can definitely get used to if you don’t have many other different examples, plus not everyone is so disturbed by it. One of my heist companions in GTA lives pretty close to me and he plays on a TV set that also happen to be one of the worst offenders i ever saw, and when i play with that it’s absolutely noticeable. I simply feel less connection.

            It’s worth pointing out that he actually use any possible filter and “enhancement” his TV is capable of, and it happens to have a LOT of them. I’m pretty sure he could shave a lot of time just by disabling most of that stuff. The sharpness is so cranked that you can see halos and other artifacts around objects, but he seems to think it’s ok, and that speaks volume about how much one can get used to things.

    • iainl says:

      Yes, because I feel less bad spending money on my PC for my photography than I do for my games, so I went for a good, calibrated 1080p instead of a more “gaming” monitor with more, faster but inaccurate pixels.

      Since that also means I don’t need anything faster than my 770, that leaves more money for decent glass, too…

      • TacticalNuclearPenguin says:

        Hah, and decent glass is where the real money starts flying in all directions as if there’s no tomorrow.

        Oh, and welcome to the world of properly calibrated monitors from which is impossible to go back. It’s hard to make converts because few people really want to splurge on something they see no value in. If only they could see what we see in front of our eyes right now…

        • Unclepauly says:

          I’m with you on this, my TV(panny plasma) and pc monitor are both calbrated to standards and now when I see other peoples displays they usually look like candy coated cartoons are playing when in fact it’s a national geographic documentary.

          • pepperfez says:

            I dunno, documentaries being rendered into cartoons sounds like a pretty big benefit to me.

      • Reavergold says:

        For photography, a 4k monitor and 1080p calibrated monitor have been invaluable. Being able to see my image, almost at the resolution I shoot at is amazing.

    • MikhailG says:

      Yes mister PC masterrace. Not everyone has money to blow on high end monitors.

    • Alfius says:

      I’ve mixed feeling about the convergence on 1080 as the de facto standard.

      On the one hand, it means developers can optimise their wares to look good and run smoothly at a small number of common resolutions, of which 1080 is the baseline and hardware manufacturers can (presumably) take this into account as well.

      On the other, the £200 1920×1200 24″ Samsung monitor I bought five years ago is still better than anything on offer at that price point today. I’d like to upgrade at some point but unless I want to drop £300-400 at least there’s nothing for me to upgrade to.

  2. caff says:

    I would be pushing for 1440p or 4k. I’m running 4k on a 970gtx without too many issues. It’s time the PC master race rises up and goes way beyond what crappy consoles can do.

    • Asurmen says:

      Won’t happen until it doesn’t cost your kidney to do so.

    • Vin_Howard says:

      60fps at 1080p is a lot better then the console’s current 30(ish)fps at 720p

      • melnificent says:

        30ish at 900p on PS4 and 792p (yes really) on the Xbone. I believe MS was getting upset about the “only 720p” stuff that was happening and pushed for a more than 720p approach.

        Either way though 1080p at a locked 30fps with every single graphic knocked to the max on a PC is still a large step above the consoles. I know, I know 60fps. But I prefer the locked 30fps my 270x is capable of versus the 40-60 variable rate if I push for 60.

        • DanMan says:

          Which is nuts by MS, because you usually sit much further away from your TV than a monitor, so the game resolution isn’t that important, unless you have a really huge TV (>60″).

        • TacticalNuclearPenguin says:

          But you still have an advantage as your locked 30 fps at least probably never dwindles, which was often not a case with the previous and current generation of consoles.

          I absolutely remember Mass Effect being for me unplayable or GTA5 always jumping from 15 to 25+ fps.

    • Nasarius says:

      I’m planning to build a shiny new PC late this year after Skylake is available, and my primary concern is the possible demands of VR. The Valve thing is two 1080×1200 screens running at 90fps. If my math is correct, that’s almost the same number of pixels per second as 4K at 30fps.

      So, I’ll definitely be looking at 4K benchmarks before I buy my next GPU.

    • Hebrind says:

      That’s all well and good, but I’m going to be gaming at 1080p until it doesn’t cost you £600-800+ to get the kit to game at 60fp on ultra at 4K. I have a life to lead, y’know .____.

      • DanMan says:

        Same for me, really. I finally have a system now, that I expect to handle everything I throw at it just fine at 1080p60. As much as I like to make the step to 4k, it’s still too early IMO. It doesn’t help that the OS and desktop software still don’t work perfectly with high ppi displays. But the horsepower you need for 60fps @max details makes it too expensive in the end.

        I could see myself make the switch to a 27″ 1440p G-Sync monitor though, since it wouldn’t have to run at 60fps constantly anymore.

    • hamilcarp says:

      1080p 60fps is still more than the new consoles are capable of in most new games. Besides, not everyone has USD$1.5k + to blow on a new rig. And cut out that “master race” nonsense, we shouldn’t have to constantly reaffirm the superiority of PC hardware, it comes off as insecure.

    • CookPassBabtridge says:

      @caff – In case you are feeling depressed, just to let you know that some enthusiasts do still exist on this site: I recently upgraded from my poor little gaming laptop to an X99, SLI 980 rig and have the Acer 144Hz IPS G-Sync 2k Monitor coming tomorrow. Its the rig I have been wanting to treat myself to for years and years (read decades) and I finally saved up and went for it. Performs beautifully and can’t wait to see what it makes of that Predator ;)

      • MichaelGC says:

        Aha – is that the Acer XB270HUbprzomfgwtfbbq? (With the important bit being the ‘U’. Obviously…) I’ve been keeping an eye on those, but I’m waiting for Amazon UK to stop showing “Usually delivered in 1 to 2 months.” Usually? Usually. Righto, Amazon, whatever you say…

        Anyway, I’m very interested to know how the XB270HUqwertyuiop performs in the wild*, so if you happen to feel like sharing your experience with it in the future, please do, and I will keep an eye out!

        *As it were. Not a commentary on the state of your garden.

        • CookPassBabtridge says:

          Lol yeah I have no idea what all the letters are for but SCAN and Overclockers (UK) have them in now after months of filling backorders. I think newegg had them too. I will come back and let you know over the weekend if I remember what is like. I felt 2k was a better spot to aim for as it looks great, but I’m more likely to get those 120 (MAYBE 144, probably not consistent) FPS with SLI 980’s than if I 4k.

          WTFBBQ
          Rofl :)

          • CookPassBabtridge says:

            OK so dont know if you are checking this thread, but I have the monitor. In general it looks great, and when getting high frame rates the gameplay is super smooth with no tearing even though the FPS jump about. Looks like the G-SYNC is doing its thing. Colours look fantastic and the 2k resolution makes for beautifully sharp images. It even seems to add ‘presence’ to games – I had a shot at Metro: Last light and it gave people a much greater sense of weight and a sense of almost being there. Not quite VR, but a strong effect.

            The major downside to it though is the backlight bleeding, which is really noticeable even in semi-dark scenes (which much of Metro is made up of). Apparently it seems to be a fault with a large amount of them. Mine is going back for exchange, but if the next one has the same problem I will probably go refund. I think if you spend £700 on a monitor it shouldnt have yellow halos in each corner :/

          • MichaelGC says:

            Many thanks for the update! Yes, I don’t think I could possibly agree more with your final point, there… Fingers crossed for your replacement, and I think I will hold off for a little while longer and see if they can get those problems ironed out. Oh well, at least Peggy the Piggy Bank will be pleased to hear of her stay of execution…

        • TacticalNuclearPenguin says:

          Yeah, gotta love monitor names, right? I recommend checking Philip’s product range.

          • CookPassBabtridge says:

            The weird thing is usually stuff has a daft name OR a number. Ford Mondeo. BMW 320d. For some reason ACER decided to have both a GI Joe action figure name AND a daft number. ACER PREDATOR XB270HU. Now they’re just showing off (I think amazon sensed the daftness and made the other letters up for a laugh, as overclockers and SCAN don’t list them).

    • OmNomNom says:

      Lol. 4k on a 970 must be like watching snails crawl across your screen. All the 4k screens so far are poor for gaming too.

      A few good 1440p ones now though

  3. Rich says:

    Are stock coolers OK for Nvidia cards? The £250-ish ones are all reference cards.

    • frenchy2k1 says:

      Yes, recent nvidia’s reference coolers are fine, some are even great (GTX980/970).
      They are all blowers, but work reasonably well. Their trade off is a bit more noise but exhaust all the hot air to the back. Open coolers pump most of the heat back into the case, requiring better cooling there.

      Their high end coolers have been great since the first titan and the 700 family. As they kept the same model since, but their cards use less power, they are over designed for those cards and work really well.

      • CookPassBabtridge says:

        I have the EVGA ACX 2.0 cooler on my cards, and whilst it is a great cooler, it is really annoying having the air vent into the case, for 2 reasons. One is as you say – that heat is now in the case and must be got rid of (not too much of a problem with a nice big 140mm fan on exhaust duty), but the bigger irritation is that my case has some really nice dust filters keeping the thing nice and free of build up. The ACX coolers just bypass all of that, and you can see a film of dust beginning to form on the backplate of the card below it. Maybe it was a necessary evil for the cooler to work as it does, IDK, but it does keep a superclocked card nice and cool.

        • TacticalNuclearPenguin says:

          Remember that afterall said cooler is even more impressive than it initially sound like, especially if you take into account that it has pretty much the same performance as others ( for example from ASUS ) that use a veeeeery dirty and cheap trick:

          They only bother about cooling the chip itself, bypassing the extra heat of cooling everything else, so their life is far easier but the cards are absolutely a LOT less happy about that idea.

          The ACX cooler works for the whole card, which is the only honest and genuine way of doing that job, and i can assure you that other developers would do the same if they were forced to show you the thermal readings of the other stuff they are ignoring.

          Off course i’m not making precise examples as i can’t remember which brands ( and model ) are using approach A instead of B, but i know for a fact that the ACX cooler is like that and some models at least from ASUS cheaped out on it. I can’t be more specific that this but as usual, always read a lot of really in depth reviews before buying stuff, and in your case you can absolutely be happy about your purchase.

    • DanMan says:

      Merely OK. Custom ones are usually more silent and more cooling efficient.

  4. mkreku says:

    “It’s a cliché, but it’s a cliché because it’s actually true. AMD drivers remain more problematical.”

    Really?

    Could you please specify which games are giving you problems? Or what other problems you have with AMD’s drivers?

    I keep reading this general complaint a lot, but no-one ever specifies what’s wrong, and it’s so opposite to my own experiences with AMD cards/drivers.

    • thedosbox says:

      For those handful of games that use it, OpenGL support isn’t the greatest. And while it’s been over 18 months since I last used a AMD card (a Sapphire 7870), texture shimmer was common in a variety of games.

    • Rad says:

      He and/or She also says that they have had little experience with AMD card in recent memory:

      “For starters, my experience with AMD chips in-game has dwindled so much in recent years, I’ve lost that gut feel for what to expect.”

      As for AMD’s drivers, I haven’t had a single problem related to the drivers for about 5 years, and the current 14.2 Omega drivers a very nice.

      • Premium User Badge

        Dukey says:

        He was referring to AMD CPUs there.

        Personally, I used to have a fairly budget AMD card in my laptop (which for a long time was the only thing I could game on). For what it was, it performed pretty well in most games. The big problem I had was that not long after I started using it, the Catalyst Control Centre stopped working and never started again. The process was there in my Task Manager but it just wouldn’t open. I couldn’t change any settings whatsoever for my graphics card ever again. The next four years were spent trawling the internet for a solution and all I ever found was people with the same problem and no fix. I tried absolutely everything.
        On a laptop where this was my only method of chaging the LCD panel scaling settings, this was a huge problem for me. I also had to dig quite deep in the AMD website to find if there were new drivers available and when there were, they would often fail to install.

        So when I finally upgraded to a new PC I practically wept tears of joy when all the accompanying software with my new Nvidia card actually worked. I’m now on my second Nvidia card and I don’t think I could ever go back. Just the whole experience of getting things running smoothly, tweaking settings and finding and updating drivers is so much easier, and actually works 100% of the time, instead of just when I’m lucky.

        (obviously this was all a few years ago and based purely on my own experience)

    • Vin_Howard says:

      Well have owned an AMD GPU for the last 4 or so years, I have only had two issues:

      1. an update broke my computer; although I suspect there was something else wrong with my computer at the time and I ended up doing a factory reset

      2. My mouse cursor breaking when playing RTS-interface games (this has been an issue with AMD drivers for several, several YEARS now)

    • Jeremy Laird says:

      It’s not so much game support as problems with dual-screen, dual-card and things like boot-to-black screen post driver update. The latter might not be game-specific, but it’s hard to game with an entirely black screen!

      I’ve had numerous such issues in the past 18 months with various AMD cards. I’ve had to do clean installs on at least one occasion I can recall due to AMD boards refusing to do anything but boot to black screen. It’s beyond exasperating. I know other hardware guys have a similar attitude to me – always a bit nervous of testing AMD boards for work due to the elevated risk of ballache and days lost to problems. I can only report what I find.

      • FriendlyFire says:

        I know it’s somewhat anecdotal, but while debugging graphics issues for the mod I’m working on, I’d say something like 75% if not more is AMD-related. Generally, people don’t update their drivers, and for whatever reason the older drivers tend to be a lot more problematic on the AMD side. Even with updated drivers I get some crash/bug reports and every single one of them has been AMD so far, reports which I just can’t reproduce on my Nvidia card.

        It’s rather frustrating in my case because I’d much rather run into as many bugs as possible myself so I can fix them faster! Yet, I prefer being able to play without issue.

        • Premium User Badge

          Grizzly says:

          A year or two ago, AMD drivers tended to push performance at the cost of stability, with a two-month release schedule. They have since dropped this practice as it was silly.

    • Premium User Badge

      Maltose says:

      I would have agreed with you 3 months ago, but 2 recent releases gave me some problems, with both the latest stable and beta drivers. My card is an AMD 7950 (basically and R9 280).

      1) GTA 5 – MSAA is broken, ’causes the bottom right corner of my display to flicker and artifact (imagine the “hacking” graphical effect in Watch Dogs, only as a bug, not a feature)

      2) MKX – Serious artifacting around characters, even at lowest settings.

      • TacticalNuclearPenguin says:

        With the equivalent of a 280 you’re actually quite lucky that MSAA is sort of broken, that way you don’t feel compelled to sacrifice the majority of your processing grunt for it.

        With only FXAA you might actually be able ( on 1080p ) to raise a lot of other good things, maybe not on Ultra, but if you stay clear of abusing the “grass” settings and to have very low tessellation ( which is AMD’s Achille’s heel ) you might be in for a decent time.

        • All is Well says:

          This is really true. GTAV performs quite well on the HD7950. I have one too and I can max out the settings and still get a consistent 60 FPS, but if I turn on MSAA, that drops quite sharply. 2x MSAA is around 20-30% performance decrease and 4x is around 50, sometimes more.

    • Premium User Badge

      phuzz says:

      Far Cry 2 just refuses to run on my rig ever since I moved from nVidia to AMD. I’ve also had a few crashes to desktop in various games, but keeping the drivers up to date seems to fix that.
      I’d say that AMD drivers are still a little behind nVidia, but not by much these days. It also depends a huge amount on what games you’re playing.

    • Moraven says:

      I must not play the problem games. Never had an issues with AMD/ATI cards for a decade.

    • Premium User Badge

      BlueTemplar says:

      I can give you specific examples here :
      1.) StarDrive 2, recently released, Unity 4.4, ATI seems to have more of this problem :
      link to steamcommunity.com

      2.) Zero-K, OpenGL, many issues with ATI :
      link to zero-k.info
      (though it would seem they have been mostly fixed now?)

  5. Vandelay says:

    I’ve been thinking I will probably hold tight on my current computer until sometime early-ish next yeah and then build something new. Do we know whether there will be any new cards coming up in this sort of price range?

    My current card was in the £200-250 range (a 6950.) To be honest, I’ve been a tad disappointed in its performance. Having come from a 4850 (which, I think, was about ~£120,) it definitely was an improvement, but it wasn’t the mind blowing performance boost I was expecting from a card that I would of thought to be in another class. Instead, it felt like I had just gone and bought an equivalent modern card of what I had before. I suppose that might be more a testament to the superb 4850, rather than the 6950 not being great, but it was a bit of a disappointing surprise for me.

    Not that it is a terrible card. I’m still using it very nicely at generally medium-high settings, getting between 40-60fps in most games. I probably will be staying clear of AAA games this year, but I expect it would run them okay, mostly.

    • Premium User Badge

      Maltose says:

      link to reddit.com

      Since there’s no good price trackers for newegg, I use this subreddit as a defacto price history by searching it for whatever I’m looking for. You can get pretty good deals too.

  6. Chaosie says:

    This is a strange argument I try to make to my friends or literally anyone who asks whenever I outline my new PC builds. I don’t generally spend more than the 200-250 range on a gpu because of diminishing returns.

    When I do a build I have a price point in mind, and generally its based about what I historically can find delivers the most performance per dollar. Yes I can spend more and get more, but I’m just blowing money at that point. When it comes to brand, it is, for me, whatever is working at that price. For most applications one brand or the other is ‘fine’. It’s not at all like the days of the 3d accelerator race.

  7. fish99 says:

    The big drawback to the 960 is most of them are 2GB. I’m seeing quite a few of the current generation ports using over 3GB vram. The 4GB versions are £200 so you may as well get a 970 at that point, that extra £50 gets you a lot of extra performance.

    I would say that though, I bought a 970.

    • Baines says:

      Wasn’t the controversial issue with the 970 that it only buys you 3.5GB of quality performance, and that games accessing the last 0.5GB of RAM could cause visible stuttering?

      • zat0ichi says:

        I’ve read the same.
        Some people have done some serious analysis and have found a measurable performance drop off.

        In real world situations I’ve found it hard to generate and 0.5 related stuttering.
        Farcry4 at 1440p DSR to 1080p was janky.

        I push GTAV and I get stuttering with every thing maxxed but there is a line between 1440p (DSR) with MSA2x and 4x where the stuttering kicks in. I also use that nvidia MFAA optimisation thing. The other irritation is that GTAV at my preferred setting runs between 45 and 60 fps which means Vsync is at its worst and without it the screen tearing is horrid.

        970 the perfect 1080p card? It should have been but isn’t.

        • Unclepauly says:

          To be fair Far Cry 4 stutters even on cards with 6gb vram and GTAV is a vram HOG. Rockstar did a great job optimizing the engine for vram overflow though and it just seamlessly spills into system ram if you go over. People with really fast system ram can actually get performance increases because of this.

          • fish99 says:

            I’m getting good results in GTAV with 1440p DSR and FSAA. Good compromise between AA, perceived resolution and framerates on my 970. Performs better than 1080p with MSAA or TXAA.

      • fish99 says:

        True, but it’s still better to have 3.5GB than 2GB. I have no idea if the 4GB 960s have the same issue.

        As for whether it’s a big deal, I haven’t seen major problems when my 970 pushes past 3.5GB, dunno if that’s because nvidia have improved things with the drivers, but I have seen some games stick stubbornly to 3.5GB as if they don’t know the extra 0.5GB is there.

        • Baines says:

          From what I recall, Nvidia said that they weren’t going to try to do anything about it. Nvidia says the performance hit is acceptable, the card works as intended, and that poor performance is the fault of poor memory management in the OS/program.

  8. pepperfez says:

    Since this is the GPU recommendation place, does anyone have any experience with the Asus/MSI mini 760/970? I’m afraid of it being jet engine-loud.

    • Siimon says:

      Temps can become a limiting factor, especially in a small form factor case. You’ll have to run at lower clock speeds, where full form cards can run at their full boost/OC performance.

    • thanosi says:

      I’m not going to pretend to be an expert but i built a small form factor itx pc a couple years ago when the 760 came out and purchased the Asus mini and with very few exceptions i’m still running everything on at least high if not ultra at 1080p (i game on the tv) if the game is older then a year. It’s super quiet and i only get heat issues if i do something crazy like turning on whatever the high end processing feature in the Witcher 2 was called.

      • pepperfez says:

        I’ve got a long enough backlog that stock 970 performance will probably hold me until quantum computing as long as it’s quiet, so that’s reassuring.

  9. John Connor says:

    CTRL + F “144hz” – not found.

    CTRL + F “120hz” – not found.

    Did I actually stumble into the Republic of Peasant Systems?

    Higher resolutions are nice but higher framerates feel amazing.

    • Siimon says:

      RoPS? Really? Y’know, at <$250 you're probably not talking about high-end gaming at 144hz anyway.

      A) If someone is buying a 200-250 graphics card they'll probably opt for a monitor in the 100-200 range, not the 200-600 range.

      B) For most games, a ~GTX960 card won't get 144+ FPS anyway.

    • Unclepauly says:

      Hmm, article is focusing on main stream gaming at 1080P. Pretty sure those refresh rates fall outside this realm.

    • thanosi says:

      Sounds more like you stumbled up your own ass

    • HidingCat says:

      There always has to be someone bleating about high refresh rates…

    • OmNomNom says:

      This is what confuses me as well, people are more worried about 1440p and 4k, but really at 1080p @ 144hz with a little anti aliasing and the result is a better package.

      You need a LOT of power to get decent (> 100) framerates at 1440p, let alone 4k.

      • HidingCat says:

        That’s for you. I prefer a decent colour representation and high resolution screen.

    • Premium User Badge

      phuzz says:

      If you can afford a monitor that can run at 120MHz+ then you should be able to afford to spend more than £200 on a graphics card.
      Most of us can’t.

  10. Siimon says:

    I think you’re underselling the 960 here. Especially the 4GB version. For $100 less than the 970, it is a great card. With new tech in the 960 (vs 7xx/6xx) you get much better relative AA, Tessellation, AO, performance.

    As for “Nvidia’s Geforce GTX 960. Instinctively, it’s a GPU that makes me very uncomfortable with its pitiful 128-bit memory bus”

    That 128bit bus isn’t as bad as it sounds. With their compression tech the low bus width isn’t /as/ bad as one would think.

    link to extremetech.com (Metro: LL is a good example)

    • Unclepauly says:

      Not true. The 6xx/7xx series had very strong AA performance. For example my 680 gets 15% higher framerates than my 970 when supersampling. Turn off AA though and just rely on shading performance and the 970 is like 30-40% faster.

  11. L3TUC3 says:

    I’m actually at this pricepoint looking at something suitable to replace my HD7850 1GB. It doesn’t meet Witcher 3 specs.

    For AMD the 285 looks ok, but I’d much rather get something 2gb+ so I can maybe squeeze an extra year out of it. I don’t feel much for the 280(X) or or the 290s.

    For Nvidia I can’t find anything my wallet likes and does equal or better performance wise. But, there are deals for 970s with the Witcher 3 included, so that’s not so bad either.

  12. ZIGS says:

    I currently hate my GTX 670. Too old to run (most) games maxed/60 FPS but still good enough to warrant being replaced :(

    • Unclepauly says:

      The 670 is an awesome card still, especially at 1080P it should be maxing most everything out there barring any vram limits. SLI 670’s with 4gb vram is still a monster setup.

    • McPartyson says:

      My GTX 670 handles everything just fine 1080p 60fps. I am running it with an i5 3570k clocked @ 4.5…nonetheless, I don’t always get the highest of the ultra settings and I don’t usually use MSAA, only FXAA in most recent games….but it works great! and I honestly just don’t think it’s worth the upgrade to a 970. I’ll wait until the next chipsets are released.

  13. Radiant says:

    For 30 quid more I can buy an xbox one console.
    This is why PC game remains niche.

    • Unclepauly says:

      Xbox plays games at 720/900p and 20-30fps? Also “niche?” GTAV just sold over a million units on steam in less than a week. How is that “niche”?

      • Unclepauly says:

        Lol there was almost 10 million players on steam a couple hours ago. If it was like 200,000 I would agree with your niche claim but there’s more players on steam than on xbox one.

        • Radiant says:

          Put your pitch fork away I’m typing these pithy comments from a pc.

    • Deadly Sinner says:

      Maybe if you got a refurbished One. At the prices I’m seeing for new, they’re over £70 more. And that’s without the Kinect.

      • Oktyabr says:

        Agreed. And until consoles support generic USB keyboards and mice as *gaming* input devices they will fail to replace the PC for gaming. I had a buddy the other day, who just bought a GTX 960 based PC, if he could use his console controller to play Arma 3. ~rolls eyes~

        Steam Box anyone?

    • McPartyson says:

      nice try, have fun with that console for a week or 2, then it will gather dust while you go back to your PC. I think it’s safe to say we’ve all been there and done that at some point in our lives…

      • Premium User Badge

        Phasma Felis says:

        It baffles me that we’re still having holy wars over platforms.

        Seriously, this boils down to “I enjoy MY games more than you enjoy YOUR games!” “Nuh uh!”

    • OmNomNom says:

      Even if this were true, how much more will you spend in a single year on games and subscriptions to services? It seems for the pleasure of owning that cheap console you pay ~25% more for each game.

      Of course this goes without saying but… enjoy your 900p @ 30fps.

    • fish99 says:

      TBH you make the difference back with cheaper games on PC, and by not having to pay for multiplayer.

  14. *Junon says:

    A little hunting and a little patience found me a Sapphire R9 290X for $300 USD. It replaced a 2GB Radeon 6950 that had been aging gracefully until recent titles like NBA 2K15 and GTAV. The old card was also $300 back in 2011 so I’d say it was a decent run. Totally satisfied with the performance of the 290X, especially in addition to the much cooler running temps. Part of me wonders how a GTX 970 might have been but I’m happy to spend a few more years on AMD.

  15. MegaAndy says:

    I’m on the wall about upgrading from a 470gtx to the 970gtx . it still runs most games on at least high/medium but I really fancy maxing all games again.

    Thanks for the comment on CPU bottlenecking, I have been wondering if my 4/5 year old i7 950 3.06 running at 3.2ghz would become the bottleneck if I upped the GPU.

    • OmNomNom says:

      The 970 is an awesome card for the money. The only other thing I would consider is if you find a cheap 780 gtx or ti. They’re very nearly as fast as the 900 series and may be going for a lot less (maybe you can afford 2?)

    • DanMan says:

      I made the jump from a highly OC’ed 460 to a 970, while keeping my moderately OC’ed i5 750. It still made a big difference, but it depends a lot on where the game is more demanding. The more CPU-bound a game is, the less of a difference you’ll see. But I think it was worth it all things considered. The power consumption hasn’t really changed either, so I didn’t need a new PSU.

    • Unclepauly says:

      CPU bottlenecking with that proc is going to be on a game by game basis. GTAV? bottlenecked. RTS games? bottlenecked. Crysis 3? severe bottleneck. most first person shooters you are fine but there are some that require massive cpu power. Don’t even try to play arma 3 multiplayer

      • DanMan says:

        You’re over-dramatizing. It just means that you have to turn a few settings a notch down or two.

        • TacticalNuclearPenguin says:

          If you have a CPU bottleneck there is not much you can do with graphic settings though.

          • Asurmen says:

            There’s quite a lot you can do with graphics settings that affect the CPU depending on the game. Arma 3 is one of them.

          • DanMan says:

            Geometry detail affects CPU, for example.

          • TacticalNuclearPenguin says:

            Off course you can, but many games see very little difference, and let’s not even get started on most MMOs and other almost single-threaded games. Plus as you might know, trying to reduce the CPU load on Arma3 can’t compare to actually upgrade, meanwhile this is not true for GPU bottlenecks where settings can have absolutely huge results.

            A 760 should have little problems in 1080p for GTA5, yet a friend of mine can’t solve his less-than-60-fps situation even with 90% of the stuff on normal ( which is the lowest ) and no AA, and that’s thanks to his Phenom. This is just a random example among many others.

            Again, you can often find a way around your issues in a PC game with plenty of settings, i’m not denying that, but the issue of a CPU bottleneck nowadays is so easy to fix, and so rewarding considering all the extra motherboard features you get, that i wouldn’t think twice about that. If you do it now especially the amount of future proofing is also incredibly huge.

      • MegaAndy says:

        I have a feeling I ran crysis 3 at almost highest settings anyway actually. I imagine GTA 5 really is a CPU hog though, currently run it on medium/high.

  16. TheSplund says:

    Running a 970GTX with a 1050 display (though looking at a triple head – 3×1680*1050 – setup soon) – bought the 970 as my two 560ti’s couldn’t cope – simple really.

    • OmNomNom says:

      Unless you really love your sims, I wouldn’t bother with triple head. Causes more problems than it is worth often.

  17. A Gentleman and a Taffer says:

    Currently planning/dreaming of a 27 inch 1440p monitor + GTX 970 combo in the near future. An upgrade costing as much as the whole PC originally! I get the impression the extra £50 to jump from 960 to 970 is worth the investment, especially if you want to be churning 1440p plus at decent frame rates.

  18. Delicieuxz says:

    1080p, or 16:9 aspect ratio is for PC gaming amateurs. The true PC Gaming Master Race uses only 16:10 aspect ratio, or 1920×1200.

    • Unclepauly says:

      I have ascended from PC master race to my true form as PC GoD.

      I have a 24″ sony fw900 that does 2304 x 1440 which is 16:10 at 1440p resolution. It is truly a sight to behold.

      • Delicieuxz says:

        That must be nice. You have ascended to a PC Super Sayajin God.

  19. Katar says:

    It’s currently sold out everywhere but ASUS are selling an R9 290X with a proper cooler for £230, and the R9 290 can be found for under the £200 mark though you might have to go for a stock cooler and/or a crappy brand.

  20. Dread says:

    I’ve been pondering about this question for a while now and hoped this article would give me some insight to make a decision, but I’m still in the same conundrum.

    Here is the situation, my PC is four years old now. i5 2500 with 3.3GHz, 8GB RAM, Geforce 560 TI with 1GB, 1080p monitor. Normally, I would replace the entire machine this year, but considering next year the Geforce 10 with full DX12 support hits, I’m more inclined to wait a year longer for a completely new PC. However, the graphics card simply is not up to snuff anymore, as it’s even below Witcher 3s minimum specs.

    I’ve been considering whether to get a 960, 770 or 970 for a while now and I just can’t make a decision, as it’s mainly a price and value/money issue. Here in Germany, all the graphics cards appear to be quite significantly more expensive than in the UK or US.
    The 970 goes for around for 330-360€ here (=360-400$), the 770 is only available used on ebay auctions for about the same price as a new 2GB 960 – 200€ (=220$), there are also 4GB 960 available for about 270€ (300$). If the 970 was available for that price, I would buy one in a heartbeat.

    I really don’t know what to do here, I feel the 960 is not powerful enough, but then again, it only needs to run 12-18 months, before I buy my new PC anyway. The 970 is good, but it’s so expensive, considering the short lifespan it’s going to get. And with the 770, I have a personal aversion towards using used electronics and I just can’t think it’s worth buying a used 770 for the same price as a new 960.

    • zat0ichi says:

      get a 2nd hand 770 4gb then sell on again in a year.

      • Dread says:

        4GB used 770s are in the same price range as 4GB 960s (270€). The 770s for aronud 200€ are all 2GB variants.

        You think, this would be worth it?

        • zat0ichi says:

          I didn’t realise the prices were so close…
          4gb is a must IMO

          770 has the edge (about 10% higher FPS) but will run hotter and hungrier and as the 960 is newer tech it might hold it’s retail value a little better.
          960 will have a warranty, 770 you’d have to hope the previous owner has cared for it.

          Maybe try the red team? I don’t say that lightly as a life long Green team member but in a pinch where outlay needs to be at minimum the Red team does seem to punch above its price point.

          4gb though, don’t skimp on that.

    • Asurmen says:

      Why are you going to get a short life span from the 970?

      • Dread says:

        As I said, I plan on upgrading my whole machine sometime next year, after the Geforce 10 series hits the market. So, a 970 will only run for 12-18 months before I replace it.

        • Dread says:

          By upgrading my machine, I mean of course build a new one.

        • Asurmen says:

          Get 970 now and another one in SLI when you rebuild?

          • Dread says:

            No, the whole point of me waiting another year is because I want a Geforce 10 gpu in my new rig.

    • TacticalNuclearPenguin says:

      A good cooler and your 2500k on 4.4 ghz, you’re sorted for another 5 years. Just focus on the GPU!

      And yes, the 770 as the article said has more of a high-end spin to it. Maybe it’s going to be similar to other newer but lower end cards for a few benchmarks, but when something starts to push at the memory bus or just requires a better rounded up architectures, you’ll see the gap widening.

      • Dread says:

        I guess so, but it’s a ritual for me to get a new machine about every 4 years. Besides, there is other stuff, which needs upgrading, I don’t have an SSD for example, as back then they cost almost 2€ per GB, compared to now 40 cent. Literally, the only reason I don’t do a new rig right now is the fact, that Dx12 and the Geforce 10 series is on the way and I just don’t think a Geforce 9 card will provide the longetivity I want because of this.

        Well, that’s two votes for the 770 so far. I think, I’ll try to find one of those then and if possible with 4GB.

        • TacticalNuclearPenguin says:

          Oh don’t get me wrong, mine was just the cheap way advice, i never oppose a good solid refresh!

  21. Dr_Barnowl says:

    What about VR?

    The key thing for VR is stable, high, framerates (in excess of 60Hz and preferably 90Hz I think). I presume this is going to be at 1080p for the first generation of consumer VR headsets.

    An indicator of the best single-card solution for a VR headset would be great.

    I agree with the people saying 1080p is a sensible target now, for this reason amongst all the others. For cross-platform console ports, 1080p will be where the design is for a while, and the assets just won’t look any better rendered at 4k. I’d much rather have smooth and good looking at 1080p than an incredibly pretty slide show at 4k. For much the same reason my passive-video watching is still in the SD era ; 1080p resolution doesn’t add a sparkling plot or an improved script to the SD version of content.

    • Oktyabr says:

      Thank you! FINALLY someone said it! I truly believe that 2015 is the year that VR really becomes mainstream.

      It’s why I’ve held off on 4K and 1440p monitors and continue to game on a single 24″ @ 1084P (with a second one for skype, etc.) Same with video card. I’ve felt the itch to jump on a new card (current HD6950 OC’d to 6970 is playable) but why buy *anything* now until I see what sort of my requirements that the new VR headset I am *definitely* going to buy has?

      • Premium User Badge

        Wisq says:

        That’s a pretty long time to hold off. I’ve been gaming on 1440p IPS for over four years now.

        I’m looking forward to the Rift, Vive, and/or any other comparable VR solution as much as the next person, but video card technology also probably isn’t going to make any quantum leaps between now and then. And VR isn’t going to completely replace the desktop for a long time, maybe never.

        May as well upgrade to 1440p, which is (IMO) a nice compromise between the datedness of 1080p and the unrealistic-for-gaming 4K.

        • Oktyabr says:

          Nah, I disagree.

          It may never completely “replace the desktop” but it’s definitely the direction we are headed in. The games I play today will benefit from VR tomorrow… mostly ARMA 3 and some racing sims that already have “free look” head movement built in. More importantly it will change the way developers think about games. Will you be able to play it on a monitor? Sure, probably… but more and more new games will be “backwards compatible” with 2D tech (monitors of any kind), designed with either full VR and/or “augmented” reality as the primary target. Seen the tech video of the augmentation stuff that Microsoft is working on? A dude kicks back on the couch and something like a 70+ inch *virtual* big screen TV appears on his living room wall with the Netflix logo emblazoned on it. Oculus from the start envisioned a virtual movie theater experience you could share with friends… think IMAX in 3D, without ever leaving your home. And who knows what tricks Valve will bring to the show. We are less than a year away from beginning of the great VR wars, and yeah, until some game or application comes out that justifies “upgrading” to 1440p before that, I’m pretty content to wait just a little bit longer.

          For the rather high entry fee of a quality 1440p experience today, what would it really give me? As I said I already run two 24″ 1080p monitors for desktop work and a single is fine for the games I play. And a year from now they will work just as well when I hang up my VR/AR headset(s) for the more mundane.

  22. The Sombrero Kid says:

    Please Please Please Please Please stop calling the total gpu ram the frame buffer, I’ve no idea why all games journalists seem to have gotten this wrong but the frame buffer is the buffer in which the completed frame to be rendered on screen is stored, i.e. at 1080p it is 1920×1080 pixels at 24 bits per pixel – at that resolution it is always 6MB a 2GB frame buffer would be absolutely preposterous!

  23. TimRobbins says:

    My rule is to not worry about gpu upgrades until the $200-250 range doubles the memory of my current card. At that point, developers usually have gotten used to the mass-licensed engines that utilize the tech and monitors that support it have dropped in price. Wait for a black friday sale and you’ve got a fairly inexpensive but huge upgrade to last 5-7 years.

  24. Grovester says:

    Don’t do what I did and buy a 970 two weeks before they do a promotion to get a free copy of Witcher 3.

    And six weeks before the Witcher 3 and GTA V bundle.

    Sigh.

  25. cammackarzie says:

    I bought a gaming pc from DINOPC the other week and I’ve been using it almost a week now. It came with a 6 core processer each core at 3.5 ghz as well as an radeon r9 280x graphics card. I’ve been played evolve at the highest detail at a constant and steady framerate with only lags when skydiving, apart from that I have no trouble. Alien isolation runs at ultra perfectly with no lag. To say this graphics card around 200 pounds I would recommend it. Although this is my first PC gaming machine, so I’m a little naive. Glad I made the right choice according to this article :D