Skip to main content

Hard Choices: Graphics Cards

The only 4 graphics cards you need to even consider buying

Hello, good morrow and, well, graphics. After my début – and let's be honest, definitive - dissertation on PC processors last month, this time around we're talking pixel pumpers. The bad news is that this instalment won't be nearly as neat as the first. With CPUs, I can point at the Intel Core i5 2500K and bark, “buy it”. Job done. Things are a lot more fluid and complex when it comes to GPUs - but even so, when it comes down to it you only need to trouble yourself with four cards today. The buying decision remains rather easy.

With that, dig in, get comfy and let's begin. Last time I left you with a couple of not-terribly-perplexing numbers to ponder, namely 1,920 and 1,080. No mystery, I was talking about pixel grids. Like it or not, 1080p has become the de facto screen resolution standard for mainstream PC monitors on sale today. As it happens, the latest Steam survey confirms 1080 is indeed by far the most popular res in terms of the installed base of gamers, too.

On that note, an informal survey of the screen res propounded by the pukka RPS crowd rather than the mouth-breathing, Steam-powered masses would be fascinating. So commenters, please preface your opening salvo with a quick screen-res stat. I thank you.

Back on message, 1080 is absolutely what current and future consoles are all about, too. Mobile aside, it's the resolution the whole gaming industry has consolidated around. And that, for the most part, is a good thing. It's rendered ultra-high-end graphics chips pretty much irrelevant.

There will always, of course, be exceptions. As RPS's very own Alec 'Fingers' McMeer will attest, I have 30-inch uber-res (read 2,560 x 1,600 pixel) panels in almost every corner of my sumptuously appointed Georgian apartment. Then again, as Alec would also attest while appealing to the alleged and, I would argue, slanderously fictitious collection of dead bodies in my underground vaults, I'm not like other people.

30 inches and a couple of grand's worth of PC monitor you almost definitely don't have
30 inches and a couple of grand's worth of PC monitor you almost definitely don't have

The point is that, taken as a whole, you lot aren't likely to be pumping more than 1,920 x 1,080 pixels per frame any time soon. And that makes a big difference when it comes to the sort of graphics chip you need to buy.

But before we take a look a what boards you should be bagging right now, first some context in the form of the back story of how we got where we are today. This isn't just about chips and bits and transistor counts. It's about the punctuation points graphics technology has inserted into the history of PC gaming.

T n' L, baby: It all started with the NVIDIA GeForce 256

The first GPU ever was the NVIDIA GeForce 256. That's a simple fact, mainly because I say so but also because a GPU, or Graphics Processing Unit, is not the same as a mere graphics chip. The 256 was the first graphics processor for the PC with hardware transform and lighting acceleration. It made a huge difference to performance and image quality and it set the tone for PC graphics as we know it today.

It was around that time in the late 90s that PC gamers got their first glimpse of high resolution 3D graphics with smoothly filtered textures. For me it was actually the earlier NVIDIA TNT2. No matter, I'll never forget the first time I saw Tomb Raider running on a 3D card with proper texture filtering. It may have been courtesy of a miserable 15-inch goldfish bowl of a monitor (Trintron, yo). But nothing has been nearly as dramatic since. Nothing.

Watch on YouTube

Since then, there have been plenty of significant way points. The NVIDIA GeForce 3 and ATI Radeon 8500 introduced the programmable shaders that underpin many of the photo-realistic effects, such as oh-so shiny water, we all take for granted today.

Other big milestones were the ATI Radeon 9700 Pro and NVIDIA GeForce 6800 Ultra. Both delivered preposterous increases in performance and image quality compared with previous generations and created expectations the industry has been chasing ever since.

Actually, it was the 6800 that really delivered on the notion of ultra high quality, high resolution gaming for the first time. I vividly remember struggling to believe the images in front of me were being rendered in real time as I soaked up the buttery smooth frame rates the 16-pipe 6800 kicked out in Far Cry at 1,600 x 1,200. Verily, it was the stuff of geeky gaming nirvana.

Full of buttery goodness thanks to the GeForce 6800 Ultra. Apparently.

The next big change was the shift to unified shader architectures and the focus went from counting pixel pipes to totting up stream processors. Such is the life of a PC hardware hack.

To balance the big successes, sometimes even larger failures featured along the way. NVIDIA dropped the ball horribly with the GeForce 5800 Ultra, aka the DustBuster, of 2003 and its broken 32-bit maths. Meanwhile, AMD hit the wall with 2007's Radeon HD 2900 XT, which was just awful all round. Overall, however, progress was relentless.

But so too was the inexorably upward creep in prices. That GeForce 256 card sold for around £200 or so back in 1999. By 2004 and the 6800 Ultra, you were looking at £400 for a high end boards. Today, it's as much as £600.

GeForce 6800 Ultra: 16 pipes and silly money

Frankly, it's gotten out of control. Funny thing is, AMD (or ATI as it was then), proved it didn't have to be this way. In 2008, ATI wheeled out the Radeon HD 4870. It wasn't the fastest GPU you could buy. NVIDIA's GTX 280 was quicker. But for £200 it gave you 80 to 90 per cent of the performance at a fraction of the price. It was a no brainer.

At the time, AMD promised us this was the model for the future. Instead of a futile willy waving contest over ultimate performance and the ever increasing prices that went with it, the plan was to target the £200 price point and give gamers the best possible experience. Hallelujah.

It was a brilliant idea and, for me at least, PC graphics has never been the same since. When the 4870's successor, the Radeon HD 5870, launched at £300, well, betrayal and mortification figured highly. Things didn't get much better with the 6900 series and with the new Radeon HD 7970 up at £450 and beyond, it's as if the 4870 never happened.

Top end performance for £200? That was the Radeon HD 4870.

At least it would be if it weren't for the minor matter that kicked off this column, the emergence of 1080 as the standard screen res. In the context of 1080 panels, I put it to you that spending much more than £200 on a graphics card makes absolutely no sense at all.

If that's a clear and categorical message, the question of video board specifics is a lot more complicated. For starters, the sheer range of graphics chipsets on offer is huge. Both AMD and NVIDIA knock out all sorts of chips to suit different parts of the market. Then there are variations within chipsets according to things like the number of memory chips, clockspeeds and coolers.

Things change a lot faster in graphics country than they do in processor land, too. The threat of an all-new GPU that makes everything before seem irrelevant always looms large. CPU development tends to be more incremental. And anyway, CPUs are less critical to gaming performance. So long as you have that Core i5 CPU I've already told you to buy, the performance bottleneck is almost always going to be the graphics card.

Still, what I can do is whittle the list down to manageable proportions at a few key price points and also provide some general rules of thumb that make life an awful lot easier. The first lesson involves memory, both the chip count and the bus width over which they communicate with the GPU.

Do not be seduced by sheer memory amount. There is nothing more gruesome than a low end GPU buried under a hill of crappy memory. Worse than being no benefit at all, the slower memory chips used in cheap cards with big buffers actually result in worse performance than their more modestly memoried brethren. Moreover, that 1,920 x 1,080 pixel grid puts a cap on the amount of memory you actually need. 1GB gets the job done today and will do so for some time to come thanks to the likely specs of the next generation of consoles (more on that later).

Edit: There's been some chatter below on the subject of graphics memory and the question of whether 1GB is enough to get the job done. For the vast majority of games, it is. There will always be exceptions to this, including Battlefield 3, which is particularly memory hungry. Similarly, you're much, much more likely to run into the limits of the memory buffer if you tweak your games with uber-res texture packs. A slightly more nuanced view would therefore concede that 2GB is worth considering, but only where it commands a very modest premium over the 1GB equivalent and where you've double checked that the extra memory hasn't come at the cost of memory frequency. At the £200 price point and 6950 chipset I favour, it's a moot point as the 2GB version is within budget. It certainly wouldn't make sense to skimp to the tune of £10 and go with a 1GB model.

Can it play Crysis? Not if it's lumbered with cheap, crappy memory

Bus width is just as important when it comes to memory performance on graphics cards. The rule here is simple. Touch nothing with a bus smaller than 256 bits. Anything more than that is gravy. Anything less isn't worthy of HD gaming.

Next up is multi-GPU in the form of NVIDIA's SLI and AMD's CrossfireX. The basics here a straight forward. Don't bother. Now, that's going to upset people successfully running multi-GPU rigs, particularly SLI systems, which are definitely preferable to AMD's flaky CrossfireX technology.

But the case against easily outweighs the case for. Number one, multi-GPU in all flavours is less reliable. It's hard to be sure if it's working optimally. In my view, guaranteed if slightly lower performance is preferable to the constant worry that comes with multi-GPU. What's more, multi-GPU performance isn't always what it seems to be based on benchmark results. One reason why is micro-stuttering. We've not the space and I've not the inclination to explain why here. If you want to learn more, I suggest you start here. If I were you, I'd just trust me on this one. Life is happier with single-GPU.

Then there's the question of APIs. On the PC that means DirectX and right now it's DX11. New versions of DirectX don't appear all that often and anything even remotely worthy of your consideration already supports DX11. The only real differentiator in this area is tessellation technology.

In really simple terms, tessellation boils down to increasing geometric detail by orders of magnitude using hardware acceleration. Right now, it's not being used by many games and most that do are pretty pants in terms of gameplay. But there's nothing wrong with the technology itself. It can produce spectacular results. Assuming it features prominently in next-gen consoles, it'll be rampant a few years from now.

The tessellation-heavy Heaven benchmark. It's triangle-tastic.

Until the introduction of the new AMD Radeon HD 7900, NVIDIA had the edge in tessellation. Arguably it still does given the punitive pricing of the 7900 series. It's something to bear in mind even if I reckon it'll be a couple of years yet before tessellation hits critical mass.

Finally, we have what I'll call the rule of top-rung but cut-down. The idea here is that the best and most effective cards are very often cut-down versions of high end GPUs rather than maxed-out members of the next rung down. Similarly, at the lower end of the price range, cut-down versions of second-rung cards regularly reign.

All of which pixellated perambulations just leaves us with the question of what you should actually be buying. My view is that £200 is the sweet spot you should aim for and that means you've got two choices. From the AMD camp comes Radeon HD 6950-based boards such as Sapphire's Radeon HD 6950 Flex Edition. From NVIDIA with have the GeForce GTX 560 Ti with 448 cores (yes, that's it's proper title), a prime example of which is the MSI N560GTX-448.

1080p is piffling pixel grid for a beast like the Radeon HD 6950

The Sapphire Radeon board is particularly attractive right now thanks to the arrival of the all new 7900 series. Strictly speaking, that makes it a cut down version of AMD's previous generation of GPUs. But it also means it's very attractively priced at £200 on the nose. It's a super card that will render just about anything at 1080 with the details maxed out (quiet at the back, Crysis 1 doesn't count). And it's now at the price AMD should have launched with a year ago.

As for the GTX 560 Ti 448, contrary to the name, it's the real high-end GPU deal and uses exactly the same three-billion transistor GF110 graphics chips as the GTX 570 and GTX 580 cards. It absolutely renders the 570 redundant and if I had to pick between it and the Radeon 6950, the NVIDIA GPU would get the nod due to reliability both in terms of drivers and more consistent performance more of the time. It kicks off at £225, making it a little more expensive than the Radeon.

Something like MSI's moderately awesome GeForce GTX 560 Ti-448 is all you'll ever need

In an ideal world, that's where I'd leave it. Unfortunately, some of you people are poor and financial realities impinge. Some extra options further down the range are helpful. At a little over £150, my clear pick is the GeForce GTX 560 Ti chipset, something along the lines of the MSI N560GTX-Ti. It's based on NVIDIA's second rung GPU. It got loads of cores, texture units and render outputs and critically a 256-bit memory bus. At 1,920 x 1,080, it'll love you long time with all the eye candy enabled in most games.

Boards with NVIDIA's GeForce GTX 560 Ti kick off around £150

Our final candidate is the Radeon HD 6850, Sapphire's vanilla Radeon HD 6850 will do. It's the second rung member of what will soon be a defunct AMD line. Again, that makes it sound unattractive. But the second-rung in the new Radeon HD 7000 series, the 7770, hasn't arrived yet and the 6850 is one hell of a card for £100.

With nearly 1,000 AMD-style shaders (note: they’re not directly comparable with NVIDIA's shaders) and a 256-bit memory bus, the reality is that for most games, most of the time it will deliver an indistinguishable experience to a top end, £500 board. I really mean that. I've conducted the blind comparison test. People cannot tell the difference. That includes you. The only snag is that you'll just have to be a bit clever on occasion with the eye candy in terms of working out which options you can knock down without kyboshing the visual splendour.

Parsimonious pixel pumper: A Radeon HD 6850 is one hell of a card for £100

As a post script for anyone with money to burn, the easy answer is Radeon HD 7950. It's a seriously quick card, overclocks like a beast and is well up to the job of driving 30-inch 2,560 x 1,600 and 27-inch 2,560 x 1,440 displays. You may as well have the latest technology available (the 7900 series largely solves AMD's tessellation problems) and the price premium on the flagship 7970 chipset is totally indefensible.

So, maybe it's not so bad after all. You've just four chipsets to choose from that are available at roughly these price points:

£225 NVIDIA GeForce GTX 560 Ti with 448 cores
£200 AMD Radeon HD 6950
£150 NVIDIA GeForce GTX 560 Ti
£100 AMD Radeon HD 6850

Simple enough, eh? Next time, I'll be telling you why a good monitor is absolutely the most important investment you can make, how right now is literally the best time ever to buy one and that we've Apple to thank for it. It makes me sick to say it, but it's true all the same. It really is all down to Apple. Good luck. God speed. And enjoy your graphics.

Addendum:

The overarching point of my proselytisms on RPS is helping you choose stuff to buy today. Problem is, with graphics there's always something a lot better just around the corner.

The most immediate threat is the chip known as GK104. As ever, the usual caveats apply. We won't know for sure what it will look like until it appears. Still, there are a few near-enough facts to ponder. It's an NVIDIA chip. It'll form part of NVIDIA's upcoming GeForce 600 range. It's due out around March or so. But it's not NVIDIA's next high end chip. That's not due for a month or two later at the earliest.

The rumours regarding GK104 are legion and big question is how well it will perform. NVIDIA is putting it about that GK104 will give the Radeon HD 7900 cards a serious run for their money for around £100 to £150 less cash. If so, AMD will have no choice but to dump its prices. The implications for anyone buying a GPU today are pretty obvious.

Further out, the influence of the next generation of games consoles enters the mix. There's more chat about the next Xbox than Playstation at the moment. Xbox Next, as we'll call it, should appear in 2013 but is probably already influencing game development.

Firm facts about Xbox Next are very hard to come by. At this point an interesting thought experiment comes in handy. The transistor count for the main processor chips in the first Xbox was roughly 50 million. For Xbox 360, it was 500 million. Could Xbox next be 5 billion?

That would be enough to buy you, for instance, the new six-core Intel Sandy Bridge E CPU and a GeForce GTX 580, which are around the 2 billion and 3 billion mark respectively. A better bet might be that Core i5 I told you to buy and AMD's dashing new Radeon HD 7970, which weigh in around 1 billion and 4 billion a pop.

Tantalising stuff, isn't it? Except I can't believe it will happen. Personally, I'm not expecting a 5 billion tranny chip. Heat and power issues probably preclude it. The whole thing needs to fit inside about 200W tops. Then there is the Red Ring of Death experience with the 360, the success of the fidelity -eschewing Wii, the frankly astonishing performance developers have squeezed out of the ancient 360 (Crysis 2 on the 360 is ridiculous), the quest for better profit margins from day one – all are among the many reasons to expect Microsoft to be more conservative this time.

My bet is something nearer 3 billion transistors, including a host of functions normally found on a PC motherboard, all squeezed into a single SoC. It's also almost guaranteed it won't be x86-compatible à la PC. I think the absolute best case scenario Xbox Next will be four Power7 IBM cores and a GPU analogous to the upcoming Radeon HD7700. It may not even be a good as that.

At best, then, if you want equivalents to today's PC hardware, we're looking at Radeon HD 6800 series levels of performance or maybe GeForce GTX 560 Ti. It's another reason not to unload on a £500 video board.

Read this next