Hard Choices: Graphics Cards

By Jeremy Laird on February 7th, 2012 at 2:27 pm.

Hello, good morrow and, well, graphics. After my début – and let’s be honest, definitive – dissertation on PC processors last month, this time around we’re talking pixel pumpers. The bad news is that this instalment won’t be nearly as neat as the first. With CPUs, I can point at the Intel Core i5 2500K and bark, “buy it”. Job done. Things are a lot more fluid and complex when it comes to GPUs – but even so, when it comes down to it you only need to trouble yourself with four cards today. The buying decision remains rather easy.

With that, dig in, get comfy and let’s begin. Last time I left you with a couple of not-terribly-perplexing numbers to ponder, namely 1,920 and 1,080. No mystery, I was talking about pixel grids. Like it or not, 1080p has become the de facto screen resolution standard for mainstream PC monitors on sale today. As it happens, the latest Steam survey confirms 1080 is indeed by far the most popular res in terms of the installed base of gamers, too.

On that note, an informal survey of the screen res propounded by the pukka RPS crowd rather than the mouth-breathing, Steam-powered masses would be fascinating. So commenters, please preface your opening salvo with a quick screen-res stat. I thank you.

Back on message, 1080 is absolutely what current and future consoles are all about, too. Mobile aside, it’s the resolution the whole gaming industry has consolidated around. And that, for the most part, is a good thing. It’s rendered ultra-high-end graphics chips pretty much irrelevant.

There will always, of course, be exceptions. As RPS’s very own Alec ‘Fingers’ McMeer will attest, I have 30-inch uber-res (read 2,560 x 1,600 pixel) panels in almost every corner of my sumptuously appointed Georgian apartment. Then again, as Alec would also attest while appealing to the alleged and, I would argue, slanderously fictitious collection of dead bodies in my underground vaults, I’m not like other people.

30 inches and a couple of grand's worth of PC monitor you almost definitely don't have
30 inches and a couple of grand’s worth of PC monitor you almost definitely don’t have

The point is that, taken as a whole, you lot aren’t likely to be pumping more than 1,920 x 1,080 pixels per frame any time soon. And that makes a big difference when it comes to the sort of graphics chip you need to buy.

But before we take a look a what boards you should be bagging right now, first some context in the form of the back story of how we got where we are today. This isn’t just about chips and bits and transistor counts. It’s about the punctuation points graphics technology has inserted into the history of PC gaming.

T n' L, baby: It all started with the NVIDIA GeForce 256

The first GPU ever was the NVIDIA GeForce 256. That’s a simple fact, mainly because I say so but also because a GPU, or Graphics Processing Unit, is not the same as a mere graphics chip. The 256 was the first graphics processor for the PC with hardware transform and lighting acceleration. It made a huge difference to performance and image quality and it set the tone for PC graphics as we know it today.

It was around that time in the late 90s that PC gamers got their first glimpse of high resolution 3D graphics with smoothly filtered textures. For me it was actually the earlier NVIDIA TNT2. No matter, I’ll never forget the first time I saw Tomb Raider running on a 3D card with proper texture filtering. It may have been courtesy of a miserable 15-inch goldfish bowl of a monitor (Trintron, yo). But nothing has been nearly as dramatic since. Nothing.

Since then, there have been plenty of significant way points. The NVIDIA GeForce 3 and ATI Radeon 8500 introduced the programmable shaders that underpin many of the photo-realistic effects, such as oh-so shiny water, we all take for granted today.

Other big milestones were the ATI Radeon 9700 Pro and NVIDIA GeForce 6800 Ultra. Both delivered preposterous increases in performance and image quality compared with previous generations and created expectations the industry has been chasing ever since.

Actually, it was the 6800 that really delivered on the notion of ultra high quality, high resolution gaming for the first time. I vividly remember struggling to believe the images in front of me were being rendered in real time as I soaked up the buttery smooth frame rates the 16-pipe 6800 kicked out in Far Cry at 1,600 x 1,200. Verily, it was the stuff of geeky gaming nirvana.

Full of buttery goodness thanks to the GeForce 6800 Ultra. Apparently.

The next big change was the shift to unified shader architectures and the focus went from counting pixel pipes to totting up stream processors. Such is the life of a PC hardware hack.

To balance the big successes, sometimes even larger failures featured along the way. NVIDIA dropped the ball horribly with the GeForce 5800 Ultra, aka the DustBuster, of 2003 and its broken 32-bit maths. Meanwhile, AMD hit the wall with 2007′s Radeon HD 2900 XT, which was just awful all round. Overall, however, progress was relentless.

But so too was the inexorably upward creep in prices. That GeForce 256 card sold for around £200 or so back in 1999. By 2004 and the 6800 Ultra, you were looking at £400 for a high end boards. Today, it’s as much as £600.

GeForce 6800 Ultra: 16 pipes and silly money

Frankly, it’s gotten out of control. Funny thing is, AMD (or ATI as it was then), proved it didn’t have to be this way. In 2008, ATI wheeled out the Radeon HD 4870. It wasn’t the fastest GPU you could buy. NVIDIA’s GTX 280 was quicker. But for £200 it gave you 80 to 90 per cent of the performance at a fraction of the price. It was a no brainer.

At the time, AMD promised us this was the model for the future. Instead of a futile willy waving contest over ultimate performance and the ever increasing prices that went with it, the plan was to target the £200 price point and give gamers the best possible experience. Hallelujah.

It was a brilliant idea and, for me at least, PC graphics has never been the same since. When the 4870′s successor, the Radeon HD 5870, launched at £300, well, betrayal and mortification figured highly. Things didn’t get much better with the 6900 series and with the new Radeon HD 7970 up at £450 and beyond, it’s as if the 4870 never happened.

Top end performance for £200? That was the Radeon HD 4870.

At least it would be if it weren’t for the minor matter that kicked off this column, the emergence of 1080 as the standard screen res. In the context of 1080 panels, I put it to you that spending much more than £200 on a graphics card makes absolutely no sense at all.

If that’s a clear and categorical message, the question of video board specifics is a lot more complicated. For starters, the sheer range of graphics chipsets on offer is huge. Both AMD and NVIDIA knock out all sorts of chips to suit different parts of the market. Then there are variations within chipsets according to things like the number of memory chips, clockspeeds and coolers.

Things change a lot faster in graphics country than they do in processor land, too. The threat of an all-new GPU that makes everything before seem irrelevant always looms large. CPU development tends to be more incremental. And anyway, CPUs are less critical to gaming performance. So long as you have that Core i5 CPU I’ve already told you to buy, the performance bottleneck is almost always going to be the graphics card.

Still, what I can do is whittle the list down to manageable proportions at a few key price points and also provide some general rules of thumb that make life an awful lot easier. The first lesson involves memory, both the chip count and the bus width over which they communicate with the GPU.

Do not be seduced by sheer memory amount. There is nothing more gruesome than a low end GPU buried under a hill of crappy memory. Worse than being no benefit at all, the slower memory chips used in cheap cards with big buffers actually result in worse performance than their more modestly memoried brethren. Moreover, that 1,920 x 1,080 pixel grid puts a cap on the amount of memory you actually need. 1GB gets the job done today and will do so for some time to come thanks to the likely specs of the next generation of consoles (more on that later).

Edit: There’s been some chatter below on the subject of graphics memory and the question of whether 1GB is enough to get the job done. For the vast majority of games, it is. There will always be exceptions to this, including Battlefield 3, which is particularly memory hungry. Similarly, you’re much, much more likely to run into the limits of the memory buffer if you tweak your games with uber-res texture packs. A slightly more nuanced view would therefore concede that 2GB is worth considering, but only where it commands a very modest premium over the 1GB equivalent and where you’ve double checked that the extra memory hasn’t come at the cost of memory frequency. At the £200 price point and 6950 chipset I favour, it’s a moot point as the 2GB version is within budget. It certainly wouldn’t make sense to skimp to the tune of £10 and go with a 1GB model.

Can it play Crysis? Not if it's lumbered with cheap, crappy memory

Bus width is just as important when it comes to memory performance on graphics cards. The rule here is simple. Touch nothing with a bus smaller than 256 bits. Anything more than that is gravy. Anything less isn’t worthy of HD gaming.

Next up is multi-GPU in the form of NVIDIA’s SLI and AMD’s CrossfireX. The basics here a straight forward. Don’t bother. Now, that’s going to upset people successfully running multi-GPU rigs, particularly SLI systems, which are definitely preferable to AMD’s flaky CrossfireX technology.

But the case against easily outweighs the case for. Number one, multi-GPU in all flavours is less reliable. It’s hard to be sure if it’s working optimally. In my view, guaranteed if slightly lower performance is preferable to the constant worry that comes with multi-GPU. What’s more, multi-GPU performance isn’t always what it seems to be based on benchmark results. One reason why is micro-stuttering. We’ve not the space and I’ve not the inclination to explain why here. If you want to learn more, I suggest you start here. If I were you, I’d just trust me on this one. Life is happier with single-GPU.

Then there’s the question of APIs. On the PC that means DirectX and right now it’s DX11. New versions of DirectX don’t appear all that often and anything even remotely worthy of your consideration already supports DX11. The only real differentiator in this area is tessellation technology.

In really simple terms, tessellation boils down to increasing geometric detail by orders of magnitude using hardware acceleration. Right now, it’s not being used by many games and most that do are pretty pants in terms of gameplay. But there’s nothing wrong with the technology itself. It can produce spectacular results. Assuming it features prominently in next-gen consoles, it’ll be rampant a few years from now.

The tessellation-heavy Heaven benchmark. It's triangle-tastic.

Until the introduction of the new AMD Radeon HD 7900, NVIDIA had the edge in tessellation. Arguably it still does given the punitive pricing of the 7900 series. It’s something to bear in mind even if I reckon it’ll be a couple of years yet before tessellation hits critical mass.

Finally, we have what I’ll call the rule of top-rung but cut-down. The idea here is that the best and most effective cards are very often cut-down versions of high end GPUs rather than maxed-out members of the next rung down. Similarly, at the lower end of the price range, cut-down versions of second-rung cards regularly reign.

All of which pixellated perambulations just leaves us with the question of what you should actually be buying. My view is that £200 is the sweet spot you should aim for and that means you’ve got two choices. From the AMD camp comes Radeon HD 6950-based boards such as Sapphire’s Radeon HD 6950 Flex Edition. From NVIDIA with have the GeForce GTX 560 Ti with 448 cores (yes, that’s it’s proper title), a prime example of which is the MSI N560GTX-448.

1080p is piffling pixel grid for a beast like the Radeon HD 6950

The Sapphire Radeon board is particularly attractive right now thanks to the arrival of the all new 7900 series. Strictly speaking, that makes it a cut down version of AMD’s previous generation of GPUs. But it also means it’s very attractively priced at £200 on the nose. It’s a super card that will render just about anything at 1080 with the details maxed out (quiet at the back, Crysis 1 doesn’t count). And it’s now at the price AMD should have launched with a year ago.

As for the GTX 560 Ti 448, contrary to the name, it’s the real high-end GPU deal and uses exactly the same three-billion transistor GF110 graphics chips as the GTX 570 and GTX 580 cards. It absolutely renders the 570 redundant and if I had to pick between it and the Radeon 6950, the NVIDIA GPU would get the nod due to reliability both in terms of drivers and more consistent performance more of the time. It kicks off at £225, making it a little more expensive than the Radeon.

Something like MSI's moderately awesome GeForce GTX 560 Ti-448 is all you'll ever need

In an ideal world, that’s where I’d leave it. Unfortunately, some of you people are poor and financial realities impinge. Some extra options further down the range are helpful. At a little over £150, my clear pick is the GeForce GTX 560 Ti chipset, something along the lines of the MSI N560GTX-Ti. It’s based on NVIDIA’s second rung GPU. It got loads of cores, texture units and render outputs and critically a 256-bit memory bus. At 1,920 x 1,080, it’ll love you long time with all the eye candy enabled in most games.

Boards with NVIDIA's GeForce GTX 560 Ti kick off around £150

Our final candidate is the Radeon HD 6850, Sapphire’s vanilla Radeon HD 6850 will do. It’s the second rung member of what will soon be a defunct AMD line. Again, that makes it sound unattractive. But the second-rung in the new Radeon HD 7000 series, the 7770, hasn’t arrived yet and the 6850 is one hell of a card for £100.

With nearly 1,000 AMD-style shaders (note: they’re not directly comparable with NVIDIA’s shaders) and a 256-bit memory bus, the reality is that for most games, most of the time it will deliver an indistinguishable experience to a top end, £500 board. I really mean that. I’ve conducted the blind comparison test. People cannot tell the difference. That includes you. The only snag is that you’ll just have to be a bit clever on occasion with the eye candy in terms of working out which options you can knock down without kyboshing the visual splendour.

Parsimonious pixel pumper: A Radeon HD 6850 is one hell of a card for £100

As a post script for anyone with money to burn, the easy answer is Radeon HD 7950. It’s a seriously quick card, overclocks like a beast and is well up to the job of driving 30-inch 2,560 x 1,600 and 27-inch 2,560 x 1,440 displays. You may as well have the latest technology available (the 7900 series largely solves AMD’s tessellation problems) and the price premium on the flagship 7970 chipset is totally indefensible.

So, maybe it’s not so bad after all. You’ve just four chipsets to choose from that are available at roughly these price points:

£225 NVIDIA GeForce GTX 560 Ti with 448 cores
£200 AMD Radeon HD 6950
£150 NVIDIA GeForce GTX 560 Ti
£100 AMD Radeon HD 6850

Simple enough, eh? Next time, I’ll be telling you why a good monitor is absolutely the most important investment you can make, how right now is literally the best time ever to buy one and that we’ve Apple to thank for it. It makes me sick to say it, but it’s true all the same. It really is all down to Apple. Good luck. God speed. And enjoy your graphics.

Addendum:

The overarching point of my proselytisms on RPS is helping you choose stuff to buy today. Problem is, with graphics there’s always something a lot better just around the corner.

The most immediate threat is the chip known as GK104. As ever, the usual caveats apply. We won’t know for sure what it will look like until it appears. Still, there are a few near-enough facts to ponder. It’s an NVIDIA chip. It’ll form part of NVIDIA’s upcoming GeForce 600 range. It’s due out around March or so. But it’s not NVIDIA’s next high end chip. That’s not due for a month or two later at the earliest.

The rumours regarding GK104 are legion and big question is how well it will perform. NVIDIA is putting it about that GK104 will give the Radeon HD 7900 cards a serious run for their money for around £100 to £150 less cash. If so, AMD will have no choice but to dump its prices. The implications for anyone buying a GPU today are pretty obvious.

Further out, the influence of the next generation of games consoles enters the mix. There’s more chat about the next Xbox than Playstation at the moment. Xbox Next, as we’ll call it, should appear in 2013 but is probably already influencing game development.

Firm facts about Xbox Next are very hard to come by. At this point an interesting thought experiment comes in handy. The transistor count for the main processor chips in the first Xbox was roughly 50 million. For Xbox 360, it was 500 million. Could Xbox next be 5 billion?

That would be enough to buy you, for instance, the new six-core Intel Sandy Bridge E CPU and a GeForce GTX 580, which are around the 2 billion and 3 billion mark respectively. A better bet might be that Core i5 I told you to buy and AMD’s dashing new Radeon HD 7970, which weigh in around 1 billion and 4 billion a pop.

Tantalising stuff, isn’t it? Except I can’t believe it will happen. Personally, I’m not expecting a 5 billion tranny chip. Heat and power issues probably preclude it. The whole thing needs to fit inside about 200W tops. Then there is the Red Ring of Death experience with the 360, the success of the fidelity -eschewing Wii, the frankly astonishing performance developers have squeezed out of the ancient 360 (Crysis 2 on the 360 is ridiculous), the quest for better profit margins from day one – all are among the many reasons to expect Microsoft to be more conservative this time.

My bet is something nearer 3 billion transistors, including a host of functions normally found on a PC motherboard, all squeezed into a single SoC. It’s also almost guaranteed it won’t be x86-compatible à la PC. I think the absolute best case scenario Xbox Next will be four Power7 IBM cores and a GPU analogous to the upcoming Radeon HD7700. It may not even be a good as that.

At best, then, if you want equivalents to today’s PC hardware, we’re looking at Radeon HD 6800 series levels of performance or maybe GeForce GTX 560 Ti. It’s another reason not to unload on a £500 video board.

, , , , , .

220 Comments »

  1. skyturnedred says:

    I saw it. Fast fix though.

  2. Brun says:

    Running a GTX 480 here (EVGA). For a while I considered getting a second one and running in SLI, but to the tune of $550 I wasn’t that interested, given that a single 480 seems to be able to handle anything I can throw at it by itself. I intend to wait for at least the 6XX series of GeForce cards before upgrading.

    • Caleb367 says:

      I very recently – I’m still busy reinstalling stuff right now – upgraded to an i5 and put on it a Geforce 440, which is a cheapo DX11 card. You know what, it’s doing great. Just tested it on Saints Row 3, 1280×1024, top detail ‘cept AA, and works smooth as silken butter.

    • frenchy2k1 says:

      @Caleb:
      As stated in his article, your perfs are very resolution dependent.
      If you only play at 1280×1024, your card does not need to push that many pixels and your CPU will be taxed more (relatively), so your configuration works. If you raise your resolution, your performances will drop quickly (GT440, based on GF108, cannot push that many pixels).

      His advice is targeted to a 1920×1080 resolution, which is close to the standard lately.

    • DuddBudda says:

      hate to break it to ya, but 1280*1024 hasn’t been ‘top detail’ for a decade

  3. Premium User Badge

    Dolphan says:

    I got a 6850 HD in November – an XFX which I’ve overclocked by something close to 200mhz without the slightest problem (or particularly high temps). I only need 1440×900 for my size of monitor, so I can crank everything else all the way up on BF3 and it’s fine. Lovely card.

  4. Orija says:

    I’m currently using a CRT monitor with a native res. of 1024×768. Yea, yea.

    Could you guys recommend a decent monitor? I’m currently thinking of getting the Dell U2311H.

    • Brun says:

      I’d recommend against getting a Dell monitor since, in general, you’ll pay more for it than equivalent or better monitors from other brands.

      I run two monitors – a 22″ Dell LCD and a 23″ Acer LED. The Acer is an inch bigger and built on better technology but cost the same as my Dell monitor (about $180). The Acer even came with the DVI cable. Granted, I did buy the Acer almost a year after the Dell, and I know there was a drop in monitor prices during that time.

      What I’ve been looking for for quite a while is a monitor with > 60hz native refresh rate. I can’t stand tearing when playing games so I usually run with vsync on, which incurs a performance hit. Bumping up my refresh rate would obviously alleviate that somewhat, but no one seems to make monitors these days with more than 60 or 70 hz refresh rates. Strange, given that my TV goes up to at least 120 hz.

    • Premium User Badge

      FriendlyFire says:

      The Dell U2311H is an excellent monitor. I have three of them and heartily recommend them. Brun doesn’t know what the hell he’s speaking about.

      Acer and other crap corporations will sell you extra-cheap TN monitors which have terrible viewing angles and poor color reproduction. Dell’s monitors are all IPS screens with superb viewing angles and excellent color reproduction; factory calibration is surprisingly good for the price.

      As a bonus, Dell monitors of the U class are all VESA-compatible and the default mount has pivot. You’ve never properly worked on a computer until you’ve worked on a portrait monitor. Text editing, programming, even surfing the web are all much more enjoyable with portrait.

      Honestly, the only consideration would be whether you want to go with the U24 model instead (I think it’s U2412h?), which I believe has higher-end IPS tech. You can also look at HP’s professional 24″ model.

    • Derppy says:

      @Brun Dell has the best monitors in 1080p-1200p range. If you appreciate colors, contrast and viewing angles, get IPS-panel from Dell and stay away from poor quality TN-panels Acer mostly makes.

    • Ignorant Texan says:

      Brun,

      3D Capable LED monitors use a 120Hz refresh rate. I believe some of the higher end 1920×1080 LED monitors that are marketed to gamers with a 1 or 2 ms GTG response time are also using a 120Hz refresh rate.

    • Svant says:

      DELL U2412M <- using that since it was basically the only 16×10 monitor availible at a decent price when i got it. 1920×1200 give you that extra vertical space that widescreen monitors really lack.

    • Orija says:

      I don’t really see much of an advantage of getting a u2410 other than for the increased size and 16:10 ratio. Is the 16:10 res better to have than the 16:9? I thought 16:9 was the one that was in general use.

    • Jeremy Laird says:

      Hold tight re the monitor. That’s the next installment. Recent months have seen some fantastic affordable panels appear. it’s been a long time coming, but you can now have something other than a nasty TN screen for well under £150.

    • Prime says:

      I shall look forward to that article with some interest: I’ve long wanted to get above 60Hz.

    • Orija says:

      That’s swell though I’d very much appreciate articles on RAMs, Cabinets, PSUs, Motherboards and anything else that comprises a PC. Seriously, the first time I looked up what components I needed to get for building a PC it felt more like I was gonna build the TARDIS.

    • Ignorant Texan says:

      Mr Laird,

      I know monitors are in the next installment, but what GPU would you recommend for a 2560×1600 monitor? While the GTX560ti 448 is a wonderful card, I don’t think it has enough ‘umph’ to run satisfactorily at that resolution.

    • Premium User Badge

      oceanclub says:

      “Dell’s monitors are all IPS screens with superb viewing angles and excellent color reproduction; factory calibration is surprisingly good for the price.”

      After experiencing only LCD monitors in work, I was always very skeptical and had a huge CRT until about 2 years ago. When it finally packed in, I got the (at the time) highly rated Dell 2209WA (one of the early IPS monitor that had no ghosting problems) and was completely converted. I would definitely stick with Dell myself unless convinced otherwise. Can’t see a need to upgrade personally – while it does have a (now) unusual ratio of 1650 x 1080 it’s never bothered me.

      P>

    • zaphod42 says:

      @Brun Wouldn’t upgrading your monitor’s refresh rate make tearing worse rather than better? Seems like forcing triple buffering would be a better solution.

    • Brun says:

      Tearing happens when your graphics card’s output framerate is higher than your refresh rate. The GPU sends a frame to the monitor and it begins to refresh the screen, but before it can finish, the GPU sends another frame to the monitor. So your screen ends up with half of the first frame, and half of the second frame, which causes the mismatch or tear.

      Vsync caps your GPU’s framerate output at whatever the refresh frequency of the monitor is (usually 60 Hz, so 60 FPS). All you really need, though, is a refresh rate higher than your framerate.

    • Dude Imperfect says:

      I was going through the same progression you are making about 6 months ago. I can not recommend enough an IPS panel. The transition to IPS from TN is not as great as say SDTV to HDTV, but by George it’s about as close as you’re going to get anytime soon. The big knock on IPS’s has been their comparably slow response times to a TN’s. I’m here to tell you anything under ~18ms is not going to be noticeable at all; whereas, the gain in color fidelity will be striking.

      Personally I found the best value for my money to be the NEC EA231WMi which I found for a steal at $250. The 231′s have been replaced by the NEC EA232WMi which features an “upgrade” to a LED back light. The LED does provide a decrease in power consumption but at the cost of bit of color reproduction. If you can find a (increasingly rare) 231 you may want to go with that but regardless the screen of the EA23XWMi series is fantastic and highly recommend.

      PS: Mr. Laird do check out this monitor series before writing up your next feature…I did my homework on them and have not been disappointed.

    • LintMan says:

      @Orija – I love my Dell U2410. The U2412 is very similar, but the U2410 has a slightly faster response time and more accurate color (mostly important if you view your photos on your PC, or more especially if you do stuff like photoshop). The display is gorgeous, and I couldn’t find anything else close to this size and quality for the price (< $450 on sale).

      About the aspect ratios: I think 16:10 was more common, but when more and more TV's started coming out at 1080p with 16:9, monitors started following the trend, since it 's cheaper to make 16:9 LED panels than it is to make 16:10 LED panels. I don't see any user benefit to 16:9, it just makes for smaller screens.

      Personally, I think it's annoying that horizontal monitor resolution (and monitor resolution overall) has pretty much stagnated for well over 10 years. I bought a 1600×1200 CRT monitor in the late 90's, and I'm only running 1920×1200 now 13 years later, while most monitors are still stuck at something like 1440×1080, barely an improvement over the 1280×1024 I was using in 2005.

    • Kadayi says:

      +1 on the Dell tbh. They might seems a little costly, but you get great picture quality/colour fidelity, robust build & a extremely flexible stand. I’ve been using the U2412 for about a year and I’ve no complaints.

    • NamelessPFG says:

      CRTs don’t have native resolutions…

      That aside, you can try looking through thrift stores or craigslist for any FD Trinitron or Diamondtron NF CRT monitors in the 21″ or over range. How does 1600×1200 at 95 Hz (or possibly more, up to 160 Hz depending on resolution) sound, especially when it only costs about $6 to $20 per monitor?

      If you must have an LCD, then you’re going to have to make a tradeoff…120 Hz refresh rate and lackluster TN image quality, or IPS image quality and 1920×1200 options at 60 Hz?

      “I don’t really see much of an advantage of getting a u2410 other than for the increased size and 16:10 ratio. Is the 16:10 res better to have than the 16:9? I thought 16:9 was the one that was in general use.”

      I wouldn’t be so against 16:9 if they didn’t make the stupid decision to reduce vertical resolution instead of expanding horizontal resolution further, like 16:10 does relative to 4:3. We’re supposed to be expanding resolution in both directions, not regressing. Isn’t improvement the point of new technology?

      1920×1080 is too short to fit 1600×1200. 1920×1200 isn’t. Thus, the former has no place in the computer monitor realm, yet manufacturers shove it in anyway because everything has to be held back to HDTV standards. (And yes, I do still play old games that only go between 1280×1024 and 1600×1200 as their top resolutions.)

      I don’t mind letterboxing at all, so if the extra FOV really is that big of a deal, I can *gasp!* run 1920×1080 on a 1920×1200 monitor.

    • Shortwave says:

      Just going to add a very simple thought to this.

      I freakin’ LOVE my 120hz monitor.. LOVE LOVE LOVE IT.
      I can sit here right now and zoom my mouse back and forth across my 120hz and my 60hz and see the difference instantly. (For those nay-sayers in the technology, please try it sometime!). It even makes scrolling while reading much much nicer on the eyes. It’s also saved me from getting headaches while gaming. If you suffer from eye fatigue and headaches while gaming I highly suggest you go try one out somehow/sometime.

      I’m using the BenQ super Uber CS:S OMG PEW PEW LEET XL2410T and now theres no looking back. Also, I find the image quality to be amazing once adjusted manually/properly. But I will admit the viewing angles can be pretty bad. Keep in mind this is a GAMING monitor though and it includes one of the most versatile and high quality stands I’ve ever seen on a monitor.

    • phylum sinter says:

      If you’re looking for a monitor that is both modern and NOT made more for movies than anything else (16:9 and web browsing, doing data work, graphics work, etc. do not really benefit from such a wide screen) you can also get a 16:10 monitor – whose resolutions are 1920×1200.

      Both Dell and NEC make good monitors in this form factor. Right now i have a rather oversized monitor (28″) from a lower priced company called Hannspree that i game on. It was only $270 and i have absolutely no complaints beyond a little light leaking out the back vents.

      I really would consider when buying a monitor whether you’ll be using it for just movies or games as well – as even though my television (and gameboxes connected to it) run at 16:9 form factor/1920×1080, i much rather prefer something closer to the old 4:3. There aren’t any limitations as to whether any of the cards listed here will support this resolution too.

      For reference, the card i use is an ATI 5850 – a dx11 card, with 1gb ram. It cost me $180 a few years ago. The smart buy for gfx cards is to look 1-3 models back in time, and purchase there. The newest cards are almost always overpriced for a good 6 months after release.

    • kemryl says:

      Huh. I thought that interlaced HDTV’s and 3D monitors claimed to have 120hz refresh rates because they were displaying two images simultaneously at 60hz each, and they just used 120hz as a fake selling point.

    • Thiefsie says:

      Apart from Macs, Dells are pretty much the best screens out there as long as you get their better (IPS) ones.

      I absolutely adore my U2410 and now have two of them.

      The stand, the design, the buttons, the ports…. all exemplary, backed by a great warranty if you order direct, and of course, Dell have good sales/coupons on from time to time which is when you should hunt their screens down.

    • Shortwave says:

      kemryl, you would be as far as I understand. Mostly correct.
      For example the Quatron HDTV’s claim to have 120hz, when in actuality they only repeat and shift the last frame, which requires processing, which creates delay. Which makes me have sad faces!

      I swiftly returned it and bought a true 120hz PC monitor. : )

  5. TormDK says:

    Good purchase guide for those that really can’t be arsed to browse through a ton of technical sites.

    I’m still patiently awaiting NVIDIA’s response to the 7900 with Kepler, but it looks like 2012 is going to be the year where future hardware upgrades become redundant if our future monitors do not go up a notch.

  6. CMaster says:

    1680 x 1050 here (with a second monitor at 1280 x 1024, but I don’t play games on that one.

    I’ve never spent more than I think £140 or a graphics card.
    Have to say after each purchase bar one, I’ve never seen a need to have spent more either. Still, waiting for the money to buy a new i5 based system before I replace my current (perfectly serviceable ) HD4850 I think.

    Also, might be worth mentioning that sites like Tom’s Hardware do a bimonthly (or so) “best graphics cards for the money” chart?

    • DrGonzo says:

      I upgraded from a 4850 to a 6870 at the end of last year. At 1680×1050 it flies along on everything I’ve tried so far. BF3 and Metro 2033 look great and smooth. Got it for £130 so I’m sure it’s even cheaper now if you are considering upgrading.

    • MikoSquiz says:

      I’ve spent variable sums over the years, but I’ve come to the conclusion that as long as the game runs, the spec of the video card doesn’t make the blindest bit of difference to my enjoyment. Might as well buy the cheapest one and cut a wad of £10 notes into decorative shapes to brighten up the apartment.

      So “Never spend more than £100 on a video card” has gone into my rules for life, nestling somewhere between “If you don’t know what it is, don’t snort it” and “Always make sure you get the gentleman’s address or license plate”.

      (8800GT at the moment; it’s the only part of this ca. 2007 desktop box that’s been upgraded.)

  7. yhalothar says:

    I have a 2500k with 570 GTX driving a 1920 x 1200 display (otherwise I’d go for the 560) and I approve of this article. Everything pretty much runs nicely.

  8. WMain00 says:

    I bought a Nvidia 560-without-a-ti card a couple of months back and don’t regret. It runs everything quite happily at 1920×1080. I bought that over a ti because I couldn’t really afford the extra cost and because I wasn’t entirely sure whether I’d notice any difference since my processor is starting to show its age (a Q6600).

    Can’t honestly afford a massive upgrade at the moment, but saying that I haven’t noticed any game really struggle so far and I usually play things on quite high settings. Even the dreaded Metro 2033 ran away quite happily.

  9. deadly.by.design says:

    I’m the weird guy running 1680×1050, apparently.

    (gogo 460GTX 1GB)

    • Arclight says:

      Same, because 16:9 is for movies. Real gamers use 16:10. :D

      Kinda wish I had waited and gotten the 560, but the humble 460 overclocks happily.

    • DrGonzo says:

      I’m also running at 1680 on my pc monitor. I sometimes game on my tv at 1080p with a pad though.

      And all my gaming friends use 1680 as a resolution, it’s not that scientific a study I admit, but it’s not an unusual or uncommon resolution.

    • Edawan says:

      I have a 16:9 monitor because 16:10 is impossible to find at 24” nowadays.

    • LintMan says:

      @Edawan – Check out the Dell U2412 and U2410. Those are both quality 16:10, 1920×1200 monitors. I have the U2410 and love it. They are both probably more expensive than competing 24″ 16:9 monitors, but Dell frequently has major sales and coupon deals if you keep your eye out. I got my U2410 for $150+ off.

    • Premium User Badge

      ffordesoon says:

      Why anyone would want to run games at anything beyond 1680×1050 is honestly beyond me. I understand that it’s pretty, but so is 1680×1050. After that, you get diminishing returns on performance for not a whole lot more fidelity, and the time spent scrolling across the screen with your mouse is just unacceptable. I mean, I know you can accelerate the speed of your mouse, but why should you even have to? Fie on beauty at the expense of functionality! Fie, I say.

      As for my setup, I’ve got an two-year-old Intel Quad-Core Extreme, the model number of which I can never quite remember, and I upgraded to a 560 Ti last month for $240 from an 8800 GTX Ultra. While that card’s performance was still fair in most games, the 560 Ti runs everything I throw at it flawlessly at max settings. Well, except Metro 2033, but that’s no surprise, and all I have to do is bump the AA down a notch or two, and it’s smooth as silk.

    • phylum sinter says:

      @ffordesoon 1920×1200 is especially noticeable if you have a monitor larger than 23 in. or so. i really don’t notice pixels at this resolution on a 28 in. monitor.

    • PeteC says:

      1680 x 1050 here too on a Samsung SyncMaster 2233 monitor with a 120hz refresh rate (that last bit means nothing to me but it’s apparently quite nice.)

    • Premium User Badge

      Andy_Panthro says:

      @PeteC

      Same monitor here, Samsung SyncMaster 2233 – it’s really rather good, although my last monitor (still sitting nearby) was a 1280×1024 Dell, so doesn’t provide much comparison.

    • Premium User Badge

      sonofsanta says:

      You’re not the only one, 1680 x 1050 is a lovely resolution. We have some full HD monitors at work and they just feel so… cramped. 16:10 till I die!

      Also: agreed on Crossfire/multi-GPUness being a POS. My current set up is a pair of 5770s, and having done it once, I’ll never do it again. It costs more than just the cost of two cards as you have to overspec on PSU and mobo, there’s a fair chance that any game will only use one card anyway, launch titles are a disaster (see: Witcher 2, others), you’re constantly worrying about how effectively it’s being utilised instead of just enjoying the pretties, and worst of all, once you notice the microstutter, you can never stop noticing it.

      Single cards are the only worthwhile way of doing it.

  10. Sensai says:

    I actually found a remarkably good deal a few months back on a dual fan 6870. I believe it went something like this:

    $185 for the card, minus a 30 dollar mail in rebate. Came with Deus Ex: HR, Shogun 2, and Dirt 3. I was going to buy HR anyways, I traded Shogun 2 (I tried the demo and really didn’t like it) for Limbo and Bastion, and due to the debacle with Dirt 3 Steam codes being leaked, I couldn’t get rid of it. I also sold my old card for ~65 to a friend (which was a little low, but he’s a friend, c’mon).

    All said, I paid 90 dollars for a sizable upgrade, and got with it Dirt 3, HR, Bastion and Limbo. Considering I was going to buy the latter 3, that puts me at (effectively) spending 15 dollars for an upgrade. Such a good deal.

    [Edit:]

    Also, 1920×1200 is the way to go. Sure, consoles and TVs may be going to the inferior 1080p, but we PC gamers have always loved our excess have we not? We need that extra 120 vertical space!

  11. Premium User Badge

    Clavus says:

    My HD5850 runs BF3 on high at a solid 50 fps most of the time. There isn’t much of a reason to spend a lot on your graphics card unless you have a absurd multi-monitor setup nowadays.

    • Fierce says:

      Clavus, if you can find another 5850 (I know, I know) and CFX them, you’ll be able to play on Ultra like I do, except with Shadows on High and Motion Blur + AA Off. I rarely drop below 60fps, even on Karkand maps, and I always play with ‘renderer.drawfps 1′. Using 12.2 previews and these results were last seen last night when I knifed my 400th person. Good times.

      For the article:

      I’m running a 24″ BenQ V2400W which is 1920*1200 (Review & Pictures @ http://hardforum.com/showthread.php?t=1315565) and I wholeheartedly agree that real gaming happens at 16:10.

      It may be hard to find, it may be “just a few more pixels”, it may be whatever else people will rationalize it to be, but it’s a mathematical fact that it is also closer to the golden ratio more instinctively enjoyed by the human eye. It’s also great for productivity work too. Snapping an inbox to the right side of the screen while snapping a word processor to the left side is functional and doesn’t look squished.

    • Premium User Badge

      Clavus says:

      I’m not a fan of multi-card setups. They’re not very stable, plus they generate a lot more noise and heat. Also I would need to upgrade my PSU for it since 620 watt with only 2 gfx power connectors wouldn’t cut it.

    • Fierce says:

      Just so you know, if the PSU is your chokepoint, then you’re going to need to upgrade it anyway moving forward. I mean, if you’re concerned with the noise and heat produced by multi-GPUs and the power the PSU would need to supply them, I can’t imagine how you’d get anything say ~GTX 570 or newer~ without having the same problem.

      As for multi-card stability, everyone has a story, and I don’t know what cards you tried pairing and on what PSU that you subsequently encountered instability on. Just want to point out that there’s nothing inherently unstable about a SLI / CFX config, it just doesn’t provide the 1:1 performance increase ratio people expect it to for the cost they pay.

      My advice to you stands solid however. Your PSU (be sure to upgrade to FULLY MODULAR ONLY) and Case permitting, the difference between BF3 Ultra @ >70~95fps and BF3 High @ 50 fps is only one example of more than a few games I’ve personally experienced where the multi-card path provides real dividends. A fact to keep in mind for when times are better, to be sure.

      Happy gaming. Don’t become my 401st!

  12. dancingcrab says:

    I have the MSI GTX560Ti-448, and it’s fantastic at 1080p! Unfortunately, my Phenom II 955, even overclocked, is throttling some games – in particular, pre-v1.4 Skyrim was suffering. Thank you Bethesda for fixing that!

  13. fauxC says:

    Wow you high-spec PC people are scary. So many numbers and abbreviations.

    • TormDK says:

      It comes with being a part of the PC MASTER RACE!

      Didn’t you get that memo?

    • fauxC says:

      I think the memo may have been too graphically demanding to appear on my steam-powered anachronism.

  14. Tuskin38 says:

    I’m running Dual AMD 6970s, very nice.

    • Premium User Badge

      FriendlyFire says:

      Dual 6950s here, though I have to say I’m not exactly pleased by them. I also run Eyefinity and support is sketchy at best. CrossfireX support is slow to appear and just about never works on older games and Eyefinity causes tearing in the non-primary adapter (since they couldn’t put three bloody DVI ports) which is extremely apparent when using all three monitors in portrait.

      There are very high chances I’ll be going Nvidia in my next build, even if they usually are more expensive.

    • DrGonzo says:

      And to think you could have bought a modest GPU with no noticeable difference. And bought so many packets of Malteasers with the difference.

      Think of the Malteasers!

    • Shortwave says:

      I’m totally please with my choice to pop in a second 6950.
      I installed an after market cooler on the main one to reduce noise, and put it on the top as the after market works BEAUTIFULLY it doesn’t need as much space to breath. Allowing the second to control it’s fan automatically without pushing 70c ever, which keeps me happy. Since it only powers up when I’m in game which means I don’t even hear it anyways. Though my needs for it are most definitively not the same as most people. I like having the extra juice to run in eyefinity when I want. Tt does much better than the single GPU in those regards. But usually I just like having one dedicated to my single monitor while I have TV playing on another, and any chats I might be running on the other. It really takes a load off of things and allows the single 6950 to sore beautifully. But again, of course most people shouldn’t really need anything more than a single 6950 as stated.

      I was quite pleased that the 6950 was his main AMD recommendation, as it truly is the most intelligent GPU solution on the market still.

  15. Deano2099 says:

    Any of these offer any noticeable benefits over my 5850? This is around the time I’d expect to upgrade but whenever I’ve looked, advances seemed pretty marginal.

    • gimperial says:

      I had the same problem as you, and what I did was buy a second hand 5850 for cheap. In CF they are more powerful than a 580/6970, and a lot cheaper. Personally I’ve never had any problems with drivers etc but YMMV.

    • Fierce says:

      @gimperial

      Mmmm, I wouldn’t say they are more powerful than a 580/6970, though the performance is certainly up there… not unless you’re talking with some massive overclock of course. Careful not to misrepresent the strength of 5850s in CFX to the poor guy. It would be a disservice.

  16. HostFat says:

    Remember also that AMD cards are better then Nvidia at mining Bitcoin :)

  17. Skusey says:

    I’ve two monitors, both 1920×1080. But I’m only running a 5770, which could probably do with an upgrade. Though I’ll wait for NVIDIA to get off their arses and force AMD to stop playing silly buggers with pricing. Hoping that within a few months I might be able to pick up a 6950 for something closer to £150. Also, do those cards still have a good chance of unlocking into a functional 6970? Because that would be fantastic.

    • Premium User Badge

      LaunchJC says:

      My 5770 has been the most amazing value, really am not looking forward to the difficult upgrade decision!

    • Skusey says:

      Yeah, I got mine a few years ago and it’s still performing remarkably well.

  18. Premium User Badge

    AmateurScience says:

    I upgraded from a ATI 5770 HD (which really was a great, cheap wee card) to the slightly meatier 560 Ti mentioned above. Really very impressed with it indeed. Still only running 1680 x 1050. Mostly because the only fault I can find in my current monitor is that it *isn’t* 1920 x 1080 and that’s not a good enough reason to upgrade for me. I’m sure it’ll happen in due course.

    Next up is switching up the processor to the 2500k from an AMD 555 at some point in the distant future where I actually have any money.

  19. tehfish says:

    “So commenters, please preface your opening salvo with a quick screen-res stat.”
    1920×1200
    (Plus a 1920×1080 secondary screen)

    Article seems pretty much spot on. I have a 6950 brought for £200ish myself and recently suggested a 6850 for a friend after a cheaper card, so the suggestions seem to match up :)

    Only niggly issue I noticed was the whole “nvidia drivers are better” comment.
    It depends on your own experiences really, but I switched to ATI mainly to get away from the awful nvidia drivers, and the view of friends of mine varies from being less-than impressed with nvidia to ‘theres no difference anymore’. Whilst nvidias drivers being superior was once the case, I feel it’s now a meme past it’s time

    • joel4565 says:

      Yeah I haven’t noticed any problems with Ati drivers for my 6950, but then again I didn’t really notice many problems with my nvidia driver for my 8800GT last generation for the most part. I did have some Blue screen of deaths blamed on the nvidia driver during the one set of nvidia driver, but the issue was fixed by a later set of drivers a month or two out.

      I flip between the two video card companies almost every generation and almost never have any driver issues. My only problem with a video card in the last few years was with how hot the 8800gt was set to run. The fan was set to not go up above 29% until it was on fire, so some games caused it to overheat unless I used a video card utility to manually set the fan speed.

    • Shortwave says:

      Ditto, I’ve experienced no driver issues with my 6950s at all. Usually if anything it was an issue with a specific game for a while instead.

    • Torgen says:

      I switched back to Nvidia a few years back (9600 GT) because of driver problems with City of Heroes, and have heard of far more problems with Ati drivers than Nvidia ones. That, and PhysX in some games, has stopped me from upgrading my card recently. I only upgrade every three years or so.

      EDIT: Acer P215H, 1440×900

    • Antsy says:

      I had an xfx 4870×2 since release and had more compatibility issues than I can remember. I got really tired of reading “an issue with amd cards” in game support pages and forums so when I upgraded last year I punted for a gtx570 and haven’t looked back

  20. James Allen says:

    1920×1080. GeForce 560 Ti. Works awesomely.

    • Jhoosier says:

      Same here, works nicely. A bit sad I didn’t get the 448 core version, and I’m sure the monitor article will have me in tears as I just bought a new BenQ 24″. But that way lies madness.

  21. InternetBatman says:

    It’s a good guide, but I haven’t felt the urge to upgrade for years. I got a 430gt when my last card died and it runs games. That’s all I want my cards to do and it does it. I’m trying to get my limping 5 year old computer a year or two into the next generation of consoles before it dies.

  22. Casimir Effect says:

    1920×1200

    Have a GeForce GTX 275 running things at 1920×1200 on a 27″ HannsG monitor. Handles most things pretty well although The Witcher 2 gave it some issues. Typically I may have to drop some AA or reduce shadow quality, but the entire rig is getting on for 3 years old and wasn’t top-of-the-range to begin with.

  23. AbyssUK says:

    I am a scientist and I agree with this article. Unless you have a quadHD 200hz 50″ tv/monitor never spend more that 200 quid on a video card.

    Good rule of thumb, spend 150% of the price of your monitor on a gfx card and you’ll be alright.

    • Guvornator says:

      Speaking as someone who was going to spend all his cash on a base unit and plug it into two of the crappy CRTs I have lying around, why is the monitor so important? (I am planning to get new ones in the future when I can afford them/one/whatever)

    • Svant says:

      Id say do the other way around, dont spend more on your gfx than your monitor. A good monitor will last many years and a shitty monitor will make everything look shitty no matter what fancy gfx card you are using.

    • joel4565 says:

      I am not sure what kind of crappy monitors you are used to dealing with. But lets do some math.

      $100 1440/1680p monitor = $150 video card (so horrible monitor, decent lower range card)
      $200 1080p monitor = $350 video card (okay 1080p monitor, too expensive of a video card for 1080p)
      $1000 1600p monitor = $1500 video card (so tri sli 580, which is overkill for almost every current game even at 1600p)

      So i agree with the poster who said spend the same on video card or less. No matter how good your graphics card is, it will look like crap on a 100$ monitor. But a $150 card (ie ati 6850) can look pretty nice on a nice starter IPS $300 monitor (ie Dell 2412m). So I just got an ati 6950 in my latest build, now I am saving up for a nicer monitor maybe a 24″ 6bit ips like the Dell 2412m or the HP ZR24w.

      I hope the monitor article addresses nice monitors, not just 1080p tn panels.

    • DrGonzo says:

      I’m not sure where you are shopping, but you can pick up a good 1080p screen for 100 quid. Then a graphics card to handle it very, very well for 150 quid, or a more modest one at 100 quid.

      I see no difference between those who spend insane amounts on pcs and those who use macs.

    • deadly.by.design says:

      Heaven help you if you buy an Apple cinema display.

    • AbyssUK says:

      I did say it was a rule of thumb, joel4565 and am in the UK :) Your own country may vary.

      Also a $1000 monitor you’d be a graphics designer or something so be wanting a cad card of some sort.

    • Premium User Badge

      Wisq says:

      Yup, 27″ Apple Cinema display here. 2560×1440 on an ATI 5970 (dual GPU).

      At the time that I got it, there was no chance of getting a large IPS display under $1000, and oddly enough, Apple’s display was actually the cheapest of the large IPS displays (at exactly $1000). The Dell ones cost at least $200 to $300 more. And many modern video cards support the Mini DisplayPort connector required to use modern Apple stuff. Heck, Eyefinity cards rely on them, since there’s no other way to fit so many connectors onto the back of the card.

      Did I miss out on the de facto 1080p standard? Yeah. Do some games have issues going above 1920×1080? Absolutely. Do I regret it? No way. Looks positively stunning, works seamlessly with both my PC and my company-supplied Macbook Pro, and has amazing build quality and sturdiness, not to mention resale value.

  24. Premium User Badge

    PoulWrist says:

    These articles are delightful. As someone who does system building for a living (at least partially), it always irked me how much crap was sold expensively with the promise of being something it was not, your articles are certainly very measured and you have the exact right outlook on how to build a good and balanced system. Rather than those nutheads you generally hear from.

  25. eraserhead says:

    Ha, I have two 1920 x 1200 – eat that! :-) I’ll never succumb to 1080.
    I got my AMD 5870 when it was fairly fresh, it was the most expensive card I ever bought. But I have to say, I’m glad I did. Because (also thanks to console gaming and its limiting effect on gfx evolution) it still handles everything I throw at it. I’ve never had the same gfx card for so long. So when the day comes and it can’t properly display new stuff, I’ll probably go all out and buy the best available card again (well no overclocked dual something). I have to say that energy efficiency in idle mode is also a huge factor for me, as sadly I’m working most of the time and the beast has to be quiet and not quadruple my energy bill.

  26. faelnor says:

    Main resolutions are 1680×1050 and 1920×1080 when plugged to the telly. The second hand 4870X2 that I got last year for less than one console game (50€) chews through everything I throw at it without distinction.
    I mean, it does, except when the CrossfireX drivers don’t work at all for a given game, which is really rare these days. I don’t know how well this card fares in comparison with much more recent chipsets, but I’m surprised it works so well in 2012. No reason for me to get another graphics card at the moment.

  27. Anorak says:

    I remember a very long time ago an article in PC Gamer UK (I think), which was trying to encourage gamers to agree on a couple of budget price cards, and stick to those for a year or two.
    The idea was to force a stop to the endless upgrade cycle of graphics cards, and force developers to be a bit more creative in how they gave us better graphics, not just throwing more power at it.
    It was hoped that more creative games would be made, that weren’t just focusing on shinier graphics.

    This has actually happened now, almost my accident. Because nearly all developers want to release a game on all platforms, they’re locked in to whatever kit is in the PS3 and the 360, so even though PC hardware has come on much further, there are not many devs taking advantage of this.

    I’m actually quite grateful for this- I’ve not had to upgrade in two years, and two years ago I bought a “budget” card. It plays newer games beautifully – Skyrim looks fantastic on it, and Human Revolution ran with nearly everything maxed out.

    This is essentially a counterpoint to people who argue that consoles are stagnating the PC cutting edge….I can never keep up with the cutting edge, financially, and I think very few of us can.

    • Prime says:

      I am also very grateful for this stability. It’s nice to install a game, set everything to maximum, and have it work flawlessly and smoothly without hours of fiddling to set the game to your PC’s particular ‘sweet spot’.

      Although, to any developers reading this, that does NOT mean the same thing as taking all our graphical options and tick-boxes away! Leave the control of such things with me, thank you very much, because you’re bound to get it wrong more than right if you attempt to dictate (See: id’s launch of Rage).

    • Guvornator says:

      Total agreement. The amount I’ve spent on PC is, looking back on it, obscene. I remember Westwood Studios were very keen that their games should run and look decent on PC that were not high spec. It will be interesting to see if this continues through the next console phase. If the Xbox isn’t x86 compatible I can see a return to the bad old days…

    • Premium User Badge

      Stellar Duck says:

      It could be argued that it has in fact done the opposite. It has stifled creativity in the PC space as a game has to be able to run on a console. Smaller levels, more loading, fewer systems interacting to create great experiences.

      While it’s indeed cheaper in hardware, there is also a price to be paid creatively. Look at the difference in scope in the level design and free form madness between Crysis and Crysis 2. Never mind the graphics, but look at how limited a playground Crysis 2 is.

      I for one am not sure it’s a good thing.

  28. Premium User Badge

    tomeoftom says:

    1440×900. Feel super-justified with my 560Ti and 2500K right now, thanks.

    • joel4565 says:

      At 1440×900 you probably have quite a lot of extra horsepower just sitting there.

      1440×900 = 1,296,000
      1920*1080 =2,073,600

      Since you are pushing only 1.29 Million pixels compared to 2+ million, your systems does not have to work very hard. Given how nice your processor & cpu are, I would encourage you to save up for a nice monitor to take at vantage of the extra power.

      I myself just built a similar system 2500K & ati 6950 and am saving up for a nice 1920*1200 monitor. Something like the Dell 2412m.

  29. Lenderz says:

    2048 x 1152 res here on a 23 inch display – 1080p is so yesterday and you can really benefit from higher pixel density over larger screens in my opinion.

    Currently running 2 heavily OC’ed GTX 460, waiting for the mid range AMD and Nvidia next gen to surface and then looking to upgrade, would be a bad time to upgrade graphics card imo as the die shrunk next gen will have a lot to offer in cost/performance/power consumption.

    • Prime says:

      Funny, I had an argument with a PS3, 40″ TV owning mate the other day who believed that his setup beat any PC out there. He was most miffed when I told him the PC romped past that standard years ago. I think I even used the word “quaint”. Heh.

      Plus, our GFX cards are capable of drawing a lot more geometry and effects at that resolution than the PS3 is!

  30. Prime says:

    1920×1080 monitor.

    I got my RADEON 5850 (1gig video ram) eighteen months ago, coupled with a overclocked i5 750 CPU running at 4.01 GHz, and there hasn’t been a single game I couldn’t play at the full 1080p, in highest detail. It eats Crysis for breakfast. And games that weren’t exactly smooth at this resolution to begin with – Skyrim, Albion Prelude – seem to have magically worked themselves up to glorious smoothness with patches and/or driver updates.

    At this rate I can’t see myself changing upwards for another year, perhaps two.

    • iucounu says:

      Sounds remarkably similar to my setup – same OC, same card, same res, same performance. I found I couldn’t run The Witcher 2 maxed out, but then that was before I upgraded to Win 7 – might give it another go.

    • Prime says:

      Witcher 2 is actually one I haven’t got around to yet. I’d heard it was a pretty one. Let me know how you get on?

    • iucounu says:

      Wilco. Might fire it up tonight, in fact, as I haven’t looked at the new patch either.

    • Fierce says:

      Neither of you will be able to play Witcher 2 maxed out as “maxed out” includes turning on Ubersampling, which is a ridiculously expensive application of AA. Everything has trouble with Ubersampling.

      And I say this running two 5850s in successful CFX.

      So No to Ubersampling, and for GODS SAKES BRING SHADOWS DOWN TO ‘HIGH’, and you may be fine when walking through the forest. Shadows in almost every game are extremely expensive for what they produce, and turning them to High from Very High/Ultra will not produce a perceptible difference to the human eye in a gameplay context. It will in a screenshot comparison context, but no one plays a video game looking at the shadows and making sure they’re anti-aliased.

  31. Premium User Badge

    The Sombrero Kid says:

    Saying 16:9 is the most popular resolution is like saying the intel gma is the most popular gfx card.

    • Fierce says:

      While I agree with you, popularity is subjective and malleable due by market factors. Any “popularity” 16:9 enjoys I would readily argue is a by-product of the market being flooded with 16:9 monitors from production-minded panel manufacturers, and not so much actual preference over 16:10, but the possibility of course exists that people could like 1920×1080.

      Personally of course, I think they simply don’t know any better.

  32. Derppy says:

    High resolution displays don’t cost thousands of dollars like the article implies. Look up Hazro HZ27WC. It’s 2560×1440. However, in my experience, you’ll need another 6950 to enjoy games with the sweet extra resolutio. Worth considering if you work on the computer and have a little extra to spend.

    • Shortwave says:

      Correct, crossfiring is usually almost always only more useful at higher than 1080 resolutions.
      Spare when dealing with HD texture packs and the likes. Still I do see 10-20 (Sometimes more) increase with crossfire on. Though there is rare cases where it lessens the fps I should note.

  33. Advanced Assault Hippo says:

    Basically, the 6850 is the best option for most people. Plays everything at high settings and probably will do for another couple of years. £100, job done.

    I bought one last year to replace my 4850. PC gaming is so cheap at the moment.

  34. Sarlix says:

    I brought a HD 6770 before Xmas for £70. I’ve brought a good many GFX cards in my time, but this was one of the best purchases. And mostly due to Morphological Anti-aliasing. It may not seem like a huge deal, but I can effectively play a whole host of games now without regular AA – This not only keeps my frames high, but it’s also not a huge step down from x4 AA visual wise. I am very happy.

  35. phenom_x8 says:

    Vanilla HD6850 users since Nov 2010 here at 1680×1050; Powerful cards that made me through 2011 (the best year with witcher 2, dx3,batman, etc) with merely full satisfaction. No regret at all, just hope that it will stands in term of performance till 2013

  36. Premium User Badge

    The Sombrero Kid says:

    Also i’m afraid you’re totally wrong about the memory limit, I can use about 5 fullscreen render targets before we’re talking about shadow buffers, special effects or anything else, trust me I will fill your memory.

    • Jeremy Laird says:

      If you’re saying it’s possible to code a game to saturate more than 1GB of graphics memory, well, who could argue?

      In reality, developers target a pretty broad installed base of users. Coding a game that results in memory swapping over the PCI Express bus for even a significant minority of users would be fairly suicidal, which is why it doesn’t really happen. It’s very obvious when a card runs out of video memory. Right now, for the vast majority of games from Skyrim to Crysis 2, that isn’t happening with 1GB cards when running at 1,920 x 1,080 or below. I don’t think it’s likely any time soon, either. I very much doubt that the next gen consoles are going to push video memory requirements beyond 1GB.

    • joel4565 says:

      Yeah, but don’t forget about people who want to load 20 Skyrim texture mods. I believe with the mods that Skyrim already has out, you can saturate the 1 GB buffer on graphics cards.

      You combine a nice city texture mod like this one: http://skyrim.nexusmods.com/downloads/file.php?id=607 along with a few for characters, weapons, and armor and watch the video card memory usage skyrocket.

    • Fierce says:

      God Jeremy, I sure hope the next gen consoles push video memory requirements beyond 1GB!

      Much like you said with 16:9 being the de-facto 800 pound gorilla; like it or not, the unsustainable arms race of the cutting edge pushing envelopes like memory requirements is the only way developers are going to get off 32bit (among other things) and start building the Ark Of The Future with materials other than brittle wood.

      The amount of progress in video games made since even 2001 has been incredible. I absolutely loathe the idea of it slowing down just so more tweens can afford a GPU. No offense to my financially struggling co-readers intended, as I’m struggling too, it just can’t be forgotten that Super Sampling AA and Tessellation were by no means cheap features when they were born, yet we are now ever grateful for their existence.

      The upgrading reprieve produced by the long lifecycles of the 360/PS3 has been enjoyable and I’m sure relished by many, but I wouldn’t want to see the entire industry’s innovation energies stagnate for the benefit of its continuation! I’ve seen enough console-res textures in my games already, and I would hope it is soon time to move on.

  37. Premium User Badge

    TeraTelnet says:

    1920 x 1200!

    I’m still using my Radeon 4870, which I feel retro-justified in buying now with its inclusion in the article. Still good enough to run modern games at the above resolution, with most if not all of the bells and whistles enabled.

    • Sarlix says:

      The 4870 can still outperform newer cards such as the 6770. It really is a beast. The only downside is that it’s also a heat demon. I sold my purely because it put out so much heat. And it’s also not energy efficient – goes without saying really.

  38. Retro says:

    Thanks for the article! Any chance you could spend a few words on the situation in the mobile (laptop) GPU market?

    • Sorbicol says:

      This would be much appreciated, although an article on gaming laptops (and please don’t say “don’t bother”) would be great. Any word if you are thinking on this for the future? Apologies if you’ve answered that before!

  39. Kdansky says:

    The usual 1920×1080 it is for me on most my screens (PC primary / TV / Mac). I do have an old 5:4 (120×1024) used in Portrait, but that’s not for gaming. All resolutions but the 5:4 (which was incredibly badly supported by games) were pretty much forced on me by the manufacturers. They don’t sell anything else. I’m using a 4890, if I remember correctly. Hard to justify upgrading for 250€ when everything still runs acceptably.

    I could really do with a monthly hardware column so I can refer to it whenever I feel like buying stuff. Every time I have to spend dozens of hours to figure out which cards are actually what I need, so I don’t accidentally buy overpriced mobile versions.

  40. marach says:

    “Moreover, that 1,920 x 1,080 pixel grid puts a cap on the amount of memory you actually need. 1GB gets the job done today”
    What? 1GB? MAYBE just MAYBE if your happy playing with no AA etc but games like BF3 already fill a 2GB buffer and even 3GB still hits a limitation. A little googling would have shown that.

    As for NV drivers etc I’d actually call the driver itself worse than AMD’s though it’s frontend is in some/many (depending who you ask) ways better, people report more crashes and graphics restarts on NV systems than on AMD where I work…

    • Jeremy Laird says:

      BF3 is absolutely an outlier when it comes to memory usage. Why it needs more than 1GB @ 19×10, I have no idea.

      The below link shows an aggregated table of performance results at high detail 1,920 x 1,080 across a number of different games. The table includes Radeon HD 6950s in 1GB and 2GB trim. You’ll note the 1GB card scores marginally higher.

      http://www.tomshardware.co.uk/charts/2011-gaming-graphics-charts/Enthusiast-Index,2674.html

      There will always be exceptions and BF3 is certainly one. But the vast majority of games, including graphics fests like Crysis 2, sit quite happily in 1GB @ 1,920 x 1,080.

    • Kdansky says:

      What about Rage with its ULTRAGIGAMEGATEXTURES? Did the optimize so hard for the 360 that they never need more than 1 GB?

    • Premium User Badge

      Goodtwist says:

      @ Jeremy Laird
      I easily hit the 2GB VRAM cap when playing Fallout 3 with hires mods. Or any other mod, as a matter of fact.

  41. Bishop99999999 says:

    Yeah, 1920×1080 here.

    So let’s say I’m the kind of insecure jackass that can’t comfortably buy a piece of equipment if there’s a newer, shinier models with higher numbers on the box. Would taking the plunge for a 570 or 580 be as big a waste of money as the article implies? I do have a bit of cash to throw around for my new system (first one I’ll ever build!) and after years of limping along with laptops I’d like something with a little punch to it.

    Also, would considering a 590 make people laugh at me?

  42. Daniel Klein says:

    (1920 x 1080)

    Damnit, I’m totally in the market for a new monitor and was going to buy one this week. Will hold off until the next article then. Any ETA for it? A month perhaps?

  43. Gizoku says:

    My resolution = 1920 x 1200.

    I got a Sapphire ATI Radeon HD 6870 in November last year for £125. In practical terms, how big is the gap between the 6850 and 6870? Worth an extra £20?

    • step21 says:

      Imho it can well be worth it, not that much after all. However like in the article it’s hard to notice in practice. Furthermore I think between those two there are also some other considerations, as the 6870 is based on something else (and not on the same line as its name would suggest iirc) and it needs more energy (two pci-e cables for juice instead of one), needs more cooling, and is as a result probably also louder. So depending on where you want to use it, and depending on whether those things matter to you, the 6850 could still be the better choice. I run it in my htpc for example, and it still is cool and quiet, which would be much harder with the 6870 (and I would need a stronger power supply)

  44. NathanH says:

    My resolution is 1024×768 on an old cheap monitor. I might increase it by a notch for a game with a busy UI but generally anything higher than 1024×768 makes me feel weird.

    • Lenderz says:

      I had the same issue before I upgraded, it took some getting used to but a higher pixel density is totally worth spending the time/money getting used to.

  45. Ridnarhtim says:

    Looking forward to your Monitor article. I assembled my current PC less than a year ago (with an i5 2500k – high five – And a GTX 570), so neither CPU nor GPU will be updated soon, but my monitor is a few years old now, and was cheap even then (1440×900).

  46. step21 says:

    <3 my Ati Radeon HD 6850, the Sapphire model is the best :)

  47. Gnoupi says:

    I was a huge believer in 3dfx.

    Anyone remember them?
    No, really?

    At the time it was 3dfx vs Nvidia. Saying “I have an ATI card” at this moment was the equivalent of “I have an Intel graphic chipset” nowadays.

    I remember when 3dfx fell (mostly because they never managed to make a new processor, and at the end were mostly just strapping processors together), and was bought by Nvidia. Journalists were commenting “Ok, and now what’s left to concurrence NVidia? ATI? Don’t make me laugh…”. And at the time it was quite accurate.

    • Fierce says:

      Diamond Monster 3D II owner here.

      I bought it with my saved up allowance and asked my dad to put it in the computer. He said he’d do it later. I asked again a week later and he said the same thing. A week after that, he came home to find me in his room, cross-legged in front of the open computer case, holding his power drill and frowning earnestly at the PATA cables and PCBs I had on the floor around me. We looked at each other and he just walked out.

      I had it back together, card installed and had booted up QuakeGL by the time he returned from having dinner. I was 11 at the time. He said he was proud of me.

      I’ll never forget 3dfx.

    • Lenderz says:

      As was I, my first 3DFX card was the Voodoo Banshee, essentially a Voodoo2 that didnt require a separate 2D card, and man did it make games shine, prior to that I had a matrox mystique, shame they moved out of consumer cards.

  48. kael13 says:

    I’m staving off an upgrade til the next lot of kit comes out. Namely Ivy Bridge and Kepler tastiness.

    Never running AMD cards in Crossfire again. Mark my words.

    On that note, anyone interested in buying an extremely well-looked-after two generations old custom i7 PC? ;D

  49. Belsameth says:

    480GTX here running dual monitors. a 1080p one, obviously, and a 160xsomething :)

  50. Premium User Badge

    Goodtwist says:

    This is obviously a splendid topic to debate on. Everybody has an opinion- wait: what did Dirty Harry say about it?!

    I’m on 1920×1200 with ATI 6970, mildly OCed.

    So…I agree with Jeremy on the choice presented, to some extent. The point is, however, that you get what you pay. And if you need more, you’ll need to pay more.

    You can easily hit the ceiling even of the most potent- and rather well established video cards than the new 7900- video cards such as 6970 and 580. All you need to do is to max out anti-aliasing and/or play with video mods. Fallout 3, hires mods in SSAA mode kills all and any FPS. Mind you, a rather not so new game.

    tl dr – you’ll always need more video power than available with mods and anti-aliasing –> <3