Hard choices: Graphics Card Update

By Jeremy Laird on June 1st, 2012 at 2:00 pm.


Knocking out hardware guides for RPS has been heap good fun, thanks in no small part to the enthusiastic après-post banter. But it’s also created a bit of a monster. Problem is, things move fast in ye olde world of tech and especially in graphics. It’s been getting on for four months since our first perusal of the world’s finest pixel pumpers.

That’s long enough for AMD and NVIDIA to roll out a small army of new graphics chips. The good news is that the four 3D boards I recommended back in Feb still look pretty sharp, partly because the arrival of new chips has pushed prices down. But there are also some new GPUs I reckon you need to know about and some broader trends to think about. So here goes.


NVIDIA’s GTX 680 is awesome, and yet…

The biggest news has been the arrival of a new high end GPU – well, high end of sorts – from NVIDIA. Variously known as GK104, Kepler (which is actually a whole family of new GPUs) or the GeForce GTX 680, it might just be the best graphics chip NVIDIA has ever made. And I kind of hate it.

On the one hand, it’s a preposterously impressive technical achievement from NVIDIA. It’s clearly the fastest graphics chip on the planet. In efficiency terms, otherwise known as performance-per-watt in industry parlance, it’s completely off the map.

Overall, the GTX 680 is so good it makes AMD’s Radeon HD 7970 look fat and wheezy and past it. And that’s a chip that only appeared at the end of last year. It’s ridiculous. But for upwards of £400, it’s also extraordinarily pricey when you consider that it’s physically a very small chip by high end GPU standards.

Remarkably, the GK104 GPU that powers GTX 680 boards is actually smaller than the chip inside the GeForce GTX 560Ti, and you can buy one of those for £150. The explanation is a die shrink from 40nm to 28nm transistors, allowing the 680 to squeeze 3.5 billions transistors into less space than the 560 manages to fit 2 billion of the little binary blighters. Ultimately, it’s chip size that most determines manufacturing cost.


AMD’s once majestic 7970 is now the fat, wheezy kid

Meanwhile, NVIDIA has just announced a much larger Kepler-based GPU that looks much more like a traditional high-end part, the chip known as GK110. It’s absolutely massive. It packs 7.5 billion transistors. It’s about as sexy as silicon gets. But it may never make it into PCs.

That’s a story for another day, since GK110 isn’t appearing in any form until the end of the year and has so far only been announced as part of NVIDIA’s Tesla product family targeted at industrial number crunching.

Anyway, you could argue the size of a graphics chip is utterly irrelevant to gamers. If it’s fast and efficient, if it’s cool and quiet, it’s worth the money. But the GTX 680 still looks suspiciously like a mid-range chip that just happened to end up insanely fast. And I’d like to see NVIDIA cashing in a little less.


The GTX 670 is a more pleasantly-priced Kepler card

Mercifully, there is now a cheaper version of GK104, the recently released GeForce GTX 670. It’s essentially 7/8ths of a GTX 680, which means it’s still a monster performer. You can pick up cards for a little over £300. And it’s my first recommended buy this time around.

It’s a bit of a grudging recommendation since I think it should be about £100 cheaper and I’m generally loathe to recommend cards for more than £200. But when you compare it to AMD’s Radeon HD 7900 series, it’s hard to conclude that it’s not good value of a sort.

There’s a little bit of give and take in the benchmarks, admittedly. So, the 7970 takes the spoils in Crysis: Warhead, for instance (does anyone actually play Crysis?), while the 670 hammers AMD in games that actually matter like Battlefield 3 and Skyrim. And when the 670 wins, it tends to wins bigger.

Like I said, the 670 is a great card. I just resent the pricing. As for AMD, since last we spoke it’s thoroughly fleshed out the Radeon HD 7000 series. But the only two new chipsets you need to worry about are the Radeons HD 7850 and 7870.


Mid-range machine: AMD’s 7800 series is still a bit pricey

These are classic mid-rangers that ought to be right up RPS’s alley. They’ve got proper 256-bit memory buses and healthy clock speeds. But like the GTX 670, I’m not really feeling the pricing. You’re looking at £180 and up for the 7850 and £250 for the 7870.

That more or less makes sense when you factor in the stream shader counts. The 7850 gives you 1,024, the 7870 1,280, while the full-on 7970 is a 2,048 shader beast. The problem for me is that the old Radeon HD 6970 can now be had for £200 with the 6950 sitting at around £160, down from roughly £250 and £200 since February – and the latter was for a 1GB card. So, it’s far from clear that the new cards are the better buy at current pricing.

The solution could be NVIDIA’s upcoming GeForce GTX 660, which should appear this summer and slots in below the 670. If it’s any good, it will push the 7800 series down to nearer the sweetspot.


Cheap chip: XFX’s Radeon HD 6950 can now be had for just £160

If you can, I’d recommend holding out a month or so for the GTX 660 to appear and the market to readjust. If you really must buy right now, you have three options. If you can stretch all the way to the GTX 670, that’s great. If not, a 2GB Radeon HD 7850 isn’t a bad buy if you can find it for £180 or less. Otherwise, hunt down the cheapest Radeon HD 6970 or 6950 you can find. There are some real bargains out there, like the 2GB XFX 6950 Scan is currently doing for £160, or Pixmania’s Sapphire effort for even less. That’s a lot of card for the cash.

__________________

« | »

, , .

130 Comments »

  1. aircool says:

    Bought a GTX570 last year some time… Haven’t even thought about an upgrade since then as it seems to work fine with anything that’s thrown at it.

    • Kreeth says:

      Yeah, I bought a GTX580 pretty much the day they came out. Given that that’s now 18 months ago and the only thing it struggles with is Crysis at 2560×1440 with everything turned right up, it’ll probably do for a good while yet.

      I suppose it cost a lot at the time (£370ish?), but if I can get 2-3 years or so out of it it doesn’t seem that bad to me.

    • xavdeman says:

      Not true, I own a GTX580, and it can’t handle Battlefield 3 on Ultra (with motion blur and Post-Processing AA (FXAA) disabled) on a framerate acceptable for multiplay @1920×1200. (60fps in gunfights). I have to dial down to Medium to get that consistently across maps (first world problem right there). The problem is: neither does the GTX680. So I’m not upgrading until probably the GTX 780 is out, because the gains right now are just too small. But it’s good to see NVidia is back, they got their ass handed to them for a while by AMD.

      • Axyl says:

        Then you’re doing something wrong, my friend.

        My GTX 580 runs BF3 flawlessly on the highest possible settings across the board. Never drops below 60FPS, even during the most intense 64 player online games.

        The ONLY thing wrong with my 580 is that last week, one of the fans on the Twin Frozr II cooler died. RMAing the card and bought a 680 today instead.

        System Specs :

        Windows 7 x64 Home Premium
        Gigabyte Z68A-D3H-B3
        Intel i5 2500k Sandybridge @3.7Ghz
        16Gb G.Skill DDR3 RAM @2133Mhz
        MSi nVidia GTX680 GDDR5 2Gb Twin Frozr III (This is the new one, previously it was the GTX580 1.5Gb variant)
        Creative SoundBlaster X-Fi Xtreme Audio
        Corsair 120Gb Force 3 SATA 6Gb/s SSD
        2Tb Western Digital Caviar HDD
        ViewSonic VA2248-LED 22″ Monitor @1920 x 1080

        It may be worth noting that the only thing installed on the SSD other than the OS, was BF3. Maybe that had something to do with the performance I had.

        *EDIT* I have Motion Blur disabled as it’s revolting, visually.

        • grundus says:

          I had a GTX 580 Twin Frozr II/OC as well and it couldn’t run BF3 on Ultra (the default Ultra settings I mean) at 1920×1200 much above 60fps, and it would dip below 60 from time to time. That’s with an i5 2500K at 3.3GHz, G.Skill Ripjaws 2x4GB 1600MHz DDR3 and… That’s about it. My GTX 680 Twin Frozr III/OC can run it at almost 90fps at 1920×1200 on Ultra at if you stick it on Auto it gets 60-70fps at 5040×1050, which is bloody impressive if you ask me. Please, ask me.

          Having to dial BF3 down to medium to get good frame rates with a 580 sounds like a different problem, on High or a custom setting which is basically Ultra but without the 4x AA I was getting many extremely playable frames at all times.

          My 580 was only 6 months old but I upgraded to the 680 purely for nVidia surround on a single card for my sim racing cockpit, the 580 was great but the 680 runs so much cooler I feel a lot happier about maxing settings. Dead Island on ultra at 1920×1200 got my 580 up to 74 degrees, on my 680 I haven’t seen anything above 64 yet running anything, not even Arma II. Upgrading the 580 to a 680 was actually the cheapest option for triple monitor gaming, the alternatives were a Matrox Triple Head2Go (£235), a second GTX 580 (£280-300 for a Twin Frozr II) or SoftTH and a secondary cheap graphics card (free software and a cheap card), I sold my 580 for £220 and the 680 was £450 from Scan.

      • jimbobjunior says:

        I have a similar problem, but it’s not the graphics card at fault. BF3 appears to be CPU bound during bouts of action. You can see if this is the case for your own machine using the console and perfoverlayenable=true. If you can run round an empty MP server in ultra at 60fps+ but lag when in gunfire/explosion-towne then it’s probably the CPU.

        • veelckoo says:

          Not true. While my GPU GTX570 is really hard at work (99%) my cpu i2500k @ 4.3GHz is not that engaged in BF3.
          You can see for yourself: try Render.PerfOverlayVisible 1 command in BF3 and you will see which part is most used – CPU or GPU.
          Also there is no way that BF3 maxed will play 60+ on every map in multiplayer mode. Simply won’t happen.
          Even nVidia confirms: http://www.geforce.com/games-applications/pc-games/battlefield-3/ops/Battlefield-3-NVIDIA-GeForce-GTX-580-OPS
          BF3 maxed With GTX580 @1920×1200 gives you 54 frames ON AVERAGE. Which means you will get much less from time to time.

          • Steed says:

            @Velcro dunno man… dunno. My i5 (2500k,with nipples juiced up nicely) is normally between 60-100% on all cores, and BF3 runs the poor beast hotter than any other game. GPU is max squibby, but then that’s a product of it choosing to run at 100% of 80% power, rather than 80 of 100 etc for reasons beyond my understanding. It’s all gribble really. Get a solid 70 (capped) on Ultralisk (no MB or AA) @ 2560×1440.

        • aircool says:

          Doesn’t matter how hard or lazy your CPU is, it’s a clock speed thing, or something to do with single threads, whatever they are.

      • Fierce says:

        I concur with Axyl (though shame on him/her for not including the OS), something is wrong with your numbers, as I have no trouble getting >60fps the VAST majority of the time in firefights and Frostbite destruction scenes. And I only play on 64 player servers and with the FPS counter activated.

        i7 920 d0 @ 4.0GHz
        12GB Corsair Vengeance DDR3 1600MHz @ 8-8-8-24-1T
        2 x Sapphire Radeon 5850 in CFX
        Asus Xonar Essense STX
        BenQ V2400W 24″ 16:10 1920*1200 LCD (Review & Pics Here)
        Win7 Pro x64 SP1, OEM

        And just for clarification, I play on Ultra with Motion Blur and AA disabled as well, everything else on maximum. Lately I have turned down Shadows to High though since expensive shadows are hardly what I need during a fast paced multiplayer game.

        • Axyl says:

          Lol.. My apologies. I’m using Win 7 x64 (Home Premium).

          I think i just assumed that everyone would assume Win7 x64. My bad. :)

          Edited my earlier post to include OS. Thanks for the heads up. :)

          I should also add that I have Motion Blur disabled on the grounds that it’s ugly as hell.

      • zaphod42 says:

        Hm, yeah, you’re doing something wrong man. I’ve got a 560 and I can run BF3 on 1080, ultra settings, with some AA and AF, on rock solid 60fps. I can turn the AA up, but then the fps go down a bit. Still.

        Uh, whats your CPU like? You know a really fast GPU can’t do that much if your CPU is ancient. I’m also finding that some games like BF3 are more CPU bound than people expect, lots of games from the last few years you could easily get away with a crappy ancient CPU and a decent GPU and play them fine. But some new games now need the next gen of CPUs to keep up. You’re probably hitting a bottleneck on CPU performance.

        As someone else suggested, do you have HyperThreading on? That’ll wreck your CPU performance on games.

        • xavdeman says:

          I have an Intel Core i7 2600K. I believe it has Hyperthreading turned on by default and I can’t be bothered to disable it. It should just work ™. But yeah, there’s some strange bugs, like I can’t enable more than 4x deferred anti-aliasing, wtf.

    • Baresark says:

      In general terms, you probably wouldn’t notice a huge difference from generation to generation. So there is no reason to upgrade every generation. Your money would be much better placed in an SLI setup and buying the same card you already have for cheaper than a 600 series card.

      I’m running dual 460′s, which performs almost equally to a single 580, but that is the best I’m gonna get, so I may scrap my two 460′s for two 660′s in the near future. If the numbers work out the same, I should blow away the 680, but who knows for sure. I usually wait for a good site like Tom’s Hardware to benchmark these things before any decisions are made.

      So, if you have a 580, it’s probably not worth upgrading to a 600 series card. As the author said, they are doing much bigger things at the end of the year. If the timing works out for my upgrade, I may wait to see how that turns out.

  2. MiniMatt says:

    Would love some further examination of that performance per watt metric – particularly as it roughly correlates with noise.

    Personally I’m pleased with “good enough”, “high settings fine, leave ultra for the fanatics” and my deciding factor is almost always noise (whether inherent or how tweak and mod-able it is to make it so).

    • Mr. Mister says:

      I remember reading when the GTX680 came out that its most ouutstanding feature was how the hardware itself controlled practically every (internal) setting automatically depending on factors such as temperature, fan speed, etc.

    • PoulWrist says:

      The GTX 680 is rather incredible with a TDP of way under 200 watt, resulting in a load power consumption of about 170 watt.

    • RegisteredUser says:

      I agree. FPS per watt used in various resolutions and settings should be a standard metric in GFX card testing.

    • LionsPhil says:

      Likewise, I am interested in what the state-of-the-art in passively cooled graphics has reached. My 8800GTX is still a nicely performant beastie for its “age”, but the heat and noise really do damn grate.

    • zokier says:

      Radeon 7750 is probably the best performing passively cooled card available. TechPowerUp does routinely perf/watt analysis in their reviews.

      • LionsPhil says:

        Hmm, those graphs are rather depressingly dominated by ATi at the high end, aren’t they. Sadly too many horrible experiences with their wares to entertain it.

        • zokier says:

          Note that the review I pasted was made before kepler had launched. I’m hoping that when low-midend keplers land, we’ll get some competition at the perf/watt arena.

    • stupid_mcgee says:

      Wattage is what has convinced me to go with the 7850 over the 570. The 600 line is just too costly for me. I tend to set my GPU price around $250 USD (~£160). I would need a new power supply if I went with the 570 and, from the benchmarks I’ve seen on Tom’s Hardware, the performance is nearly identical with the 7850 pushing out a minor difference of 1 or 2 frames more.

    • Rikard Peterson says:

      Yes. I’ve said this before in comments to “Hard Choices”, but I’d love to see more attention being paid to noise levels.

  3. Flukie says:

    Still happy with 5870 2GB here, waiting for next series of cards/mobos for now.

    • Mashiki says:

      I’m still quite happy with my 560Ti. I probably won’t get anything else until the next generation of CPU’s and mobo’s roll in.

    • TechnicalBen says:

      I’m still happy with my ATI 4870. Although I notice my games are distinctively less “shiny”. I’m not joking when I say that, I think it’s something to do with how ATI/AMD render the scenes, but I get less reflections and lighting than a Nvidia card it seems. That and the shadows work different.

      However, most of that is the game devs not using the options ATI/AMD have. Instead just going Nvidia options or “generic broken” for everyone else. :(

  4. Shantara says:

    I’ve built a new Ivy Bridge machine last month, but decided against upgrading my old Radeon 5770. I’ve yet to find a game that interests me and it won’t run smoothly. With the dominance of multiplatform releases I guess I can wait another year before looking to buy a new videocard. Maybe when Bioshock Infinite comes out…

    • Mr. Mister says:

      …but you got no PhysX.

      • xavdeman says:

        You do, just not hardware accelerated PhysX, which is more impressive. It just hogs your CPU and has less effects.

        • Beef says:

          To be a bit more accurate, the GPU PhysX will not hog your cpu any less, it will just add more sparkles. Quite literally even.

          The game-sensitive physics, i.e. stuff you can interact with, still are on the CPU.

          • TechnicalBen says:

            Just to back you up, back in the day when you could run PhysiX with an ini hack, I did just that. It ran super smooth on a Single core (with HT) 3GHZ mobile Prescott chip. :P

            It’s only the fact that Nvidia lock you into their cards or nothing, that stops the game using physics. Else, why do you think Havoc, Unity and OPENCL (spell?) all work so well and fast even on AMD/ATI hardware. :P

    • Buemba says:

      I built a Sandy Bridge PC late last year and also decided to reuse the 5770 I had in my old machine. The only games I own that I can’t max on 1080p are Witcher 2 and LA Noire, but everything else runs very well.

      And I bet it’ll handle upcoming big releases like X-COM, Dishonored, Dark Souls, Bioshock Infinite, Quantum Conundrum and Darksiders 2 just fine.

    • Eddy9000 says:

      Would like to add to the 5770 love, bought mine for £150 about 2 years ago, not had any problems with a new release so far.

  5. iGark says:

    Where, approximately, will a Radeon 6770 put me? I’m hesitant to get anything different because then I’d have to upgrade my PSU too.

    • PoulWrist says:

      High-entry level. Like, it’s a good card for medium-high detail gaming at 1680×1050 and below. 1920×1080 and up it’s going to be out of breath quickly.

      • iGark says:

        That’s good news. High entry level is almost exactly where I want to be at this time. My computer, which is a refurb, did not come with a graphics card, so I’m looking for a high entry level. My highest resolution is 1920×1080 but I tend not to play fullscreen, so it sounds pretty much perfect, especially for the affordable price.

    • mrosenki says:

      The newer AMD7000 and GTX600 series cards have much improved power-consumption over the last generation though, so you might fare better with one of their mid-range models.

  6. MacBeth says:

    I almost made a bit of an error when upgrading just in the last couple of weeks, due to what I reckon is a bit of a mismatch in recommendations in the various RPS articles – I went with the recommended top-end monitor Samsung SyncMaster S27A850D (absolutely gorgeous), and being 2560×1440 it really needs a top end card to drive it properly… the best RPS recommended card being the GeForce GTX 560 Ti with 448 cores. I went for a Sapphire 7970 OC edition in the end – the GTX 560 would no doubt have been fine for a smaller monitor but just wouldn’t have done the job with the big Sammy…

    I concede it’s one hell of a first world problem to need a £400 graphics card to power your £600 monitor but still…

    • Jeremy Laird says:

      To be fair, MacBeth, I did qualify the graphics coverage. The core theme was the dominance of 1080p monitors and how you didn’t need to go beyond the cards recommended for that resolution. Step up to 25×14 or 25×16 and obviously you’re going to need more horsepower.

      I included a few high res screens in the monitor grouping because I think the one thing worth spending big on is a screen. You get a lot more mileage than components that depend on computer chips for their worth.

      • MacBeth says:

        Yes, I realised that on re-reading the articles when I was assembling my spec, and did some further research. I’ve found all your hardware articles really useful and refreshingly to-the-point, but a specific note to the effect of the above would have saved me a bit (more) brainthink…

        From your recommendations I’ve got the Z77 Sabertooth mobo, the i5-3570K CPU, the Samsung SSD and the aforementioned Samsung monitor, and they’re all performing brilliantly… the GPU recommendation was the one piece of the jigsaw that gave me trouble. For saving me from trying to compare motherboards alone, I thank you deeply… I’m grateful to be saved from analysing CPU and SSD performance too.

  7. PoulWrist says:

    I’m waiting and waiting and waiting for the Asus GTX 680 DirectCU II Top I ordered a month ago :| still unknown delivery date :( At least it was “cheap” when I ordered it, but still almost a 3rd more than what I paid for my 5870 2,5 years ago :| but, I can’t complain about the value of that card. A very good card this was for the time and if I hadn’t upgraded my monitor and upped the ante to 2560×1440 I wouldn’t even consider upgrading.

    • Almond Milk says:

      You’re in a better place than me at least, I just ordered mine this week…I’ll be lucky if I see it by start of fall :P

  8. whatisvalis says:

    The 7850 seems to be an excellent overclocker judging from a few forum threads. Some are reporting 30-50% OC

    Does this make it more appealing if it can be had for around the $220 mark?

    • PoulWrist says:

      If you’re into OC’ing, probably. You’re probably best off shelling out for a custom card instead of a reference design if you want to go messing about.

      • whatisvalis says:

        That’s the one, didn’t have it bookmarked on my PC. Obviously there are a few disasters in that thread, but it’s a pretty good read. Seems like you can get a good OC pretty painlessly, without risking your card. You can also go bananas with it, but that looks a little more high-risk.

        Right now they are around $260 in the US, which is a little high for me. Think i’ll wait for a deal/price drop.

  9. iainl says:

    I was about to buy a HD6850 as the £100 ish range is my budget (actually its more new GPU plus an SSD for £200 and so I’m thinking a £90 Crucial M4 128GB). Waiting to see what happens in a month or two sounds like an ideal plan.

  10. mod the world says:

    There goes my hope that the jump to 28nm will provide us with a new generation of “massive bang for the buck” – cards.

    • Malibu Stacey says:

      Yeah the first release in a new chip family is always indicative of the general trends it will follow amirite?

      I suggest waiting a few months before you jerk that knee any more in case you do yourself an injury.

    • Jeremy Laird says:

      Yup, this is my major disappointment with the NV 600 series, too. Great cards, but would like to have seen much more aggressive pricing.

  11. MerseyMal says:

    I’m currently running on a pair of factory overclocked XFX HD 6870 Black Edition cards. Don’t see the point of upgrading just yet.

  12. internisus says:

    I’m saving my money bits for this monstrous beauty, the “TOP” edition GTX 680, a line of cherry-picked overclockers that runs nice and cool in a big tank of a 3-slot card. A $500 GPU seems justified since I’m still running a Geforce 9600 GT.

    • PoulWrist says:

      I have that ordered. The shop keeps pushing expected delivery date back :| on a positive note, the day I ordered it, it was at an all-time low price, at the same level as a reference 680.

      • internisus says:

        Yeah; hopefully their availability will level out soon, although I don’t mind the wait much myself because it will be at least a couple more months before I have the scratch to purchase this and the rest of the rather extravagant build I’ve been planning (Sabertooth Z77 in a HAF XM, 3770K with a Noctua NH-C14, Corsair AX1200, 256GB Samsung 830 SSD, 2TB WD Caviar HDD, etc.). As evidenced by my 9600, I don’t manage to buy computer parts more than once every 6 years or something, so I want to make sure that I am comfortable for a long time. I also reason that there is no upgrade path for current motherboards and processors since the next Intel architecture will require a different socket; therefore, I might as well max out my situation. The GPU is the one thing I can see myself upgrading, but the 3-slot giant is too sexy to resist.

        I can’t imagine actually being in a position where I am ready for the card and yet must wait for it to become available again, though; that must be quite frustrating. I don’t typically pay attention to the newest tech and so haven’t seen how quickly these new GPUs sell out before. It’s kind of nuts!

    • ThTa says:

      I’ve personally ordered the GTX670 version of that, since it’s still on-par with a default GTX680 but about a hundred euros cheaper. (It’s also ridiculously quiet; as in, more quiet on full load than my current (dying ;-;) HD4870X2 is on idle.)

  13. Axyl says:

    Bought an MSi GTX 580 1.5Gb Twin Frozr II in January.
    One of the fans on the cooler died last week, so that’s getting RMA’d for a refund.

    Bought an MSi GTX 680 2Gb Twin Frozr III today.. £480 all in from Overclockers.co.uk (including FREE Next Day, Saturday Delivery due to the Jubilee) and that’s arriving tomorrow.

    The 580 is an absolute monster, if a touch on the power-hungry side. Well worth the £380 I paid for it 5 months ago however, aside from the cooler failing (103 degrees C it peaked at when i noticed the cooler issue)

    If the 580 is any indicator of the sort of performance I can expect from the 680, then it’s worth the frankly painful price tag attached to it, especially as the 680 is much more efficient in terms of performance-per-watt.

    Tomorrow is going to be a good day. :D

    *EDIT* Here’s the 680 I’ve ordered. – http://www.overclockers.co.uk/showproduct.php?prodid=GX-156-MS

    • PoulWrist says:

      The 680 is about as powerful as a 590 and uses 70 watt less power than a 580. It’s a way, WAY better card.

      • Axyl says:

        Yup, that’s pretty much what I’m expecting.

        Cannot wait. :D

        This stuff is like porn…. Almost. :P

    • fitzroy_doll says:

      One of the fans on the cooler died last week, so that’s getting RMA’d for a refund.

      Same thing happened to me with that same cooler (MSI Twin Frozr II). I think there is something wrong with the fans MSI uses. I replaced mine with a rigged up pair of Noctua PWM fans.

  14. Devrey says:

    I’d like to say a big thanks to Jeremy and his Hard Choices. I used your guides for an el cheapo upgrade of my 5 years old machine. I went with an AMD Phenom X4 965BE and an Sapphire Radeon HD 6850. Together with a motherboard, 8gb of ram and a fresh copy of Win7 it only set me back €300. Thanks! Now I can finally play GTA4 at an acceptable framerate!

    • PoulWrist says:

      The 965BE is an awesome CPU :D have one myself, and it’s just fine for pretty much everything still. And I don’t see myself needing an upgrade anytime soon, what with my monitor bottlenecking all GPUs in modern titles.

    • deke913 says:

      I agree with Poulwrist, I have the same cpu and 8g of gskill 1600 and a budget bought gtx 480 from newegg. I cant find a game that wont run at max settings. Very good considering I only spent about 700 on the whole system.

  15. MD says:

    I’m looking for a low-end replacement card at the moment.

    My current system:
    Motherboard: Gigabyte G41MT-ES2L
    RAM: 2GB Kingston DDR3-1333, will be upgrading to 4GB
    CPU: Pentium E6500 (2.93GHz dual core)
    Monitor: Samsung 2233RZ (22″, 1680×1050, 120Hz)
    Video card, now broken: ATI Radeon HD4650

    In this thread a couple of people have suggested the 6850, although it will apparently be heavily bottlenecked by the rest of my system. I’m considering some cheaper options, like a 6670 for half the price of the 6850.

    Any thoughts?

    • PoulWrist says:

      The 6850 isn’t low-end though. While it would be bottlenecked, you could later get a better CPU + mobo for relatively cheap, H61 + Core i3 for instance, and you’d be doing fine for a long time.

  16. cedarrapidsboy says:

    Bought a 5850 when they were released. Overclocked. Haven’t had any trouble since. Current games on PC, at least the majority, take it easy on graphics cards due in large part to either their multi-platform nature or their indy development roots (excepting outliers like StarForge). There are titles out there that benefit from these beefy new chipsets… but I think they are in the minority.

    That said… anybody want to buy a vintage 5850 for £400?

    • Malibu Stacey says:

      Hell I’m running a GeForce GTX 550 Ti on an DFI DK X48-T2RS with a Core 2 Q6600 (oc’d to 3Ghz per core) and still run current games at high settings without any issue at 1920*1080. The GPU is a little over a year old as the 8800 GTX I had sent as a RMA replacement for my original 8800 GT kept overheating so I replaced it last Easter with the GTX 550 Ti for just under £100. The rest of the machine is pushing 4 years old this summer.

  17. Malibu Stacey says:

    Remarkably, the GK104 GPU that powers GTX 680 boards is actually smaller than the chip inside the GeForce GTX 560Ti, and you can buy one of those for £150. The explanation is a die shrink from 40nm to 28nm transistors, allowing the 680 to squeeze 3.5 billions transistors into less space than the 560 manages to fit 2 billion of the little binary blighters. Ultimately, it’s chip size that most determines manufacturing cost.

    And? What has manufacturing cost got to do with retail price? Unless you’ve somehow figured out how to make R&D cost NVidia absolutely zero then manufacturing cost will need to be inflated to account for things like that (lets not get into stuff like shipping, assembly etc. shall we).

    • Mistabashi says:

      Exactly. I was reaading that part thinking “what the hell is he moaning about?”, it’s really very silly to complain that the fastest single graphics card on the market should be cheaper because of some weird notion that chip size = chip value.

      • internisus says:

        I thought the same thing. I don’t know much about this stuff, but it seems to me that a new architecture and a die shrink imply a great deal of R&D cost that’s a more significant factor than the manufacturing process in the price tag.

        • ThTa says:

          It’s not just R&D costs, it’s also completely reworking your manufactering process and doing so until you get a sustainable yield (they print these chips on what could be described as large plates and then cut them out, but if your manufactering process isn’t up to snuff, a large amount of those individual chips simply won’t work). Usually, they’ll actually start off with a really low yield but start manufactering them at a large scale regardless just to beat their competitors to the punch; to compensate for the large amounts of “lost” chips they have to mark them up, the prices will start to drop as their manufactering process improves. A die shrink is truly ridiculously expensive.

          • Jeremy Laird says:

            The point you guys are missing is this. GK104 is a follow up to GF104, ie the 560 Ti chip. It’s actually smaller than GF104. The R&D and die shrink costs apply to any new generation. It’s a moot point.

            Had AMD’s 7000 series been more competitive, GK104 boards would have been priced and positioned very differently. The pricing reflects as much the competitive landscape as it does engineering and manufacturing costs. If you are happy to send an extra £100 to NVIDIA simply because it happens to have won this round of the graphics wars, then so be it.

            Personally, I think it’s a shame that the trend the Radeon HD 4800 was supposed to set didn’t catch on. If it had, £200-£250 would have been top end for a GPU and that’s the price you’d be paying for 680s. Why you’re so keen to pay £400, I can’t say I fully comprehend.

          • LionsPhil says:

            Why you’re so keen to pay £400, I can’t say I fully comprehend.

            I daresay few of the people defending “new manufacturing processes are expensive to begin with” are actually dashing out to buy one of these things, if only because it follows that “…but they get quite a bit cheaper over not a huge deal of time”.

          • Malibu Stacey says:

            As stated in a reply to a previous comment, I’m running a year old GTX 550 Ti which cost me around £100 at the time in a 4 year old machine & I’m in no hurry to upgrade anything. Hell I’d still be running an 8800 GTX if it didn’t overheat & reboot my machine when under anything resembling 3D.
            I just think your comment in the article was rather disingenuous.

    • xGryfter says:

      The smaller die with increased transistor amount, better performance, and cooler temps is a more valuable chip. This whole smaller chip=cheaper is pretty stupid. This BS about the 680 being a follow up to the 560Ti is getting old, the fact is this chip kicks the shit out of everything else on the market by a wide margin. Value is not determined by the name/classification of the chip but by it’s performance.

      If price is your only determining factor for buying PC hardware then so be it but if you like to have a bit more info on your hardware you should probably look into more detailed reviews with real world performance testing (not canned benchmarks) and explanations on what makes a certain card better than others outside of pure numbers. While value is relative I’m sure we would all like things to be cheaper no matter what they are there is a lot more that goes into determining the value of something outside the obvious and sometimes you know to have a better understanding of the technology to help better convey those determining factors.

      The 680, no matter what the tin foil hat brigade says it should have been labeled as, is worth the cost of admission. To be fair, it still launched at a cheaper price point than the competition while giving the end user much more than just a frame rate boost.

  18. kataras says:

    Apart from having a fetish, I don’t understand the need for constant upgrades or for getting a top-of-the-line card. How many games today push these cards to their boundaries? Metro 2033 comes to mind, BF3, Witcher 2? My 560ti plays these with high settings and playable framerates. If there is one ‘advantage’ to most games being multiplatform, is that the requirements are lower.

    At the same time, the prices of new launches do not go down…

    • MacBeth says:

      Bigger screens have an effect, for one thing – affordability of a 2560×1440 or 1600 panel is improving, and you do need a better card for that, as I discovered…

      Sure, it’s not *necessary* to have a screen so big that you have to turn your head to see the edges, but it’s awesome…

      • kataras says:

        Hmmm true…

      • Howl says:

        You can pick up three 24″ and stick them in landscape mode for Nvidia Surround, for about the same price as a 30″ display. Then you do start needing more GPU muscle. It’s well worth it though. BF3 in surround is as good as it gets right now.

    • xGryfter says:

      There is also the fact that some peoples acceptable frame rate is different than others. How a game engine renders those frames is also a deciding factor, some games feel great at 30fps while others feel sluggish at anything below 50 or so. Personally I like my games to run at a minimum of 55fps with everything maxed and I prefer 120fps. I have a friend who is more than happy if most games hit the 25 to 30 frame rate mark which would drive me bat-shit crazy.

  19. skyturnedred says:

    I just realized it’s been too long since I last upgraded, since I have no idea what exactly there is inside my PC.

  20. MommaB says:

    For god’s sake do yourself and your pc (and power bill) a favour and FORGET about the absolutely craptastic XFX cards – wow I have neer seen such a shockingly bad and cheap cooler :(

    • Mistabashi says:

      What the hell are you talking about?

    • DPB says:

      XFX are a no-go for me after the last card I got from them was dead on arrival. It could just be bad luck in my case, but I tend to see more complaints about their cards than any other brand.

  21. Icyicy9999 says:

    I’m still waiting for a GTX660, Nvidia.

  22. man-eater chimp says:

    I need to do a complete reboot of my entire Desktop rig when I go home this summer, so I may well just bookmark this and any more hardware features RPS does. Please do more!

  23. titan13 says:

    “But the GTX 680 still looks suspiciously like a mid-range chip that just happened to end up insanely fast. And I’d like to see NVIDIA cashing in a little less.”

    I literally could not care any less if I wanted to.

    “Anyway, you could argue the size of a graphics chip is utterly irrelevant to gamers. If it’s fast and efficient, if it’s cool and quiet, it’s worth the money.”

    This.

    • Jeremy Laird says:

      So, you’re happy paying £400 rather than £200? Because that’s the implication.

      • xGryfter says:

        No, that’s your implication based on rumor and heresay. The fact is the performance of the 680 is inline with what one would expect from a top tier follow up to the 580 and then some.

        • Jeremy Laird says:

          Incorrect. It’s got nothing to do with rumour or hearsay. It’s interpretation of facts. These facts:

          GTX 280 576mm2 1.4b 65nm
          GTX 480 529mm2 3b 40nm
          GTX 580 520mm2 3b 40nm
          GTX 680 294mm2 3.5b 28nm

          GTX 460 332mm2 2b 40nm
          GTX 560 332mm2 2b 40nm
          GTX 660 tbc

          Thre’s no rumour or tin foils hats involved. GTX 680 is very different from recent high end NVIDIA GPUS.

      • titan13 says:

        What I meant was, if Nvidia’s die size is smaller and they make more money per card than the competition but their cards are also better or better priced, then I couldn’t give a monkeys about the die size or their profit margins, I’d buy Nvidia without a second thought.

        And also, good for them for being so darn competitive. They done good.

        Edit: I personally don’t resent companies for making money because that’s how capitalism works and I like capitalism because it provides me with stuff that I could not possibly hope to make for myself, by myself (and it’s much better than USSR style communism IMO).

        Edit: If you want to blame someone for Nvidia’s high profit margins, blame the competition, don’t blame Nvidia for being a capable company. They aren’t a charity. Businesses will always price their goods/services wherever they think they will make the most money and the more money they can make, the more likely they will survive and be successful in future. The truth is, they would be bad business people if they priced their products at any other level and they would be letting down their shareholders as well.

        So, would I be happy paying £400 for a card if I knew Nvidia could still have made a decent profit selling it for £200? I might be unhappy if I resented the lack of competition, but I don’t, I actually think we are lucky to have AMD competing at all and I can’t blame Nvidia. So yes, I would be happy. Even at £400, it’s a lot of performance for the money, that’s looking at it historically and from looking at the alternatives.

  24. Mungrul says:

    Not related to cards as such, but I’m wondering what Jeremy’s opinions are on Retina technology and when we’re likely to see that tech trickle-down to the desktop.
    Of course, Retina desktop monitors will have absolutely stupid resolutions, so will likely need to be driven by equivalently stupid cards.
    Thoughts?

    • Howl says:

      It’s already happening at Apple. The next OS X release arriving this summer is geared up for HiDPI and the rumours are that the next set of hardware upgrades from Apple will have retina displays.

    • Jeremy Laird says:

      I’d love to see ultra-high DPI PC displays. Once you go high enough, the scaling and interpolation issues for games basically disappear, so it shouldn’t be a major drag on games performance (but won’t be a benefit either unless you make use of the extra pixels, of course). For desktop work, it would be great.

      IIRC, Windoze Vista was originally supposed to be fully scalable, vectorised etc, but it never came close to happening. I’m not sure if MS is even talking about eventually doing it. Haven’t had a chance to read up about high DPI from Apple machines. But it’s the sort of thing that could actually push me over the edge and have me running an Apple box for work.

  25. Clavus says:

    One thing to note is that the HD7900 seems to have more overclocking headroom. My new Sapphire HD7950 (OC version) is happily running on a 1 Ghz (a fat ~20% above default) clock with room to spare (review sites push it to nearly +50%). Consumes frames like a beast.

    • whatisvalis says:

      Surely the 7850 is a better deal though. $150 cheaper and can easily match stock 7950 speeds OC. Common stable OCs of 1200 / 5000. Hopefully the GTX 670 etc. will drive the prices down on the 7x series.

  26. noodlecake says:

    Boooo! I wanted to see mine listed in the “mid range” bit but it’s in the budget bit… Oh well. At least it’s getting a recommendation of some kind.

  27. Brun says:

    So far my single GTX 480 has stood up to everything I’ve thrown at it. Don’t intend to replace it until well into the 700 series of nVidia cards.

  28. sunaiac says:

    That article is full of bullshit. The “not even hidden” advertisment for the GTX680 was already borderline, but that, seriously ?

    Read real tests (http://www.comptoir-hardware.com/articles/cartes-graphiques/17719-test-nvidia-geforce-gtx-670.html?start=0)(http://www.hardware.fr/articles/866-1/nvidia-geforce-gtx-670-test.html)
    Yep, some tests are not based on HAWX2 or crysis 2 with tesselation in the middle of a city, but on a varied panel of engines and game styles.

    Then divide the actual performance of cards by price.
    Then only choose.

    The GTX670 certainly not wins by a large margin when it does. And it doesn’t win in total. And the more detail gets high (AA,…), the less it stands its ground.
    Oh, and do not forget the card you buy could strangely have a turbo much less good than the press ones.

    GTX580 was an awesome future proof card with undisputed performance (as were all nVidia high ends before).
    GTX680 is a overpriced joke.

    And before the AMD fanboy witch hunt begins, I have a 580 DCU 2 in my rig.

    • drewski says:

      I think you’re being a bit harsh on RPS here. Most hardware sites are gushing with praise for the 680 and, considered it’s the fastest available single core on the planet, it’s no surprise it commands a price premium.

      I personally don’t think it remotely represents value for money, but it’s still an insane piece of hardware. Once AMD hit back, I’m sure we’ll see more reasonable prices as nVidia stretch the architecture.

  29. Jamesworkshop says:

    I’d still be using a 9800GTX if it hadn’t died, 4 years is not a bad lifetime so I can’t complain, replaced it with a £100 GTX560 so I won’t be getting a new card for a while, I wasn’t expecting the 600′s to be as good as they are winning in every metric you care to name.

    Graphics cards have been boring of late as only the mental multiple monitor set-ups really offer any value on these cards.

    Could you do us some decent portable gaming systems reviews because the power and heat reductions in these architectures are getting me very interested on the lower end of the power scale devices.

  30. InternetBatman says:

    Is $200 really a budget card? I have a tiny screen and I don’t max out games, but I’m pretty happy with my old GT430.

    • Arglebargle says:

      I never spend more than $200 on a video card. Buy on the back end of releases, picking up the last generation’s good cards when they drop in price due to the goshwowboyoboy new cards.

      However, I’m of the opinion that the monitor, keyboard and mouse (or, ugh, game controller) are extremely important, as they are the things you use all the time. Get good ones. A decent monitor is money well spent.

  31. MythArcana says:

    There are lots of important factors missing here when comparing cards. I’d like to know what kind of power supply is required for any of these high end adapters. You don’t want your lights to dim when you turn the computer on…

    • mod the world says:

      Power usage is actually improved on these new cards. Even the most demanding cards use < 200 W at peak. (single gpu cards ofc)

  32. Jon Tetrino says:

    You forgot to mention that the 670 can clock to quicker than the 680′s stock – resulting in equal to better performance. You can pick up £320-340 cards that are pushed to this point out of the box – and by that metric, it is certainly worth its value.

  33. RegisteredUser says:

    I quite honestly think heat, noise and most importantly long term watts-per-FPS achieved costs should factor in.
    I for one have been gaming less and less, and I want a card that in 2D mode and when watching HD content saves tons of energy and runs ultra cool with barely audible cooling.

    Its one of the reasons I am in no hurry to replace the 5770 AMD I have, because it is one of the lowest wattage eating parts I’ve seen. Admittedly, I haven’t compared in 2012, but last year there weren’t any real faster-at-same-price-and-wattage cards.

    Basically the challenge is: low energy use + just barely able to run 1920×1080 modern games at very high settings while being under 300 EUR.
    And so far I’ve done quite okay with even proper PC titles at at least high settings with this quite outdated card.
    That 80% of games are games made for 8 year old console hardware kinda helps not frowning over the 3D card every day(even though I wish this were different and we’d actually progress again. But in meaningful steps, not the 6 month arms race cycle we’ve had previously).

    • Shortwave says:

      I gave my old 5770 to a ex and I miss it more than I do her. Haha.
      It actually is a really nice card for the exact reasons you said, I agree!
      And she’s now enjoying all the new games on the market still.
      And doing so at a higher fidelity than a console could provide.

      So yea.. 3 year old GPU.. It cost what now? In theory like 50-60 bucks?

      Lifes good, but the “Pc gaming being way too expensive” myth still lives on.. Ha!

  34. Fox89 says:

    I recently bought a 680 with my new system. It’s amazing and I don’t regret a single penny of the cost. I certainly don’t recommend it to others though, the 670 is way better in terms of value for money. But sometimes you buy with your heart, not your head, and that’s why I got the 680!

    Needless to say, stepping up from a GTX 285 was…’noticeable’. Now if only Max Payne 3 would hurry up and decrypt I might be able to stretch its legs a bit…

  35. Wedge says:

    Yep. I’m waiting for nVidia to drop their midrange, to make the AMD midrange more price competitive. Making due with the integrated chip on my brand new Sandy Bridge system until then =D (yes Ivy Bridge was out when I built it, but the 3570k was a touch more expensive and not noticably better than the 2500k). Luckily will all these indie games and such around, it’s not been so hard to make due. The onboard Intel GPU’s are finally up to the level of a bottom tier video card it feels like.

  36. wodin says:

    I bought a 6850 for just under £90. I’m very happy with it and it’s now OC’d to a 6870.

    Also your best buying a 6850 than a 7850 as the speed difference aint much compared to the price difference, the 7 series wasn’t a massive leap.

    My 6850 plays everything at max no problem..infact it’s the first time I’ve ever been able to play a game like ARMA 2 at max settings quite happly. Then modern shooters like BF3 at all Max it flies. All for under £90.

    • Shortwave says:

      I’ve built like ten computers in the past year or two with 6850′s in them.
      Not a single person has yet felt the need to upgrade or add a second card.
      Not one has fried or given us any grief either.
      Seems you made the right call.

      I think for 90% of people getting more than that is totally useless and just a waste of money.
      I mean, all it comes down to is FPS in the end of the day right? It does just great!

  37. Shortwave says:

    It some places you look the 7850′s are selling for a meesly 10-20 bucks more than the 6950.
    So just be very sure to take a look at that first as it is a newer smarter generation of card.

    With that being said, 6950s are still tearing through games easily and will be for a long time to come.

    Currently using dual 6950s and am able to get plus 120fps in nearly anything I play. So joy!

    I really don’t see myself needing to touch them again for the next 2-4 years anyways.

    • drewski says:

      The 7850 is amazing bang for buck if you’re happy to go AMD, especially considering it’s tiny power draw.

  38. MordeaniisChaos says:

    I refuse to trust ATI. They CONSTANTLY have issues with games not enjoying their drivers, they have pretty limited driver control, and just generally underperform. I can’t even play unreal engine 3 games without shitty shadows because of my 5770. Going to 1080 with max shadows in skyrim makes some bizarre artifacting occur that lets me see through shit, and is ugly as hell. I’ve had way too many issues with ATI and all it takes is a google search of visual glitches in a game and it usually comes up with 20 results saying “ATI USERS HAVING ISSUES.”
    It also sounds like a jet engine, god damn. This gen’s nvidia cards are sooooooo quiet. It’s the difference between OG 360 with disc spin-up vs shiny new 360 with an installed game.

    • Shortwave says:

      I wouldn’t really think about under performing when we’re talking about a first gen DX11 3 year old low end GPU. JS. It’s no shock that some new games give you issues.. They are new games.. It’s an old low end card.. Doesn’t take a genius to figure out that it’ll run into some issues. And you realize nvidia guys have just as many issues but with other games right?..
      Last night my buddy spent hours trying to get dirt 3 to work and it came down to an issue with nvidia..
      So yea. Don’t be such a spaztic fanboy. It’s so distasteful.
      Just wait till’ one day your nvidia GPU pisses you right off.. Lol, it happens to all of us..

      And what do you mean limited driver control? Is there something you needed to do and it wouldn’t let you?

      BTW, I have no issues or bugs when running either Skyrim or any Unreal game with forced max graphic settings (through Catalyst Control) across three 1080 monitors with my AMD GPUS.
      450 dollars for 4gb of vram bb. <3

      The only game I can think of that has in the past year is Rage, but they fixed that now.

    • RegisteredUser says:

      Actually ever since catalysts went from 2 a year to basically monthlyish updates AMD has pulled its weight.
      I used to have Geforces for quite a while and they were a buggy mess just as much. ATI was way more broken back then though.
      The 5770 I have has maybe once run into a game that was completely against it in the last 3 years, and thats about it.

      Shadows from what I can tell are crap because the way some engines do them are crap much rather than how one firm does them over the other. After all, thats what unified DX and pixel shader standards are for, no?

      Okay, okay, not that any of that really always applies..I’ve read the dev complaints. :P

      Still, you make my card out to be somewhat of a monster. Outside of the noisy spinup, which in my case is likely because of a badly airflow optimized case, I don’t share most of your issues on the same card.

  39. Carra says:

    After waiting for nine months,I finally caved in and bought a new PC (thanks a lot for these posts, they were helpful). I put in a GTX670, it’s a very nice step-up from my 8800GT.

    Sure I could wait another few months for the 660 version. Maybe it’s worth it, maybe not. In any case, it’ll be worse than what I have now.

    Now I just need a nice 27″ high res IPS screen… Those Hazros seem to be impossible to find in mainland Europe. Any other, similar screens worth buying?

  40. TaroYamada says:

    Still using a GTX 460 1 GB I bought around 18 months ago. Runs everything I want it to at the highest settings, the only thing that pushes it hard is Metro 2033 at highest settings on DX 9, I get great frames but the temp goes up quite a bit.

  41. drewski says:

    woo i have a 7600GS and a HD3000 laptop wooo

  42. The Tupper says:

    Hi Jeremy.

    I really love your hardware columns, despite being too broke to consider a system update at this time. Your priorities and writing style chime perfectly with everything else on the site. Any chance of a regular update along the lines of Lewis’s Bargain Bucket where must-buys can be highlighted?

    Apologies if this has been requested by others already. Let’s just say that it’s Saturday night and it’s late.

  43. hello, please change your username says:

    You rich europeans/americans do not need articles like this.
    With salaries 100$/hour you can just grab most expensive card that’s available – and that’s it.