Week in Tech: Is Nvidia’s New £150/$200 Graphics Good?

Almost definitely not what a retail 960 will look like...

Ah, the glories of high-end graphics chips. The billions of teensy little transistors. The preposterous pixel pumping power. All terribly impressive. But not hugely helpful if you simply want half-decent frame rates on a plain old 1080p monitor without re-mortgaging everything short of the shirt on your back. In an ideal world, what most of us really need is an affordable £150/$200 graphics card that’ll hook up to that 1080p monitor and run almost anything you chuck at it without worrying about optimising the settings. Well, it just so happens Nvidia has a new GPU that fits the bill, on paper at least. It’s the Geforce GTX 960. Is this the mainstream marvel we’ve all been waiting for?

In RPS posts passim, I may have shown a teensy weensy tendency to preach LCD panels with epic pixel grids. 1440p at a minimum, better yet 4K. That sort of thing.

Back in the real world, the latest Steam survey data shows that 1080p (or 1,920 by 1,080 pixels) remains by far the most popular screen resolution for PC gamers. 1440p and beyond? A glint in the game developer’s eye.

Thus, what most of us need is a video card that cranks out decent frame rates at 1080p. 4K graphical frolics simply don’t factor. You could, of course, make the case for high-end graphics by arguing that overkill today is future proofing tomorrow. And that would be true.

However, graphics cards undeniably suffer from diminishing returns. A £300 / $400 card usually isn’t twice as fast as a £150 / $200 card. Moreover, if you bought a £150 card today and then another in two or three years, you will likely be better off with that second card than soldiering on with the elderly £300 / $400 card bought today.

The new GTX 960 is based on the same Maxwell gubbins as the mighty 980

At worst, performance with a newer £150 / $200 board would likely be on a par. And for sure, you’d have a newer architecture with better future prospects for driver and software support.

All of which brings us to the new Nvidia GeForce GTX 960. It’s ticks the cutting-edge tech box by virtue of its Nvidia Maxwell graphics architecture, which in my not altogether humble opinion is currently the best in the world.

In that sense, the 960 is actually pretty easy to understand. We’ve seen Maxwell graphics in the GTX 750Ti and more recently in the GTX 980 and 970 boards. And we know that it’s particularly good in terms of performance efficiency.

In other words, Maxwell delivers far more usable gaming performance both per transistor and per watt of power than both Nvidia’s older Kepler architecture and AMD’s competing GCN graphics tech, which is found in every AMD graphics card of the last few years, along with the Xbone and PS4 consoles.

Anyway, the GTX 960 is based on a new graphics chip. Not that it really matters, but it’s codenamed GM206. The important numbers are these: 1,024 shaders, 64 texture units, 32 render outputs, a 128-bit memory bus and 1,126MHz core clockspeed.

For context, the desirable but pricey GTX 970 is 1,664 shaders, 104 textures, 56 render outputs (that’s a newly revised figure following a bit of a balls-up by Nvidia), a 256-bit memory bus and 1,050MHz clockspeed. Oh, and the 750Ti rocks in at 640 shaders, 40 textures, 16 outputs, 128-bit bus and 1,020MHz core clock.

AMD’s Radeon R9 285: Bigger bus equals better bet

Really roughly, then, what we’re looking at is a card that falls pretty neatly between Nvidia’s current high end and what you might call the lowest rung of genuinely gaming-capable graphics in the 750Ti. With one exception. The memory subsystem.

The 128-bit memory bus on the new 960 looks stingy. If you look back on Nvidia’s recent past, the 760, 660 and 560 all had wider buses and – here’s the kicker – more memory bandwidth. Yikes. The 2GB frame buffer is a slight concern in this age of uber textures, too.

But the GTX 980 and 970’s relatively modest 256-bit bus (in high-end terms) hasn’t stopped those GPUs from being seriously quick. So, perhaps 128-bit and 2GB is good enough combined with 7GHz data-rate graphics memory and the Maxwell architecture’s colour compression cleverness at 1080p?

Nvidia knows what it’s about and how to price its graphics cards, so the answer is largely yes. But in somehow slightly unsatisfactory style. On the one hand, you’re looking at solidly playable frames rates at least in the 40s and 50s with the details set either to mostly high or maxed out at 1080p in modern / graphically zingy games. Crysis 3, Rome, Alien Isolation, Battlefield 4, Metro: Last Light, Mordor – all very doable at pretty high 1080p settings.

However, really ramp things up and the wheels do begin to fall off. This card conspicuously doesn’t like Shadow of Mordor’s high-res textures, for instance. More to the point, what the 960 isn’t is overwhelmingly faster than the old 760. Which is what you really want given that the 960 ostensibly replaces the 760. It’s faster much of the time. But not always and not often by all that much.

Mordor’s high-res textures will give the 960’s memory bus a battering

AMD’s Radeon R9 280 and 285 alternatives can make for some uncomfortable comparisons, too. In some ways it’s a real mismatch. The 280 especially is a hunky old thing with a 384-bit memory bus that’s no less than three times wider than the 960s.

That the 960 trades blows with the likes of the 280 is kind of impressive. But, generally, I remain a little uneasy about that meagre 128-bit bus. Even at 1080p, I worry about future games with even more memory-intensive textures and whether they’ll start bunging up the 960’s bus.

Saving graces? I would like to say overclocking. Most, maybe all, GTX 960 cards on sale are factory overclocked, for starters, and it looks like core clocks not a million miles away from 1.5GHz are usually realistic if you want to get your hands dirty. With a memory bump, you’re looking at about 10% additional performance. That’s nice, but not really enough to make a huge difference to the subjective experience.

Power consumption and noise are certainly upsides. This thing draws way less power than anything else with comparable performance and that makes it ideal for teensy PCs or silent systems in living rooms.

Long story short? I’m not recommending the new 960 as a no brainer. It’ll fit well in certain scenarios where power and noise are critical. But thanks to that memory subsystem, it’s not a simple case of setting everything to high and letting rip. Existing cards like the GTX 760 and Radeons R9 280 and 285 with wider memory buses and optionally a bump to 3GB of graphics memory are pretty clearly better bets despite being based on older technology.

It’s a pity, because a £150 / $200 Nvidia Maxwell card ought to be the perfect affordable gaming solution. But on this occasion, Nvidia has just missed the target.

73 Comments

  1. FCA says:

    Two months ago, I had to choose: buy a R9 280 now, or try to get by on some Intel integrated graphics andbuy a Geforce GTX 960 around now. I’m really glad I didn’t wait, even though some people were telling me that I was stupid for not waiting for the supposedly amazing GTX 960. I’ve seen these things often enough: the hype is never fully realized.

    The only thing I don’t like about my R9 280 (apart from the stupid name/numbering) is the power consumption, but so far it doubles nicely as a (very slight) room heater.

    • 3Form says:

      This is fairly reassuring to me too. I just took the plunge and decided to get a new system (I’m on an E6600 and 9800 GTX+) and my friend was advising me to get a GTX 960 for ~£180. But I noticed an exdisplay R9 290 going for the same price and having trawled through benchmarks and reviews went for it. Now I’m being ridiculed for being an AMD fanboy of course! :)
      Little worried about the power draw, but my 9800GTX+ idles at 47 degrees and hits 90 under load so I’m not too fussed about temperatures.

      Either way I’m glad to heard good things about your 280.

    • Premium User Badge

      rustic says:

      One should never forget that there is ALWAYS something better due in “just a few months”. Unless it’s about some serious technological generational shift coming up, it’s better to just buy what’s available today as long as you think you need better performance and you have the money.

  2. fish99 says:

    Have RPS reported on that “bit of a balls-up by Nvidia” over the 970? I’m not sure what to make of it, I mean my 970 seems like an awesome card, and I think people getting refunds are maybe overreacting a bit, but at the same time I’d obviously rather the whole 4GB operated at the same speed.

    It’s probably a bigger issue for people running 4K or multiple screens.

    • thedosbox says:

      There’s a discussion in the forums. However, it shouldn’t be an issue for someone running @ 1080p.

      • fish99 says:

        Thanks. Does anyone know a good tool for monitoring vram usage?

        • Sakkura says:

          There are many tools that report it, MSI Afterburner for example, but none that are really “good”. Because they can’t really tell how much is being used actively, and how much a game has just decided to allocate to stuff it might end up using at some point maybe.

          • fish99 says:

            Okeydokey thanks. I’ll just use it as a super rough guide. I’ve heard some games push >3.5GB in 1080p so wanted to know if it’s true.

            Either way though, I’m not going to feel unhappy about my 970 because it’s clearly miles better than my old 660.

          • TacticalNuclearPenguin says:

            But still, get yourself that Afterburner thingy nonetheless, you can make very granular correction to the whole fan speed logic that are a) better than stock b) tailored to your preference and overclocking a GPU today is horribly safe given all the built in measures against extreme power usage and high voltages.

            Eitherway long story short Nvidia is releasing ( or has released ) a driver that should improve things. They’ can’t solve a physical problem with lines of code sadly, but what they can do is ask the card to be more clever about where to dump certain portions of data. Ideally your 500gb of slow VRAM would be used for stuff that requires less constant juggling.

          • iainl says:

            Personally, the reason I’d most recommend getting the Afterburner thing is that most GPUs hit an annoying noise at a certain fan speed, and setting the RPM curve to just skip that bit right out is a Godsend on the noise front.

          • fish99 says:

            Well, testing done, and Dying Light in 1080p certain does push up to 3.5GB vram usage (plus over 7GB of ram), in fact it seems to get stuck there like the game doesn’t know there’s more vram available. Only tested for around 2 minutes, so I dunno what would happen after a few hours. No noticeable stuttering btw but I have had issues with the game freezing for 3-5 seconds every so often.

            Clearly though people saying games never reach 3.5GB in 1080p are not correct.

            Lords of the Fallen gets up to 3.5GB vram too.

    • bv728 says:

      At reasonable resolutions, you’d need an absurd number of monitors to care in normal 2d use. I mean, ludicrous numbers. Silly, goofy numbers. Numbers that deserve adjectives and description, in relation to their count of monitors. I mean, if you triple buffer all of them, you’re still over 100 monitors. (for reference, a single frame buffer for a 1920×1080 2d screen is about 8meg).

      • Sakkura says:

        Graphics cards are for 3D graphics, and the framebuffer is only a very small part of what the graphics card stores in VRAM.

    • ResonanceCascade says:

      The “balls up” of the GTX 970 is overblown hogwash that should only worry people who fret about theoretical specs rather than real-world performance. In practice, the 970 is an extremely fast and relatively cool and low-power consuming card with a good price tag. It’s still the card to have.

      • sicemma says:

        Personally I still would’ve bought it and I like mine, but I feel for people who got told to go buy 3 of them for future proof 4k (not that this was in any way good advice to begin with, but still), because that’s the case where you may well run into memory problems before you run out of any other performance metric. I really hope nvidia work out something with their partners/retail to get people a refund or trade up or whatever if they want one, and I can’t help but feel they need to do something to make it right with the people who keep their 970, too.

      • TacticalNuclearPenguin says:

        Actually it’s the other way around, all the damage control that has been made focuses on differences in the average FPS, which are indeed minimal.

        Thing is, this measuring system is extremely unsuitable to properly gauge stuttering and Nvidia knows that very well. If you’re used to fluctuating frames in the 30-40 fps territory, you probably wouldn’t notice such a thing, the real kicker comes when an otherwise smooth presentation gets ruined by intermittent frame drops.

        It’s pretty hard to support Nvidia in times like these, especially considering the ( unconfirmed but sadly very likely ) predictions about the new Titan Xgoing around 1300 USD. What’s next, the 980ti as a slightly cut down Titan in shader count and half VRAM, for a whopping 900 USD?

        • fish99 says:

          Makes me wonder if the cards would actually work better with half of that final 0.5GB just disabled (thus ensuring every 0.25GB chip had it’s own L2 cache), rather than having half the bandwidth to that 0.5GB causing bottlenecks with texture swapping. Game engines are usually written to gracefully cope with whatever amount of vram is there, but expect it all to have the same performance.

          Of course nvidia would never implement that change since it would trigger more refunds, even if it improved performance.

          Btw nice to see a rational debate about the issue here, after reading the hysteria on the nvidia forums.

        • ResonanceCascade says:

          Maybe down the line when games get more demanding I will have to change my tune, but so far I have had no stuttering or frame drop issues at all (and I would definitely notice, since I’m doing a lot gaming on the Rift DK2). The card runs like a dream: quiet, fast, and dependable. Totally worth the money, though the misleading advertising is a legitimate concern.

      • ghling says:

        I agree with you that the 970 is still a nice card to have and there is no real alternative currently. But this does not cover the fact that nvidia plainly lied about the specs (sorry, “had a miss-communication between tech and marketing”, sure). You see, the thing is not that the 970 is a bad card now, but selling a product with wrong specifications simply is something close to fraud, no matter if only affects 512MB of ram (that’s 1/8th nonetheless) or 2 GB. And it doesn’t stop there, the whole advertising you could read everywhere where “the 970 is the same as the 980, but way cheaper” is now proven wrong.
        Again, it’s not about 512 MB slower VRAM or a slower bus, its about nvidia lying to everybody. What if the next time, this gets worse and it’s then 1 or 2 GB? If you start accepting such a thing as lying about specs you can be pretty sure there will soon be no way of knowing if you get what you pay for.

  3. Baines says:

    Thanks to the design issue present in the GTX 970, one must now ask the question “Does performance drop when you cross a certain memory threshold?”

  4. vorador says:

    The best thing about the 960 is that the older models get a price drop.

    Anyways, i still rock a 660 Ti and can run anything just fine. Sure i have to lower some settings on some of the latest games like Shadow of Mordor, but that’s the extend of it.

    So i will wait.

  5. barelyhomosapien says:

    Welp my current 770 is still slightly better, which makes me happy and with news that not all of the next directx features will run on current DX11 cards I think I’ll just wait for a fully DX12 compatiable card.

    • Optimaximal says:

      Nvidia have stated that all Maxwell cards will be fully DirectX 12 compliant… not that anyone will use it for 2-3 years.

      • TacticalNuclearPenguin says:

        As long as most cards are compatible with it we might see a faster adoption given that this time the focus is more on performance.

        This time around the general idea is to have the highest number of cards compatible with the baseline features, and then limit the most exotic ones to the new models. I’m pretty optimistic this time around.

  6. Premium User Badge

    Wisq says:

    I’m still running a pair of GTX 690s from 2012 — and interestingly, according to Tom’s Hardware video card charts, they’re still roughly on par with today’s GTX 980s which retail for ~$650.

    At the time, I had money to burn and wanted to build a crazy system — I knew (or I thought I knew) that it was a crazy extravagance and totally not worth the money. But seeing as how I’ve now had almost three solid years of play out of it and each card has only depreciated from $1k to $650 worth of video card, I’m actually not so sure.

    Certainly this would seem to fly in the face of the common wisdom that it’s better to buy lower-end more often. Yes, technically, buying a ~$200 card every year would still cost less, but at the cost of frequent replacements (time is money too) and a worse overall experience (if I’m flying high most of the time and only bottlenecking right at the very end of my long replacement cycle).

    • montorsi says:

      Amusing to look at the power draw on those cards versus the 980s. Nvidia made enormous progress over the span of two GPU generations.

    • Pesticide says:

      do u realize the power u used with those 2 in 12 months could have bought u a 980 most likely

      • Premium User Badge

        Wisq says:

        Eh, not really. Power is still stupid-cheap. Plus it’s a nice office heater on -30ºC days like today.

    • TacticalNuclearPenguin says:

      “Certainly this would seem to fly in the face of the common wisdom that it’s better to buy lower-end more often.”

      It’s also some sort of misguided wisdom since i don’t think it really happens that often. In my experience, those who buy the cheapest cards are also going to stick with them for a long time, meanwhile the spendy enthusiasts will go for 500-1000+ Euro a year on GPUs alone.

      With this in mind it’s always hard to argue and find common grounds between these two camps, the cheap ones just don’t give a crap, if instead they actually do but they can’t afford it they cower in their fortress of denial, while the spendy ones are seen as madmen that always buy “useless expensive stuff” and they inevitably end up buried beneath a mountain of insults often related to spending their parent’s money and other stuff.

    • Sleepymatt says:

      Arguably the sweet spot is buying mid-range every generation (not that I have done mind you). The reason is that high-end loses too much in depreciation ($700 in 3 years is really quite a bit!), and low end might not have a resale market at all (unless manufacturers drop the ball like with the 960). I have seen people price out their route from 570 to 970 and the resale they got on year old cards left them not a whole lot out of pocket while continuously being just off the leading edge (Titan ridiculousness aside) of performance.

      • Premium User Badge

        Wisq says:

        Yeah, the fact that I have two of them is certainly taking its toll, depreciation-wise. But if I had bought just a single one, it’d be $350 in 3 years, which isn’t all that terrible — I definitely would not have gotten the same value out of ~$120 worth of video card each year, and anything cheaper than what I have probably would’ve already needed replacing by now.

  7. disorder says:

    Knowing in advance maxwell to be imminent I was nevertheless and annoyingly forced into buying a new GPU a few months ago, before their release. Choosing a stock 780 primarily on the basis it had the biggest and most reliable-looking heatsink/fan unit a 250W GPU has been ok, over autumn and winter but probably less so over coming spring and summer.

    oh performance? still nothing around that tested my previous card (6870 I recall), if it hadn’t had the discourtesy to die on me, like it’s previous generations of fan or related problems. A pity it hadn’t lasted at least this long. That’s one fancy looking heatsink.

  8. Curry the Great says:

    So um, why didn’t they stick a bigger memory bus on this? Are they really expensive? I don’t really see the lacking technology that forces the downgrade. Why?

    • Optimaximal says:

      It’s about not cannibalising their own products – If they did [put a bigger memory bus onboard], it’s likely that nobody would buy the 970 or 980 because the 960 is power frugal and extreme overclockable…

      • Curry the Great says:

        So basically, we need more competition to drive the prices down. Is that possible? I guess this industry is so high-tech it only left two competitors in the end. And with AMD not having a reply we get the short end of the stick.

        Is trying to screw us into buying the more expensive cards done for bigger profits or is it really needed to cover development costs? I wonder how much profit a company like AMD and Nvidia make on their graphics part.

        • Sakkura says:

          AMD is replying well enough. Their graphics cards are better value for money across most of the range from high-end to bargain bin, and that’s while they’re competing with their old architecture against Nvidia’s new architecture. In a few months, AMD will release a new series of cards with an architecture that will be at least somewhat new, and that should shake things up even more.

          • drewski says:

            I think the biggest problem for AMD remains the power draw. They can have all the fancy clocks and buses and RAMs they like but the lower power on the new NVidia architecture is going to keep them selling.

            I’d like AMD to be able to put up a stronger fight, but until they can at least get in NVidia’s power usage ballpark, they’re only really going to be options for people with money and heat dissipation to burn.

          • Sakkura says:

            GCN 1.0 was only a little less efficient than Kepler, and GCN 2.0 improves on it decently. Nvidia does not have a huge lead in efficiency when you compare same-generation cards.

            The situation today is like early 2012, when AMD had better performance and power efficiency than Nvidia, because the Geforce 500 series was all the latter had available.

          • drewski says:

            That might be relevant when AMD actually get around to releasing their Maxwell competitor, but until then, the R9s are getting smoked on power use.

    • Sakkura says:

      Bigger memory bus means higher manufacturing cost. The GTX 960 is probably really cheap to produce, which is part of why its $200 price tag is so disappointing. But it leaves Nvidia a lot of room for price cuts later on.

  9. Scytale says:

    I recently switched to a 970 and loving it! Awesomely fast and silent! Shadows of Mordor in DSR 1440p Ultra with a smooth playable framerate.

  10. mattevansc3 says:

    Not sure if you are going off the RRP or advised price for the article but out in the wild you will struggle to find one of these for £150. There’s few reference models, most being the factory overclocked models and you are easily looking at £160-£190.
    Overclockers cheapest £159 link to overclockers.co.uk
    Scan’s cheapest is £160 link to scan.co.uk
    Ebuyer’s cheapest is £158 link to ebuyer.com

  11. aircool says:

    How does it compare to my stock 680 that’s been happily chugging away in my PC for the last few years?

  12. airmikee says:

    Great article at the right time. I’m still on a 560 GTX, runs the games I want to play without any problems, but it’s a few years old and I think I want to replace it before it actually dies. The 960 would be a tremendous upgrade for me, but if it drives the prices of older models down, I may go with something else. Just have to do some research when I’m ready to click a buy button.

  13. malkav11 says:

    Overkill today might be (some degree of) futureproofing tomorrow on system components, but overkill on high resolution monitors is the opposite of futureproofing – it’s substantially upping the level of hardware power required to maintain acceptable framerates and thus committing to spending more on every upgrade from here on out. Sure, sooner or later that technology is going to get cheap enough and standardized enough to become a comfortable default target the way 1080p is now, but that hardly means there’s a financial (or technological) advantage to investing in it now rather than then.

    They’re definitely shinier if you have the money for it, though.

    • TacticalNuclearPenguin says:

      1440p now is currently something like 1080p was in 2010: it was reliable enough but still needed the horsepower and no cheap GPU was allowed, not for the most power hungry games at least.

      4K is still very far but if you are a house builder in The Sims and you have a decent GPU i’d argue that building on 40 incher with that resolution seems pretty epic.

      We’re getting there, slow and steady, but i agree that we’re still in the ballpark of high level GPUs in this case. Then again it’s fair to point out that Jeremy was expressively targeting the 1080p territory,

  14. racccoon says:

    Even though I upgraded to i7 4 gig comp and just 16gig ram thus far.
    I haven’t bought a new graphic card yet!
    I have the Geforce GTX 970 on watch as that is the better choice in my opinion, I currently still have the nice n easy GeForce GTX 650, but GTX 970 is my best buy watch, so, once I see the price drop a few more quids or dollars as I don’t care were i get it from as long as its reasonable and cheap +p&h. so its a waiting game for me.

  15. drewski says:

    The long wait for a low power, 256-bit, 3GB, decently priced midranger continutes.

    Hopefully we’ll get a 960Ti with an upgraded bus and more memory at some point in the future. Or new tech will make them drop the 970 into “people with other things to buy” spending range.

    • Perkelnik says:

      Thats my guess as well, 960Ti with wider memory bus (I presume 192bit) and more memory.

      In any case, my 650Ti still serves me well and since GTA V and Witcher 3 are still months away, I can sit comfortably and wait for prices to drop :)

    • TacticalNuclearPenguin says:

      Given how memory modules are set up, with 256 you only get even numbers like 2 and 4, an upgrade on that front is surely possible as some AMD cards have already doubled the count with beefier modules, Nvidia seemed like it planned the same for the 970 and 980 but that was scrapped at some point.

      But a hypotethical 256bit 4GB 960ti for like 50-70 euro more than this and maybe a couple shaders more would be pretty much a killer. If that will ever exist with that slightly raised budget it might stil be some decent value.

  16. c-Row says:

    I have recently switched to a 960 and I am rather happy so far. I am still playing in 1920×1080 so I am not exactly the target demographic for anything beyond that model, mind you.

  17. WereCatf says:

    I was much more interested in the inclusion of a full HEVC – encoder in hardware with GM206. Once Twitch and pals get around to adding HEVC – streaming to their repertoire the 960 will likely prove a cheap, solid device for streaming high-quality content. The previous GPUs in the Maxwell-family, ie. 970 and 980, only support HEVC – decoding in a mix of software and hardware and no HEVC – encoding at all in hardware.

  18. richard335 says:

    The best thing about the 960 is that the older models get a price drop.

    Anyways, i still rock a 660 Ti and can run anything just fine. Sure i have to lower some settings on some of the latest games like Shadow of Mordor, but that’s the extend of it.

    So i will wait.

    Check out my blog link to ulasan-android.blogspot.com

  19. D70CW6 says:

    i have a 660 atm that i bought a year ago – will do me for another few years tbh, as i have a huge backlog of steam games that i’m determined to whittle away at. wont be touching 2014 games for a couple of years.

  20. zat0ichi says:

    There has been a furore around 60fps being the target frame rate for games. “30fps being more cinematic”

    I jumped on the the 970 bandwagon to power my 1080p rig.
    I can say that when running latest AAA titles maxed out with a smattering of AA that I’m still not getting rock solid 60fps. (55dips)

    IMO the 970 is the definitive 1080p card even with its 3.5 +0.5 Vram.
    The 960 with 2gb Vram is a very cynical product.

    “Moreover, if you bought a £150 card today and then another in two or three years, you will likely be better off with that second card than soldiering on with the elderly £300 / $400 card bought today.”

    I played the mid range game from 460 to 970.
    £150 for the 460 then about £50 for each card after that apart from the 970 which I had to put 150 of my own money in for.
    560ti – 660 – 770 = £150

    So I’m looking at about £450 since summer 2010 plus 5 hours of selling and installing time.
    If I’d bought a gtx480 I would not be in as good a gaming place.

  21. Ejia says:

    This sounds good to me, since I’m quite happy with 1080p, but in practice I’d rather spring for the 970. Or, perhaps, wait for the 3xx line and see how far nVidia’s willing to drop prices in response.

  22. zeekthegeek says:

    With that memory it is simply a straight up pass for me. Memory is suuuper important with the way crossplatform stuff is being deployed – especially since they’re targetting systems with gpus that can access a full 8 GB of vram when necessary.

    • Sakkura says:

      Consoles can’t use 8GB of their unified memory as VRAM, some of it will always need to be used by the CPU.

    • KenTWOu says:

      They can’t use 8 GB of vram when necessary, because OS uses 3GB for its needs.

  23. nekoneko says:

    So I’ve got a 660ti right now and can still run most games at medium-ish quality alright. Dying Light was making it choke until I turned off the shadows. Is this a good upgrade for me or should I wait for the inevitable 960ti?

  24. Love Albatross says:

    A question, for Jeremy or anyone. I’m really enamored of the new 21:9 monitors, particularly the AOC U3477PQU.

    However, what kind of performance would I get with a GTX 780, i5 2500K, 8GB RAM when playing at 3440×1440? My system is no slouch and can happily handle everything I’ve thrown at it at pretty much full detail at 1080p, but I’ve never had it hooked up to such a high res display before.

    • TacticalNuclearPenguin says:

      You’re looking at 5 megapixels with that, it’s not as heavy as 8MP that 4k has but it’s still something pretty much big, and the 780 is the cut down from the Titan which had one shader cluster less than the 780ti ( which had only 3GB because greedy marketing tactics ).

      In comparison i’m pushing 3.7MP @1440p with my 780ti and i already can’t use AA on many games. Ultra is doable, but sometimes i’m dipping a bit here and there. With that aspect ratio you have the same pixel density as mine, more or less, afterall the extra dots are nothing more than added space on the sides.

      If you’re going to upgrade soon however you might be onto something, or maybe you can just accept that you’re in for some initial struggle. Those 21:9 thingies look brilliant with the right games, just bear in mind that you’re going to get some extra IPS glow and that the usual reports on the interwebs claim that these things are more prone to develop extra backlight bleeding in some weeks, possibly due to the inherent weakness that comes from overstretching.

      The importance of this last point rests on your preferences, i personally hate IPS glow so much that i splurged the extra cash for a polarized screen, problem is that this only happens in professional products and the price is rather insane. I must admit though that the whole performance at least is wonderful.

  25. TacticalNuclearPenguin says:

    *wrong reply*

  26. mrhidley says:

    I’m going to agree with the article here. You can find the R9 280 or R9 285 for under £150 quite easily, and they’re considerably more appropriate cards if you ever have designs on going past 1080p. AMD really do win on bang for buck most of the time, and I speak as someone with a GTX 980. You can find an ex display R9 290x around £200, that’s a huge bargain.

  27. Person of Interest says:

    Last year I upgraded to a GTX 970 (from a Radeon HD 5850!) because I wanted to run 1080p @ 60 FPS without compromising on image quality. Then a used 2560×1600 monitor fell into my lap over the holidays. Oh well, back to making hard choices in the graphics options menus…

    Regarding the GTX 960: it doesn’t seem like a good deal. Its price/performance is hardly better than the GTX 970, and its performance/watt is worse. And it’s just not fun to have to dial down graphics settings on day one of a new card purchase. Plus you never know when a friend or relative might offer to sell you his/her “old” 1440p monitor for cheap.

  28. amateurviking says:

    Picked up a 970 whilst I was in the US just before Christmas, not expecting to have to change it for a good few years (I hope).

  29. Skategodindy says:

    This is one of the first honest reviews I’ve seen of this card. The first day reviews didn’t even mention how the R9 285 was beating it in VRAM intensive games, or that the R9 280x, which is the same price on many retailers, absolutely destroys it in every benchmark. Hell, they don’t even mention what other cards are in that price bracket, just talking about how it’s a better version/better value than the GTX 760.

    It honestly feels like those first-day reviewers got the cards early and didn’t want to upset nVidia. This seems especially true considering nobody was pointing out the 4GB VRAM issue on GTX 970 either.

  30. CookPassBabtridge says:

    As a sim enthusiast I was rather more interested in them bringing out the promised (?) 8GB 980 to see the impact it has for triple monitor and extensive modding / addons. I am quite interested in a super wide instead though. There just doesnt seem to be a monitor that has it all yet though.

    But yeah, cards.

  31. drvoke says:

    I’ve been running a GTX 275 since… 2008? Have only recently begun to have problems with frames on some games, having to turn graphics options down, etc…. It struggles with Starpoint Gemini 2, but plays perfectly well with Saints Row 4, KSP, DCS A10C, etc…. I kind of have to laugh at calling the 7xx series “low tier, barely gaming capable” or whatever. If even something that many generations removed from my current card is considered not-so-great, is it even worth the money for me to upgrade at all? Are the newer cards just pieces of shit compared to the 2xx generation? Given my experience, how do I even interpret this column? It’s like it’s not giving me good or useful information at all since apparently my views on what is gaming capable (i.e., it plays games at frame rates that aren’t painful for me to look at) are so alien and foreign to the writers views and experience…. I just came into my tax refund and was planning on spending a bit of it on a video card upgrade, now I’m not so sure. Seems a bit of a waste to plop down that kind of cash on a marginal improvement?