Nvidia GTX 950: The Cheap GPU We Were Waiting For?
It's cheap. Will it keep you cheerful?
Monster GPUs, 4K screens, zillions of pixels pumped per picosecond. These things are exciting. But are they relevant to most of us? When a top graphics card costs over £500 and arguably has an optimal working life of about 18 months to two years, I'm not so sure. Either way, most of us simply don't buy that kind of clobber. Instead we buy things like the Nvidia's new budget offering, the £120 / $160 GeForce GTX 950. And we buy them because, well, they're actually affordable. But what exactly is life like at the more prosaic end of the pixel-pumping spectrum? To find out, I've been slumming it with the new 950. This is what I have discovered.
First, let's deal with some speeds and feeds. You might expect given the '950' moniker that this is Nivida's new entry-level graphics card. But not quite. The older GeForce GTX 750 and 750 Ti boards live on.
That's actually good news, because it suggests the 950 should be more than just a generational step over the 750 boards. It's a critical segment higher in the GPU pecking order.
The 750 was always impressive in terms of the performance it delivered given its meagre specification. But even at launch, I didn't think it was truly gaming viable.
Anyway, here's how the new 950 lines up spec-wise. And remember, the 950 sports Nvidia's second-generation Maxwell technology. That means you can't simply look at the number of shaders in older 'Maxwell' cards like a GTX 750 Ti or GTX 770 and draw simple conclusions, though up to a point that is possible with other Maxwell 2 cards like a 960 or a 970.
Zotac's AMP Edition GTX 950 is a little higher clocked than the norm, if you care about that kind of thing
On that note, the graphics chip inside the new 950 is indeed exactly the same silicon as the found in the existing GeForce GTX 960 board. It's known as GM206 in Nvidia's kerazee GPU codex. Of course, the 950 is cheaper than the 960, and that means they've turned bits off.
You still get 32 render outputs. But the texture units drop from 64 to 48 and the shader core counts shrinks from 1,024 to 768. In practice, numbers like that aren't always a good guide to performance. But all other things being equal, the more fancy visual effects a given game has, the more shader count in particular will limit performance.
Meanwhile, the 128-bit memory bus remains and clockspeeds for both memory and GPU core haven't shifted much. Oh, and we're talking 2GB of graphics memory. Which may or may not be an issue for you - more on that here.
Anyway, re specifics the card I have in is Zotac's AMP Edition GTX 950 which goes for an extra £20 / $30 or so and bumps the stock clockspeeds by about five per cent. Whichever 950 you go for, it'll be a pretty compact board, though it's worth noting that it does require a single six-pin power rail. You can't just run these boards off the PCI Express socket, sadly.
Note the six-pin power rail. You can't just run off PCI Express power
If that's the particulars squared away, what's this thing like to game and generally live with? Inevitably, that's a complicated question. Part of the problem is how clever games have become. Many will autodetect your GPU and disable certain rendering features. That means hitting the 'ultra' button in the settings menu doesn't always do the same thing. It varies according to the card in question.
Much of the time, that makes sense. There's little point, for instance, in a game allowing you to enable settings that take the graphics memory usage well beyond the card's frame buffer. That simply always destroys frame rates.
It's a budget board but it's still nicely built
Of course, you can usually force settings via config files. But it's generally worth being aware how this stuff works.
With that in mind, let's talk GTA V. Here's a game that does quite a bit of this background load balancing in order to keep the memory footprint inside the available memory budget.
At a glance, GTA V running 950-stylee looks thoroughly familiar...
Max GTA V out via the in-game menu at 1,920 by 1,080 pixels with the 950 and the result is 1.6GB's worth of memory usage. That's far less than you'd see doing the same thing with, say, a 4GB card.
The good news is that it runs pretty nicely thus configured. It feels smooth and the input lag levels are low. Fire up Fraps and the frame counter reveals numbers typically in the mid to high 30s. In other words, just enough.
The catch to all this is the image quality. I'm pretty sure the texture quality has gone south versus a high end card. What's more, the game has also definitely nixed some of the fancier shader effects. With a high end card, you get bump mapped sand on the beach. With the 950, it's just a flat texture. How much you care is up to you, but achieving playable frame rates does come at a cost.
Dimples in sand are not bump or 'normal' mapped. The humanity...
That said, you can bump the resolution up to 2,560 by 1,440 and not suffer much of a drop in frames rates. Impressive. However, the input lag then becomes pretty unpleasant. Less impressive.
Next up, Witcher 3. Hit the 'Ultra' button when running at 1080p and the GTX 950 isn't playable. Fraps will tell you the frame rates are in the mid 20s, but the lag is pretty horrendous and the overall experience is clunky and unpleasant.
However, step down a notch in the global settings from Ultra to High and the frame rates jump up into the 40s and the input lag largely disappears. What's more, as the screen shots below show, the step down in visual quality is relatively subtle. Nice.
Witcher 3 running in full reheat in 'Ultra' mode above. Plain old 'High' below. Yeah, it's not that dramatic a difference
Running at 2,560 by 1,440 pixels and High knocks the frame rate back to the mid 20s and our old friend input lag returns. Still, it's undeniably impressive to see a card like this coming close to playability at that resolution in a game as pretty as Witcher.
Next up is Total War: Rome II. Of all the games I tried, this was the best match for the 950. At 'Extreme' graphics settings and 1080p, Rome is genuinely very playable and comes without any obvious compromises in terms of image quality. Frame rates, for the record, are in the 30 to 40fps range.
Even better, Rome is actually playable at 2,560 by 1,440 on this board. It's still mostly smooth and largely lagless. In fact, the overall feel is not dramatically different from the MSI GTX 980 I've had installed for the last month or two.
Strategy games like Rome II are probably the best fit with a budget board like the 950
Of course, that reflects the fact that Rome is a relatively CPU-limited game. Moreover, some hardware glitches recently have seen me fall back temporarily to an older six-core Intel Gulftown CPU, which is a few generations old. So that CPU limitation will be even more apparent.
Overall, then, the 950 gets a lot done with a little. That 128-bit memory bus will always make me feel a bit uncomfortable. Ditto the 2GB of memory. But there's also no denying that Nvidia clearly has some very clever data compression tech going on. The 950 does things that you simply wouldn't think possible when you look at the numbers on paper.
Rome even runs pretty sweetly at 2,560 by 1,600 pixels
OK, you will need to keep expectations in check. You'll also need to get your hands dirty tweaking the settings in those games that don't do it for you, though Nvidia's GeForce Experience utility can help you with that.
But the simple answer is, yes, you can have a genuinely good gaming time with the new 950. I also think that it's cheap enough that my default advice re stretching to a £200 / $250 board if you can doesn't necessarily apply. Yes, I still think you should stretch to that kind of price level if you are serious about your gaming. But if for whatever reason you are unwilling or unable, the good news is that you and the new GTX 950 will make a decent goer of things.