Nvidia’s GTX 1070: The 1440p Graphics Card Of Choice?

Hello. Good evening. And graphics. After a brief excursion into the delights of HDR screens, it’s back to This Week in Graphics in which I deliver my subjective, benchmarkless verdict some months behind almost everyone else in the Alpha Quandrant. Being first is so easy, so obvious, after all. This time around we’re filling in the final slot in Nvidia’s new Pascal family of GPUs. If you discount the crazy money Titan X, at least. Yup, it’s the GeForce GTX 1070. As it happens, the 1070 neatly fills what is normally my favoured slot in the overall hierarchy of any given GPU family, namely one rung down from the top graphics chip that’s actually bought in significant volumes. Except, Nvidia’s Pascal family isn’t entirely normal…

The Nvidia GeForce GTX 1070, then. We’ve already touched base with the 1080 and the 1060. The GTX inevitably slots in between. But it’s much more closely related to the 1080. It’s based on the same GP104 GPU / chip / thingie as the 1080.

That means, you could argue, it’s a proper performance graphics chip with things like a decent 256-bit memory bus to the 1060’s cheapo 192-bit affair. It also sports 8GB of memory to the 1060’s 6GB and 3GB options.

Of course, it’s cheaper than the 1080, so something has to give. That something involves turning parts of the chip off. Specifically, the shader count drops from 2,560 on the 1080 to 1,920 for the 1070. The clock speeds are down a smidge, too.

All told, Nvidia reckons the computational clout drops from 8.9TFLOPs to 6.5 TFLOPs. TFLOPs are, of course, a theoretical metric with naff all relevance to gaming. But the measure does give a rough idea of the overall scale of loss in the transition from 1080 to 1070. And it certainly looks like the sort of hit you might actually feel in games.

In that sense, the 1070 is a little disappointing. Taking my trusty AMD Radeon R9 290 as an example of a card that dropped into more or less the same category three years ago, on paper that was much closer to its Radeon R9 290X sibling. Close enough, in fact, that you couldn’t tell the difference in games most of the time.

To put the difference into numbers, the 1080 packs about 37 per cent more TFLOPs than a 1070. A 290X cranks out just 17 per cent more TFLOPs than a 290. See what I mean?

Then factor in the 1070’s wallet pillaging price that starts around £350 in the UK and $390 Stateside (thank you, Brexit) and the appeal of the proposition hardly improves. But let’s not get ahead of ourselves. What is the 1070 like as a gaming weapon?

My muse for this assessment is MSI’s GeForce GTX 1070 GAMING X 8G, which is a pretty fancy pants variant of the 1070 and not exactly an obvious value play at £480. The extra cash buys you about another 100MHz in terms of stock clockspeeds, the promise of much more overclocking headroom thanks to improved power delivery and superior cooling, plus some very nice build quality. Ultimately, it’s a niche product aimed at a very particular kind of enthusiast. But it certainly won’t sell the 1070 short, that’s for sure.

I retraced most of the same territory as I have with the GeForce GTX 1080, the cheaper 1060 and indeed AMD’s Radeon RX 480. Comparing my notes from the 1080 in particular, the good news is that you’d have a hard time picking them apart.

Here’s what I said about the GTX 1080 while playing Witcher 3:

“As for Witcher 3, at first I thought the GTX 1080 had its measure at 2,560 by 1,440. But knock things down to ‘1080’ and there’s a tangible uptick in smoothness and response. It’s subtle, but it’s definitely there.

Speaking of response, that remains a relevant issue for the GTX 1080. There’s definitely noticeably more input lag running Witcher 3 at 4K than lower resolutions. Of course, some games, like Shadow of Mordor, simply have laggy interfaces at any resolution. But the new 1080 doesn’t banish input lag to history.”

And that’s completely true of the 1070, too. Of course, if the 1080 wasn’t quite a single-card 4K solution, the 1070 was never going to be either. But more important is that the 1070 feels subjectively every bit as effective as the 1080 as a card for gaming at the popular 2,560 by 1,440 pixel resolution.

That’s a general trend. The 1070 will handle relatively undemanding games like Shadow of Mordor at 4K very nicely indeed. Tougher titles run well if not always absolutely faultlessly at 2,560 by 1,440. I tried knocking down some settings from ultra to merely high in a few titles. But in truth the subjective impact in terms of both image quality and performance was minimal.

That puts those of you looking for a card to play games at 2,560 by 1,440 in a tricky spot. The Radeon RX 480 is priced aggressively, but it’s not quite the 2,560 by 1,440 killer I’d been hoping for. The 1070 hits the 1440p performance target but at a punitive price.

At something nearer £250 to £300, I’d be a lot happier getting behind the 1070. However, Nvidia has been gradually but relentlessly dragging prices up of late and I can’t quite stomach the £120 / $130 jump from the 1060 to the 1070. It’s enough to make a Radon R9 Fury for £290 look pretty interesting.

Ultimately, however, my hopes are pinned on AMD’s upcoming Vega graphics family, hopefully out no later than early next year, which might inject a little competition into the 1070’s segment. As it stands, my overarching recommendation is hold rather than buy. 2016 has seen a pretty decent jump in performance thanks to a long overdue process shrink from 28nm to 14nm transistors. But it’s increasingly looking like we’ll have to wait until 2017 for the value half of the 14nm proposition to kick in.

Sponsored links by Taboola

More from the web

From this site


  1. pfooti says:

    I agree with you – I’ve got a 1070 I bought about a month ago or so, an ASUS model (strix something or other, it was the least expensive on amazon at the time).

    I play games at 2560 x 1440 by default (I don’t have a 4k monitor), and on my earlier card (gtx 770) I had to tune it down to 1080 for some games to get above 30fps. My particular monitor (an asus, as it happens) does 1440p just fine, but the 1080p on it looks smeary (antialiased pixels and so forth).

    So now I’m perfectly happy playing pretty much my whole game library at 1440p. I usually have nigh-ultra settings on most games, sometimes I turn something down a bit, if it’s dropping below 30fps. On most undemanding games, it’s well above 60fps.

    Great performance, and I feel like I “saved money” by not buying the GTX 1080. It’s still not particularly cheap, though.

    As an aside, the linux drivers for the 1070 actually seem more stable than the ones for the 770. Go figure.

  2. TillEulenspiegel says:

    The price really is painful. I’m seriously considering just buying a GTX 970 (€260, as opposed to €450 for a 1070) to hold me over for another few years, maybe until there exists a GPU which can perform at 4K and the inevitable high-res VR.

    • desirecampbell says:

      I’d suggest grabbing a 1060 instead of a 970. I can see 1060s for about the same as 970s, and they’re ever so slightly faster.

      link to uk.pcpartpicker.com

      performance at 1440p:

      • Unclepauly says:

        Yeah, I’m not watching a whole video just to find out if a cards faster. I’d rather skim an article for 10 seconds to get the important numbers.

        • Ragnar says:

          10 second summary:

          1060 is faster than the 970 for about the same price, and about as fast as the 980.

          6GB model is ~10% faster than 3GB, which won’t make much difference at 1080p but could at 1440p or for newer games.

          MSI’s TwinFrozr and EVGA’s ACX 3 are the best and quietest cooling solutions for those with decent case airflow.

    • Ragnar says:

      1060 6GB models sell for the same price or less than the 970, and offer 980 level performance.

  3. FecesOfDeath says:

    It’s an excellent performer at 3440×1440 ultra/highest settings with AA turned off, though I think if you have a 144 Hz monitor the 1080 would be a better buy. The Gigabyte G1 model seems to be more power efficient than the MSI one in this article, as the Gigabyte only needs a single 8-pin connector to operate.

    • Ragnar says:

      They’re actually about the same power wise, the second connector is mostly there to appeal to those that would scoff at buying a performance card with only one.

      MSI’s cooling solution is better and quieter than Gigabyte’s, and on par with or slightly better than EVGA’s ACX 3. I wouldn’t pay the premium for the Gaming X model, though, and would get a cheaper model instead. They all overclock to about the same anyway.

  4. BarneyL says:

    Where do the ultra wide 2560×1080 monitors fit into this? Does the drop in pixels for these fit into the sweet spot for the 1060\480 range for those of us on a budget but hoping for more than standard HD?

  5. dangermouse76 says:

    So the GTX 1060 is nearly a 1440p killer ? And in the £250-280 range in the UK about now.

    • Unclepauly says:

      Not with newer more demanding games. 1440P is a bit too much for it.

    • PenguinJim says:

      1060 and 1070 Prices: Did Jeremy Laird Get Them Right?

      The 6GB 1060 is often a bit cheaper than £250 – it starts at £230 in the UK, with the 1070 starting from £350. As Jeremy says, a £120 price difference.

      The thing for me about the 6GB 1060 is that it’s 980 performance with an extra 2GB of VRAM. If you’ve been waiting since 2014, you’ve saved less than £100 for your 2+ year wait. It’s not, if you’ll forgive the pun, a game-changer.

      That extra £120 gets you more gaming performance than any GPU released before this year – indeed, any other commercially-available GPU in existence, bar the 1080, and that includes Titans. Exciting!

      It’s a shame that the prices look so poor in £ now, but hey. If we didn’t want it, we wouldn’t have voted for it, right?

  6. Banks says:

    I would upgrade to vega next year, but I think we are still stuck in a transition where it’s better to wait a bit more and this gtx 1070 proves It. I want 1440p and 144hz, but a 500$ GPU or even the 1080 can’t give me both.

    The time will come.

  7. LearningToSmile says:

    I built a PC with a 1070 in it literally yesterday. Installed Witcher 3 to test it out, and it runs flawlessly on almost the highest possible settings(I think I had some space left to crank up the hairworks-specific anti-aliasing but didn’t bother) at 1440p.

    I think the 1070 was the easiest choice of all the components in my PC – anything lower from the current generation just wouldn’t cut it at 1440p, and 1080 is just an awful proposition when it comes to price/performance ratio. Maybe if I had a 144hz display and was made of money I’d consider a 1080, but I was happy to put the spare cash into a beefier CPU(6700k) and fast memory instead.

  8. Baines says:

    I went with a 1060 for a PC that previously relied on integrated graphics, with the plan to see what the future holds. The 1070 just felt too expensive for what remains a fairly questionable period in graphics, where we don’t quite know how the whole Vulkan and DX12 things will fall out, nor do we really know how well current Nvidia cards will be suited for what comes along in comparison to AMD. And there is the increasing push towards 4k, and the specter of VR, and who knows what else.

    The 1080 is just too expensive in general. The 1060 gives decent returns for price, and in a year I can potentially still pass it on to someone else if I decide to upgrade.

  9. pillot says:

    Well as someone that has one of the highest end 1080s available (the hybrid), with an overclock, the 1080 is still not enough to completely blow out the park all titles at 1440p. Playing mankind divided – not even maxed out, AA is completely off – the fps does dip into the 40s. 40-70 range is still perfectly playable, it’s just that when you have a monitor that can do 165hz, 60fps stops being the benchmark of acceptability that it once was.

  10. Zenicetus says:

    Well, drat… I was hoping for a lower price.

    I’m keeping an eye out for an upgrade from my current GTX 970, not because I need more speed as such, but for a jump to 8GB video RAM. The flight sim I fly the most (X-Plane) could take advantage of that. But I think I need the price to come down more to the $300 zone. This is about $100 too much for just a vram upgrade.

  11. Carra says:

    Bought a 1060 earlier this year. My 670 needed a replacement So I was doubting between a 1060 and a 1070. Since I expect to replace my entire system in two years I went with the €120 cheaper option. Quite a bit cheaper and it also runs my games at 1440p at high settings, I’m happy.

  12. Premium User Badge

    Risingson says:

    Wow, you guys must play a lot!

  13. Agnosticus says:

    I too was eying the GTX 1070 to get near my 1440p144, even though I’m not a big Nvidia fan, but in the end it was too expensive, while not being that well optimised for DX12/Vulkan.

    So I went for a custom RX 480 (100 watts less than my old AMD!), having often to decide if I want to run it 1080p100+ or 1440p60. But as soon as VEGA prooves to be strong+affordable, I think I’ll trade the card in and get me a new one.

    EDIT: YAY! :D

  14. Premium User Badge

    phuzz says:

    I tend to alternate who I’m buying my graphics card off, so at the moment I’ve been using an AMD card for a while, and I’ll probably be looking at nVidia in a few months, so I wanted to ask; how’s the driver situation? I hear nVidia’s drivers are a bit rubbish at the moment.

  15. Mechorpheus says:

    If you’re in the market for a 1070 and want to save a bit of cash, it might be worth looking for a 980 ti. They’re going for ~£300 on ebay at least, and have much the same performance profile as the 1070, plus you get more out of them through Overclocking (Maxwell parts tend to get more % OC than Pascal, in my experience at least).

    I’m still running one of those and it handles mostly everything I care to throw at it at 1440p ultra (I’ve only really had to drop settings in Witcher 3, but then who hasn’t?)

  16. Shiloh says:

    Eh, choices choices – I’m in the market for a new gaming rig this side of Christmas (or maybe early in the New Year depending on finances), but currently on the fence about building it myself or buying something off the peg.

    Been using an AMD card in my current machine but might switch back to Nvidia, I’ve no particular brand loyalty.

  17. C0llic says:

    What’s going on with GPUs right now is a little worrying. I’ve always been on team Nvidia – primarily because of their superior drivers which always made their cards more competitive despite both companies being comparable for raw performance (often ATI equivalents were technically faster). But Nvidia has been dominant for a while for now, and this feels like a price-hike.

    I don’t want to see the market suffer due to complete dominance like it has with CPUs – it feels like that may be starting to happen. Please, please do something to shake them up with Vega, ATI.

    • Czrly says:

      I think nVidia are chasing a different rabbit, these days. They’re far more motivated by the potential profit in the data-science world and so their hardware’s greatest gains will be in CUDA performance, not in real-time trick-rendering. (And have no doubt that the games-world *is* trick-rendering. Disney and Pixar and other users of real rendering probably want more CUDA performance for their render-farms.)

      ATI and Intel are basically not in this race. Some might cite the existence of OpenCL as competition to CUDA but it isn’t. It isn’t as widely supported and the developer’s experience under OpenCL is very rough, compared to the tools and docs and material that nVidia give you for CUDA. nVidia cards can run OpenCL and AMD cards can’t run CUDA so any researcher who wants to run everything has to buy nVidia hardware, meaning that they’re more likely to target CUDA instead of OpenCL because doing otherwise would be like choosing a headache. (Papers have submission deadlines, you know.)

      nVidia aren’t being all that anti-competitive. They are helping the OpenCL community quite a bit and aren’t trying to lock their hardware to CUDA. They’re simply wooing the researchers with good tools and docs and stuff – quite honest if you ask me.

      All is not lost, however. All that game developers need to do is start exploiting the power of CUDA. This might mean a shift from pixel prettiness (more streaming multi-processors can help that, but that isn’t what they’re aimed at) towards artificial intelligence and other stuff.

      Personally, I’m right behind this. Games look “good enough” for me. When games play better, I’ll surely want them rendered in higher resolutions but I’d much rather see developers doing something with AI for now, allowing graphical fidelity to stagnate.

      In the cover-shooter I am currently slogging through, the enemy just charge forwards and grab cover and then intermittently pop up and spray bullets. There’s NO coordination in the squad, they do not try to lay down covering fire while one member charges or to flank my position or to do anything innovative. They’re bots, as primitive as can be, and the game-play is headshot-whack-a-mole. Sure, the cinematic sequences between cover-shooting bouts look absolutely smashing but, frankly, I’d rather have clever enemies to fight and text-dumps between bouts than THIS.

      • Baines says:

        Game developers taking advantage of CUDA could be a boost to games, but it isn’t going to happen because AMD doesn’t have CUDA support. Publishers would either be locking themselves to a single brand of graphics card, or they’d effectively be designing two versions of the game. Sure, we have some brand-focused development, but it is largely relegated to a few extra frames per second and non-gameplay graphical bells and whistles.

        It *might* have happened if it was AMD instead of Nvidia, or if consoles were all using Nvidia GPUs, because console-focused publishers would be looking for every edge in the console market without concern for what it meant for PC ports.

  18. Kamikaze-X says:

    1070 really shines when you get good cooling on it and overclock it.

    I have mine under water and i can run it 2100mhz core and 9.8Ghz memory all day long. Almost 1080 perf at that clock.

    I’ve never been disappointed with it. Great CUDA speeds for encoding as well.

  19. DEspresso says:

    Shame Time: I looked at the first picture and thought it was rendered..

  20. PenguinJim says:

    “At something nearer £250 to £300, I’d be a lot happier getting behind the 1070.”

    The British viewpoints have really amused me over the 1070. It was clearly going to be available from £299.99, but between the reveal and the pricing/release the people in the UK voted to leave the EU and bombed the GBP by 15-20% against the relevant currencies, resulting in a 1070 that can be picked up from £349.99 instead.

    But despite this being exactly what was voted for in the UK, somehow this has become Nvidia’s fault! You can see other comments on this article complaining about a “price hike” (although that may be due to their being very new to PC gaming, and not being aware that the 970 was the cheapest ever xx70 release – looking back at the 770, 670 and 570 provides a more suitable context for the 1070’s price).

    It’s hardly fair to use your own country’s currency as a criticism for a foreign product. The 1070 was mostly built by Taiwanese-owned companies for an American company. The prices in America (and Taiwan, funnily enough) are suitable – it’s a card that beats any other GPU (except the 1080) for a relatively good price.

    If you’re already chafing against the prices in this brave new world the UK has chosen for itself, now is the time to leave, before the GBP drops any lower. (May I suggest Canada or Australia, as their currencies have also, relatively speaking, fallen over the last six years, making it less of a psychological “loss” to transfer one’s assets across.)

    • Jeremy Laird says:

      Unfortunately, you have overlooked some fairly obvious facts. The 1080, 1070 and 1060 are each and every one of them more expensive than the boards they replace in both dollars and sterling. This is entirely consistent with my contention that Nvidia has been increasing prices. That’s because they have been increasing prices.

      Moreover, many companies elect to not automatically pass on fluctuations in foreign exchange to their customers. Oddly, they often elect not to do so when passing on the change would mean lower revenue. But some companies maintain pricing even at their own cost.

      The bottom line is that the new cards are more expensive than before in real terms and pretty much any currency. It’s that simple.

      • PenguinJim says:

        Is Jeremy Laird About to be Shown Some Obvious Facts He’s Overlooked?

        (Sorry, I find it difficult to resist a question-headline when commenting on an article with a question-headline! Although I’m choosing to eschew Betteridge, obviously! ;))

        Unfortunately, you have overlooked some fairly obvious facts. For starters, I assume you looked up the 960 launch price to compare it with the 6GB 1060 launch price? The 960 launch price was, of course, for the 2GB model – the 4GB variant didn’t actually get its own official MSRP when it finally launched, bizarrely, but it did launch at $50 more than the 2GB card. So, that’s a $250 (6GB) 1060 versus a $250 (4GB) 960. The 3GB 1060 has now released, at $200. Oh, the same as the 2GB 960 release price! I realise that it’s been a couple of years, and having two VRAM configurations can confuse some people’s understanding of prices, but hopefully you now see that the 1060 is the same US dollar price as its predecessor (which technically makes it cheaper, for the obvious reason! ;)).

        I pointed out that the 970 was the cheapest by far in the xx70 line-up in my previous comment, but I think it flew over your head, so I’ll repeat it here. GTX 570: $350, 670: $400, 770: $400, 970: $300, 1070: $380. It’s fair to say that the xx70 range has always been $300-400, certainly erring towards $400, which rather undermines both your “contention” and your “bottom line”, I’m afraid – it’s not that the 1070 is more expensive, but that the 970 was cheaper. Oddly so, it seemed at the time. And now we know the reason why. ;)

        But, yes, the 1080 is US$50 (9%) more than the 980 was at launch. You are absolutely correct there. It’s indisputable. Well done.

        “Moreover, many companies elect to not automatically pass on fluctuations in foreign exchange to their customers.”

        Good and correct choice of “many” instead of ‘most’ there – it certainly must be in the dozens. But the thing is, Nvidia are maintaining their pricing. All these cards released post-Brexit, with UK prices that reasonably accurately reflect the currency at the time of their UK launch.

        Or if you mean that they’re not maintaining their pricing based on their traditional tier pricing… well, again, they are, as I explained above. Well, except that extra $50 on the 1080. WORSE THAN ATTILA.

        “The bottom line is that the new cards are more expensive than before in real terms and pretty much any currency. It’s that simple.”

        Funnily enough, the £350 1070s we’ve had in the UK are the cheapest 1070s in the world, are they not? :D But I can certainly understand British people complaining about having the cheapest price. We are a nation of complainers, after all. We don’t need a rational reason, as you so aptly demonstrate.

        But the real bottom line is: you should get used to the new value of the pound. It’s your psychological values you’re projecting – not real-world prices. (Also, you should check your facts about previous graphics card prices before trying to correct someone! Or, if it’s an area you don’t understand, feel free to ask! We commenters wouldn’t be commenting unless we were happy to help. :))

        • Jeremy Laird says:

          A lot of words. But it’s actually really simple. The new cards are more expensive than the old cards. That’s true in the UK with our feeble Brexit pounds. It’s true in the US with the all powerful Trump dollar. That’s not a projection. It’s a fact. The cards are more expensive.

          I am not sure why pointing this out upsets you. But that is put of my control. Good luck!

          • PenguinJim says:

            Is Jeremy Laird Advising People to Buy GTX 960s Today?

            “A lot of words.”

            Errr… not really. I grew up with Mavis Beacon. Can’t you touch-type? Or is the rather vertiginous RPS comment formatting confusing you, perhaps? ;)

            “The new cards are more expensive than the old cards. That’s true in the UK with our feeble Brexit pounds. It’s true in the US with the all powerful[sic] Trump dollar. That’s not a projection. It’s a fact. The cards are more expensive.”

            Oh. OH. OOOOHHH. I am sorry. I made a mistake before. I thought you were comparing launch prices.

            Yes, you can get a 960 today far cheaper than 1060 today. Same is true for the 970 vs the 1070. And again for the 980 vs the 1080.

            I’m not sure of the relevance of the point, given the enormous performance differences, but you’re right. The old, slower cards are cheaper today than the new, faster cards. And, hey, just check out those 8800GT prices!!! (That’s a reference to a slightly older graphics card.)

            So, basically, you’re advising people to buy 960s, 970s and 980s today, because they’re cheaper. Gotcha.

  21. xweiss23VT says:

    I have a 970 that runs everything smoothly in 1080p. I use my monitor (144 hz) for online FPS or my TV when relaxing. I purchased it for $270 around 1.5 years ago. With the rebate from the settlement, I feel it was a good investment for $220.

    I have been tempted to purchase one of the new architecture cards but my logic is as followed, what do you think?

    I would rather buy a 4K in a few years so I can get a top brand TV with proven HDR. So why not wait 2-3 years for the next architecture or go SLI with the current? New card + new TV = $1200+ I would think.

Comment on this story

XHTML: Allowed code: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>