Nvidia GeForce GTX 1080 review: A big leap, but not quite a 4K slayer

Nvidia’s GeForce GTX 1080 is no longer top dog in its GPU family – that honour now goes the GTX 1080Ti and, of course, the frankly ridiculous Titan Xp. It has, however, come down quite dramatically in price since I first looked at it, arguably making it a better buy than ever before if you’re after a 4K-capable graphics card. The economically monikered MSI Gaming X 8GB Twin Frozr VI pictured above, for instance, cost a wallet-breaking £695 a year ago. Now you can pick up one like Gigabyte’s equally succinct GeForce GTX 1080 Turbo OC for as little as £489 from Scan. The 1080Ti, on the other hand, has remained at a steady £700 since launch.

A no-brainer, right? Not quite, as there’s also the GTX 1070Ti to think about as well, which costs even less at around £420 and promises near 1080 performance. We’ll be taking a look at the 1070Ti shortly, so we’ll update this page with our findings soon to let you know how we got on. For now, though, I’ll turn my attention back to the regular GTX 1080.

What I’m mainly interested here is whether the GTX 1080 feels like the double-generational jump I’ve mentioned previously. Is it truly that good? Of course, many flavours of Nvidia’s new GeForce GTX 1080 wonder chipset are available. So best to begin by clarifying the specifics of the card in question, the MSI Gaming X 8G. There are some hard points defined by Nvidia’s GP104 chip that lies at the heart of the GTX 1080 card / chipset / whatever.

An MSI Gaming X 8G Twin Frozr VI GeForce GTX 1080, yesterday

So, that’s things like 2,560 eye candy-creating shader cores, 160 texture units for, you know, texturing stuff, 64 render output units for spitting out finished pixels, a 256-bit memory bus and a healthy 8GB dollop of memory. All of that applies to any 1080.

Beyond that, retail cards differ from Nvidia’s so-called reference chipset. The standard core clockspeed and boost clock (the latter roughly equivalent to turbo mode in a CPU) are 1,607MHz and 1,733MHz. This MSI board tops out at 1,708MHz and 1,847MHz respectively.

The MSI also cranks the memory speeds up, but by such a tiny amount that I won’t bother with the details. Truth be told, the tweaks to the operating frequencies don’t amount to anything you’re ever going to feel in games. What you might notice is the cooling solution.

A standard Nvidia board has an enclosed or ducted impeller-type fan for pumping hot, GPU’ed air straight out of the chassis. Sounds like a good idea? Yup, but as it happens sound is the problem. That kind of cooling is relatively noisy. So MSI, like a lot of non-reference designs, has ditched all that in favour of larger and more conventional fans.

Custom cooling makes for silent running

In fact, MSI has rigged this board to power down the fans under low load, making it totally silent. Long story short, this kind of cooling setup typically makes for less din.

If that’s how MSI Gaming X 8G stacks up against the Nvidia reference board, my yardstick will actually be a Sapphire AMD Radeon 290 board. Obviously the 290 is a fairly old card now. So this isn’t about direct comparisons for purchase. Instead I’m interested in both how the 1080 feels in isolation and also the question of whether it really does dramatically improve on the subjective experience of a high-end card from a couple of generations back.

For logistical reasons too abstruse to divulge, my virtual playgrounds in this case extend to Total War: Attila, Witcher 3, that bloody Mordor game and GTA V. As for graphics settings, the general rule is maxxed out, but I’ve flipped a few knobs to ‘off’ that either can’t be used across both Nvidia and AMD cards (like Nvidia Hairworks) or I summarily and unilaterally judge to be waste of GPU cycles. Neither the game title choices nor the settings are scientific. It’s not the point. Complaints on a postcard, which will be filed in the circular receptacle beneath my desk with an autocratic flourish.

Two cards, one review…

Oh, and resolution-wise I sniffed around each game and with each card at 1,920×1,080, 2,560×1,440 and ye olde 4K, otherwise known as 3,840×2,160 pixels. Anyway, that’s how I’m rolling and the scene is set. But what did I learn?

First, the 1080 is not a universal 4K panacea. It batters Shadow of Mordor and its orcish malevolence into submission at 4K, no question. Ditto GTA V. Both feel super slick and super smooth on the 1080 at 4K. That’s not something you can say about the Radeon 290. Think just about playable but slightly juddery and you’ll have the right idea.

Then there’s Total War: Attila. At first, I thought the GTX 1080 had that nailed, too. Then I zoomed right in among the troops and surprisingly but undeniably noted that the buttery smoothness gave way to the unmistakable staccato that accompanies fewer frames being rendered.

Time to dust off that old Radeon 290…

The same goes for Witcher 3. In fact, the 1080 struggles just a little with Witcher, generally. It’s playable, but not truly effortless. I don’t want to get bogged down with talk of frame rates, but if I had to guess, I’d put the GTX 1080 at the low 30s in Witcher with my 4K settings.

The 290, of course, is a total mess in Attila at 4K, thoroughly unpleasant and unplayable. It copes a bit better with Witcher, but 4K is frankly beyond it. So the new card is a big step forward. And yet not actually the 4K killer I’d hoped for.

The step down to 2,560×1,440 is where I was confident coming in that Nvidia’s new chip would render all comprehensively asunder. And yet, the harsh truth is that it doesn’t. Not quite. Again, zoom into among the troops in Attila and a very slight drop off can be felt. And before you blame that on CPU limitations, that doesn’t happen at 1,920×1,080.

This is where even the mighty new GTX 1080 comes unstuck

As for Witcher 3, at first I thought the GTX 1080 had its measure at 2,560×1,440. But knock things down to ‘1080’ and there’s a tangible uptick in smoothness and response. It’s subtle, but it’s definitely there. Speaking of response, that remains a relevant issue for the GTX 1080. There’s definitely noticeably more input lag running Witcher 3 at 4K than lower resolutions. Of course, some games, like Shadow of Mordor, simply have laggy interfaces at any resolution. But the new 1080 doesn’t banish input lag to history.

All of which makes the conclusion regarding Nvidia’s GTX 1080 simple enough. It’s a clear and substantial step forward, there’s absolutely no question about that. But it’s not the multi-generational / er-me-gerd leap I was hoping for, nor the final answer to the will-it-play-4K question.

The price certainly makes it more tempting than it was a year ago, but until we’ve taken a look at the 1070Ti, it’s hard to say for sure whether you should take the plunge right now. There’s also AMD’s new Vega 64 card to factor in as well, which again, we’ll be looking at very soon. My advice would be to hold fire for now, and check back in just a few days time.

 

6 Comments

  1. Komsomol says:

    I have a 970. I have been circling buying a new card like the 1080 for a while now. The only issue is that I game at 1920*1080 and it seems like total overkill unless am going straight into 4K. Which obviously means I would need to upgrade my monitor then (£££). Given this article now it seems that it’s not worth it now…

    Should I just upgrade to a 1070? Will it even mean much gaming at 1920*1080? I want to max things out and my 970 is slowly heading into the medium settings region of options recently…

    • Der Zeitgeist says:

      I’m mostly in the same situation right now, with my 970 starting to struggle in some games at 1080p. I thought about getting something like a 1070Ti, but I guess that wouldn’t really be worth it this year, with the GPU prices as they are.

      I guess I’ll wait until next year, and then get a 1170 or whatever it will be called then, together with a new G-Sync screen, and finally make the jump to either 4K or at least 1440p.

    • Premium User Badge

      Drib says:

      Same as both of you. I have a 970 that has been completely great for everything… up until about a year ago. Now and then I’ll get a little stutter, Total Warhammer, etc.

      But I only play at 1080p, and more recently whatever 2560×1080 is, for this wonky ultra-widescreen monitor I bought. So it strains a little, but is it really worth dropping $500+ on a shiny 1080 or what-have-you?

    • fray_bentos says:

      I would say that your upgrade path depends on your monitor, and corresponding upgrade plan for that. I used to have a 970 and a 1080p, 120 Hz monitor. I upgraded to a 1070 and was maxing out the frame rates at 120 fps in most games on max settings. The 1070 felt like overkill for 1080p gaming, and was a factor in persuading me to shell out for a 1440p 165 Hz monitor. I value framerate and smoothness over eye candy (which is typically only noticeable in still shots). Thus, I target 1440p, 120fps by turning down settings in the occasional situations where I need to in very recent games (GPU hogs like physX, tressFX, higher AA settings). To my surprise, I have found that I can tell the difference between 144 fps and 165 fps (in slightly less demanding games from a few years ago) that have panning cameras (racing, fps, 3rd person etc). The down side is that going from in-engine at 165 fps to 60 fps cut-scenes can be jarring, while 30 fps cut-scenes now make my brain implode!

    • Stingy McDuck says:

      You can always downscale from a higher resolution down to 1080p. That’s a great form of antialiasing and a must for some games that use not so good temporal antialiasing solutions, and end up looking kinda blurry at regular 1080p (like Dishonored 2). Besides, I wouldn’t expect an GTX 1080 to run most current games at 4K 60fps, let alone future games, so I don’t think having a 4K monitor is a must.

      If I were in your situation (which I am, because I also have a GTX 970 and plan to upgrade), I would wait until next year. Perhaps nVidia will release a new line of products (Volta), that would leave you with a new GTX xx70 card, that could perform similar to a current GTX 1080ti, an should cost between$350 and $400.

  2. Raoul Duke says:

    The Witcher stuff sounds very odd. I have an HTPC with a very modest R9 380 in it, and it happily smashes out the Witcher 3 at 60+ fps at 1080p with most quality settings cranked up.

    So I am struggling to see how a vastly more powerful card like the 1080 would struggle even slightly with it at 1440p or even 4k. I suspect there’s more than just the power of the hardware at play there.

Comment on this story

XHTML: Allowed code: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>