Nvidia GeForce GTX 1080Ti review: A 4K monster that isn’t worth the extra cash

Nvidia GeForce GTX 1080Ti

As muscular as the GTX 1080 is, there have been some not-entirely-unwarranted grumbles about its underlying tech; specifically, that it’s basically a GTX 1070 with more of the GPU’s cores enabled. The GTX 1080Ti is much bigger break from the rest of Nvidia’s 10-series, and a much more overtly ‘high-end’ card. It uses the bigger, beefier GP-102 GPU, same as in the bonkers-expensive Titan X and Titan Xp, and wields 3584 processing cores to the GTX 1080’s 2560. Its 11GB of memory is the most you’ll find in a mainstream card, too.

Obviously, these upgrades will put a proportionally larger dent into your finances. The MSI GeForce GTX 1080Ti Gaming X Trio I’ve been testing – with its factory overclocking and custom triple-fan cooler – is £750, and generally the cheapest GTX 1080Ti I can find still asks for £698. With the GTX 1080 dropping as low as £500, this card needs to prove it’s not just a list of fancy-sounding specs.

In Doom, at least, it’s definitely overkill as far as 1080p and 1440p are concerned. Both ran beautifully at top settings, and if there were any drops (during particle-rich explosions, say), I couldn’t tell just by watching. 4K does brilliantly too, the GTX 1080Ti having no trouble at all with chaotic brawls around large arenas.

Hitman also gets on very well with this card, though not as well as Doom – in fact, at 1080p, there wasn’t much visible difference between this and the GTX 1070Ti. The frame rate even had its worst drop in the very same place – a short foray onto a penthouse balcony. 1440p still looked great, but with negligible differences to report compared to 1080p.

Nvidia GeForce GTX 1080Ti rear

If that sounds a bit down, know that that 4K runs like a relative dream. Funnily enough, I couldn’t tell the difference once again (besides sharpness) between this, 1440p and even 1080p without an fps counter – you could argue that there’s something a bit off about not getting bigger gains at lower resolutions, but it’s 4K we’re really here for, and good 4K is what you get.

Middle-earth: Shadow of War cuts a nice figure on all resolutions. It’s particularly consistent at 1080p and 1440p, while 4K takes a slight hit but still goes strong. Only in the most densely detailed (or NPC-stuffed) scenes did it appear to drop below the magic 60fps.

While Rise of the Tomb Raider was always smooth as well, it wasn’t always stable; at 1080p with Very High settings, it could switch between lush mega-slickness to more modest (but still good) performance. 1440p was a similar story, though it should be said that you probably won’t be able to tell without a high refresh rate monitor. 1440p is also the resolution where most cards start to stumble with this game, so the fact that the GTX 1080Ti never dropped too far is still quite impressive. Even 4K, which was clearly a tougher test, remained fairly silky.

Nvidia GeForce GTX 1080Ti side

So far, the GTX 1080Ti has mostly been walking it, but I was kind of expecting Total War: Warhammer II – a demanding bastard of an RTS – to trip it up. Not so. At 1080p, the battle map runs as well as you’d like, and although the campaign map initially looks like it’s going to give the card a harder time, it’s only briefly and barely. 1440p was great stuff as well, again using all the highest settings.

4K, however, is where the GTX 1080Ti meets its match. The battle map doesn’t run too badly when viewed from the air, but zoom in on a unit and the frame rate collapses. The exact same thing happened on the GTX 1080, so it’s a shame the extra £200+ isn’t enough to rise above it. There was some micro-stuttering on the campaign map, too.

As usual, Wolfenstein II: The New Colossus ran vastly better. These was a single stutter at 1080p when we rounded a corner straight into a startled Nazi guard, but otherwise, everything is exquisite. 1440p remains excellent too, and when we recreated the corner meeting with the Nazi, that stutter didn’t re-occur. 4K is just dandy, too, with no big issues to speak of.

Similarly, The Witcher 3: Wild Hunt works very well with the GTX 1080Ti across all three resolutions. 1080p and 1440p are both splendid, to the point where this is the only game where this card outperforms the GTX 1080 by something approaching a significant margin. It can also just about handle maxed-out 4K, though since this was accompanied by the occasional fps hiccup, it’d be wiser to turn some stuff down.

Nvidia GeForce GTX 1080Ti ports

Assassin’s Creed Origins keeps its composure at 1080p, even in the typically GPU-punishing, densely-packed villages. 1440p is almost as good, though suffers from occasional hesitations on the highest settings, and 4K takes a much harder hit, its diminished smoothness immediately visible. That said, it didn’t do as badly around the populated areas I was expecting.

All in all, this isn’t some massive generational leap that many may have hoped for, but it is an extremely potent graphics card. If money’s no object, then this is indeed the best you can buy for single-card 4K gaming. I can also attest to this particular MSI model being very well-made indeed. Strong backplate, sturdy plastics, an extra HDMI output for VR, RGB lighting – all the trimmings are here, and while the GP-102 is a big ol’ GPU, the card as a whole is no longer and hardly wider than most GTX 1080s I’ve handled.

The exceptions, build-quality-wise, is that the fans can get pretty loud. It’s not actually like the decibels are particularly high, but the whirring is relatively high pitched, so it cuts through ambient and in-game noises with an unusual decisiveness.

This isn’t the big issue, though. While this does edge ahead of the GTX 1080, it’s so much more expensive that mere edging doesn’t really seem enough. This is supposed to be the final word on 4K, but Warhammer II and The Witcher 3 – to name just two – show that it’s not.

I feel quite bad for saying this because gaming with it has been rather enjoyable, but most of the praise given above could equally apply to its non-Ti little brother. Even if the heart says to splurge, the head says to stick with the GTX 1080.

Check out our guide to the best graphics cards for more reviews and buying advice.

51 Comments

  1. Sakkura says:

    The GTX 1080 came out before the 1070. How can you fault the 1080 for having the same GP104 chip, just fully enabled? It makes no sense.

    And that kind of binning is completely normal. The RX 570 has the same chip as the RX 580, just cut down a bit. Even the GTX 1080 Ti has the same chip as the Titan Xp, just cut down a bit.

    • dirthurts says:

      Yeah I’m with you here. This is how all cards are done these days, basically. It doesn’t makes sense to do it any other way.

    • po says:

      Reviewers need to stop making the assumption that the only difference between cards is how many cores are turned on, when the truth is that in the cheaper cards it may not even be possible to turn them on, because of manufacturing defects on the GPU die.

      The order of their naming makes perfect sense, when you understand that the first card, the GTX1080, has all of its cores enabled, because the manufacturing process for that generation of chip was at that point sufficiently refined, that they could reliably produce dies without defects; the second card, the GTX1070, used a more advanced chip generation, but couldn’t enable all of the cores on the die, as the manufacturing process was not yet refined enough for them to all work; and the GTX1080Ti is the result of that process being refined.

      • boosnie says:

        You’d be right if you weren’t wrong.
        The gpu die in all the 10xx family is GP104 except for the 1080Ti alone that mounts a GP102.
        The refinement process is not happening at this stage.
        All the family is built at 16nm and then gets binned for performance and defects.
        First choice silicon goes to 3rd party 1080 OCs, 2nd choice is reference 1080, 3rd choice goes 1070 (now 1070Ti) and so on.

        • frenchy2k1 says:

          Nope.
          GP102: Titan X (pascal), Titan Xp and GF1080Ti
          GP104: GF1080, GF1070 and now GF1070Ti
          GP106: GF1060s (all models)
          GP107: GF1050s (all models)
          GP108: GF1030
          link to en.wikipedia.org

          And of course, there is the big chip, GP100, used for HPC and some professional visualization.

          Each family is made of multiple chips, each chip is used in multiple products. Enabled features of each product have to do with targeted price point, manufacturing defects and product segmentation.

          • Landiss says:

            Would be relevant if anyone was talking about 1060 and worse cards :p.

          • boosnie says:

            You are right but I was talking about high end spectrum.
            My point was more on the so called “refinement process”.

  2. Premium User Badge

    Drib says:

    This has been what I’ve seen in benchmarks too. It’s a bit better than the 1080, but costs like 150% of it, which isn’t nearly worth it.

    • LucidDreamer says:

      Thats BS. 1080ti is in many cases 20-30 percent faster than 1080, thats a fact. You obviously havent researched it enough, but just saw fringe cases of weak tis.

      • Premium User Badge

        Drib says:

        Cool story, bro. Like I said, it matches what I have seen in benchmarks, but of course I haven’t seen every comparative test. Besides, 20% is “a bit better”, like I said. Besides, even a 30% increase at probably more than 50% increase in cost is a bit shaky a proposition unless I’m running multiple 4k monitors and need that kind of nonsense to back it.

        For what it’s worth, userbenchmarks shows about a 27% increase.

      • Cederic says:

        The problem is that the 20-30% still doesn’t make it capable of high FPS 4K gaming, and the moment you drop to 1440p the extra 50% cost makes it a ludicrously silly purchase.

        When I get the same perceived and usable performance from a card half the price (i.e. using a 1070 for 1440p) then why on earth would I pay £750 for one of these?

        The GTX 1080ti is faster than the 1080, but it still isn’t worth it.

    • frenchy2k1 says:

      This is always the case in tech.
      There is a sweet spot of performance/$$ and the further away you get from it, the worse it gets: price increases exponentially to a linear performance gain. Bleeding edge and “extreme” cards or CPU are a good example of this and this card falls easily under that rule.

  3. hoho0482 says:

    I have slightly lower spec 1080ti. Used for 2560×1080 xplane with high-res real world textures (many many GBs worth) ripped of mapping sites. Still end up gpu bound but only by smidge. Manages between 70 to 45fps at low level flight…

  4. konondrum says:

    I really like that your game reviews lack scores, I can read and form my own judgements. But if is the kind of thing that is going to pass as a hardware review…

    Maybe people who aren’t tech savvy find this kind of thing useful? But to me it’s meandering, uninformative and inexact. It’s very easy to get measurable data and compare it other similar products, but this article makes no effort to do so.

    Perhaps the authors had CPU or other bottlenecks on their system? I have no idea, because I don’t even know what kind of CPU they have.

    • Premium User Badge

      Drib says:

      There are comments like this on every hardware review RPS does.

      They do them the same way as games, subjective watching and no real numbers.

      That’s just how it is.

      • brucethemoose says:

        And the comments have a point.

        Hardware is far more utilitarian than a game. It has a job to do, and either does it well or doesn’t, which makes this kind of review far less useful.

        • konondrum says:

          Thank you brucethemoose.
          This is really what I’m getting at. I just don’t see what value this “review” gives at all. If there are readers here at RPS who find value in them, feel free to chime in.

          • Eponym says:

            Thank you both!

            RPS: these video card reviews are incredibly uninformed and hurt your brand. There are very real world scenarios where the Ti is worth the extra cash. I run one on a 4k display that barely does 60 fps Middle Earth: Shadow of War – my viewing experience would suffer if I instead purchased a vanilla 1080. There are plenty of games like this and some of us want a seamless experience with their expensive AF OLED 4k displays.

        • Beefsurgeon says:

          I find myself barely skimming these hardware reviews. Games are subjective art and require prose. Hardware requires numbers and graphs.

    • brucethemoose says:

      Yeah.

      At the very least, I came into the article expecting a more… Eclectic and non-mainstream set of test games, I guess? Things like Star Citizen, for example.

      Instead, the test suite is pretty typical. This article doesn’t give me much more info than a couple of forum comments, and the text is no quicker to read than detailed charts from more thorough sites, and the “seat of the pants” impression is far less detailed than a good YT review. I don’t see a benefit here.

  5. muro says:

    Witcher 3 works fine in highest setting in 4k on a 1070gtx too (finished it a few weeks back).

  6. geldonyetich says:

    I like my 1070Ti so far. It’s like a 1080 with a minor inferiority complex. Perhaps it suits me better than I might admit…

    • Premium User Badge

      Drib says:

      I enjoy my socially awkward, undateable, gently aging GTX970.

      For some reason.

    • aircool says:

      Got a new PC with the ASUS GTX1070 Ti STRIX AG for £440. One click in the overclocking software that comes with it and BANG! just nudging 2000MHz without breaking a sweat.

      Not sure about it not breaking the motherboard though, I had to make a small case mod to stop it sagging.

      • MattM says:

        I like having a 90deg rotated case for that reason.

      • geldonyetich says:

        I have the exact same model card and encountered a problem with some windows locking up when I bumped up the clock speed with GPU Tweak II.

        Brought the clock back down, uninstalled GPU Tweak II, and the problem went away. Wasn’t even hot when I had the problem.

        Point being that’s the thing about overclocking: your mileage may vary. I might have been able to get it to work with further tweaking, though. I’m inclined to blame the software, as there was no artifacting.

  7. K_Sezegedin says:

    I got a 1080Ti so I could supersample the living hell out of my VR applications.

    Meanwhile my monitor is an ancient HPLP2475w 1920×1200 60hz bwaha, oh well.

  8. Premium User Badge

    syllopsium says:

    We need numbers, without them hardware reviews are less than useless, and anyone dropping this amount of money on a graphics card without researching it with a fine toothcomb is ill advised.

    Also, comparisons with other products, and exploring all the features is useful for anyone who cannot check this themselves.

    There are several reasons for this, the main ones being subjectivity, and lack of experience.

    I remember a long forum thread about gaming on the Intel X3100 (a low power laptop IGP that is not designed for gaming). Eventually the thread was locked because people kept insisting a game was playable, when in reality it was never achieving more than 15fps. I’m sure they had fun, as did I when I played Wing Commander 2 on a 286, but that doesn’t mean the game was running at the rate it was supposed to.

    There need to be comparisons against other cards in this price point – I would usually choose NVidia, but no mention if AMD is now approaching competitiveness?

    It’s good that the additional HDMI port is mentioned, as that’s useful for VR setups, but VR is not tested. People may be assessing high end GPUs for VR suitability, and this is the sort of data expected at this price point.

    • frenchy2k1 says:

      To answer your question: No. AMD has nothing so far to compete with the 1080Ti. Vega64 trades blows with the 1080, but the 1080Ti is on its own…

  9. OmNomNom says:

    But really if you’re doing single card 4K (for gaming) then you’re doing it wrong. There isn’t a card that handle the strain.

    I’d buy a single 1080Ti for 1440p @ 120hz+ experience. I wouldn’t even consider 4K for serious gaming with anything less than SLI.

    • d0x360 says:

      You’re kidding right? I can max most games at 4k60 on a 5820k@4.4ghz, 32 gigs DDR4 & my regular old single GTX 1080fe with +245 clock and +545 memory with voltage at 114%.

      Only a couple of newer games force me to lower settings, I’ll give examples.

      AC Origins. Possibly a CPU issue due to the double layer DRM. I have to set the resolution slider to 80% which is still well above 1440p.

      Shadow of War runs at 4k60 with the auto scaling set to 80 which means if needed it will lower the resolution by 20%

      And that’s it for new games. Here are some newer titles that run maxed out at 4k60 no problem

      Destiny 2, Wolfenstein 2, Doom, Battlefield 1, Battlefront 2, NFS Payback, Forza 7, Forza Horizon 3… basically anything released before September 2017 runs at 4k60 provided I overclock.

      So glad I only spent $500 on the 1080fe and didn’t wait for the inevitable and more expensive ti. Next year when we cards are out I’ll upgrade again and once again I’ll be able to play everything I own now at 4k60+ and everything I buy for the entire year as well except for maybe a couple titles UNLESS they use Vulkan or DX12. I wouldn’t include dx12 but I’m buying AMD next time.

      Nvidia handles both Vulkan and DX12 poorly and they also have issues with HDR when compared to AMD. I checked with a Vega 64 a couple weeks ago using Win10’s HDR mode. With nvidia the colors are…crap but on AMD they are fine. In games it’s a bit better on the Nvidia side but AMD is still outputting more accurate colors which I verified using a professional tv calibration tool that you place on the screen.

      TL;DR

      You are wrong, plain and simple but if you want to keep falling for that Nvidia price trap then by all means continue.

  10. BathsaltxAddict says:

    Can we have the rest of the build for the PC please?

  11. Kawkaw says:

    like some commentators pointed out, you talkin abt VR and not testing this card for VR.
    some VR apps/games require this card, and some apps like oculus medium struggle with 32GB ram + 1080ti.
    so ideal VR setup is going into this direction – 2x1080ti +32gb or more ram + new coffee lakes
    congrats i’ve given you fodder to write a new article

    • Hammerstrike says:

      Multi-GPU configs are not supported in VR.

      • brucethemoose says:

        Ugh, I hate RPS comment link filtering.

        Search for “VR SLI”. It’s basically 1 GPU for each eye.

        • Asurmen says:

          Of which exists in such a limited support as to be useless at this point in time, so you’d be VR proofing your computer for something that may never really be supported.

          Buy the second card once developers really start using it, and not before.

  12. Severn2j says:

    So, rather than spend on upgrading from a 1080 to a 1080Ti, is it better to spend it on a 2nd 1080 and go SLI? Would that even be faster than a single 1080Ti?

    • Sakkura says:

      SLI support in games is a very mixed bag, and it’s probably only going to get worse over time. If a 1080 isn’t fast enough for you, a 1080 Ti is likely the best choice. But if you can make do with the 1080 until a new generation of cards arrives, that will be even better.

    • OmNomNom says:

      Yes SLI is better when it works.

      Best case sli scaling is about 95%. Best case 1080ti over 1080 is about 30%

  13. Screaming_Meat says:

    I think it’s worth remembering who these cards are for. Me, I game on PC for lots of reasons, flexibility, playing how I want to play, more games, mods, crazy peripherals. But also because it’s the only place you can choose to game with no compromises. I have two of these in my PC. I’m not rich, and I’m not daft — I just channel my spare income into this, my main hobby.

    Anyhow, you’ll definitely hit performance walls with one of these cards. Warhammer, sure. Maxed out at 4K one can’t handle it, two however… as I say, no compromises. I can’t recommend the 1080ti highly enough. Sell your kids like I did and get two.

    • OmNomNom says:

      Exactly, its a matter of what you funnel your cash into and if you’ve wasted money on kids or a ridiculous car or whatever.

      I’d have these in SLI if i didn’t already have OC 1080 SLI but i tend to do alternate generations

      • Screaming_Meat says:

        Some people think SLI is a pain. Yeah, it takes more tinkering, but I’m an enthusiast. I find that process fun and rewarding. It’s almost disappointing when something works perfectly right away. You know what I’m talking about, my SLI brother.

  14. mercyRPG says:

    Are owners of this card – Slaves to the Fan Noise – happy with the awful noise?

  15. Kasjer says:

    It think the problem with judging if such card is worth it for 4k gaming boils down to what results you expect. If you want to ram up every setting to ultra and use GPU taxing AA solutions, one 1080Ti is not enough to do stable 60fps in all of the games and two of them might do the job if SLI is properly supported by title. But, you can go down a notch with several settings, especially choose less demanding AA method (as pixel density in 4k is so high, jaggies are barely noticeable anyway). You’ll still have visuals far exceeding what “4k” consoles deliver. In this scenario, 1080 and 1080Ti are sensible choice, you can also choose top AMD card or even go down to 1070Ti if you are willing to sacrifice a bit more bells and whistles.

    But if you are looking for a piece of hardware that can take ultra settings you use in 1080p and translate them to 4k, while delivering the same level of performance as your 1080p/1440p gamig card (970, 980, 980Ti, 1070 and so on) you’ll have to wait until next generation of GPUs drop on the market and in that context, 1080Ti is certainly not worth the asking price.

    • Screaming_Meat says:

      Yeah, I sort of agree, and sort of don’t. As you rightly point out, two of these allows ramping of everything to max in 4K. If that matters to you (I have two of these, so I guess it does to me), the card, or cards, are perfect. I think whether or not that’s worth the money is subjective. I don’t have a car (don’t need one, walk to work), no kids, good job. This is where I put my money. The problem with ‘waiting for the next card’ is that will always be there. And when Volta comes out, if I upgrade I get a great deal of what I spent on the cards back via eBay. I think it’s a case of either being on that train or not, but only the individual can decide if it represents good value to them. All this ‘not worth it’ talk it silly.

Comment on this story

XHTML: Allowed code: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>