Is Nvidia’s New Titan X Uber-GPU Good Enough?

All in black: Nvidia's big new beastie

The same. But different. In a good way. That’s the take-home from the launch of Nvidia’s new Titan X graphics. Yes, it’s another $1,000 graphics card and thus priced well beyond relevance for most of us. And yet it’s different enough, philosophically, from Nvidia’s previous big-dollar Titans to signal something that does matter to all of us. The focus with Titan X has moved back to pure gaming and away from doing other variously worthy and unworthy stuff on GPUs, like folding proteins or, I dunno, surveying for oil.

Generally, I was pretty pleased when I saw the specs for the new GeForce GTX Titan X board. But not actually surprised. I wasn’t surprised because it was already known that that the GM200 graphics chip inside Titan X was an eight billion transistor effort made using ye olde 28nm silicon from manufacturer TSMC, if you care about such things.

That’s an insane number, the eight billion thing. But it’s not actually dramatically bigger than the previous-gen 7.1 billion transistor chip which, in various guises, powered the likes of the original Titan, the Titan Black and indeed the 780Ti.

In other words, eight billion transistors makes for an incredibly complex computer chip. And yet not an order of magnitude more complex, not a generational leap in complexity versus that 7.1 billion figure. In that context, something had to change beyond mere complexity if Titan X was to look sufficiently impressive in the inevitable post-launch benchmarkathon.

That something is twofold. Firstly, Titan X gets the same Maxwell graphics architecture as the likes of the GTX 980, 970 and 960.

It’s an architecture that undeniably delivers the best frame-rate-per-transistor ratio in the industry. It is, in short, almost definitely the best graphics tech for PC gaming currently available.

Titan X is jolly fast, but a GTX 970 still makes infinitely more sense

But it still isn’t quite enough on its own to ensure Titan X delivered a sufficient performance leap. So what Nvidia has done with GM200 is ditch all the fancy gubbins that boosts general-purpose (ie not graphics rendering) performance that went into the original Titan. Things like big fat registers and super-fast FP64 performance – the sort of stuff the contributes to general compute performance on a graphics chip.

The net result is eight billion transistors of pure gaming grunt. And that’s a joy because of what it says about PC gaming. It’s healthy enough for Nvidia to produce an absolutely traditional uber GPU that’s designed expressly to do one thing really, really well. Play PC games.

While the old Titan was a marvel of engineering, it was never a pure gaming product. Indeed, the GK110 chip that Titan was based on first appeared in one of Nvidia’s Tesla boards, which are so compute-centric they don’t even bother with video outputs. There will not, I suspect, be a Tesla board based on this new chip. Nvidia will have to cook up something more specific for that.

All of which means that Titan X keeps the ball rolling for gaming graphics in general. Clearly most of us can’t even consider buying one, but its very existence will help drag the entire market along that little bit, raise expectations of what is possible for PC graphics and encourage game developers to be that little bit more ambitious.

Oh, and the other thing Titan X does is deliver the closest thing yet to a single graphics card capable of 4K gaming without compromise. It’s not actually quite there. But with most games, most of the time, it looks like you get decent frame rates at 4K without needing to switch much of the eye candy off.

I’ve not personally gone all the way to benchmarking a Titan X. But I’ve seen a few running and they’re nice and quiet under load, and clearly kick out some serious frame rates. Worth the money? Not really, but still a no brainer if you can easily afford one.

AMD’s replacement for the 290X is coming soon, but will it be enough?

Or two – it’s high enders like this that actually make sense in SLI multi-card setups. After all, if SLI is borked with a given game, your fallback is still the fastest single card money can buy. So there’s no downside. Well, aside from the minor matter of the extra $1,000.

As for what AMD has in response, the much mooted Radeon R9 390X is due a little later this year, though it’s not clear exactly when. How it stacks up against the new Titan X will probably depend on two things. Firstly, is it made with those ancient 28nm transistors, or will it be the first performance graphics chip to move to newer production tech? And will it be yet another minor respin of AMD’s existing graphics tech, known as GCN or Graphics Core Next, or has AMD responded to the remarkable efficiency of Nvidia’s Maxwell tech?

If rumours that the 390X will be water cooled as standard are true, then it seems the 390X is likely 28nm (not actually AMD’s fault) and GCN only slightly revised. And that, I’m afraid, probably won’t be good enough. Even with some fancy new stacked memory technology.

Anyway, I would say I’m pleased to have managed to opine at length on the new Titan X without once mentioning its 3,072 shaders or 12GB of graphics memory. But I just have. So I can’t.

And for the record, UK pricing for the Titan X, which you can’t quite buy yet but can pre-order, kicks off at a little over £850. But then if the precise price makes any difference to you, then you, like me, probably can’t afford one.

27 Comments

  1. TacticalNuclearPenguin says:

    Still, a more “sensible” hypothetical 980ti that uses the same insane number of cores ( maybe a cluster less? ) that is priced at 650-ish ( or 700 with serious coolers ) and that has access to “only” 6GB of VRAM instead of 12 would be an absolutely instant buy for me.

    I’m still glad to see that the 1300 USD price speculation is untrue, but that is only because they stripped out some admittedly non-critical non-gaming features, but then that still leaves us with a 1K USD/EUR ( yay for wild conversion rates ) monster.

    I don’t know, such a monster is a hard thing to put into perspective, especially given the cooler matter. While i absolutely believe you that it runs proper and quiet, i’d easily spend 1K with a 850 Euro titan + a 150 Euro waterblock, rather than 1k on that stupid blower that no matter how good it runs still feels stupid to me.

    I think i’ll sit this one out for at least 2-3 mounts and check what happens. Still, a Maxwell architecture that has around 50% more shaders is absolutely something that deserves a lot of money, i won’t deny that, certainly Nvidia has done some serious homework this time around for a 28nm architecture. If only i wasn’t in the process of replacing my old DSLR camera…

  2. sandineyes says:

    It’s an impressive card, but the pricing is insanity. The original Titan cards were able to justify themselves at that price because they were meant as an entry-level professional graphics card, but this is just a gaming card. Consider the last few high end single GPU launch prices from nVidia (in the US):
    GTX 680: $500
    GTX 780: $650
    GTX 780 Ti: $700

    Also, it is maddening that they refuse to allow these things to be sold with open-air coolers. They may have the best blower out there, but it is a complete waste to someone who has a case that can handle a much more silent and effective open-air cooler.

    • TacticalNuclearPenguin says:

      I guess this time around they are justified by the ability to push 20nm high-end performance on an otherwise “old” 28 nm chip.

      Actually, i think there’s nothing strange in that, they really pushed all there is to push. Still, obviously, the disadvantage is on the customer there.

      If you buy this card, it’s because you’re prepared for a technological leap way ahed of it’s time. Ultimately, you still need to have enough income to justify it, which somehow still validates your point. Think of it another way, you might want to wait and see how AMD answer, but the truth is that since they don’t really have such an efficient architecture on their hands ( not that we know off ), they have two options:

      1) go 20nm, which is already invalidated as it just won’t happen.
      2) stay 28nm aswell, but then if they want to match this thing they’ll have to resort to a super massive die size that, alone, will raise the prize considerably, throwing them off their usual “bang for buck” approach.

      • TacticalNuclearPenguin says:

        But yeah, as i said in my above comment, i absolutely agree on the blower. It’s unacceptable. It might be the best blower in the universe. Hell, it might be the best one in the next 10 years. It’s still something that i wish to see addressed, so i hope they won’t limit it like the previous Titans.

    • Jeremy Laird says:

      Well, the GPU in Titan X is larger and thus ostensibly more expensive to produce. What’s more, because it’s a pure gaming GPU, none of that $1,000 is going on GPGPU features you, as a gamer, will never benefit from.

      So, I can’t see how for gamers it’s not better value than the original Titan at the same price. Obviously if you want a GPGPU card, that’s a whole different ballgame.

      And as mentioned, should form the basis of a very nice 980Ti, though even that will probably priced into orbit. Initially, at least.

      • SuicideKing says:

        Only slight weirdness is that Nvidia sort of killed the Titan brand itself. The point of the Titan was FP64, “the big chip for gaming” was the x80 Ti.

        So according to Nvidia’s Kepler naming scheme, the Titan X should have been the 1080 Ti, with the cut down version being the 1080…

        Actually no: the 980 should have been the 880 and this should have been the 980 Ti, with cut GM200 being the 980.

        Wow they really confused their branding with GM204 and GM200, didn’t they?

        • SuicideKing says:

          No, wait. Traditionally, GM204 would have been the 960 Ti while GM200 would have been the 980. or 880 Ti and 880. Whatever. Damn you, Nvidia. XD

  3. Rindan says:

    I really wish that there existed games that would make this things worthwhile… but there are none. Maybe once the wave of VR d00m hits around Christmas we will be able to justify this, but right now my far more modest GTX 770 is absurd overkill. Nothing makes my modest computer even flinch.

    I know the days of having to upgrade every 2 years was bad in a way… but it was also good because it meant we were actually pushing the limits. Nothing has made me sadder than the “next gen” consoles. I was hoping that we could finally move up a little, but the “next gen” consoles are just bad PCs.

    • Unclepauly says:

      If you are running at any res over 1080P the 770 is underpowered in every other game coming out these days.

    • TacticalNuclearPenguin says:

      Oh there are, this card pretty much rides the GTA5/Witcher3 hype.

      But really, it’s been 15 years that i see this kind of reasoning pop up for any new graphic card and the answer is still the same. Progress might be admittedy slower nowadays, but there are still options. If you want more resolution/framerates/effectrs you still want the latest and the greatest, even on “just” 1440p. 4K is still too difficult to tame, although this one is making some decent progress and a good selection of games might just run at 60 fps indeed.

      Ultimately it’s down to your budget and your preferences, you might think that increasing your resolution beyond a point and asking for even more frames is madness, that’s fine, still there are some people interested in just that and, for them, there NEVER will be a GPU for it. If said people will ever get a GPU that does all they ask they’ll simply increase their pixel count again and wait for a bigger card.

      • lanelor says:

        Tis so true! Still wondering why I took a second R9 280 while most AAA are struggling to use properly the first one. What I want from AMD & NV is to sit down with MS and major game devs and consider their crappy coding and/or optimizing. Why bother with 60$ titles who cant run properly for months, not to mention CF/SLI?

  4. Siimon says:

    Anandtech has a great review of it, with FPS for 1440p and 4k resolutions at ultra and medium details. A bit disappointing that a $1000 card can’t do 4k properly. For ~$1000, two 980/Ti’s w/ 6GB would be a better upgrade… I want a card that can do 120/144fps at 1440p and 60fps at 4k using v.high/ultra settings on current gen games for $1000.

    • Cinek says:

      We’re still not there, I’m afraid. Probably gonna have to way for post-R9 300 GPUs, as I doubt even 300 series will be good enough. Of course SLI/Crossfire can bring enough power, but let’s face it – the amount of problems it brings doesn’t really make it a viable solution.

    • TacticalNuclearPenguin says:

      I’m willing to bet that two 980ti would run more in the 1400-ish ballpark, but yeah, it might be worth waiting for them. The VRAM would still be plenty and, aslong as there aren’t big cuts in the shader department and we can save 300+ euro we might just have a winner.

  5. Tacroy says:

    Where the heck are the 8 GB 900 series GPUs? NVidia was teasing about them back in October of last year and there’s still neither hide nor hair to be seen.

  6. Person of Interest says:

    Here’s my super scientific survey report from checking many reviews:

    Based on reviews from Tech Report, Anandtech, and TechPowerUp, the Titan X appears to be 50% (40%-60%) faster than GTX 970, and 30% (20%-40%) faster than a GTX 980. Anandtech’s benchmarks score towards the high end, Tech Report’s towards the low end, TechPowerUp in the middle. Tech Report may score low because their test machine uses an i7-5960X (8-core Haswell-E) and they make no mention of overclocking it past its base 3.0GHz.

    Power efficiency is between GTX 970 and GTX 980, and overclocking headroom is nearly as good (11%-25% across the five reviews I checked), with resulting performance scaling on par with other Maxwell cards.

    I suppose, if one was willing to pay $220 to go from GTX 970 to GTX 980 for 15-25% better performance, it might be equally (un)reasonable to pay $450 to go from GTX 980 to Titan X for 20-40% better performance.

    GTX 970 SLI will beat Titan X, but only on the few games which are SLI optimized. Although SLI was recently (and IMO rightly) panned by this column, I will pile on by linking to this post by a former Nvidia intern who reveals just how much hack-work goes into the drivers to help SLI perform well for select titles.

    • Subject 706 says:

      Simply panning SLI isn’t fair IMHO. Two card SLI has come along way, and is usually quite problem free (but of course actual performance depends a great deal on specific game optimization). Beyond two GPUs is not really worth it.

    • SuicideKing says:

      I guess the best advice I would give anyone is to wait for the 390X to release, because if AMD price it well then Nvidia will have to drop prices across the board, going by whatever performance benchmarks have been leaked so far (which were accurate for the Titan X, btw).

      • Asurmen says:

        Not sure who will have to compete more on prices actually. Based on those benchmarks they’re a close tie. All Nvidia have to do is price near AMD and heavily push the power efficiency aspect explaining that it will quickly be cheaper to run than AMD. However AMD can’t really afford another price war with Nvidia. They need to highest profit per unit desperately. If AMD don’t produce something that out performs Nvidia in every way by a good margin and gain some market share, I can’t see how they can stay in the business.

  7. JoyceCristal says:

    Now I really want to try out an nvidia card !
    link to essayhelpuk.com
    link to gizmodo.com

  8. DrManhatten says:

    “.. like folding proteins or, I dunno, surveying for oil.” Or running ridiculous large neural networks that will eventually lead to an AGI which will save humanity from self destruction.