Stop That, It's Silly: Nvidia's New Titan X Graphics Card
Quantitative easing meets PC gaming
Sooner than anyone expected, Nvidia has rolled out its latest uber graphics card. It's the new Titan X. It's undoubtedly the fastest and bestest PC graphics board ever and probably by some margin. And it will cost you $1,200 and probably a similar post-VAT sterling figure back in the old, disintegrating empire. Call me a desiccated old cynic, but this is getting silly...
Before I invite a comment-thread flaming with my philosophical observations, let’s get a sense of Nvidia’s new pixel-pumping machine courtesy of the speeds and feeds.
The latest Titan X - for it is thee, there’s no ‘GeForce’ or ‘GTX’ branding - is part of the new 16nm Nvidia Pascal family and thus closely related to the GeForce GTX 1080 I had a sniff around a few weeks ago.
Where the mere £600 / $600 GTX 1080 has 2,560 pixel-prettifying shaders, the new Titan X has 3,584. Nvidia hasn’t dished the details on some of the other specifics including things like texture units and pixel outputs. Be we do know it has a large if conventional 384-bit memory bus hooked up to 12GB of 10GBPS GDDR5X memory.
For context the old GTX Titan X from the now defunct 28nm Maxwell family of chips rocked 3,072 shaders and 12GB of memory, albeit much slower 7GBPS memory. Oh and Nvidia has bumped the GPU speed versus the old Titan by nearly 50 per cent to a peak clockspeed of 1,531MHz.
Beautiful board, beastly price...
When you throw numbers around like that, it all gets a bit baffling. So, for a rough feel of the raw computational impact of this new chip, try this for size. It’s good for 11 TFLOPS of simple number crunching prowess. One can argue the toss over the relevance of that figure for rendering and indeed playing games. But it’s still bloody impressive and not far off twice the 6.6 TFOPS figure attained by the old Titan X, which was not exactly a slouch.
As it happens, the question of whether this board is really for gamers or number crunching of a different kind is where things get complicated with the new Titan X. It seems that it’s based on an entirely new and separate GPU from the GP100 monster that Nvidia revealed back in April as part of its new Tesla P100 compute board.
If we are to believe Nvidia, and at this stage I wouldn’t entirely take everything at face value, the GP100 chip in that Tesla board is a 15-billion transistor hunk where the new Titan X uses a hitherto unseen chip known as GP102 with 12 billion transistors. For the record and at risk of getting swept away by a torrent of codenames, the GTX 1080 uses yet another chip called GP104 that clocks in around the seven billion mark.
The new Titan has 40 per cent more, er, rendery bits than the feeble GTX 1080
Anyway, the point is that at first glance the latest Titan looks like a pure graphics product without any of the compute-centric features of some earlier Titans. However, for the first time, Nvidia is touting this card’s INT8 performance, which is a measure of neural network or so-called deep learning performance and thus very much a non-graphics application. The messaging, then, is a little mixed - is this an out and out gaming card or something else?
Whatever it is, you’re getting 40 per cent more functional units than a GTX 1080 for 100 per cent more money. If that doesn’t sound like a great deal, the new Titan X also has a lower GPU clock than the 1080, so in at least some situations you won’t even get 40 per cent more performance.
The Titan X does have 50 per cent more memory bandwidth than the 1080. But however you slice it, the value proposition looks laughable. I doubt, for instance, that even this new Titan X will prove a total single-card solution for 4K gaming.
Oh my god, it's full of shaders
That’s even more true when you consider the likely manufacturing cost of the new GP102 chip. At 471mm2, it’s much smaller than the 601mm2 of the old Titan X. In fact it’s nearer in size to the GM104 chip in the old GeForce GTX 980, which comes in at 398mm2.
The point is that, generally, the bigger the chip, the more costly it is to produce. But in this instance, Nvidia has released a smaller chip and then ramped up the price. Actually, that’s exactly what it did with the GTX 1080, too. The chip in that measures just 314mm2. Admittedly, there will be variables with new processes, but the pricing of this new GPU family looks positively punitive to me.
At this point I was planning on penning a semi-serious dissertation about what I think is going wrong. It would involve near negative real-world interest rates, quantitative easing, epic inequality and how this thing strikes me as being the graphics card Donald Trump would sell you.
Instead I’ll recall how excited I used to be when a mega new GPU was launched. I remember when the GeForce 6800 Ultra was launched with 16 - yes, 16! - pixel pipes. Even then it was known Nvidia didn’t always play a straight bat, especially not after the partial-precision / four or eight pixels per clock / leaf blowing shenanigans of the FX 5800 series.
In the good old days, men were men and graphics cards had pixel pipes
But somehow it was so much easier to rejoice in the sheer technical majesty of a GPU capable of smoothly rendering Far Cry’s stunning vistas at 1,600 by 1,200 pixels. That was way back in 2004. Adjusted for inflation, at most that was a £600 graphics card.
There’s actually a parallel to be drawn with my other occupational muse, the car market, in which wheel-heeled punters beat eat other to a metaphorical pulp to sign on for the latest limited-edition Porsche, paying 200,000 euros for cars which are so over subscribed that they’re worth three times that much on the open market the moment the first examples are delivered. It’s total madness, which is where all that cheap money I mentioned comes in.
But maybe I’m just a desiccated old hack. Maybe I should be celebrating the mere existence of master works like the Nvidia Titan X and Porsche 911R. On the other hand, surely there’s a point when the price gouging becomes so vulgar you just have to gag? I don’t know about you, but £1,200/$1,200 for a graphics card certainly sticks in my throat.
Or maybe cards like this just aren't relevant to PC gamers and are best ignored? The problem with that notion is that the whole market is being dragged upwards. The new high end is £1,200, the new enthusiast is £600 and the new mainstream is nigh-on £300. I have a hands-on with the new GTX 1060 in the works and a jolly nice mid-range board it is, too, in many ways more impressive than the GTX 1080. But it's 300 bleedin' pounds for the version I have and even the cheapest 1060s are well over £200.
AMD to the rescue with the Radeon RX 480? Hold that thought...