The best PC games ever The best PC games of 2018 so far Best graphics card 2018 Best free games Rainbow Six Siege operators guide Monster Hunter: World guide

19

Nvidia RTX 2080 and RTX 2080Ti: Turing's best features explained

There's a lot more to Nvidia's Turing GPU than just ray-tracing

Featured post The Nvidia GeForce RTX 2070, RTX 2080 and RTX 2080Ti

After all the rush and excitement of Nvidia’s Turing launch at Gamescom, there’s now less than a week to go before those shiny new GeForce RTX graphics cards are out in the wild. On September 20 and 27, both the RTX 2080 and RTX 2080Ti will replace the GTX 1080 and GTX 1080Ti as Nvidia’s top dog bestest best graphics cards – and the great thing is I can now finally tell you about all the other cool stuff they can do besides their fancy ray-tracing reflection gubbins.

You can now read my Nvidia GeForce RTX 2080 review-in-progress (and my Nvidia GeForce RTX 2080Ti review here) for look at its performance figures, but rest assured, real-time ray tracing isn’t the only thing that makes these cards stand out. There’s also DLSS, Nvidia’s nifty deep learning anti-aliasing tech that uses AI to get up to double the performance of a GTX 1080. There’s also variable rate shading, which lightens the load on the GPU even further by simulating certain chunks of the environment and concentrating all the detail where it’s needed most, giving you even more performance and delicious frames per second. Then there’s the Nvidia Scanner for one-click overclocking, Ansel RTX for even prettier screenshots and a whole lot more – all explained here in (hopefully) non-techno jargon.

It’s all about speed, speed, speed on this here Turing train, but will it be enough to justify their huge cost? Here’s everything you need to know, including the RTX 2070, RTX 2080 and RTX 2080Ti’s price, specs and release date, as well as all the other tidbits I’ve gleaned from various developers and demo sessions. Let’s go.

Nvidia Turing RTX 2080 specs

Lying at the heart of the RTX 20-series is Nvidia’s Turing GPU. Borrowing much of the same fancy gubbins as Nvidia’s professional Quadro RTX cards, including its RT Cores (or ray tracing cores), Tensor Cores (which are all to do with AI and deep learning bits and pieces) and Streaming Multi-processor (or SM processor, for short), the RTX line is as much about new technological features as it is about raw performance.

Accompanying the Turing GPU is a rather hefty dollop of GDDR6 RAM. The RTX 20-series will be the first graphics cards in the world to take advantage of this super fast memory, beating AMD’s Navi GPUs well and truly to the punch. With at least 8GB of the stuff at their disposal, this should definitely help lift the cards’ respective speeds compared to their immediate predecessors. Here’s how they all shape up in handy graph form:

  Nvidia GeForce RTX 2070 Nvidia GeForce RTX 2080 Nvidia GeForce RTX 2080Ti
CUDA Cores 2304 2944 4352
GDDR6 RAM 8GB 8GB 11GB
Memory Speed 14 Gbps 14 Gbps 14 Gbps
Ray Tracing 8 Giga Rays/sec 8 Giga Rays/sec 10 Giga Rays/sec
RTX operations 45 trillion/sec 60 trillion/sec 76 trillion/sec
Power 175W 215W 250W

I’m not going to pretend to know what all these extra metrics actually mean, like Giga Rays and RTX (ray tracing) operations, because even Nvidia have admitted they had to invent new ones for Turing because it’s just so gosh-darned powerful compared to the GTX 10-series’ Pascal GPU. As a point of comparison, though, the Nvidia GeForce GTX 1080Ti can only do 1.21 Giga Rays/second. That makes the RTX 2080Ti around 8x more powerful in the old ray-tracing department.

In terms of display outputs, each card will also support 8K resolutions at 60Hz with HDR enabled – on a single screen, no less thanks to their DisplayPort 1.4a ready ports, as well as 4K at 60Hz via their HDMI 2.0b outputs. They’re also ready to support the new fancy USB-C tech, VirtualLink. This will be of particular import for VR users, as it means the card’s USB-C output will be able to deliver all the power, display and all the data you need to use a VR headset from a single connector.

Nvidia’s also hoping their Turing cards will be an appealing prospect for streamers, too, as they’ll be able to encode 8K video at 30fps in HDR in real time for the first time, as well as get up to 25% bitrate savings when encoding with HEVC, and up to 15% savings in H.264. 4K streaming will also become a lot more practical, with Nvidia claiming you’ll get better performance than a typical x264 encoding setting you might be using today, with 1% dropped frames when streaming to Twitch at 4K using a 40K bitrate, compared to 90% via x264, and 1% CPU utilization compared to x264’s 73% at 4K over a 40K bitrate.

Hope you’ve got a big-ass case to fit one of these things in…

Nvidia Turing RTX 2080 ray-tracing: what is it and why should I care?

Essentially a piece of really, really fancy light tech that makes shadows and reflections behave just how they do in real life, ray tracing is a big part of what makes Turing so special. It’s in the name of the graphics card, after all (RTX), and it’s not surprising Nvidia have focused almost all their attention on it since their initial announcement.

It’s not proper, proper ray tracing like you get in Pixar films, all told, but Nvidia’s AI boffins have pretty much given us the next best thing – as my interview with Metro Exodus rendering programmer Ben Archard can attest. It’s worth pointing out, though, that it’s still an extra feature developers will have to support rather than an inherent part of the card. Still, if even a fraction of games end up looking as good as Nvidia’s Sol ray tracing demo below, I’ll be very happy indeed.

At the moment, there are just a handful of confirmed RTX games that will be supporting the RTX 20-series’ ray tracing ability, but the demos I’ve seen so far have all been rather tasty looking, from Shadow of the Tomb Raider‘s lively Day of the Dead scene to Battlefield V‘s exploding tank fire in the eyes of your enemies.

That said, with the Metro Exodus team aiming for 60fps at just 1080p with all of its ray tracing tech turned on, there are still a lot of questions that need to be answered regarding how much of an impact ray-tracing will have on a game’s overall performance.

Nvidia Turing RTX cool feature #1: DLSS – what the heck is it and why is it important?

Fortunately, Nvidia’s crammed a lot of other performance-boosting bits of AI tech inside their Turing RTX cards that will hopefully negate any potential ray-tracing impact if developers take advantage of them. Indeed, for all the attention given to ray-tracing over the last month or so, it isn’t actually nearly as interesting as all the other bits and pieces Nvidia’s Turing architecture offers as well.

Chief among them is DLSS, or Deep Learning Super Sampling. Nvidia teased this feature back at Gamescom with this rather vague but intriguing RTX 2080 vs a GTX 1080 performance graph, showing that games with DLSS enabled on an RTX 2080 will be able to deliver up to double the speed of a GTX 1080 at 4K. Now I can tell you how it actually works.

It’s unclear what settings Nvidia used to get these performance stats, and what kind of frame rates that equates to, but what’s really astonishing is the speed boost provided by Turing’s other big feature, Deep Learning Super Sampling (the light green bar)

DLSS is pretty neat. In essence, it’s a super efficient form of anti-aliasing (AA) and uses Turing’s AI and deep-learning-stuffed Tensor Cores to help take some of the strain off the GPU. AA, in case you’re unfamiliar with the term, is a technique game makers use to make lines and edges appear smooth and natural instead of a million jagged pixels.

It’s particularly important when playing games at 4K, as more common AA techniques, such as temporal anti-aliasing (TAA), can often soften images a bit too much (negating that lovely sharp feeling 4K should bring) as well as get things a bit wrong when scenes are happening at speed. Whether you’ll notice any of this without the aid of a freeze-frame camera device, of course, is up for debate.

Once Turing’s AI smarts have been trained with a special neural network dedicated that particular game (in this case, Epic’s taxing Infiltrator demo), DLSS games won’t just produce a more accurate picture, but they’ll also buy you extra frames in the process (click to enlarge).

But! DLSS isn’t just about making images look better. It’s about making images look better and booting the frame rate right up as well. By getting the AI to generate those super high resolution images instead of the GPU, DLSS frees up the GPU to concentrate on what it does best – pushing out all those lovely frames at delicious speeds.

Much like ray-tracing, however, DLSS is a separate feature that not only needs developer implementation, but input from Nvidia as well as they’re the ones that need to train their clever AI boffin machines on each individual game in the first place to make sure they’re doing everything properly. From what I’ve seen so far, though, DLSS is impressive stuff that makes a good first impression. If there’s enough uptake from developers to support it, it could be a major string to Turing’s bow and offer a substantial advantage over whatever AMD’s cooking up.

Nvidia Turing RTX cool feature #2: Variable rate shading

A bit of a nerdy one this, so bear with me. I promise it’s cool. Another nifty bit of AI tech, Turing’s variable rate shading can provide another performance boost by concentrating the detail of a scene where it’s needed most (in the centre, say, where you’re focusing most of your attention), and reducing the level of detail elsewhere, such as things on the periphery or simpler bits of geometry in the foreground.

To give an example, imagine a scene from Forza Horizon 4. The car’s in the middle, you’ve got that wonderful British countryside stretching out in front of you, fields whizzing by to your right and left, and the rest of the road trailing behind you.

The blue sections require the most detail, but the purple and red areas of the scene are less important, so Turing can shade them at a lower rate on the fly to get more performance

Naturally, the car and horizon are the main focus of your attention, so these bits will be given the most detail. The fields to the side are less important, so they might have half as much shading given to them. The road, meanwhile, is definitely not where your attention should be focused in this high-octane racer, so that might receive half as much detail again. The end result is a scene that not only looks nice and sharp where it counts, but delivers a higher frame rate than if everything was being rendered with the same amount of detail.

If you’re worried that this might make games look a bit inconsistent, fear not. Another example Nvidia showed me was the submarine area from Wolfenstein II: The New Colossus. Here, variable rate shading was being used to intelligently assess which bits of the scene to shade properly without a loss of overall quality.

Nvidia's adaptive shading techniques applied to Wolfenstein II

I played a live demo of this tech at Gamescom using the same scene in Wolfenstein II, and I couldn’t spot the difference there either

Technically called ‘content adaptive shading’, this was arguably one of the most impressive demos I saw during Nvidia’s entire presentation, and the difference between having it on and off was almost imperceptible. There was a vague sort of rippling effect present when I was standing stock still surveying the centre of the main control room, but it completely disappeared the moment I started to move. What’s more, the frame rate leapt by 15-20fps when I had it turned on, which is pretty impressive for nigh-on identical image quality.

Nvidia Turing RTX cool feature #3: Nvidia Scanner’s one-click overclocking

I’ll keep this brief because I know I’ve been rabbiting on for a while now, but this new software API should make it easier than ever before to get the best out of your new Turing graphics card with the absolute minimal effort. It’s going to be automatically embedded into all your typical third party overclocking apps like EVGA’s Precision software, and this one-click program will essentially run a mathematical test over the course of just 20 minutes that goes through each and every performance curve and apply its results automatically without you having to do anything at all.

If you’ve ever wanted to get into overclocking but balk at the idea of spending hours fiddling around in your PC’s BIOS settings, this will be music to your ears.

And finally, 32 words on RTX cool feature #4: Ansel RTX

Ansel, but with even more ray-tracing than you’d get in regular ray-tracing gameplay. That’s 32x shadow samples, 40x reflection samples, 12x ambient occlusion, and 10x refractions per pixel. Because Ansel. The end.

Nvidia Turing RTX prices and release date

So how much will these magic graphics cards cost and when do they come out again? Here’s all the info you need in another handy graph:

Graphics card Release date Price
RTX 2070 October 17 From $499
RTX 2070 Founders Edition October 17 £579 / $599
RTX 2080 September 20 From £715 / $750
RTX 2080 Founders Edition September 20 £749 / $799
RTX 2080Ti September 27 From £1049 / $1150
RTX 2080Ti Founders Edition September 27 £1099 / $1199

Until I can talk proper performance figures next week, however, it’s difficult to say whether the RTX 20-series will be worth all that money. All of Turing’s AI know-how is undeniably impressive, and it could spell huge gains in performance that would go a long way in justifying their equally huge increase in price compared to Nvidia’s previous generation of GTX cards.

Then again, while Nvidia have confirmed that some of the features described above are getting bundled into Microsoft’s DirectX protocol, thus ensuring a greater chance of them becoming more widespread going forward, others like DLSS require a bit more work, not least from Nvidia themselves. It will of course be in Nvidia’s best interests to get as many games supporting their tech as possible, but for me there are still too many ‘ifs’ and ‘whens’ right now to even begin making a confident buying decision.

It’s also worth bearing in mind that all of these cards are really only being targeted at those who play games in 4K. Of the figures Nvidia has talked about so far, all have them have been for 4K, so it’s more than likely that this is where we’ll see the biggest leap in overall performance. If you’re only playing games at 1920×1080 or 2560×1440, upgrading to an RTX card will probably be complete overkill for you.

Indeed, the RTX 2080Ti has been described to me as a card that’s only worth shelling out for if you have a 4K monitor with a 144Hz refresh rate. Just let that sink in for a moment. 4K. 144Hz. Most people won’t need anything near that powerful, so unless you’re thinking about upgrading your entire PC system to support 4K, these cards may well be not for you at the moment.

Next week, however, all will be revealed. Stay tuned.

Tagged with , , , , , , .

If you click our links to online stores and make a purchase we may receive a few pennies. Find more information here.

Who am I?

Katharine Castle

Hardware Editor

Katharine writes about all the bits that go inside your PC so you can carry on playing all those lovely games we like talking about so much. Very partial to JRPGs and the fetching of quests.

More by me

Support RPS and get an ad-free site, extra articles, and free stuff! Tell me more
Please enable Javascript to view comments.

Please log in to reply.

Advertisement

Latest videos