Skip to main content

Nvidia DLDSR tested: better visuals and better performance than DSR

How Deep Learning Dynamic Super Resolution looks and runs on RTX graphics cards

As if ray tracing and DLSS weren’t big enough bonuses to owning a GeForce RTX graphics card, Nvidia has just dropped another toy in the chest: Deep Learning Dynamic Super Resolution, or DLDSR. It’s essentially an AI-fuelled upgrade to Nvidia’s DSR downsampling tool, aiming to more intelligently render the frames of your games so that they appear more detailed – without the same performance loss that comes with standard DSR. It’s an intriguing new feature that could make some of the best graphics cards even better, and I’ve been trying it out to see if it performs as effectively as Nvidia claims.

In games, downsampling is the practice of rendering an image at a higher resolution than the display could normally show, then rescaling that image so it will actually fit the screen’s native resolution. This will tax your GPU harder, in the same way that rendering a native 4K image would take more horsepower than 1080p, but in exchange can make games look noticeably sharper.

Watch on YouTube

For years, DSR has been making it easy to enable downsampling at the driver level, and now DLDSR looks to take the next step: using the Tensor cores on RTX cards to provide the same fidelity enhancements, without losing as many frames per second.

For reference, I tested DLDSR on a GeForce RTX 3070, alongside an Intel Core i5-11600K and 16GB of RAM. Just to reiterate, you will need an RTX-branded GPU to use DLDSR (unlike DSR, which works on all recent and semi-recent Nvidia GPUs), though since it operates at the driver level it should work with most games out of the box. Out of the .exe. Whatever.

Since DLDSR launched in an Nvidia Game Ready Driver with God of War optimisations, I decided to focus my testing on the adventures of ol’ Grumpy Beard himself. And straight away, it confirmed one similarity between DSR and DLDSR: neither are particularly fond of borderless windowed mode. There’s an easy workaround for this, though: once DLDSR is enabled in Nvidia Control Panel, you can set your monitor to the higher render resolution in Windows Display settings. Just as with DSR, this should let you select that target resolution in any in-game settings, even if it’s in borderless mode.

In announcing DLDSR, Nvidia cited a Prey comparison that put the new tech on equal performance footing as native rendering, with standard DSR far behind. In God of War, this turned out half true: my 1080p, Ultimate-quality benchmark averaged 62fps on the DLDSR 2.25x setting, which is a considerable drop from my native resolution result of 87fps. The supposedly equivalent DSR 4x setting, however, only produced 48fps, so DLDSR does indeed go easier on the GPU.

Visually, too, DLDSR 2.25x not only looks sharper than native, but to my eyes beats out DSR 4x as well. Take a peek at the comparison below: you can see how DSR tidies up details like the shape of the glowing chain links, or the grass just to the right of Kratos’s unkempt jaw. But DLDSR improves this further by adding definition to the Nordic wall pattern on the left and sharpening up the ground textures in general.

A God of War graphics comparison image showing DSR 4x on the left versus native rendering on the right.
Left: Ultra quality, DSR 4x. Right: Ultra quality, native 1920x1080
A God of War graphics comparison image showing DLDSR on the left versus native rendering on the right.
Left: Ultra quality, DLDSR 2.25x. Right: Ultra quality, native 1920x1080

What’s impressive here is that DLDSR 2.25x, as you might have guessed from the numbers, isn’t even rendering at such a high resolution as DSR 4x. With a native 1080p monitor, DLDSR 2.25x will render at 2880x1620, while DSR 4x will go the full 4K at 3840x2160. The difference lies in the algorithm: DLDSR has enough machine learning smarts that it doesn’t need as many input pixels to produce a similar, or in this case slightly better, final image. And since fewer pixels need to be pushed, overall performance is higher. Is it a night-and-day difference in visual quality between this and DSR? Not really – I’m sure you noticed those comparison shots needed to be zoomed in to highlight the changes – but better looks plus better performance equals a clear upgrade for RTX users.

There’s also one other addition that can soften the frame rate blow even further: DLSS. Now, this won’t be available in every game, but you can use both it and DLDSR simultaneously. And while the mathematics of upscaling and downsampling each frame are a bit much for my humanities graduate brain, I do know that DLSS on its Balanced setting helped get 79fps out of DLDSR 2.25x. That’s nearly 20fps more than when using the default anti-aliasing, and less than 10fps below the all-native performance.

A God of War graphics comparison image showing DLDSR on the left versus a combination of DLDSR and DLSS on the right.
Left: Ultra quality, DLDSR 2.25x. Right: Ultra quality, DLDSR 2.25x, DLSS Balanced

Sticking with a PlayStation port theme, I also tried DLDSR out on Horizon Zero Dawn, which simply listed the higher resolution in its display menu without any Windows setting tweakery. Compared to stock 1080p, at which the built-in benchmark averaged 109fps on Ultimate quality, DSR 4x produced a similar frame rate-tanking to God of War with 52fps. Switching to DLDSR was enough to bump this back up to 83fps, and adding DLSS to the mix scored 104fps – only 5fps below native rendering.

These are good results from a speed standpoint, though I found the fidelity differences even harder to notice than in God of War. In the comparison below you can make out a touch of extra definition on the dusty ground texture, and maybe a teensy bit on parts of Aloy’s armour, but it’s proper squinty stuff.

A Horizon Zero Dawn graphics comparison image showing DLSS/DLDSR on the left versus native rendering on the right.
Left: Ultimate quality, DLDSR 2.25x, DLSS Balanced. Right: Ultimate quality, native 1080p

This does raise the question of whether, as a matter of course, enabling DLDSR is really the right play. I’d admit that while it works very well in God of War, I personally could live with leaving it off – especially when it would save me having to tinker with Windows Display settings every time I wanted to play it.

But then, I’m almost always more inclined towards higher frame rates than having the crispest, sharpest, most high-tech visuals possible. If you are a fidelity nut, and you own an RTX graphics card, I actually would recommend giving it a try. Especially in games that support DLSS, as these will come much closer to fulfilling Nvidia’s promise of higher detail at a lower performance cost.

Read this next