If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Nvidia DLSS analysis: how AI tech can make PC games run 40 per cent faster

But does image quality hold up?

What if PC hardware manufacturers fully embraced the kind of smart upscaling technologies now commonplace on consoles? It's a topic I've explored in the past, but with Nvidia's new deep learning super-sampling - DLSS - we have a reconstruction technology with full hardware acceleration, producing some remarkable results. Indeed, based on a Final Fantasy 15 demo we've had access to, DLSS is increasing performance by 40 per cent and in some respects, it's actually improving image quality.

So how does it work? At the Gamescom reveal for RTX technology, Nvidia big boss Jen-Hsun Huang talked about how deep learning technology - bread and butter for the new tensor cores inside Turing - could 'infer' more detail from any given image through learned experience of looking at similar images. Translated to DLSS, Nvidia's internal super-computer - dubbed Saturn 5 - analyses extremely high detail game images, producing an algorithm just a few megabytes in size that is downloaded via a driver update to an RTX card.

The game itself is rendered at a lower resolution and just like those image enhancement techniques that work so well via deep learning techniques, DLSS works to produce higher resolution imagery. We're pretty sure there's a bit more going on here than Nvidia is telling us. For starters, DLSS relies on titles that use temporal anti-aliasing (which, to be fair, covers pretty much every major modern game engine these days). This suggests that DLSS pulls info from prior frames to help with its reconstruction over and above whatever info it 'infers' via its algorithm.

We know that this is the case because similar to the temporal injection jittering technique seen in titles such as Spider-Man on PlayStation 4, on every scene cut, there is no data from the prior frame the algorithm can work with. This leaves us with one frame of an untreated image and that means that - yes - DLSS can be pixel-counted. We only have 4K demos to work with but the lower base resolution Nvidia refers to is confirmed at 1440p. This massively reduces the shading power required to produce the base frame, then DLSS steps in to reconstruct the image. It does a remarkable job bearing in mind it only has around 44 per cent of a full 4K image to work with - we have a bunch of comparison images on this page and you can draw your own conclusions..

Cover image for YouTube video[4K] Nvidia RTX DLSS Analysis: AI Tech Boosts Performance by 40% - But What About Image Quality?
Having taken a look at a brace of compelling demos, Rich, John and Alex discuss DLSS in depth.

So, let's return to that bold claim that Final Fantasy 15 runs more smoothly with DLSS and in some respects offers an increase to image quality. First of all, the performance metrics are confirmed - as you'll see on the benchmark widgets on this page, but what about image quality? DLSS bases its 'knowledge' of the game based on a series of super high quality 64x super-sampled images fed into the Saturn-5 hardware, but the fact is that the game we actually get to play uses one of the blurriest forms of temporal anti-aliasing we've seen. It holds up at higher resolutions, but DLSS makes no use of this form of TAA at all, instead reconstructing using a very different technique. The quality of DLSS vs the inadequacies of the title's native TAA makes for sometimes stark differences in many cases, with DLSS capable delivering more detail in some scenarios, while losing some in others.

DLSS also offers clear advantages over checkerboarding - the more usual form of reconstruction seen on consoles - in that artefacts present differently with no stippling. On top of that, DLSS also cleans up some of TAA's more problematic issues. Looking at Final Fantasy 15, it is clear that the default AA solution struggles to cope with the mesh-like effect seen on character hair. Remarkably, DLSS copes better, transparencies are processed more effectively and in many cases, DLSS resolves more texture surface detail overall.

The Epic Infiltrator demo runs on a predetermined course every time it runs, so in theory, one might suspect that the AI algorithm would be able to 'learn' more quickly and present a flawless result. Similarly, much of the Final Fantasy benchmark also delivers very similar imagery from one run to the next. However, while most of the benchmark runs on a predetermined route, there is an area of combat that is dynamic and varies significantly from run to run - and the good news is that DLSS still holds up just as well here. On top of that, the fact is that the version of DLSS we've seen concentrates on performance along - another iteration of the tech focuses on image quality. Let's just say we can't wait to see that in action.

DLSS
TAA
When comparing DLSS against a native 4K image with temporal anti-aliasing, the results seem mixed initially. Ghosting is eliminated, in-surface aliasing reduced and transparencies slightly improved but it's clear that the overall pixel count is reduced.
DLSS
TAA
When viewing a long-distance shot, however, the two stack up favourably. Foliage appears slightly sharper with TAA but the overall image looks remarkably close when using DLSS.
DLSS
TAA
In this example, DLSS offers a noticeably cleaner and slightly sharper image overall though some details, such as the Regalia text on the rear of the car, appear sharper when using TAA at 4K.
DLSS
TAA
DLSS stacks up well in close-ups too. Edges are a tad softer but remain clear and clean.
DLSS
TAA
Hair rendering appears cleaner with DLSS enabled but trees exhibit an almost sharpened appearance over the standard TAA. Specular aliasing remains an issue in both instances.
DLSS
TAA
This final shot was snapped one frame after a camera cut and it reveals the raw, underlying image. In the case of TAA, the image is extremely sharp but exhibits noticeable aliasing. There is some sort of smoothing occurring on the DLSS side, but pixel-counting confirms 1440p. However, do not that this is an untreated DLSS image and not representative of the in-game look.

While Final Fantasy is the focus here, Nvidia also released the Epic Infiltrator demo shown at Gamescom 2018. It came in just a day before the embargo lifted, so our time with it has been limited, but we're seeing the same base 1440p resolution and very similar performance uplifts. The challenge in equalling or bettering TAA is somewhat more pronounced here because Unreal Engine's temporal solution is massively improved over what's delivered in Final Fantasy 15. Also muddying the waters in terms of image comparisons is that Epic's demo is very heavy on post-process effects. DLSS still holds up, it still cleans up ghosting and other temporal artefacts seen on TAA.

The Infiltrator demo also serves to highlight that the performance boost offered by DLSS is not always uniform - it's not a straight 35-40 per cent uplift throughout. The demo features a number of close-up scenes that stress the GPU via an insanely expensive depth of field effect that almost certainly causes extreme bandwidth issues for the hardware. However, because the base resolution is so much lower, the bandwidth 'crash' via DLSS is far less pronounced. On a particular close-up, the DLSS result sees the scene play out over three times faster than the standard, native resolution TAA version of the same content.

But has Nvidia truly managed to equal native quality? For the most part, it passes muster and inaccuracies and detail issues are only truly noticeable when conducting direct side-by-side comparisons - but we did note that in the demo's climactic zoom-out to show the high detail city, the lower base resolution does have an impact on the quality of the final image. Is it likely to distract a user playing the game and not conducting detailed side-by-side comparisons? Highly unlikely.

DLSS
TAA
This initial scene is close on first examination but the blue-ish pattern textured across the center of the room exhibits a moire pattern when using DLSS. Still, in motion, the two appear very similar - and of course, DLSS has a 35-40 per cent performance uptick.
DLSS
TAA
Close-up shots reveal a slight increase in sub-pixel detail when using the default TAA but, in practice, it appears virtually identical. This sort of post-heavy image is a perfect fit for DLSS.
DLSS
TAA
This frame was taken immediately following a camera cut and reveals the difference between the two. Once again, the DLSS image resolves to 1440p while the TAA image is clearly native 4K. But DLSS is not fully active here - check out the next shot to see the difference the effect makes.
DLSS
TAA
If you move ahead a touch, both images now exhibit the full benefits of the selected techniques. Edges appear much cleaner in both and DLSS comes much closer to matching the TAA side. Both rely on information from previous frames to produce the best image.

DLSS is attracting a lot of developer support and it's not difficult to see why - the performance uplift alone effectively lets your RTX 2080 perform faster than an RTX 2080 Ti (not running DLSS, of course). Gen-on-gen, you're looking at almost double the performance. For 4K gaming especially, the boons are difficult to ignore. Take Shadow of the Tomb Raider for example - you're closing to running the game at 4K60 locked with RTX 2080 Ti. DLSS support is coming and in theory, the 2080 should be able do the same thing - if not slightly faster. Meanwhile, with ray tracing performance a cause for concern, DLSS offers a handy technology for developers to bring RT to their titles while still delivering higher resolutions.

But we do need to temper expectations to a certain degree. For starters, although the demos are compelling, we've not had a chance to actually play a game with the effect in play. This is pretty crucial! Secondly, we've seen 1440p DLSS deployed on the Star Wars Reflections ray tracing demo but right now we've not ascertained by the lower base resolution used there. Just how well does the algorithm hold up for, say, 1080p? We've seen a consistent 40 per cent uptick in performance in the supplied demos, but does that also apply to the upcoming RTX 2070? If so, potentially that would allow the new card to deliver 4K DLSS performance in line with GTX 1080 Ti and RTX 2080 - even the possibilities for a prospective RTX 1060 are compelling.

Of course, in line with all of the other cool features in the Turing architecture (and there are many), the success of DLSS is entirely dependent on developer support. What we've seen of the quality level so far is very promising, and initial take-up certainly looks strong. But the challenge facing Nvidia is onerous - specifically, persuading developers to provide continued support for features that will only benefit an initially small sector of the market. And to really make RTX work, to keep hardware shifting, and for users to feel that they're getting value from their expensive new kit, the pressure is on for Turing-specific features to be incorporated in as many high profile titles as possible. Based on what we've seen so far from ray tracing and DLSS, the benefits for gaming here are tremendous and we'll be progress for RTX support very closely in the months to come.