Skip to main content

Nvidia DLSS in Nioh 2: the most demanding challenge yet for AI upscaling?

Deep-learning super-sampling vs native resolution rendering.

Nvidia's DLSS has gradually evolved into one of the most exciting technological innovations in the PC space. The idea is remarkably straightforward: the GPU renders at a lower native resolution, then an AI algorithm takes that frame and intelligently upscales it to a much higher pixel count. There's an instant performance win, but remarkably, also a quality advantage too up against native resolution rendering. In the past, we've wondered whether this quality win comes down to mitigating the artefacts of temporal anti-aliasing - TAA - but the recent arrival of a DLSS upgrade to Nioh 2 provides us with an interesting test case. Nioh 2's basic rendering lacks much in the way of any form of anti-aliasing at all. It's pretty much as raw as raw can be. So the question is: can DLSS retain its performance advantage and still provide an actual increase to image quality up against native resolution rendering? Remarkably, the answer is yes.

DLSS was - and essentially still is - a replacement for TAA. Temporal anti-aliasing effectively uses information from prior frames and integrates them into the current one, typically using motion vectors to map where pixels in prior frames would sit in the frame being rendered. In best case scenarios, it's effectively improving image quality, and it is certainly the AA method of choice in modern gaming. But it can have its negative points: ghosting and added blur foremost amongst them. DLSS does have commonalities with TAA, which is why it is generally considered to be a replacement - it too requires the motion vector data in reconstructing its image. DLSS performance mode reconstructs from just 25 per cent of native resolution - so a 4K DLSS image is built from a 1080p native frame. Meanwhile at the other end of the scale is DLSS quality, which in this example would be generated from a 1440p frame. Balanced is the other major mode, sitting somewhere between the two.

As for performance advantages, in the case of Nioh 2, the effect is extraordinary. Nvidia's RTX 2060 is the least capable desktop GPU with DLSS functionality and at DLSS 4K, performance mode offers over 50 per cent of extra frame-rate, up against around 32 per cent for the DLSS quality mode. Perhaps the best utilisation for this GPU is 1440p rendering, where DLSS quality mode ensures you're essentially always running above 60fps. Meanwhile at the absolute top-end, RTX 3080 and RTX 3090 are delivering 4K gaming at 100fps and higher - it's an extraordinary experience on a suitable screen. But since Nioh 2 does not use TAA, does image quality hold up?

A detailed breakdown of DLSS in Nioh 2 - can AI upscaling really match or surpass native resolution rendering without TAA?Watch on YouTube

Most of our comparisons are 4K in nature, stacking up native resolution rendering to DLSS quality mode and for the most part, the news is very good. The harsh serrated edges of the native presentation are gone, while detail is (with one exception) retained. The image quality boost is mostly explained by the fact that DLSS is an effective anti-aliasing solution while Nioh 2's existing post-processing solution is either basic or almost non-existent. It's even more noticeable in motion: with no effective AA solution, there is obvious shimmer as gameplay rolls out in the standard presentation, while DLSS provides temporal consistency from one frame to the next, meaning much reduced shimmer. DLSS is effectively delivering what all supported titles since Wolfenstein Youngblood have offered: better than native quality in most scenarios. Comparing to a very raw image, we've ruled out the DLSS advantage as being exclusive to TAA presentations - it works for games with next to no AA as well.

So, to put it simply, Nioh 2 is another game where - to coin a phrase - you'd be nuts not to use DLSS. However, there are one or two issues that present upon closer inspection. There can be a very slight ghosting on moving foliage, while I noted a strange artefact on the game's depth of field effect (albeit limited to the title screen, seemingly). But perhaps the biggest issue is that far-off textures can seemingly present at a lower resolution than native rendering. The former artefacts would require further retooling of the DLSS algorithm, but thankfully the more noticeable texture issue can be patched and even mitigated by the user right now.

It all comes down to mip-map selection. A texture in a game has many lower resolution versions of itself called mip-maps. They exist for a lot of reasons, but one of which is to reduce shimmer and aliasing at a distance. Put simply, if you render a high detail texture into a low pixel area, this creates noticeable shimmer - there's perhaps more detail but not enough pixels to deliver it, resulting in sparkly effects and flicker. The solution is to swap in a lower quality texture - a lower resolution mip-map. As resolution scales higher, more pixels are available to present the detail so higher quality mip-maps are used. However, in Nioh 2 at least, mip-map selection is based on DLSS internal resolution, not its output resolution, resulting in lower visible texture detail on far off artwork. Thankfully, as the video on this page shows, dipping into Nvidia Profile Inspector and switching mip-map selection to a negative value essentially solves the problem.

4K Native
4K DLSS Quality Mode
At normal screen distance, the largest difference between DLSS vs native is the improved anti-aliasing present in DLSS.
4K Native
4K DLSS Quality Mode
4K DLSS Balanced Mode
4K DLSS Performance Mode
Decreasing from DLSS quality to lower precision modes sees a less perfect image reconstruction.
4K Native
4K DLSS Quality Mode (Default)
4K DLSS Quality Mode (-0.5 LOD Bias)
DLSS seems to use improper mip-maps by default, leading to less detailed textures at a distance than native. This can be fixed with driver tweaks to LOD bias.
1080p Native
1080p DLSS Quality Mode
1080p DLSS Performance Mode
Even at 1080p the reconstruction is rather faithful. Quality mode is upscaled from 720p, performance mode from 540p!

So how should DLSS be deployed? This applies to all titles using the technology, not just Nioh 2, but effectively, the higher your output resolution, the less of a win you get from DLSS's higher quality modes - in our opinion at least. If you're gaming at 4K, DLSS performance mode rendering natively at 1080p makes a lot of sense. At the other end of the scale, DLSS with a 1080p output resolution still looks OK in DLSS performance mode, but we'd recommend quality here instead to retain that 'better than native' effect. If there's the sense that the image is 'blurry' at 1080p or indeed any output resolution, we'd recommend investigating the negative mip-map bias selection via Nvidia Profile Inspector - but certainly in the case of Nioh 2 and Cyberpunk 2077 (to name two examples) it may make sense for the developer to patch in more appropriate mip-map selections to deliver the required look.

Ultimately, minus a few sticking points, the Nioh 2 experience demonstrates that DLSS can look like native rendering without the vast majority of TAA's issues but at the same time it also delivers better anti-aliasing - and its accelerant effect to performance is just as profound as it always has been. And it's worth keeping this all of this in context too: AI is becoming a more important aspect of technology across the board, but DLSS is just the beginning of its impact to gaming - and you can be sure that Nvidia won't be stopping there.

Read this next