Skip to main content

Long read: How TikTok's most intriguing geolocator makes a story out of a game

Where in the world is Josemonkey?

If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

DF Weekly: The goalposts have shifted again in assessing console image quality

Why pixel counting never went away.

Every week, I like to choose one specific topic to discuss from the latest edition of DF Direct Weekly - and there's certainly an embarrassment of riches to choose from in another vast episode. We sat down to film our 141th show the morning after The Game Awards, and despite the usual downplaying of expectations from Geoff Keighley, the 'wurld prm'ears' came thick and fast. For this blog though, it's actually a supporter question that prompts this article. DF Supporter Julian Sniter asked us whether upscaling technologies like AMD's FSR and Epic's TSR are making it harder for us to 'count pixels' and ascertain native resolutions. The answer is yes, but the more verbose answer is that upscaling technologies have evolved at least twice since we first started image quality testing - and it's more difficult now than ever.

It's been a couple of years now since I made it a personal mission to downplay - if not eliminate completely - revealing the native rendering resolution of any given game. Back in the Xbox 360 and PlayStation 3 era, target resolution was typically 720p and upscaling from beneath that could have a significant impact on the image quality of any given game. However, moving into the PS4 Pro/Xbox One X era, the output resolution target moved to native 4K and a whole host of intriguing upscaling technologies came into play - the most popular being checkerboard rendering (as seen in the likes of Horizon Zero Dawn, Days Gone and many others) and temporal super-sampling/jittering (Marvel's Spider-Man and For Honor being two great examples). We often refer to that latter tech as TAAU - a combination of upscaling and temporal anti-aliasing.

With a few obvious outliers (The Avengers used CBR on PS5, TAAU on Xbox), the direction of travel from developers was to favour TAAU and at that point, the argument to kill pixel counts once and for all seemed unassailable. Identifiable upscaling artefacts seemed to mostly disappear in favour of clarity: the higher the internal resolution, the less blurry the image was. And the more pixels you upscaled, the less the return. There are many great examples of 'native' 1440p resolutions that could convincingly pass for a 4K image - or at least, still looked great on a 4K screen. We worked with some fantastically talented people and produced some interesting examples of a 'clarity index' that worked by comparing TAAU-upscaling to native resolution rendering between 1080p and 2160p, and effectively removed the pixel count altogether in favour of quality comparison based on what the human eye actually sees.

Rich Leadbetter, Alex Battaglia and Oliver Mackenzie take to the mics this week in perpetrating DF Direct Weekly #141.Watch on YouTube

  • 0:00:48 News 01: The Game Awards - Senua's Saga: Hellblade 2
  • 0:08:51 OD
  • 0:17:48 No Rest for the Wicked, Monster Hunter Wilds
  • 0:26:58 Marvel's Blade, Light No Fire
  • 0:39:13 Black Myth: Wukong, Persona 3 Reload, Metaphor Refantazio, Skull and Bones
  • 0:51:58 News 02: Avatar: Frontiers of Pandora released!
  • 1:01:58 Supporter Q1: How do you pixel count games with upsampling?
  • 1:09:06 Supporter Q2: Do modern games avoid physics-based gameplay because of processing constraints or game design?
  • 1:12:19 Supporter Q3: Does the addition of path tracing to Call of Duty’s lobbies suggest a wider implementation later on?
  • 1:16:07 Supporter Q4: Should the console makers release jailbreak patches for older consoles?
  • 1:19:38 Supporter Q5: Should GPU manufacturers introduce features from Special K into their drivers?
  • 1:24:25 Supporter Q6: Can you pixel count The Game Awards?

TAAU was a fantastic technology for its time and still holds up well today - Insomniac still uses its own version, ITGI. Pinpointing what the native resolution actually is has academic interest, but when each frame features injected detail from several prior frames, 1440p - or whatever - becomes so much more. Another great example is Infinity Ward's upscaler in Call of Duty. A couple of years back in the team's Polish studio, I got to see native resolution rendering vs the final output on the PS4 version of Vanguard and I was amazed at the difference. And prototype tooling based on Warzone replays once again allowed us to programmatically quantify the improved image quality on Xbox One X vs PlayStation 4 Pro.

And yet, a couple of years on, Digital Foundry is still pixel-counting the old-fashioned way. And to answer supporter Julian Sniter's question, yes, it's a whole lot harder than it used to be. Games with poor anti-aliasing, MSAA or even FXAA could be relatively easy to count. You find an edge and pinpoint 'actual' pixels within the 'output' pixels of the final image and the ratio you find defines the native resolution rendering. TAAU make things a lot more difficult because those edges became harder and harder to find. Not only that, you could be looking at two pixel counts - the internal resolution and whatever output resolution the game is attempting to resolve.

Even so, the 'clarity index' was looking good as a replacement, but doubt began to creep in as soon as we took at Nvidia's DLSS. Remedy's Control features a pretty basic form of TAA that has plenty of noise in it - but Nvidia's DLSS, even on a lower rendering resolution, has a lower degree of noise and an arguably more pleasing image overall. A clarity index here doesn't work when comparing different upscaling methods and more than that, revealed that noise and artefacting essentially generated 'false positive' detail levels. This demonstrated that a replacement for pixel-counting would rely on the exact same upscaling techniques used between versions - something that was not always the case. And if there was no PC version at all to generate ground truth 1080p and 2160p, we'd be stuck anyway.

AMD's FSR 2 complicated matters still further. In theory, it's a significantly superior upscaler to TAAU - which explains why so many developers have ditched older techniques in favour of it (Cyberpunk 2077 and The Witcher 3 being two good examples). However, in attempting to reconstruct a native image from a much lower resolution, FSR 2 creates its own range of artefacts - ghosting, fizzle and more. At this point, we've reached the conclusion that there's no programmatic way to express image quality. FSR 2's fizzling in particular simply tells an algorithm looking to gauge clarity that there's more detail - and perhaps there is, but it's unwanted detail. At this point, a subjective breakdown of image quality and, yes, pixel counting is the best way of explaining image quality, especially in an era where 4K consoles are using FSR 2 to upscale from anything as low as 720p. The picture can look poor and the base resolution helps explain why.

Is there any route forward in eliminating pixel counts for good? We were recently pointed towards a machine learning model used by Netflix and others to judge video encoding quality. You feed it the same content with various encodes and each run is given a score between 0 and 1. One of our supporters did a lot of work in attempting to use this model to compare DLSS and FSR 2. And at first, results looked interesting. DLSS always scored higher than FSR 2, which we thought would be the case bearing in mind that Nvidia's upscaler tends to deliver significantly higher quality levels than FSR 2. But the more content we tested, the less convinced we were by the results.

It turns out that using a video-based model just wasn't a good fit. However, the way this encoder model was put together was intriguing: essentially, the model was trained by showing people various video comparisons and asked to score which they thought looked better. It's likely that thousands of comparisons like this were done and the model took shape from there. So, perhaps something similar could be done by comparing the various upscaling techniques available today? Perhaps in the future, AI will solve the pixel-counting problem once and for all, but for now, we're back at square one. It's all about a subjective analysis better informed by figuring out base resolution the old-fashioned way. Pixel counts themselves don't mean much, but definitely help to explain some of the poorer examples of image quality we've seen in recent times.

Read this next