Whether we like it or not - and let's face facts, it's probably the latter - 8K display technology is coming. 8K support is baked into the upcoming HDMI 2.1 standard, and in fact, if you live in the US and you have $5000 to spare, you can buy a quad-UHD screen right now: Dell's 32-inch UP3218K. The question is: using today's top-tier graphics technology, is 8K PC gaming viable? We broke out a pair of Asus Strix GTX 1080 Tis, ran them in SLI and gave it a try. The results were unpredictable, unstable, but at the same time, also quite awe-inspiring.
After all, an 8K screen is effectively equivalent to a 2x2 arrangement of ultra HD displays, representing an immense 7680x4320 resolution. To put that into perspective it's also equivalent to 16 full HD screens lined up in a 4x4 arrangement. To make life a little more complicated, we tested at full 8K, specifically 8192x4320. This is mostly down to the fact that we didn't have a native 8K screen to hand, meaning we would be using Nvidia's dynamic resolution scaling (DSR) technology to internally render up from our screen's maximum supported resolution: 4096x2160.
Going into our tests, we weren't hugely optimistic. Linus Tech Tips has a great 8K video worth a watch, where two Titan X Pascals struggle to run Crysis at medium settings at 30-40fps. And that kind of makes sense: Nvidia's top-end GPUs can run games at 4K at 60fps, but settings tweaks are usually required to get the job done - and we're asking two cards to run four times as many pixels. In short, there's bound to be a big hit to performance. But what we found was that for the most part, the usual rules of settings tweaking hold little water when running at 8K - often, your ceiling is the sheer pixel-throughput, meaning that some settings can be pushed up to high or even ultra levels.
Taking Crysis 3 for example, that 30fps can be maintained at the game's very high preset, with just the shading and shadows settings requiring some tweaks down. Set everything to high and you're at 40fps. But it's the sheer pixel throughput that is the defining, limiting factor here: retain high settings and drop resolution to 7094x3741 (3.5x our baseline 4K) and we're at 50-60fps. That's effectively the same kind of performance you can expect from one GTX 1080 Ti running at 4K.
And 7K turns out to be the charm for Battlefield 1 too. 8K at ultra settings sees a visually spectacular presentation, marred by frame-rate drops down to 30fps and lower. Dropping just one setting - post-processing - adds 20fps to the tally, while other tweaks downwards make no impact to performance at all. Again, it's 7K that saves the day, where we could run a taxing campaign stage beautifully at 60fps.
Rise of the Tomb Raider struggles to maintain 60fps at 4K with a single GTX 1080 Ti, but a series of strategic settings choices allows us to run this title at 30-35fps at 8K, where it still looks spectacular - but it's here where we reach another physical limit of current GPU technology: VRAM. We could only achieve solid performance by running the game's console-level texture assets, not the 4K artwork. GTX 1080 Ti's 11GB simply isn't enough, and judged by today's standards, 16GB looks like the sweet spot. Certainly, that's what Watch Dogs 2's VRAM indicator reckoned would be required to get the job done - we had some initial success with that one, but settings exploration just caused multiple crashes, something we had to contend with throughout the entire testing experience.
And general instability was a constant companion, whether it was occasional graphical glitches, full-on crashing or bizarre slowdown. Metal Gear Solid 5 can run at a beautiful 60fps at native 8K with a mixture of high and extra high settings - until an enemy opens fire, where frame-rate can crash as low as 10fps. And then there are the titles with engines that simply don't support multiple GPUs, leaving us with 20-30fps gameplay in the likes of Just Cause 3 and Call of Duty: Infinite Warfare. SLI is also incompatible with Watch Dogs 2's temporal filtering (aka checkerboarding), meaning that the efforts to test 'next-gen upscaling' to 8K were stymied.
So, what's the takeaway here? The fact is that pixel-density even on our office's 58-inch Panasonic 4K TV is very, very high to the point where making out pixel structure on the screen is challenging enough. For our money, any 8K screen would need to be absolutely huge - to the point where projection or possibly even Total Recall-style integration into the wall would make sense. Practically though, the implications for a great VR experience are more interesting, as this is an area where pixel density can have a profound impact on the quality of the experience.Meet the man trying to finish every game on Steam 'I rarely talk about this with anyone.'
We'll return to GTX 1080 Ti SLI for a deeper look soon, but fundamentally, SLI - when working of course - opens up a window to the type of performance we can expect from the next generation of top-tier GPUs. But 8K represents challenges on top of pure compute power. Pixel throughput seemed to be our biggest limiting factor, suggesting that a big boost to render back-ends would be needed to really get the job done. On top of that, 16GB of VRAM is a must. But the fact that we got close to 60fps gameplay in some demanding titles is heartening. The success we had with 7K in Crysis 3 and Battlefield 1 suggests that three-screen 4K surround could be viable here - but the key takeaway is that it's all about thoughtful settings management.
Whacking up everything to ultra is a recipe for disaster, and by extension, there's a strong case to be made that developers need to work better in telling us what individual presets do, what the hit is to GPU resources can be, and to what extent visual quality is improved. And regardless of the resolution you choose or the hardware you own, that kind of information is priceless in getting a great PC gaming experience.