Long read: The beauty and drama of video games and their clouds

"It's a little bit hard to work out without knowing the altitude of that dragon..."

If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Digital Foundry vs. 4K gaming

How well can games run on the new ultra-HD standard - and are next-gen consoles invited to the party?

If CES 2013 is anything to go by, it looks like living room TVs are set to veer off in two different technological directions: screens are going to get thinner with the introduction of cutting-edge OLED panels, while some displays are going to get bigger. Much bigger. This physical increase in the size of HDTVs will be matched by an appropriate boost in resolution, with standard 1080p seeing a 2x boost in both directions, resulting in the emergence of a new 3840x2160 "ultra-HD" 4K standard.

With Sony at the forefront of this new wave in displays, rumours are already circulating that the next-generation PlayStation (codenamed Orbis) will feature some level of 4K support, and of course we know that Polyphony Digital has also ported across a small array of Gran Turismo 5 Prologue content onto a 4K-compatible set-up powered by four PS3s operating in parallel, each rendering a quarter of the display at 1080p. All of which leads us to wonder: just how much 3D rendering power is required to produce a compelling 4K experience, how do cutting-edge games look in ultra-HD, and will the next-gen consoles have the horsepower to run advanced titles at this resolution?

We already have some experience of ultra-resolution gaming of course, having played a range of releases on the Retina Macbook Pro, with its frankly beautiful 2880x1800 15-inch display. Cutting-edge titles like Battlefield 3 and Crysis 2 were off the table even at the lowest settings, but Skyrim and Batman: Arkham City were playable, albeit with big cuts to the quality settings. However, the Retina Macbook Pro is a laptop with relatively little rendering power behind it - next-gen demos such as Unreal Engine 4 and Square-Enix's Agni's Philosophy have all been powered by the combined might of Intel Core i7 CPUs and the NVIDIA GeForce GTX 680 graphics card. So we decided to follow suit with a similar set-up.

Sony wowed CES attendees with this 56-inch ultra-HD prototype, which combined 4K resolution with a state-of-the-art OLED panel. While this exact model may not make it to market, 4K is the manufacturers' next major technological gambit.

Maybe things have changed with this new wave of ultra-HD displays, but prior to CES, actually displaying 4K on an existing display was a bit of a challenge. It can be done via HDMI, but the bandwidth ceiling ensures that the refresh rate is limited to 30Hz - not really a problem for gaming since 95 per cent of console titles run at 30FPS, but we'll need to wait for the upcoming HDMI 2.0 standard to achieve 4K resolution at a 60Hz refresh. Until then 4K gameplay at 60 frames per second can be achieved, but it's a complex setup, involving four digital video inputs, each addressing a quarter of the screen - just like the Polphony GT5 4K demo. We'd imagine that AMD EyeFinity and its NVIDIA alternative could accommodate this, but we wanted to capture our work too, so we opted for the single connection approach, hooking up our PC to our in-development next-gen capture card.

The results are surprising. On a PC you can build yourself for around £800, 4K is viable, and it can look beautiful. While we still despair at developers and publishers releasing ultra-resolution screenshots in their press materials that are hardly indicative of the final 720p releases, 4K is the "bullshot" dream made playable - extreme pixel counts rendered in real-time. It's not without its faults, but there are moments when everything comes together, where we get the Dead End Thrills experience in a fully interactive form.

We aimed high with our first test: DICE's Battlefield 3. Now, we know that the GTX 680 can produce some excellent results here. In our review of the hardware, we found that we could get something approaching a 1080p60 experience on ultra settings. While resolution quadruples at 4K, our 30FPS ceiling means that the jump to full 3840x2160 wouldn't be quite so extreme. While we couldn't get ultra settings to work (we actually received DirectX out-of-memory errors), the high quality level proved to be playable.

"Battlefield 3 at high settings offers similar performance at 4K to the console versions on our PC, with further tweaks required to sustain a consistent frame-rate."

Cover image for YouTube videoBattlefield 3 PC at 4K Resolution Frame-Rate Tests
Performance analysis of Battlefield 3 running with v-sync engaged on high settings, with resolution set to 3840x2160 - consumer-standard 4K. We uploaded a full 4K encode to YouTube's servers which you can access in a somewhat bit-rate-starved format by accessing the 'original' quality option.

By default, we miss out on BF3's showstopping 4x MSAA, available only on ultra settings, but in truth we didn't feel too bereft by its loss. Post-processing anti-aliasing has some real issues at 720p, but the higher up the resolution chain you go, the less noticeable its issues are. At 3840x2160, it's fair to say that post-AA's annoying pixel-popping issues are a complete irrelevance here.

That's not to say that our first test was a complete success, however. The stark, pristine nature of the visuals looked a little "boxey", while low-polygon objects and basic textures could look pretty rough, but the bottom line is that we're playing the game at a level that DICE never really anticipated. You'll also note that we had some frame-rate issues too - advanced effects work could see palpable drops in response and there was definitely a console-like feel to the controls (in all of our tests we played with a 360 pad to mimic a living room experience). The chances are that by adjusting a few settings done to the medium level, we could achieve a more consistent frame-rate without missing out on too much bling. In a console-specific scenario, developers would have the luxury of coding to a fixed platform and could optimise on a level-by-level, scene-by-scene basis. But the overall takeaway from this initial test was positive - we were gaming at 4K at high quality settings and the experience was more than playable enough.

The same can't be said for Crytek's visual masterpiece, Crysis 2. In DirectX 11 mode, we lost our consistent 30FPS simply by moving from the high settings to very high (that's from low to medium by any other game's definition) while extreme and ultra were completely off the tablet, frame-rate collapsing whenever effects work dominated the scene. That's not to say that we couldn't enjoy a superb experience though. In DX9 mode, anything above the very-high quality setting was too much, but we could retain consistent performance at very-high, and the high-resolution textures remained in play. The overall impression even more striking than Battlefield 3: this game took on a whole new level at 4K, with an astonishing wealth of detail in the visuals and effects work, along with motion blur doing an excellent job of mitigating the traditional 30FPS judder.

"Take your pick: Crysis at high/DX11 or very high/DX9 both offer a stunning, smooth 4K experience on our Core i7/GTX 680 gaming PC."

Cover image for YouTube videoCrysis 2 PC at 4K Resolution Frame-Rate Tests
Crysis 2 in DX9 form on very-high settings offers up a visually sumptuous experience that doesn't deviate that much from the 4K standard's target 30Hz update. Uploaded in native 4K be sure to check out the higher quality encodes on offer.

On top of that it was really rare for us to find much in the way of sub-par texture work and in terms of overall detailing, only foliage and trees felt like a bit of a let-down - and that was mostly down to a lack of geometry on these elements, which really stood out at this extreme resolution. Indeed, overall detailing was so fine that in some places we actually had difficulty capturing footage in real time - by using mathematically lossless compression we could acquire video and stream it to a Samsung 840 SSD at around 180MB/s. Crysis 2's detail level was a compression nightmare in certain areas, with bandwidth spikes well in excess of the drive's maximum 260MB/s sequential write speed.

Overall we came away rather happy with general performance here - the game looked fantastic and by sticking to the very-high graphical quality setting, we only witnessed a handful of dropped frames, none of which had any impact on the integrity of the overall experience.

Next up, Criterion Games' Need for Speed: Most Wanted. We were keen to give this one a go for a handful of reasons - firstly, the developer's lighting and effects work is exemplary and we were keen to see how this would scale up, and secondly because real-life car models tend to look absolutely fantastic when running at extreme resolutions. Finally, we were really impressed with Dead End Thrills' efforts here and were wondering what we could achieve playing at an even higher resolution.

We went into this one with some trepidation as the current build of Most Wanted does have some performance issues on PC to the point where we could not achieve 1080p60 gameplay on a GTX 670 with all settings at the max when we put together our last Need for Speed Face-Off. However, on the slightly faster NVIDIA card, we could run the game at something approaching a locked 30 frames per second with all settings ramped up to their limits, with only the ambient occlusion setting pared back to the medium level. We were particularly satisfied that we could also invoke the maximum geometry setting in order to really get the most out of the detailed model work, and this really pays off at 4K.

"We could afford lavish quality settings with Need for Speed: Most Wanted and enjoy 4K gaming with the same 30FPS update as the current-gen console versions."

Cover image for YouTube videoNeed for Speed: Most Wanted at 4K Resolution Frame-Rate Tests
Need for Speed: Most Wanted - it's been slated by many for its performance issues on PC, but we were able to get an excellent 4K experience on our i7/GTX 680 set-up with only a tiny, imperceptible quality downgrade.

Crash sequences and some certain scenarios involving effects work can see the frame-rate drop, but overall performance is pretty consistent, certainly standing up very well to the Xbox 360 and PlayStation 3 versions. However, in common with the rest of the games we tested, the 30Hz limitation to the current 4K standard is something of an issue: PC offers the ability to run at 2.5K - 2560x1440 at 16:9 or 2560x1600 in a 16:10 configuration - at full 60Hz, and there's a strong argument that this offers the best all-round package, offering up an immediately noticeable boost to display resolution without compromising refresh.

Our final test sees us returning to the Retina Macbook's finest hour - Batman: Arkham City. If we could achieve reasonable performance on a mobile graphics core, we should expect much, much more from the Core i7 desktop chip in combination with the GTX 680 - and the hardware delivered. The performance-sapping DX11 modes were off the menu, and once again we opted to swap out bandwidth-intensive multi-sampling anti-aliasing for FXAA on its high quality level, but even with PhysX enabled at its normal level, we could enjoy gameplay that delivered what amounts to a locked 30 frames per second. Our feeling is that we could have pushed the boat out still further given more time, but we were fairly happy with the results as they were - many Unreal Engine 3 games have a vast amount of intricate high-frequency detail in their artwork, Arkham City being a classic case in point. 4K offers up the pixel density to render it all without distracting specular aliasing.

At its best, Arkham City looks simply magnificent when offered such a large canvas on which to display its wares, but of all the games we tested at 4K, it was also here that we discovered the most obvious drawbacks of running assets at display modes they were never designed for. Despite some artwork upgrades over the console versions, close-up camera viewpoints on Batman and his supporting cast and environmental detail reveal textures that are simply too low-res to look effective when rendered in ultra-HD.

"Batman: Arkham City can look stunning at 4K but also demonstrates that current-gen texture assets and geometry can look pretty poor up-close."

Cover image for YouTube videoBatman: Arkham City at 4K Resolution Frame-Rate Tests
Batman: Arkham City could be run at 2880x1600 on the Retina Macbook Pro with a relatively meagre GT650M GPU. We could ramp up the settings massively on our test rig and still get a locked 30FPS during gameplay. Apologies for the lack of in-game audio - it corrupted on our test set-up, so enjoy some excerpts of the brilliant soundtrack in its stead.

The big question: will next-gen consoles support 4K?

In the cold light of day, we see 4K development on next-gen mirroring what happened with 1080p support on the current-gen consoles. Developers en masse decided that 720p resolution provided the best balance between detail and effects work, with very few actually targeting full HD where the requirements simply couldn't be met by the GPUs in the Xbox 360 and PlayStation 3. Looking back, we only saw a handful of AAA titles that actively targeted 1080p and even then, the most advanced games only achieved it with significant compromises - Gran Turismo 5's 1280x1080 native resolution being a case in point.

The challenges of developing for 4K are perhaps even more pronounced in comparison to the current-gen 720p/1080p divide. Full HD is around 2.5x the pixel count of standard 720p, while 4K sees a quadrupling in detail. We were able to get some surprisingly good results from our chosen PC releases, but it's worth remembering that these are current-gen titles at their hearts - we would expect more from games developed with newer hardware in mind.

We may see some showcase games designed around 4K, but our guess is that they will be few and far between. Our information from developer sources suggests that even Sony - with much to gain in promoting 4K gameplay bearing in mind its upcoming range of screens - is making no attempt to evangelise the new display format to third-party developers, with 1080p the target resolution for Orbis titles. As both the next-gen PlayStation and Microsoft's Durango have a hell of a lot in common from a technological standpoint (more on that soon), we'd also venture to suggest that 1080p is the target for this year's new Xbox too.

With 1080p so firmly established as a display standard, we see 4K as a niche market and it's one that can be serviced beautifully by PC hardware, where you can sink in as much money as you want to get optimal results. PC is also where they are more practical budget options - a wave of 27-inch Korean IPS monitors with 2560x1600 resolutions using similar display tech to high-end Dell and Apple displays is providing an interesting desktop alternative for those interested in ultra-resolution gaming, something we hope to look at in an upcoming feature. 2.5K may sound like quite a downgrade from full-fat 4K, but in a desktop environment, with the screen right in front of the user, we'd still expect some beautiful results.

It's fair to say that the response to the 4K emphasis at CES has been somewhat negative though. Displays are an area where people expect their technology to last five or even ten years, and after the failure of 3D to gain traction many view the arrival of a new standard with some suspicion - a pointless upgrade, if you will. However, we see things differently - we see 4K as an option, just as 1080p was when PlayStation 3 launched back in 2006. Nobody is forcing us to upgrade and the arrival of the next-gen consoles will only strengthen the grip of full HD as the de facto standard. But for those with the money to burn, who are ready to customise their gaming kit to meet the phenomenal rendering challenge, the overall conclusion we draw from our testing is that 4K really does have the potential to be quite special...