Skip to main content

Long read: The beauty and drama of video games and their clouds

"It's a little bit hard to work out without knowing the altitude of that dragon..."

If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

LucasArts' 60FPS Force Unleashed II tech demo

A frame-rate upscaler that really works? Digital Foundry investigates.

The simple way to interpolate the intermediate image is to take the previous frame and filter it based on a "halfway backward" rendition of velocity map of the next frame.

This kind of interpolation produces some artifacting. Static geometry (environments, for example) can generate this, but the effects are not so apparent to the human eye. However, dynamic geometry from characters and objects can cause issues. Andreev's ingenious solution on Xbox 360 is to down-sample the last frame to 640x360, and filter it in three passes to remove the characters completely, thus removing the most obvious artifacts. In a worst-case scenario in a game with a third-person camera like The Force Unleashed II, the overall impression will be of a 60FPS game running with 30FPS animation (a bit like the original Halo on PC, if you can remember back that far...)

On the left we see an untouched frame, and next to it artifacts found in the prototype of LucasArts' frame-rate upscaler. The image to the right shows an interpolated frame in the more advanced stages of development. Any low-resolution elements are only apparent every other frame with the human eye's blind spot helping to blend them away.

The challenge from here is queuing up the code to display the new image at the right time – no problem at all on either console if your game is locked at 30FPS. According to Andreev, if the game drops below 30FPS the PS3 is still able to carry out the "flip" at the right point between the two frames. However, on Xbox 360, Microsoft's TCRs - the technical rules which dictate what you can and can't do with its hardware - insist that all calls to the graphics hardware go through their own APIs, and there isn't an equivalent system in place on DirectX.

Andreev's solution for presenting the interpolated image in a sub-30FPS scenario involves talking directly to the hardware and bypassing the API, violating Microsoft's sinister-sounding "TCR #12" - in place to make sure that all games will work on all revisions of Xbox 360 past, present and future. There's nothing to stop Microsoft adding it to future revisions of its development tools, but it is an interesting reminder - if not a direct example - of how strict adherence to the 360's console-optimised DirectX layer may be holding more adventurous 360 developers back, something Digital Foundry has discussed in the past.

Talking about his sub-30FPS fallback mechanism for his frame-rate upscaling system, Dmitry Andreev laments that the "Direct3D API provides almost no control over the hardware which really sucks and makes implementation of rather basic things quite complicated."

However, the available demo looks pretty astonishing for a proof of concept, and according to Andreev, the bump from 30FPS to an interpolated 60FPS is indeed "free" in that the removed motion blur code is more "expensive", taking up more system resources, than his frame-rate upscaler.

According to his figures, The Force Unleashed II's motion blur eats up 2.2ms of resources on Xbox 360 (give or take 0.4ms), while the five-SPU-powered PS3 version is much faster at 1.4ms (give or take 0.5ms). Compare this with the frame-rate upscaler, which runs at 1.5ms on 360, and 1.4ms on PS3 (again parallelised over five SPUs).

So, bearing in mind how exciting this technique is and how dramatic the results are, can we expect to see the frame-rate upscaler in effect in Star Wars: The Force Unleashed II?

"Regarding TFU2, no, we are not shipping with it as much as I would like to," Andreev tells us.

"Very early in engineering production (art pre-production) we decided not to change the workflow that can affect art and design. So early on we decided to go with 30FPS motion blur as it took our rendering team of three engineers more than a year to focus on performance and PlayStation3 SPU work."

More than that, while the technique looks impressive enough retrofitted into an existing game engine, Andreev reckons that the system could really come into its own if it is integrated into development at the earliest possible stages.

"This technique makes production of 60FPS games a lot easier, but it works best when you think about it during game design. Art, VFX, animations, etc... Production of native 60FPS games is almost unreachable, that's how hard that is, but this thing brings it much closer to 30FPS production."

While there is some disappointment at not seeing this technique in the final, shipping game, what we've seen of the motion blur system in the Xbox 360 proof of concept looks good, and based on info revealed at the SIGGRAPH talk, the PS3 version should have a significant quality advantage.

"It is worth saying that motion blur makes a huge difference when compared to running with no motion blur," Andreev points out in his presentation.

"But 60FPS rendering brings it to a different level. When running at 60FPS we can get away without the motion blur. It could still be used as an effect for things that are moving at non eye-trackable speeds."

While the system may not immediately be seen in doubling 30FPS frame-rates, variants of the technique itself can be used to bring about other enhancements to image quality, and indeed, we can expect to see something of this working in the forthcoming Crysis 2, if Crytek's recent presentation is anything to go by. Here we see how a different implementation of much the same principle smooths out aliasing issues (not just edged based) in the far distance of the scene.

CryEngine 3 anti-aliasing demo. Use the full-screen button for full resolution.

In addition to a more common edg- detect/blur for close-up objects, they use a pixel re-projection approach for anti-aliasing far away elements of the scene. They render the current frame, then using the last frames camera they reverse project each pixel into the screen space of the last frame. Then they compare the current depth value with the last frames depth value and if they're similar, they blend the two colour buffers together.

Combining this with some per-frame jittering gives them an additional layer of anti-aliasing, and it's Crytek's belief that the overall result is better than Sony's extremely impressive MLAA solution which is now available for all developers to insert into their games as part of the PS3 SDK. Over and above this particular application, pixel re-projection could also have a part to play in producing a low cost (in terms of system resources) approach to stereoscopic 3D. Indeed, RedLynx's Sebastian Aaltonen, key tech guy behind Trials HD, has talked to us about something along these lines that he has experimented with in the past.

It's been almost five years since the Xbox 360 arrived in the hands of gamers, and while we're unlikely to see much in the way of new paradigms for rendering on the current generation machines, this technique is yet another example of how the technology can be pushed into new levels of performance that could never have been anticipated at launch. Enhancements in image quality aren't being achieved by brute force - indeed you can almost describe techniques like this and Sony's MLAA as ingenious "hacks" that use developer cunning to produce often radical results.

With platform holders, publishers and developers alike hoping to get at least two to three more years out of the current gen platforms, innovation like this is hugely welcome - and equally, it's great to see game developers willing to share their techniques in forums like GDC and SIGGRAPH, leading to a commonwealth of knowledge that improves the overall standards of the games we play.