Skip to main content

Long read: The beauty and drama of video games and their clouds

"It's a little bit hard to work out without knowing the altitude of that dragon..."

If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Digital Foundry vs. Unreal Engine 4

Video, screenshots and analysis of Epic's brand new next-gen engine.

At E3 this year, Epic delivered something few others dared to offer - a vision of the future in gaming, a look ahead to the graphical possibilities afforded by the next generation of console hardware. We've seen it in motion and today, on this page, you will too. Unreal Engine 4 may not be entirely what you expect, but it is very real, obviously ambitious and in many ways, rather spectacular.

When we think of what the next generation represents from a rendering perspective, we look at PC graphics hardware for our lead: increased detail through tessellation, GPU compute shaders, enhanced post-processing effects. We look towards current iterations of technologies like Frostbite 2 and CryEngine 3 that bridge the gap between the HD consoles and cutting-edge PC rendering. The thing is, as Epic has demonstrated with its Samaritan demo, the existing Unreal Engine 3 can do that too - we should fully expect to see plenty of cross-generational titles running on PS3, 360, PC and next-gen consoles using Epic's existing middleware.

This Unreal Engine 4 demo is something else. The most basic principles it is based upon strongly suggest that games based on this platform simply couldn't be achieved on current gen hardware without fundamental compromise. In the here and now, most games mostly use a mixture of static lighting - pre-calculated and "baked" into the environments - and dynamic light sources. The global illumination system employed by UE4 is all real-time, all the time: no faking, no baking - and the level of fidelity in the simulation is an obvious step beyond what is possible on existing console hardware.

"The real-time rendering focus allows developers to make changes to their game while it is actually running - something that wasn't possible with UE3."

The Elemental demo running in real-time on Unreal Engine 4. We hand-encoded each version of the video to provide optimal quality whether you're watching in SD or 720p high-def.

"There's no static lighting at all in this demonstration. We've actually removed the ability to bake down lightmaps," reveals Alan Willard, senior technical artist at Epic Games.

"Everything you're seeing is the result of lights I could find, select, move and completely change the look of at the exact same speed you're seeing right now. There's no re-calculation time for moving lights around, it's just a part of how the engine renders each scene."

This presents other advantages too. Everything is being generated in real-time to the point where the entire game runs within the actual editor. Tweaks to game code are compiled in the background while the game continues to actually run. We've seen this sort of thing before on CryEngine 3 (which can also run the same code on multiple platforms simultaneously) and we've spoken to other developers that have their own real-time editing workflows, but this is new ground for Unreal Engine.

Certainly, the fidelity of the lighting model and the relationship between all objects in the scene is remarkable. Textures aren't just textures, they're full-fledged materials with unique properties which define how they are lit and how they interact with the rest of the world. As the Epic man puts it:

"Material defines how light interacts, how it is illuminated and also how it bounces light in both diffuse and specular, so I get full colour bounces off every object."

Willard picks up the hammer wielded by the Elemental Knight in the UE4 demo.

"We also support materials emitting light so this hammer emits light based on its actual temperature," he says.

"So the hotter the hammer, the brighter the light - and it's all based on the material and the surfaces. There's no light buried in the hammer, it's all completely real-time off of the surfaces themselves."

"Everything within the game world is lit accurately with respect to all the available light sources in the area, also factoring in light bouncing off other objects."

Dynamic Materials, Indirect Lighting and Particle Effects

Depending on the material, all objects reflect the surroundings according to how they are lit. Light can be both direct and indirect - it literally bounces off objects, even through objects. Games designers can adjust the qualities of the materials, changing how much light passes through an object and how it is filtered. Absolutely everything is rendered in real-time - in terms of both light and shadow.

The materials themselves are dynamic too: as you'll see in the tools and features video on this page, the completely deferred nature of the new engine also extends to the implementation of deferred decals. Willard picks up a wet sphere and moves it around the room.

"So this sphere will drop a number of wet decals on the ground, which redefine not only the diffuse component of the ground but also its specular, roughness as well as the normals that are on the surface," he says.

"What this means is that I can have a complex surface that reflects in real-time all of the changes and can be changed by gameplay or anything else the designer chooses to do."

Light doesn't just illuminate objects and cast shadows. Just as in real life, it bounces. Willard points to a red carpet in a new room, adjusting time of day so more sunlight enters the room - the net result being that the walls gradually become more illuminated with a red tone as more light bounces onto the surroundings. Epic is using a voxel-based approach to indirect lighting - somewhat intensive in terms of RAM but with decent performance, on high-end hardware, at least. It appears a little reminiscent of the light propogation volumes developed by Crytek for its own CryEngine 3 middleware.

"Any change in the material, any change in the environment is reflected not only in the way that light illuminates something but also how light bounces off and affects the rest of the world," Willard explains.

"Lighting, indirect lighting, shadowing, particles, post processing effects - Unreal Engine 4 produces some phenomenal results on high-end PC hardware, but can next-gen console hardware handle it?"

Alan Willard presents the major new features of Unreal Engine 4 and demonstrates many of the new graphical features in a slightly cut-down version of the presentation he gave at E3. In many ways, this talk is somewhat more 'illuminating' about the capabilities of the tech than the Elemental demo itself and it's a 'must watch' for any one curious about how UE4 achieves its effects.

Particles have also seen an enormous improvement over UE3: smoke particles have volume, they cast shadows and are illuminated by the sun, other dynamic light sources and even light from the sun reflected from the surrounding environment. The UE4 demo features a GPU particle simulation where over a million particles are rendered in real-time with interactive vector displacements fields determining their behaviour.

Post-Processing and the Console Challenge

Post-processing effects also reach a new level of fidelity compared to current gen standards: in the UE4 demo Willard shows off eye adaptation - a higher precision, more physically correct version of an effect we've seen in current-gen titles where we see a simulation of the effect of the eye adjusting to sudden changes in light. Even an old favourite - lens flare - gets a new per-pixel lighting makeover. Inevitably, depth of field is covered off too with a high quality implementation.

"A lot of what we see that convinces your eye that something you see is real is a lot to do with post-processing and we've been spending a lot of time on that," says Alan Willard.

"As close as you get to reality it is always going to be a game and there are trade-offs you're going to make. So our big push is to give us much control as possible… to put it into the hands of the developers so if you choose to make a game that has realistic eye adaptation, that we have the tools available for you.

"But if you want to make something much more cartoony, you're not locked into 'well we did it this way so you're stuck doing that'. We tend to spend a lot of time making tools as broadly powerful as we can. There's a lot of things - motion blur, eye adaptation, lens flares - that are designed to bring us closer to cinematic photorealism rather than looking like an actual photograph, and we'll be continuing to evolve the engine on these lines for quite some time."

"Epic says that it doesn't know the final specs of the next-generation consoles and suggests that trade-offs may be required to translate this tech demo into something that can be used in-game."

So if this is a tech demo, just how much of it will we see in actual next-gen titles? The UE4 demo is running on PC, specifically an Intel Core i7 processor with an NVIDIA GTX680 and 16GB of RAM - what Epic terms a standard development box. This is almost certainly considerably beyond the base hardware of both Orbis and Durango, but factoring in the advantages of a fixed hardware platform with dedicated APIs, the gap narrows.

"Obviously we don't know what the final specs are for the next-generation consoles and I'm sure we'll have to make trade-offs to put a final quality game onto whatever comes out," says Alan Willard.

"We have a pretty good history of making our tech demos look like what our final games are. Gears started off as a tech demo years ago at E3 in 2004 or so. We certainly don't try to fake what we're capable of doing. Obviously the engine is very new, we're still exploring what we can do with it and as more details come out on what the next generation hardware is, we'll have better ideas on what our final trade-offs will be. We're still waiting to find out ourselves."

We can't help but feel that Epic is perhaps playing with us just a little here. Bearing in mind the realities of modern GPU design (they can take years to architect and get into production) and the projected Q4 2013 release dates, Orbis and Durango are almost certainly in the final phases of development. As a major stakeholder in the games business via its successful middleware business, and factoring in the company's previous input into the design of the Xbox 360, Epic must surely possess a rather good grasp of what these machines are capable of. This perhaps makes the UE4 demo even more exciting: what we're seeing here is its vision of the fundamental building blocks that will underpin a whole generation of next-gen titles.