Tech Interview: Halo: Reach
An exclusive, in-depth chat with Bungie on the making of Reach.
Stop us if we're getting too technical... on the other hand, don't bother! When the opportunity arose to talk tech with Bungie on any topic of our choosing, let's just say that we didn't hold back. And luckily for us, neither did the studio. What we have here is a titanic 6000 word insight into the technical make-up of the Xbox 360's biggest exclusive of the year.
In this piece, we go into depth on a vast array of technical topics: we talk about the deferred rendering solutions employed by the Halo games, the enhancements made for Reach, anti-aliasing, atmospherics and alpha, plus we talk about how high-dynamic range lighting is handled in the new game.
It's fair to say that this is pretty high-level stuff, but our other discussions on performance, co-op/split-screen, artificial intelligence, animation, motion capture, the return of the Elites, plus the process by which Bungie polishes its game before release should be accessible to all. Also, we get to the bottom of the mystery of the deleted campaign level that had the player in control of a Covenant Scarab...
Joining us for this interview are graphics engineer Chris Tchou, character designer Chris Opdahl, community/writer Eric Osborne, senior animation lead Richard Lico and character engineer Max Dyckhoff. Thanks to each and every one of them for putting so much time and effort into what is one of the most expansive and comprehensive tech interviews we've ever published here at Digital Foundry.
Halo 3 and Halo 3: ODST used a 'semi'-deferred two-pass rendering approach, except for small decoration objects like grass and pebbles which were forward-rendered in a single pass for speed. Semi-deferred rendering allowed us to have cheap decals, but we didn't use it for deferred lighting; the lighting was rendered in the second pass over the geometry so we could have complex light map lighting and nice-looking metallic shininess (i.e. something better than Phong specular). For Halo: Reach, we rebuilt the deferred buffers so they could better approximate our specular models, which let us use fast deferred dynamic lights everywhere without losing shininess.
On top of that, we also built a system to determine when objects were not taking advantage of the deferred path (ie. they had no decals or complex deferred lights touching them) and switch those objects on the fly to the faster one-pass forward rendering. Yaohua Hu also spent a lot of time researching an improved light-map representation (it's better than spherical harmonics!) that gives us the same support for area light sources, improved contrast, fewer artifacts, a smaller memory footprint and much better performance. This helped to free up a lot of GPU time to use for the dynamic deferred lights and other graphical goodies.
We built a particle system to handle the specific case of numerous small transient particles - basically rock chips, dirt puffs, rain drops, splashes, sparks and that kind of thing. I am presenting it in more detail at the next GDC, but the neat part is that it can handle tens of thousands of collisions/bounces each frame by reading the depth and normal buffers, and the whole thing takes less than 0.3 ms (about 1/100th of a frame); which looks pretty good compared to the seven (7) standard particle collisions per frame allowed by the effects budget.
The new particle system allowed the effect artists to use huge numbers of these small colliding particles in their effects without worrying about performance at all. Oh and it's used for the rain too: if you watch the rain in slow-motion in the theatre mode you can follow a single rain drop as it falls until it splashes onto something!
We use a single 7e3 buffer for our final render target in Reach. This results in a more limited HDR (about 8x over the white point, as opposed to 128x in Halo 3) but is much faster for transparents and post-processing. In practice, the difference between 8x and 128x HDR is slight - the main thing you may notice is that the bloom around bright areas loses its color more often, desaturating to white.
And yes, a single 7e3 buffer gives us more available EDRAM for the final lighting pass, but the render resolution is still limited by the three buffers used in the main deferred pass. The resolution in Halo 3 was more limited because we save some EDRAM for dynamic shadows during the lighting pass, alongside the 2 HDR buffers and a depth buffer. But with a single 7e3 buffer, we have plenty of extra room available for the shadows, and it's only limited by the 3 buffers used during the deferred pass.
Multiple tiles are problematic: they either add too much controller latency (because predicated tiling delays the GPU start), or they result in too many passes over the geometry, eating up large amounts of CPU (basically rendering everything twice). Another factor is the 360's DAC, which has the super fancy up-sampling filter that hides any artifacts - we actually ran user tests on various resolutions and no one could tell the difference! So we chose to take the additional performance and reduced controller latency over the nearly imperceptible resolution increase.
Digital Foundry specialises in technical analysis of gaming hardware and software, using state-of-the-art capture systems and bespoke software to show you how well games and hardware run, visualising precisely what they're capable of. In order to show you what 4K gaming actually looks like we needed to build our own platform to supply high quality 4K video for offline viewing. So we did.
Our videos are multi-gigabyte files and we've chosen a high quality provider to ensure fast downloads. However, that bandwidth isn't free and so we charge a small monthly subscription fee of £4.50. We think it's a small price to pay for unlimited access to top-tier quality encodes of our content. Thank you.Support Digital Foundry