Tech Interview: LittleBigPlanet 2 • Page 2

Alex Evans on building a platform for games.

Digital Foundry: Screenshots of LBP2 show remarkable improvements to an already-convincing lighting model, with realistic ambient occlusion and soft shadowing. The original Temple backdrop has been shown with shadowing on the elephant statues in the new engine. How has the lighting model changed for LBP2? For example, have you added a shadow technique which improves the backgrounds and could you add shadows to point-lights, something missing in LBP?

Alex Evans: The irradiance slices technique I presented at SIGGRAGH 2007 wasn't actually used in that form in LBP, in the end. However, for the record it does support (shadowless) point lights natively. For LBP1, I actually moved to something a bit more 'deferred' (see my SIGGRAPH 2009 talk) - I believe it would now be named something like 'light pre-pass rendering' - but the details are not that interesting. However, the idea of volume-based lighting remained at the back of my mind because it is so neatly uniform.

For LBP2, it's been brought back: every frame, I dynamically 'voxelise' the whole visible scene, and then 'splat' light into that. Because the geometry of the whole scene is now in a volume texture, sampling for occlusion information just turns into volume texture lookups, which the RSX, in this case, is good at.

That now means that in LBP2, we have fun things like real 'world space' ambient occlusion, soft skylight shadows, and also shadows on every point light in the scene, without having to render shadowmaps for each.

The whole system is 'uniformly slow' in the sense that, apart from very cheap splatting into the volume per light, the actual light and shading is fixed-cost regardless of the number of lights.

The downside, other than time per frame, is that the volume is relatively low-resolution - something like 160x90x16 - so the shadows are quite fuzzy and soft. But the resulting volume god rays, and the improved 'chiaroscuro' [use of light and shade], is worth it! Oh and also it means that the engine is no longer 'deferred' in any sense - being a traditional forward renderer makes alpha/transparency easy to do again, without special code paths.

Anton's also thrown in a really nice pre-computed GI solution for the backgrounds, and it's not conventional shadow-casting at all - it's a kind of compressed lightmap that allows you to move the sun around, wrapped over the background.

Digital Foundry: The use of SPUs in achieving phenomenal performance is well-documented. There's a direct bus linking RSX to the Cell. What advantages does this bring to the table and how do you leverage it in your games?

Alex Evans: Crikey, that's a specific question! To be honest we've approached it very much from the point of view of 'try stuff out and see if it goes fast enough'. The RSX is an odd beast in that sometimes it can surprise you in how fast it chews through things - perhaps it's the bus - and sometimes its performance just 'drops off a cliff'.

Every GPU has its foibles - and with the PS3, we didn't take a particularly scientific or analytic approach. We just threw lots of pasta at the wall and some of it stuck.

Digital Foundry: From a technical perspective, what were the key points of your LBP post-mortem once the game had shipped? What did you perceive to be the strengths and the weaknesses of the engine and how did this inform your intentions for the sequel? What lessons were learned and how has that affected the LBP2 engine design?

Alex Evans: 'Engine' means a lot of different things to different people. I'm a graphics guy, Dave did the physics, Paul and Luke worry about the scripting language, the UGC machinery, the DLC process, the resource management. All of these things were overhauled for LBP2, so it was really a process of cleaning and improving. We've released over 100 packs of DLC since launch, and as a studio it was a really interesting and difficult process to learn how to juggle multiple sub-projects within our team.

Martin, one of our producers, really did an amazing job - but we still ended up with a certain amount of fragmented attention on the team, at one point juggling four 'live' branches of the same codebase. Something that's easy for some, but not what we'd planned for.

In terms of the graphics engine, transparency was the most requested feature - and that motivated the switch back from deferred to forward rendering. The engine is still a very compact piece of code - probably because it's really just Anton (and previously me) working on it - I love the fact it still fits in a couple of source files and a few SPU jobs! All of the material shaders in LBP are procedurally generated with a few parameters, so it's a testament to the artists that they get so much from so little.

Constraints are good - and as an engine coder, if you give people too many 'knobs' they end up spending their whole lives tweaking them. Instead, we've got a constrained system and a demanding art department who really know how to milk it.

It's a lovely area of the code to hack on, because you can literally hack on one shader template and know that you can really shape the artistic look of the whole game from that one place.

The flipside is, we have a lot of old, important content to support - namely the millions of levels - and some of the seemingly tiny, relatively arbitrary or ill-considered choices, like the way that we generate, name and store materials (in a massive flat directory, now with tens of thousands of files - oops!) really hurt us now.

We discovered that SVN ['Apache Subversion', a development revision management system] has lots of O(N^2) algorithms in it, where N is the number of files in a given directory - so that our check in/out times have been ballooning. It's always those kinds of things that end up sucking up time, rather than the fun part of actually messing with 'looks'.

Comments (25)

Comments for this article are now closed, but please feel free to continue chatting on the forum!