Tech Interview: Blur • Page 4

Tech wizards Steven Tovey and Charlie Birtwistle bring Bizarre's latest into focus.

Digital Foundry: Your rear-view mirror is quite unlike most other driving games'. For a start, you've actually got one! Not only that but it doesn't seem to do any of the usual technical shortcuts: it's big, has tons of detail and is fundamental to the gameplay. A technical nightmare, surely...

Charlie Birtwistle: From working on the PGR games we knew rendering things like dynamic environment maps and the rear=view mirror were some of the most expensive things we were doing in the rendering. The cost of additional scene graph traversal and rendering really hit us on both CPU and GPU performance.

The multi-threaded renderer in Horizon totally solved our CPU performance problems - the mirror is rendered on an SPU/core in parallel with the environment maps, road/water reflections, and so on, so is pretty cheap from a CPU point of view. Solving GPU performance was simply a case of building scalability into the renderer, so in the mirror view LOD distances are shortened to reduce vertex count, the pixel shaders are simplified, and some objects are just omitted entirely.

The main point of the mirror is seeing the other cars behind you, and because obviously a car can't be in front and behind you at the same time, it means we aren't rendering more than 20 cars even with the rear-view mirror.

Steven Tovey: On PS3, the mirror (and real-time reflections) are rendered in parallel with the lighting on SPUs, so it was actually beneficial for that purpose to have the extra views!

Digital Foundry: Can you talk us through your photo mode? There's a strong sense of community throughout the game and it even extends to sharing your in-game screenshots.

Steven Tovey: From a technical point of view, photo mode is doing some very interesting image processing to achieve very nice motion blur and bokeh, ultimately producing an effective 100x multisampled image.

One of the big engineering challenges of photo mode was that it couldn't really add to the memory footprint of the game, as it happens during the standard paused game. Given this it had to perform its accumulation in a very resource-constrained fashion, this meant doing some neat tricks to accumulate using a small 32-bit per-pixel buffer.

On a somewhat related note, there could potentially be an interesting avenue of research for utilising multiple GPUs in SLI here.

In addition to our original Face-Off coverage, we have more PS3 Blur performance analysis right here. This video, taken from different stages, demonstrates just how absolutely solid the game is - the performance level in these stages actually exceeds what we saw in the original Face-Off analysis video.

Digital Foundry: There's something of an ongoing debate between PS3 and 360 owners about the quality of the online experience based on the underlying infrastructure. This is something that interests us because there's no actual way for the end-user to do like-for-like measurements. Is your netcode effectively identical cross-platform, or do you need to tailor it to each specific platform?

Steven Tovey: For Blur, the game team use a mixture of Demonware and some platform-specific APIs to give the player the most lag-free experience possible. Integration with Twitter and Facebook are also shared across all platforms through the Demonware library.

Digital Foundry: With Blur now shipping, have you done the requisite post-mortem on the game yet? What are the key lessons you've learned from a technical perspective in creating this game and what would you like to concentrate on improving in the future?

Charlie Birtwistle: One thing we'd definitely like to do is have a DirectX 11 renderer for the PC version, so really high-end machines can get the proportionally better performance and visuals that they are capable of. We'd also like to look at improving our tools and pipelines so it's easier for the artists to put more cool stuff into their environments quicker than they could previously.

Steven Tovey: As a PS3 guy, I think there's a lot of potential for improvements in the future. I'm very excited by the work the God of War 3 guys have done with morphological anti-aliasing and think there are some other techniques in that vein that could be worth investigating.

There are some specific ideas we're playing around with at the moment, which should help to accelerate the RSX on our forthcoming titles, but nothing I can really talk about just yet.

Digital Foundry: We're just a few months away from celebrating the fifth birthday of the Xbox 360. At this point in the lifecycle of PS1 and PS2 we were looking at new hardware round about now, with design and technical creativity shifting away to new platforms. Not so this time. What are your views from a technical perspective on the "10-year lifespan" for PS3 and 360? Were these machines genuinely ahead of their time, or are they becoming tired now?

Charlie Birtwistle: I'd say the 10-year lifespan is pretty accurate. I think it's only really been in the last couple of years that multi-platform games have really been anywhere near pushing the machines to their limit from a CPU perspective.

I suppose the sticking point is the GPUs, which are now fairly dated compared to modern GPUs, but you can always do more clever things on CPU to improve GPU performance such as offloading work to the CPU like we do with Blur's dynamic lighting, or coming up with more advanced and efficient LOD systems so you can get better visual fidelity out of the same number of triangles and pixels which pass through the GPU.

In other words, there's a lot more to come from this generation of hardware, and I'd take with a pinch of salt any developer who claims they are totally maxing out the hardware.

Steven Tovey: There's little doubt that both the Cell and the 360 GPU were ahead of their time, but I think there's a great understanding now of what developing for these architectures entails.

Speaking about PS3 in particular, I think in the beginning various people did a great job of selling the whole "PS3 is hard" thing to everyone and that scared a lot of people, but the problem with developing for Cell is just the problem with writing multi-core systems in general. It's not really that much harder than coding for any other multi-core architecture when you get down to it.

I think once you accept the hardware for what it is and design around it you're not going to have too much trouble. If you try to drag a memory-unaware, single-threaded engine kicking and screaming to multi-core, then I think you're in for a rough ride whatever you do. I think that's what a lot of developers found early on, which is why it's taken a while for the general standard to progress to where we are now.

Taking the decision to start from scratch with the technology at Bizarre was a good thing that has allowed us to construct a solid foundation that we can build on for the rest of this hardware cycle.

I think PS3 certainly still has some secrets left to discover, there are some great ideas kicking around about new ways to push the platform even harder and I'm personally looking forward to finding out what we can get from the hardware for future titles here at Bizarre.

Blur is out now for PC, PS3 and Xbox 360 and you can read our review and face-off elsewhere on the site.

Comments (76)

Comments for this article are now closed, but please feel free to continue chatting on the forum!