If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

The Red Faction Tech Interview: Part Two

In the first part of the Volition tech interview, we talked with associate producer Sean Kennedy and senior programmers Eric Arnold and Dave Banarec about a diverse range of topics associated with Red Faction: Guerrilla, the destruction model and the move to an open world being the key issues. In this concluding segment we're interested in a broader range of subjects, including the physics, the lauded multiplayer aspects of the game and of course, the forthcoming DLC.

Digital FoundryTo what level of precision do you calculate the physics for the destruction in this game? There has to be some cut-off point where the additional calculations wouldn't be noticed by the human eye. I'm curious at where the point is between total realism and what you might call "smoke and mirrors".
Eric Arnold

There is a definitely a diminishing returns curve here, the problem is what is noticeable to the average player is a moving target based on what is happening in game at any particular moment. Knocking a hole through a wall right in front of your face is a completely different problem than a two-storey office building collapsing in on itself. We had handle both, and everything in between, without slowing the game down or pulling the player out of the fiction we created. As a result we have a number of systems that constantly monitor performance of the engine and change settings in real time to keep performance up while making the game look as good as possible. Basically we are always getting as close to real as possible.

Digital FoundryIn what way do you tinker with the mathematical precision of the physics to make a more crowd-pleasing effect? Is pure realism in itself a bit too boring for a video game?
Eric Arnold

We spent more than half the time tweaking settings and turning knobs to get destruction to feel right. For the most part we stayed as close as possible to reality mainly so things react the way the player expects, but the rule was "this is a game, fun trumps correctness!" The largest example is probably the sledgehammer. Not even the world's strongest man could tear through a wall or send a bad guy sailing the way you can in the game, but that doesn't matter because it feels good and is a whole lot of fun. If we insisted on realism the player would spend half an hour chipping away at a wall to make a small hole (or more likely give up after a few swings because it is boring).

Dave Baranec

One of my favourite phrases about game development is "we're not making simulations, we're making games". This is often used to chide a young programmer who is trying to get too fancy or complex with a new piece of code. A corollary is "perception is everything". Now, RFG is definitely violating this law a bit – to get a simulation that looks and feels as realistic as ours does, you have to go in and do some real simulation-y work. There's just no way around it. But as with many things in game development, our physical model is a very rough approximation of reality. Civil engineers use something called matrix finite-element analysis to examine the true forces acting on a complex structure. It's very formal, expensive, but ultimately unnecessary for a game. So, we came up with some approximations that don't look much like the real thing under the hood. What was important was to get a bunch of eye-pleasing objects flying around and crashing into each other in believable ways. Better to have a game than a mathematically correct simulation that takes 30 minutes to render a frame.

And yes, there are some flat out hacks to do some crowd-pleasing things. For example, the nature of the system is such that it is easy for us to run a 3D plane through a structure and break it along that surface. If you watch large buildings coming down, every once in a while you'll see one "crack" in half down the middle. That's just a little touch we added using the splitting planes. Game tech is definitely part science and part art. I will say though that we were pleasantly surprised how much we didn't have to do this. The amount of amazing emergent physics you get just by letting the core system run was impressive.

Digital FoundryPre-RFG, just about the only game we can think of that aspires to this level of destruction is Criterion's Black. While many elements of Black have made their way into the next gen, the wholesale destruction didn't... until RFG came along and took it to a whole new level. Is there any particular reason you can think of that developers have shied away from this? The biggest bangs seemed to be reserved for cut-scenes in the current generation.
Eric Arnold

That's an easy one, it is freaking hard! Not only do you have to spend a whole lot of time to create the technology (notice the five year development cycle here? Ouch!), but it also creates problems for every discipline on the game. Rendering guys have to deal with way more stuff to put on the screen and make look pretty, AI guys and designers have to deal with the level constantly changing, sound people have to create assets for exponentially more interactions, then if you want online play you have to come up with a way to synchronize all of it. That's not to mention the memory and processing time that large scale destruction chews up. It is not a feature that can be dropped in to an existing game, it has to be planned for up front.

Dave Baranec

I would bet that many developers have tried, only to shy away in horror. The problem with a system like this is that it touches absolutely everything else in the game. It makes rendering tremendously more difficult. It makes level design extremely hard. It causes memory usage for what appear to be simple structures to be staggeringly high. So if you want a whole-hog destruction system like we have, you had better be prepared to pay for it with a ton of effort and sacrifice. You have to be targeting a game where destruction is the game, otherwise you're paying an awfully high price. It seems to me that most games are aiming at something else entirely.

Digital FoundryPerformance between Xbox 360 and PlayStation 3 is close in this project, yet we're dealing with two entirely different architectures. What is your approach to cross-platform development, in particular with regards utilising the many processors you have at your fingertips?
Eric Arnold

This was one of our top priorities. We knew from the start it would be cross platform even though the PS3 development hardware was still years off when we started. Once we got it and had the game running the two machines moved in lock step to the end of the project. We even went as far as cutting optimizations because they would only work on one platform. For our own sanity we tried to keep the internals as close as possible, but with the SPUs we had to do custom solutions at time for performance reasons. For destruction, I had to physically remove functionality and code from Havok in order to make room for our system on the SPUs. Given how hard we are pushing the systems I'm still impressed that we were able to make them virtually identical, it certainly took a lot of hard work from some very smart people on the team to get there.

Dave Baranec

Generally speaking, we like to keep both platforms developing in parallel. You can't really afford to let one platform slip behind because then it becomes difficult to make predictions about overall performance and features. Which then impacts the ability to create assets, which then impacts the schedule, which then impacts the budget, etc. It is especially critical for the tech end of things to keep the platforms inline and provide sophisticated tools so that content creators don't have to do N times as much work (where N = the number of platforms).

In terms of hardware specifics, the multiprocessing setups on the 360 and the PS3 are significantly different. At some point, you simply have to diverge big pieces of code to efficiently deal with this. For us, the two big areas here were physics and rendering. Both areas had high level cross-platform frameworks, but when you get to the most performance oriented areas, they did diverge into totally platform specific code. Thankfully, the volume of code this represents of the overall codebase is very small.