Digital Foundry: You're not shipping with the frame-rate upscaler in Force Unleashed II despite the pretty stunning demo. You've mentioned that it's preferable to build your art around this technique - how so? Would it be preferable to render some elements, for example animation, at 60Hz?
Dmitry Andreev: That's true as very early in pre-production we decided lock our feature set, and not introduce extra changes to the art and design production, so that when the artists hit production they had very solid set of features they could rely on and not to worry about anything. But they had to tweak the motion blur in some cases though and design a little bit around it.
In case of 60Hz it would be better to design things around it to get a better effect as well as handle some artefacts on the art as well as design side. Let me explain it with some examples.
For instance, when you know that every other frame is interpolated and its visuals are taken from the previous one, you want to avoid any very significant twists or directional changes between those frames. The inner frame would not have the right data to be built from, so one solution would be not to allow such quick changes in gameplay, by changing it more gradually. Or you detect this change and omit the interpolation or that particular character for that frame, leaving it as is for just that one frame.
If the character jumps quickly and does all those other crazy things, you could try to track it with the camera a little bit, and try to anticipate what we are most likely going to see.
Other things related to design involve the use of alpha blending and the HUD. In the demo you've seen the lightsabers were disabled as well as the HUD. It is not really a technical issue as it could be worked out by re-rendering them and so on, but then other kinds of questions arise. What about the sorting and stuff like that? Again, it's manageable.
But the easiest solution - just don't do it. Is HUD a problem? Yes. OK, no HUD. Lightsabers? Well, yeah. OK, no lightsabers. But we cannot go too crazy with this by removing shadows for example. Instead, we can say that we'll place lights or shadow sources so that it is not a problem. So you can't get to the light close enough such that the shadow starts moving too fast on the floor. I am not saying that those things are not solvable, but the point is that they might affect the design and art and you have to think about them.
In the demo you can see the rain. It is alpha blended, but it doesn't seem to cause any problems. On the other hand, those little splashes on the character look a little funky. Easy, remove them.
Digital Foundry: Crytek seem to be using a variant of this re-projection principle with its temporal AA in CryEngine 3. Does your DLAA anti-aliasing system in The Force Unleashed II tie in with your work here? If not, how is it different? Can the velocity buffer be re-used for other purposes?
Dmitry Andreev: No, our anti-aliasing solution doesn't use the re-projection, but it is very similar to the interpolation technique in term of simplicity. It's all about what you can do by not going around and "googling" things, but by looking at a problem from a different perspective. I can't say more about it at this point.
One thing though related to the interpolation is to make sure it works with the anti-aliasing, because very often in games I see that people don't do anything about it, and once you start moving the anti-aliasing is gone, especially with motion blur, so one should take it into consideration.
The use of velocity buffer is only limited by your imagination. I know it sounds a little blurry, but that's how it is. Even though we don't use the interpolation in The Force Unleashed II, most of the stuff described in the presentation is used in production in some way or other. Most of it. I really want people to understand things before or when using them. Knowing something and understanding it are two different things. This is what I have learned.
Digital Foundry: Are there any potential applications here for stereoscopic 3D rendering?
Dmitry Andreev: I've been playing with 3D re-projection a little bit. By the way, this is where a slightly modified version of the character removal could be used to fix up the issues of re-projection, but that is less efficient compared to when it is used in motion. And of course, when it is depth-based 3D [like TriOviz and potentially Crysis 2's implementation - Ed] you have the same issues with transparency.
In some cases it works, in other it doesn't. So it is still best to try to design your game around 3D technology from the outset. I have not tried it yet, but I think it is possible to do frame-rate up-conversion together with 3D-depth re-projection. It may work, but I would expect a higher performance and memory hit.
Digital Foundry: The presentation has stirred up a lot of interest from members of the development community we've talked to. Can you think of any situations where the technique could be deployed now? Who do you think would be first to market with a game based on this idea?
Dmitry Andreev: Well, I think that for EA running at 60FPS is very important, especially for EA Sports. And I think that most of those sport games are a lot easier to make work with this technique as some of their engines are based on forward rendering. So I would try to render the environment like stadiums and tracks at 30FPS with all the characters running at a real 60FPS.
In terms of "when", I think that games already running at 60 will stick to a real 60FPS and titles that are in production and running at 30FPS will most likely to keep it that way. But the ones that are in pre-production or early in production may try this out, which might take at least a year. I will not be surprised if it will be EA using this sometime soon, but the real utilisation might come from more tech-oriented companies like Naughty Dog. We will see.