An impressive lighting model is one thing, but light needs to be accompanied by shadow in order to carry off a realistic look. Both HD consoles on the market appear to struggle with truly convincing shadows.
"I don't think we do anything unusual here," Shishkovtsov says. "On 360 we first render the traditional depth from light point of view, then convert it into a ESM (exponential shadow map) representation while gauss-blurring it at the same time. Later during the lighting we do one bilinear lookup to get percentage in shadow.
"The end result: we avoid any jittering, noise, stipple-patterns or many (costly) look-ups to filter shadow to get something what at least remotely looks like a shadow. Of course the 10MB eDRAM on 360 slightly limits the resolution of shadow maps, which can be noticed sometimes when the light source moves... We use that space for shadow mapping only twice during a frame."
The 4A engine also includes custom anti-aliasing solutions. Developers are finding that the MSAA hardware within the 360 GPU can be repurposed for other tasks, but reducing edge-aliasing and shimmer remains an important aspect of overall image quality.
"The 360 was running deferred rotated grid super-sampling for the last two years, but later we switched it to use analytical anti-aliasing (AAA)," reveals Shishkovtsov. "That gave us back around 11MB of memory and dropped AA GPU load from a variable 2.5-3.0 ms to constant 1.4ms. The quality is quite comparable. The AAA works slightly different from how you assume. It doesn't have explicit edge detection.
"The closest explanation of the technique I can imagine would be that the shader internally doubles the resolution of the picture using pattern/shape detection (similar to morphological AA) and then scales it back to original resolution producing the anti-aliased version. Because the window of pattern detection is fixed and rather small in GPU implementation, the quality is slightly worse for near-vertical or near-horizontal edges than for example MLAA."
Another key element of the 4A tech is the artificial intelligence of the NPCs. Impressive graphics don't count for much if your gameplay opponents exhibit poor intelligence.
"Each AI character in the game has feelings: vision, hearing and hit reaction. The vision model is pretty much close to reality: NPCs have a 120 degrees visibility cone and see those in the centre of the cone more clearly, also illumination and speed of the target is taken into account. For instance, a moving object is seen more clearly in the darkness than standing one. Also a 'look closely' effect is implemented. There are different levels of alertness: light disturbance, light alert, alert, uber-alert, danger."
The sound model for the AI is intriguing. The 4A engine attempts to emulate a real perception of hearing by drawing out variables from elsewhere in the game design.
"Each sound in the game has its own 'AI mark'... shooting sounds are marked 'combat.shot'," Shishkovtsov explains. "For this mark, hearing distance is, for example 50 metres, which is quite a lot. But using the renderer's portals/sectors the system hearing handler determines 'virtual distance', taking into account walls and corridors.
"So an NPC on the other side of the wall will never hear what's going on here, because while the 'straight line' distance is only five metres, the 'virtual distance' using a sound path along the wall results in a 60-metre distance."
Hit reactions and perception of objects in the view of the NPC are also processed. If the AI recognises a grenade, it'll try to make its escape.
"The next layer is used to sort out this basic information and decide, what is the most important for NPC right now," continues Shishkovtsov. "Different levels of feeling are connected to different types of behaviour. For instance typical behavior for a 'light disturbance' is saying something like 'who's there?' and looking closer, whereas for the 'uber-alert' it’s going out for a full search.
"And of course, designers have full control over everything, so they can still make NPCs stand still or play funny animations even when a nuclear bomb is dropped nearby if it suits the scene."
As an example of a fledgling game engine, 4A does an impressive job of utilising the Xbox 360 hardware: pumping out visuals quite unlike anything else seen on the system. While the console perhaps has too many first-person shooters, the tech combined with the distinctly East European art direction has resulted in a title that looks and feels different from the Unreal Engine norm. It's interesting to see how the team's "coding to the metal" mentality has been applied to the consoles.
"The 360 GPU is a different beast. Compared to today's high-end PC hardware it is 5-10 times slower depending on what you do," says Shishkovtsov. "But performance of hardware is only one side of equation. Because we as programmers can optimise for the specific GPU we can reach nearly 100 per cent utilisation of all the sub-units.
"That's just not possible on a PC. In addition to this we can do dirty MSAA tricks, like treating some surfaces as multi-sampled (for example hi-stencil masking the light-influence does that), or rendering multi-sampled shadowmaps, and then sampling correct sub-pixel values because we know exactly what pattern and what positions sub-samples have, etc."
It's this approach that will see the Xbox 360 and PlayStation 3 far out-live the shelf lives of their individual processing components.
"The majority of our Metro 2033 game runs at 40 to 50 frames per second, if we disable v-sync on 360," says Shishkovtsov. "The majority of the levels have more than 100MB heap space left unused. That means we under-utilised the hardware a bit."
The complete transcript of our interview with 4A's Oles Shishkovtsov will be published next week. There's a wealth of cool stuff in there, including a direct comparison between the 360's Xenon CPU and the latest Intel i7 architecture. Plus: more information on 4A's HDR lighting solution, the in-game AI, the utilisation of PhysX and much more.