The making of Forza Horizon 2
Digital Foundry goes head-to-head with Playground Games.
It's the tech interview we've been chasing down for a while now - especially in the wake of that superb playable demo, released a few weeks back. Forza Horizon 2 improves on the raw, unadulterated fun of its predecessor - an intoxicating fusion of arcade-style gameplay based on the intricate Forza Motorsport simulation - and melds it with some of the most impressive rendering technology yet seen on the eighth generation of games consoles.
We've interviewed developer Playground Games before of course, when the original Forza Horizon debuted towards the end of the Xbox 360's lifecycle. It remains a remarkable story - the creation of a brand new studio located thousands of miles away from the Forza development mother base, immediately handed the keys to one of the most valuable franchises in console gaming. It was a calculated gamble by Microsoft and Turn 10, yielding instantly impressive results.
The story this time is very different. There's a sense of confidence and accomplishment about this new Forza sequel. Playground Games is established, and it has the benefit of handing in its first Xbox One title almost a full year after the hardware shipped, while also benefiting from experience with the platform inherited from Turn 10. And that's where our interview begins.
Many thanks to Playground's Alan Roberts (technical director), Matt Craven (chief engineer), Andy Sage (lead rendering engineer) and John Longcroft-Neal (senior rendering engineer) for indulging us with this avalanche of technical info.
When making our initial plans it was certainly fairly tricky to assess exactly where we'd end up from a performance standpoint. However, we were working closely with the Turn 10 team and had early access to a lot of platform information so we could extrapolate to a fair degree of accuracy where we would end up.
We created a whole bunch of calculations to estimate final hardware performance and some benchmark assets. As each new hardware and software revision became available we tested our assumptions and began to outline our budgets. The upshot of this was that we were not exactly settled on our final processes and budgets like we would have been had we created a same-platform sequel, but our pre-production estimates were very close.
This is all part of the fun of working with cutting-edge technology, and makes big-budget game development challenging and interesting.
It was clearly a generational leap from the Xbox 360, both in terms of power and the feature set. It was very obvious right away that we'd be able to make a great game and realise lots of the ambitions we had. You can see just how much of an improvement the game is from the original in every area, and the hardware is a big part of that.
We gained a good amount of early experience of the kit by working with Turn 10 on Forza Motorsport 5. They were preparing to ship the game at the time, so this gave us insight of how shipping a game would vary compared to Xbox 360, which was very useful to our own pre-production planning.
Right from the start we looked to what the Turn 10 guys were doing with FM5 and took full advantage of the great work they did to move onto a new hardware generation. We continually integrated their tech until they finished FM5 so we could continue to benefit from the optimisations and bug fixes they had worked on. There is a lot of bespoke work required to turn a Motorsport title into a Horizon title, but the engine is a really solid foundation that both teams continue to enhance and build on together.
We've implemented a feature set that we're really proud of. We sat down once we started Horizon 2 and figured out how the hardware and platform could be exploited in each area. Once you start off with that mind-set some great features start to emerge and it becomes more about how to find time to implement them all.
For example, we wanted to make access to online seamless and instant, eliminating traditional lobbies. This touched most areas of the tech base and needed some careful planning to implement it within our project constraints.
Horizon has always been about open world, but we've been able to push it further this time around. The new console and our experience from the previous game have really helped us open out the world in a way that makes sense. On Xbox 360 there were a few technical factors that made it difficult to open the world out in quite the way we wanted without sacrificing quality elsewhere, so we made a deliberate choice. These factors are now gone on Xbox One and the extra sense of freedom is great as a result.
We run the same physics engine that can be found in FM5, so it has all the same depth and sophistication. Our handling model is designed to be more forgiving and our car handling designers do an excellent job of setting up the cars to ensure that driving around our world is a fun experience. With the addition of dynamic weather we had to make some changes to enable the player to feel a change in handling when driving in the wet, but we still kept it accessible and fun.
We considered using this approach during pre-production, but decided that it would be too restrictive given our world size and limitations given the amount of data that can be put on the physical media. In the end we feel that the approach we took was a better fit for the project, allowing highly detailed and varied texturing, with massive vistas across the open world.
We were already used to streaming the world in Horizon from DVD, so the fact we're now streaming from HDD was a huge benefit to us as the bandwidth available to the game is much higher. Of course our asset sizes have increased significantly since the previous generation, but we already had the tools in place to duplicate data and optimise streaming bandwidth to a specific target. Having a HDD as standard in the box was a huge win for us.
Most of the extra memory goes into increasing the quality level of the assets and coping with the larger and more detailed world this time around. Texture fidelity has to be much higher at 1080p, and in addition we have many more complex materials that require extra textures compared to Xbox 360. For instance, due to using physically-based rendering, a large portion of our environment materials have independent reflectivity and roughness control via texture maps, allowing the artists complete flexibility. In addition, extra memory goes on more complex model geometry and higher model density.
We inherited a great starting point with the engine from Turn 10, which incorporated PBR from the outset. We took this and identified the areas that we'd need to develop for the new features used in Horizon 2, and aimed to keep the same level of consistency and fidelity used from that engine but transferred into the new dynamic time of day and open world scenario. This had several major impacts on the project, both in terms of asset production and system development. On the asset side, we had to bring everyone on the art team up to speed on the changes in the authoring process required compared to previous non-PBR production.
In terms of system development, we continued the attention to physically-based rendering into all of the new features that we developed for Horizon 2. For instance, this involved simulating the physical interaction of water with the materials used in the game to support our weather features. We also took a physically based approach to features such as the sky simulation, which accurately models the interaction of light with particles in the atmosphere to produce realistic results.
Definitely, and this was an area that proved challenging due to the volume of assets that required producing and making sure that there was a high level of consistency in all areas. We're fortunate in having a great level of sharing with Turn 10, particularly as far as the car assets are concerned. The environment assets required a much higher level of workflow change, which was exacerbated by the quality changes required when going from the Xbox 360 to Xbox One.
In some ways the opposite is true. Having a set of great-looking initial assets allows us to have a baseline for our rendering setup even though the feature sets are different between the two games. That said, we do have a specific feature set for Horizon that requires bespoke work on each car, for example to support wet weather and night lighting setup, which requires additional resource, particularly in terms of tools and pipeline setup.
The Forward+ technique we use is probably better described as 'Clustered Forward+'. All Forward+ techniques revolve around splitting up the screen into a regular grid of sub-rectangles (typically 32x32 pixels), then finding out which lights potentially effect each sub-rectangle before you start rendering. During surface shading, you load up the reduced light list for that sub-rectangle and process only those lights. The goal is to avoid processing lights that have no effect on the surfaces in the sub-rectangle.
The standard Forward+ technique uses a depth texture of the scene to cull lights from the list. There are two issues with this approach; firstly you need to render the depth texture as a pre-pass before the main scene in order to create the light lists; secondly semi-transparent surfaces cannot render to the depth pre-pass.
Clustered Forward+ avoids the need for a depth pre-pass altogether by calculating light lists at multiple depths for each sub-rectangle and using the most appropriate cluster during surface shading. We generate the light cluster data all on the GPU using Compute shaders and this is done for any rendered view that requires lights.
The advantage of Forward+ for us is that it just works with MSAA, at any level, whereas deferred techniques struggle to maintain decent anti-aliasing. Secondly you get the other benefits of forward shading such as allowing complex material types such as carbon fibre and car paint that are difficult to achieve using deferred techniques. We found that we could easily 'plug-in' Forward+ to the existing shaders which were already designed for forward rendering. The advantages of the Clustered approach to Forward+ for us were that semi-transparent surfaces did not need special consideration and most importantly we did not need to render a depth pre-pass.
It does seem sometimes that we are going in a different direction to the majority of games that are pushing for deferred rendering with FXAA or other combined anti-aliasing approaches. For us, image quality is highly important, and we found that we can achieve that best with the benefits of Forward+ lighting combined with MSAA. In terms of resource allocation, this is factored in during the planning phase of the project to make sure we spend the right amount of time balancing the cost versus quality trade-offs involved. In some respects this decision gives us more flexibility, as we don't have to spend time investing in custom anti-aliasing solutions as some other approaches might require.
We use alpha to coverage on grass, but this comes with a performance trade-off that means that it runs full execution of the pixel shader before Z test. Due to the density of foliage that we were aiming for and the fact that we had a much more consistent lighting model across all of the shaders used in the game, we decided that alpha testing for some foliage types was a better use of resources.
We were actually running with EQAA for a large part of the project, but we found that our best cost/quality trade-off was firmly with 4x MSAA. Due to the way the GPU compresses the MSAA data, it's usually not 4x the bandwidth for 4x MSAA, and with a forward-only rendering path, the GPU is rarely bottlenecked by bandwidth, especially if the majority of the render target data is in ESRAM. We found that EQAA with a lower number of depth samples than 4x MSAA improves performance, but also lowers image quality because the hardware can't properly resolve certain polygon edge configurations. In a racing game, where the focal point is often in the mid to far distance, this difference can be crucial.
We scheduled specific optimisation phases during the project development, where we looked at improvements that would allow us to achieve the solid 30fps that we were aiming for. We analysed ESRAM usage during each of these phases and determined what gains could be made, in particular optimising which render targets were in ESRAM throughout the frame. This involved careful analysis of which targets were required at which point in the frame to allow us to maximise the amount of ESRAM resident resources at any one time. It was also necessary to have a good understanding of what the bottlenecks were for each rendering stage so we could target ESRAM optimisations for systems that benefited from the additional bandwidth. ESRAM is pretty easy to manage, mainly because you don't need to resolve textures to read them and you can make textures partially resident.
The main issue was that at any time the environment could become wet. This meant that we had to incorporate the concept of wetness into our physically-based rendering solution. So all of our materials have the ability to have a thin film of water applied on top so as to bounce light as a real wet surface would. We also had to create systems that could control the underlying shading model in a realistic manner depending on the environment state and evolve this over time (for instance, with the same level of surface water, the visuals have to look different depending on whether they are getting wetter or drying out, as this is what happens in the real world).
Additionally, we had to spend a lot of time implementing a real time reflection system to allow the light to be reflected accurately on these wet surfaces. As we knew that wet weather was a big feature for the game, we made sure we addressed these systems during pre-production to determine the best approach in each case, and to allow us to more accurately define where the frame-time would be spent.
It's surprising how much time is needed in day-to-day work to keep multiple platforms compiling, running and at the same time trying to push boundaries on each. Additionally, most of the work estimates have to go up slightly and some features need to be implemented more than once to play to the strengths of each platform. In some cases you would have to settle for an inferior approach that works on all platforms due to time constraints.
Working on a single platform takes all of those worries away. Your feature list can get longer and this makes designers and artists much happier! Having your engineers code right down to the hardware is great too. We're able to really leverage the platform in ways that would be difficult without that focus. Our use of cloud computing is a really good example of that.
Bringing Drivatar technology across from FM5 meant it was simple for us to activate it for Horizon 2. Once we did this and the team trained up their Drivatars we saw an instant change to the behaviour of the opponents - our races suddenly felt more alive. We then enhanced the system to handle the open-world environment, so Drivatars could drive off-road and take short-cuts. The first time I saw a Drivatar leave the road and head cross-country was a magical moment for me.
The Drivatars system continues to evolve and has done since the launch of FM5. The FM5 team have updated the systems post-launch and Drivatars continue to learn. The Drivatars in FM5 today, both individually and as a system, drive very differently than they did in the first several months. Thanks to sharing this technology across both games we were able to bring your FM5 Drivatar (and those of your friends) into Horizon 2 at launch.
The simulation runs at 360Hz so we maintain the same level of detail as found in FM5.
We had good experience to draw on from Horizon for this. We keep the controller input and physics calculations as late as possible in the frame in order to minimise the lag. We also run the audio on a separate core and keep that as closely in sync with the rendering as we can in order to help reduce the perception of lag.
Microsoft have a dedicated team in Reading who created numerous automated stress-tests which could test and report on performance across the whole world. Using a bank of kits they were able to profile the whole environment every day and give us immediate feedback on areas of the world that weren't in budget. Our tools allow us to reproduce the exact scenario they tested and investigate the issue quickly.
As this is extremely important to us, we spend a lot of resources ensuring that systems are in place to monitor, assess, and feedback performance information to the relevant team members. Initially this begins with making sure that we can automate almost all aspects of the game to run stress-testing scenarios without human interaction. We then make sure that these automated systems comprehensively cover all scenarios that the user will encounter when playing the game, paying particular attention to worst cases. These systems are then continuously run on dedicated servers to produce up-to-date information on the performance of the game during development. The final stage is presenting this information in a way that is accessible to all of the team members involved so that they can take the necessary action to make sure that the given budgets are met at all times.
We actually made the demo for more traditional reasons. We believed that we'd made a really great game and we wanted to put the game in people's hands so they could get excited about it. We wanted to make a really generous demo, including online and social features, so people could get a taste of the full experience. We were absolutely bowled over by the response we got from it and we're seeing that carry through to the full game, which is hugely rewarding for everyone on the team.
The freedom the game gives you, not just in where you drive but what you do - there's so much choice for the player at any given moment.
For me it is seeing the sheer amount of amazing stuff the community are able to do with the game. Some of that really takes the dev team by surprise. The photography thread on NeoGAF has some simply stunning shots on it. And you never quite know what cool things you are going to find people doing if you join an Online Free Roam session.
Digital Foundry specialises in technical analysis of gaming hardware and software, using state-of-the-art capture systems and bespoke software to show you how well games and hardware run, visualising precisely what they're capable of. In order to show you what 4K gaming actually looks like we needed to build our own platform to supply high quality 4K video for offline viewing. So we did.
Our videos are multi-gigabyte files and we've chosen a high quality provider to ensure fast downloads. However, that bandwidth isn't free and so we charge a small monthly subscription fee of £4.50. We think it's a small price to pay for unlimited access to top-tier quality encodes of our content. Thank you.Support Digital Foundry