This week, Microsoft re-released The Witcher 2: Assassin of Kings via backward compatibility, and for a limited time, the game is available to download for free for Xbox 360 and Xbox One owners in selected regions. It's a welcome gift and well worth checking out on either console, especially as it runs rather well on the new Microsoft console. With that in mind, we decided to republish our article on the game's technological genesis, first published in May 2012.
The story behind The Witcher 2 is almost as epic as the plot of the hugely successful RPG itself - a tale of how an independent Polish studio created one of the most technologically advanced PC games in existence and then somehow converted it across to the Xbox 360, retaining and indeed adding to the original gameplay and bringing across the vast majority of the game's phenomenal visuals.
However, the tale really begins during the development of game's predecessor - CD Projekt RED's debut outing for super-powered monster-slayer Geralt of Rivia. In producing this new RPG, the studio turned to an existing engine to provide the technological underpinnings - BioWare's Aurora - but as development continued, it became clear that while accomplished enough in its own right, the bought-in platform wasn't entirely the right fit for the team.
"Aurora is a great engine, well-suited to BioWare-like RPGs. The thing is, in many aspects, The Witcher isn't like any BioWare game. The differences forced us to make significant changes to the engine," says CD Projekt RED senior engine programmer Tomek Wójcik.
"While working on The Witcher 1, we quickly got to a point where the technology started to limit the creativity of our designers and artists. They wanted features that weren't so easy to implement in Aurora, and they wanted a lot of them. While finishing the game, we - the programmers - finally reached a conclusion: it would be much easier to make all those features, if we only had our own tech."
Other considerations also made their way into the decision-making process. While The Witcher 2 launched as PC exclusive, the team always had its sights set on multiple platforms.
"We always wanted to release the game on consoles. We found it very hard to do with Aurora, which was rather solely a PC game engine. We definitely needed something else to work well on consoles. Developing the REDengine simply gave us full, unlimited control over what the technology is capable of," Wójcik adds.
The team are hugely modest about the scale of its achievement in the creation of the REDengine, but in the current era of games development, the notion of an independent developer producing a state-of-the-art technology that compares favourably against industry heavyweights like Unreal Engine, id Tech and CryEngine 3 is simply astonishing. From a visual arts perspective, CD Projekt RED had hit a home run with its first time at bat.
"There are many secrets behind the REDengine. I think the people who created it are the most important element: talented and ambitious folk of all kinds - programmers, artists, designers - who put in a lot of effort to create a technology that matches today's industry leaders," enthuses senior producer Grzesiek Rdzany.
"The second secret is tight co-operation between engine programmers and other developers, which allowed the engine devs to create a tool that embodied their game concept. It's worth mentioning that we remained pragmatic during the development process. If there was a solution that met our expectations, we didn't develop our own. That's why we used middleware like Havok for physics, Scaleform GFx for UI or FMOD for audio."
It's a pragmatism shared by just about every technological innovator working in video games - Unreal Engine integrates the same middleware too, and it's very rare indeed for a game to ship without some kind of assistance from an established tech vendor - it saves time, money and manpower.
Also semi-miraculous is the fact that The Witcher 2 and the REDengine were developed in tandem. In theory, this is a developmental nightmare for the programmers and artists as the capabilities of the underlying technology will be changing under their feet while they are working on gameplay and creating expensive art assets.
"Some basic elements had to be made before the work on W2 had begun in full scale. But the bulk of the work was done at the same time as the game," reveals Rdzany, before going on to explain that the parallel development did have some advantages.
"On one hand it brought some complications (due to temporary instabilities of the engine) but on the other hand this allowed us to modify the code to our needs and requirements. This way we could create tools designed for a title like The Witcher 2."
Pushing Back the Boundaries of PC Visuals
The final game remains one of the most advanced titles available on the PC, with the REDengine featuring a state-of-the-art line-up of visual effects work, along with some settings designed very much with future PC graphics hardware in mind. During our tech analysis, we found that even the most powerful single-core GPU on the planet - NVIDIA's GTX 680 - couldn't sustain 720p60 with all settings at max. The Witcher 2's uber-sampling mode caused the issue and had to be disabled to maintain frame-rate - a shame because it completely eliminated all aliasing and texturing filtering issues, adding greatly to image quality.
"Uber-sampling is a feature inspired by how some ray-tracing renderers work, as well as some tricks known in photography," explains senior programmer Bartek Wroński, comparing its approach to super-sampling - the notion of rendering at a much higher resolution and then downscaling to native res.
It's a trick most often used to eliminate jaggies and aliasing from press screenshots, and only very rarely ends up actually being used in-game.
"It is basically 'in-place super-sampling' - we perform multiple rendering passes with small, sub-pixel jitters and texture mip-bias, and then combine them in one buffer," Wroński continues.
"This way, it doesn't require as much memory as traditional super-sampling, and quality is a bit better, giving a distinctive, extremely detailed look. It wasn't designed as a feature for current mid-spec graphics cards, though, rather as a possibility for future players that will be refreshing Witcher 2 in the next few years to play a still-amazing-looking game."
Other effects also served to push a great many PC graphics cards to their limits, and while The Witcher 2 is perfectly playable even on entry-level gaming hardware, it truly is a thing of beauty with all effects present and correct.
"Bokeh depth-of-field was one of most costly post-processes we implemented. It was inspired by point-based rendering techniques and the way real depth-of-field effect works in lenses - by not gathering, but actually scattering blurred points in three depth slices," shares Wroński proudly.
"This way it was very expensive (especially as it required blending in 16-bit floating point) and was probably overkill for some graphics cards. For Xbox we had to drop it and go with standard gaussian blur depth-of-field, unfortunately. Still, we think such an effect is a must for next-generation consoles, giving games a filmic look and feel."
It's a rare admission from the team that some graphical features in the PC version of The Witcher 2 were simply too processor-intensive to make the transition over to the Xbox. CD Projekt RED is extremely proud of the fact that the 360 version of the game retains the vast majority of the visual impact of the PC game.
Where direct conversions of existing code wouldn't work, the team rewrote them from scratch to benefit the Microsoft platform's unique strengths. In several cases there's a strong argument that the console version actually looks more pleasing than the no-holds-barred PC original.
"This is our first Xbox 360 title, so we concentrated on quality and made almost no compromises while adapting the game," says Lucjan Więcek, lead level artist.
"Because we had more time for development, we decided to add some features to the game that we hadn't time to put there in the first place. We decided that the key elements we created had to be a part of the Enhanced Edition."
Console Development of The Witcher 2 Begins...
But the decision was made early on that this beefed-up version of the game would be exclusive to the Xbox 360, with Sony's PlayStation 3 architecture not being a good match for The Witcher 2's underlying tech. Executive producer John Mamais explains why:
"Simply put, our engine architecture was more suited to an Xbox 360 adaptation. Preparing two versions at the same time would result in dividing the programming team, and most likely would have doubled the time it took to deliver the 360 version at the quality we achieved. We simply didn't have the manpower to achieve this in the given time," he says.
One of Microsoft's key advantages across this generation has been the core similarities between the technological make-up of the standard PC and the Xbox 360 console. The CPU handles game logic and feeds the graphics core - there's no need to hive-off GPU tasks back to the main processor as is the case with advanced PS3 development. There's also the commonality of the DirectX graphics API that links the computer and console platforms.
"The two APIs are pretty close, but there are fundamental differences. The Xbox 360 API has more low-level access to resources and features of the GPU, and it has to deal with managing the eDRAM," reveals senior engine programmer, Balázs Török.
"So taking the first steps of moving from a PC DX9 implementation is easier thanks to the API similarities, but utilising the GPU fully is only possible through the low-level access, which meant we had to change the resource management, streaming, and even our rendering pipeline. After resolving the problems caused by the API change, the work was mainly optimising the performance of the different rendering systems using the new features as best as we could."
The Witcher 2 is no mere port of the PC version of the game. There are fundamental differences to the way the game is rendered, which helps explain why the two releases so very different in a great many ways.
"Actually, we had to rewrite most of our shaders and materials and redo the whole post-process pipeline. When the adaptation work started, we had about 5fps in most locations, meaning that the scene was rendering in 200ms instead of the desired 30ms," says senior programmer Bartek Wroński, revealing that the resultant work achieved more than just mere optimisation, but also some better-looking effects.
"Fortunately, Xbox 360 is closed architecture, and we had quite a lot of time for the adaptation/porting process, so we could find most bottlenecks and fix them. Some of the effects had to be totally redesigned to work in, for example, half resolution, like particles and transparencies or SSAO (for which we used a totally new, lower-quality but very cheap algorithm). Some post-process shaders were just rewritten to be optimised without sacrificing quality, sometimes even enhancing it."
Battling Console Memory Constraints
Core to the adaptation effort was the battle against the lack of RAM in the Xbox 360. The Microsoft console hails from an era where memory was a relatively expensive commodity - a far-flung scenario from today, where RAM is so cheap and a minimum of 4GB is commonplace even on entry-level PCs.
"The game must also fit into a half gig of memory, with all the code, the assets, and the Xbox OS. That requires splitting the game world into smaller pieces, which can be streamed in and out while playing," says senior engine programmer Tomek Wójcik, assessing the scale of the challenge.
"The problem is that if you want your assets to look really good, that makes them heavy. A lot of data is being read from the DVD or HDD in the background, and nobody wants to wait for it! I think that was the major challenge - to split the world into pieces big enough to create the proper player experience, and small enough to load quickly, fit into the memory, and run at 30fps at the same time. Of course, it also required optimising the engine to make it load the data as quickly as possible, process only what's visible on the screen at the moment (when possible), and do all the programmer's magic to optimise it for the specific hardware of the Xbox, but without proper world-partitioning we wouldn't have achieved any of that."
The advantage of running the game on the console is that every unit has the same performance level, so frame-rate was targeted at 30fps, and tools were created to ensure that the game remained within a set series of parameters.
"On a closed, very well defined platform like Xbox 360, it is crucial to have everything counted, measured and budgeted where possible. For instance, when some artist exceeded one of the limits, he quickly got notified by a nasty red bar, complaining that he's over budget," reveals Wójcik.
"It's not easy to say the exact number for a certain limit - especially when your game is not finished yet - but having some limits (even not very well defined) really made our lives simpler. Finally, QA had it in hand all the time - they were monitoring the situation and sounded the alarm every time something slowed down the game. Optimisation is a constant job - it's so easy to ruin the performance - so sustaining a good performance level all the time is a job for lot of people. Had we stopped optimising throughout, we wouldn't have been able to get the game to run at 30fps at the end."
Even with the approach to performance finalised and the continued optimisation effort that carried on throughout the 11-month development period, the CD Projekt RED team also found themselves facing some unexpected challenges. The way that the Xbox 360 handles colour is very different to PC, operating with significantly lower precision. Simply porting across existing assets and high dynamic range lighting didn't produce the results the team wanted.
"Hitting 30fps in 90 per cent of the game was a really big challenge that we finally managed to achieve, but still, the biggest problem for a very long time was colour precision. We used 10-bit render targets instead of 16 for HDR, and it was huge problem for our artists; they were complaining and crying all the time about what we have done with their beautiful game," recalls Bartek Wroński.
"Xbox 360 has its own specific output colour/gamma curve, different from PCs, and TV sets have totally different colour rendering from monitors designed for gamers and office work. We had lots of issues like overexposure, ugly colour-banding in shadows, strange colours and wrong gamma."
To overcome the issues, the programming team stepped in to assist the artists with some much needed modifications.
"We helped artists and implemented colour pre-scale that helps to fight banding, changed the tone-mapping process to work differently with a more intuitive set of parameters, and designed a special filmic-like colour curve that looked just fine on most TVs," Wroński continues, pointing out that the decision helped immeasurably with the final presentation of the Xbox 360 game.
"I think it was worth giving lots of attention, as lots of gamers and reviewers say that they prefer the colours and lighting on the Xbox 360 version of our game."
In our tech analysis of the console game we reckoned that the lighting was one of the stand-out successes of The Witcher 2 conversion, looking more natural and organic compared to what we thought was a relatively harsh approach in the PC game. There was also a sense that the lighting was more physically correct - sometimes we wondered just where the light sources were in the original version. It turns out that the changes brought about were technology-led rather than being some part of a planned revision by the art team and partly explains why those revisions were not rolled back into the PC game.
"This was a matter of technology. What we used on the PC wasn't as good on the Xbox. So we devoted a year of work to redesign the lighting," affirms lead level artist Lucjan Więcek.
"It is really important to give scenes a proper atmosphere, so we rethought many decisions, and although the first-phase changes were dictated by technology, the end result satisfied our artistic needs."
More Than Just a Port
It wasn't just framebuffer considerations that saw a change in the way The Witcher 2 looks - the basic principles behind its rendering engine were also changed considerably. CD Projekt RED utilises deferred rendering technology: here, the entire frame is broken down into several buffers that each describe particular surface properties in the scene and are then combined (or used in other post-processing effects) for the final image. Compare this to a more traditional multi-pass forward renderer, which considers the scene multiple times depending on the number of lights, looking at each light that affects each object on-screen - an approach that grows ever more expensive as the number of lights increases.
The PC and Xbox 360 version actually use two very different approaches to this deferred technique, driven by the performance considerations on the less-powerful console.
"On Xbox we decided to go fully deferred, while on PC we used partial pre-pass - for skin shaders we have different lighting models (two specular values) and two passes that provided screen space subsurface scattering. We light an object and then apply albedo and specularity. Unfortunately we dropped it for Xbox, having to shrink down g-buffer due to optimisations," senior programmer Bartek Wroński explains.
Another complicating matter is the limited amount of ultra-fast eDRAM available that is attached directly to the Xenos graphics core in the Microsoft console - 10MB in total. Fitting in all of the different deferred rendering elements into that space was challenging. CD Projekt RED's approach was to cut-down native resolution to 1280x672 (93 per cent of a standard 720p framebuffer) and then scale it vertically. The alternative would have been to impact performance by 'tiling' - a process where the screen is divided into chunks and swapped out into main RAM.
"We did fully deferred rendering, so we needed lots of memory for multiple render targets during g-buffer surfaces filling - we wanted to avoid tiling, which is an expensive operation of rendering parts of the screen that don't fit in eDRAM, and resolving them to system memory tile by tile," recalls Bartek Wroński.
"At first we thought it wouldn't be possible, but we worked out a solution for it: we compressed from three g-buffer surfaces plus depth to two g-buffer surfaces - for example, by packing two values on one byte. Together with lowering the screen resolution, we were able to get rid of any operation requiring tiling, and it really sped things up. Another obvious reason to lower screen resolution is performance - we gained a few milliseconds on pixel shaders, fill-rate and post-processes."
Moving to hardware-driven 2x multi-sampling anti-aliasing (MSAA) would not only have necessitated performance-sapping tiling, but would also have doubled the amount of RAM required. Instead, the developer opted for NVIDIA's FXAA technology - a more refined version of which was already being used in the original PC version.
"Yes, we used NVIDIA's FXAA. It is very simple to implement at any development stage, very cheap and worked just fine," confirms Wroński.
"On Xbox 360 we deliberately lowered its effect to avoid over-blurriness, already associated with lower texture quality and lower screen resolution."
Another challenge facing the team was the relative weakness of the Xbox 360's Xenon CPU. It's a tri-core processor that runs at 3.2GHz, each core featuring two hardware threads. It sounds a lot more powerful than it actually is - a Microsoft GameFest presentation by Bruce Dawson (now at Valve) reveals just how hard it is to eke out maximum performance from the PowerPC derived technology, to the point where for all its high-end clock-speed, the 360 typically requires five processor cycles to run just one instruction.
"We had some difficulties with the CPU side. There were a few functions in the code that were particularly inefficient on the Xbox 360 because of load-hit-store problems and L2 cache misses," confirms senior engine programmer, Balázs Török.
"We had no problem with these on the PC because of different architectures. We had to go through each and every such function and re-factor them to utilise the CPU better. This is especially important for some functions that are called hundreds of times in a frame."
But the oppressive RAM limit in the Microsoft console continued to cause issues for the team.
"Quite a few systems had to be redesigned to make everything fit in 512 megs. We started with the worst offenders (e.g. animations, navigation meshes) and worked our way till we could run the game on the 1GB devkit and then continued optimising almost to the end of development," Török reveals.
"We did higher-level optimisations, as well. Splitting levels into smaller areas, limiting the number of objects simulated at a time and other more-or-less straightforward optimisations allowed us to fit within memory and performance constraints."
The effort paid off. Microsoft may not have the same focus on pushing technological boundaries on its console as Sony does with its much larger wealth of first-party studios, but The Witcher 2 is a game that arguably pushes the Xbox 360 hardware more than the vast majority of the console's first party titles. The notion that so much work was done in just 11 months is breathtaking.
"The work we put into adapting the game really paid off in a big way. The game looks great on Xbox 360 even though it was initially designed for high-end PCs. It took a lot of effort from coders, artists and QA to achieve this rather herculean feat and maintain consistent 30fps performance," enthuses executive producer John Mamais.
"If we'd had more time, we could have redesigned the GUI from the ground up for console, making it slightly less complicated and feel more natural. We did a lot of work on the GUI, but it needed more time and it's different from the PC one. Next time we will put more emphasis on this aspect for sure."
The Future of CD Projekt RED
So where next for the Polish developer? Is the REDengine as versatile as middlewares such as Unreal Engine and CryEngine, and could the developer branch out into new genres?
"REDengine is technology meant for developing great RPGs, and it's good at that. It supports the creation of immersive, mature, non-linear stories; dynamic gameplay; realistic, vast game worlds and everything that makes The Witcher 2 one of the greatest RPGs I've ever played - in my personal opinion," says Tomek Wójcik.
"But to answer your question - it could be not adaptable enough to power a great racing game, for example, but it's definitely capable of creating great RPGs of any kind."
Work continues on refining the engine, however, but the team are tight-lipped on what new features we should expect from the technology.
"This is very complicated, because the REDengine already has a lot of tech innovations that allow it to create complex RPGs. But their scale is not everything - we try to make really breathtaking visuals in our titles," explains Lucjan Więcek.
"So we add such elements to the engine that allow us to create our visions. These should attract gamers, but also other developers. That's why I cannot give you a detailed answer until the engine is fully developed and available for everyone."
The company does plan to branch out though. The age of the PC exclusive - for CD Projekt RED at least - is over, with executive producer John Mamais confirming that "we're going multi-platform and will support DX11".
While the company obviously won't go on the record about any kind of next-gen console spec, we were curious about what they would want to see from new hardware.
"Generally speaking, our games always have a lot of content and detail, so we always need more of everything - more CPU and GPU power and more memory," shares senior engine programmer Balázs Török.
"More specifically, the thing that constrained us the most was the disc access speed; since the Xbox 360 doesn't guarantee a built-in HDD we can't rely on installing the game, so we hope to see better access speeds or at least a lot of memory."
At the end of the day though, hardware choices are out of their hands - the team simply needs to adapt to the fixed platforms they are presented with, and the better supported they are by the console manufacturers, the more features and performance developers will be able to extract from them sooner.
"My personal opinion is that every developer can accept the technical constraints of a system, so even if we don't get everything we want in the new hardware, we need really good tools; we need good tools for every stage of development, for creating new features, for debugging and for optimising," Török concludes.
In the meantime, CD Projekt RED continues to support The Witcher 2 and strives to satisfy its ever-expanding community. The company's approach to free game updates and DRM anti-piracy measures has provided it with a wealth of goodwill from PC gamers.
"We create and deliver DLC as a way to retain our loyal fan base and don't want to alienate them by nickel-and-diming them every time we release something new," explains executive producer John Mamais.
"Hopefully we can also pick up some new fans along the way with such a philosophy. The community is very important to the continuing success of a game development studio, and we always try to listen to our fan feedback."
Mamais also hopes that the studio's noted dislike of DRM helps reduce the impact of piracy.
"I think we've gained lots of respect in the gaming community because of it and hopefully that's mitigating some of the piracy," he says.
"When W2 pirates openly converse on forums they are often lambasted by other would-be pirates because of our policy - look at the comments on 4chan, where pirates were getting trolled for trying to download our game. To some extent, that's evidence that our way is not only right, but actually makes an impact. We need folks to buy the game so we can earn enough cash to make the next one - but customers should feel that they want to buy it. That's why we put so much care into our community."
Will you support the Digital Foundry team?
Digital Foundry specialises in technical analysis of gaming hardware and software, using state-of-the-art capture systems and bespoke software to show you how well games and hardware run, visualising precisely what they're capable of. In order to show you what 4K gaming actually looks like we needed to build our own platform to supply high quality 4K video for offline viewing. So we did.
Our videos are multi-gigabyte files and we've chosen a high quality provider to ensure fast downloads. However, that bandwidth isn't free and so we charge a small monthly subscription fee of £4.50. We think it's a small price to pay for unlimited access to top-tier quality encodes of our content. Thank you.Support Digital Foundry