It's often been said that one of the many advantages of working on console is that you have a fixed set of hardware to work with, that you can "write to the metal" and code to the "bleeding edge" of the spec. However, our sources suggest that this simply isn't an option for Xbox 360 developers. Microsoft doesn't allow it.

Suspicions were first aroused by a tweet by EA Vancouver's Jim Hejl who revealed that addressing the Xenos GPU on 360 involves using the DirectX APIs, which in turn incurs a cost on CPU resources. Hejl later wrote in a further message that he'd written his own API for manual control of the GPU ring, incurring little or no hit to the main CPU.

"Cert would hate it tho," he added mysteriously.

According to other dev sources, what that actually means in real terms is that circumventing the use of the standard APIs would result in a submitted game not ever making it past Microsoft's strict certification guidelines which state that all GPU calls need to be routed through DirectX. Compare and contrast with PS3 development, where writing your own command buffers and addressing the RSX chip directly through its LibGCM interface is pretty much the standard way of talking to the hardware.

So in real, practical terms, what does this actually mean? First of all, in many situations the CPU usage we're talking about here is not negligible, and in some cases could be the dividing line between a solid or an inconsistent frame rate. However, having control of your own command buffers as you do on PS3 offers up a lot more flexibility.

Let's say you're rendering a forest. Chances are that most of the same trees are going to be rendered from one frame to the next, with maybe a 10 per cent variance as the view changes. With Xbox 360, you'll be calling most of the same functions with the same parameters for each frame, while DirectX translates those into command buffers and feeds them to the GPU. With the LibGCM/PS3 approach though, you could build buffers for a set amount of trees across several groups. A simple check could discern if those buffers are still relevant for the next frame, and if so, they can be re-used, whereas with DirectX (and OpenGL for that matter) they will be regenerated anew for each frame.

Sounds cool eh? Well in this specific case, there are both advantages and disadvantages. First of all, it's going to require a lot of memory, and that is in short supply on any console especially on PS3. Secondly, some might call it a somewhat "cheap" optimisation. It'll up your maximum FPS, but won't do anything for the minimum where optimisation is needed the most.

But the point is that this is just one example, and there are many cases where having the choice is a seriously useful option to have in the development toolkit.

If all of this sounds pretty harsh on Microsoft, it's worth noting that there is a ton of excellent reasons for standing by this requirement. It means that the platform holder can update the system software and hardware specification and ensure that all games past, present and future will work on all iterations of the console.

More than that, in the here and now, it can easily be argued that the implementation of DirectX is a key reason that the Xbox 360's tools and development environment are considered to be generally excellent. Not only that, but game makers are familiar with the standard, and code is easily portable to and from PC. A firm adherence to DirectX is also good news for gamers too: it makes the chances of full backwards compatibility on Xbox Next a far more realistic prospect...

Sometimes we include links to online retail stores. If you click on one and make a purchase we may receive a small commission. For more information, go here.

Jump to comments (107)

About the author

Richard Leadbetter

Richard Leadbetter

Technology Editor, Digital Foundry

Rich has been a games journalist since the days of 16-bit and specialises in technical analysis. He's commonly known around Eurogamer as the Blacksmith of the Future.

More articles by Richard Leadbetter

Comments (107)

Comments for this article are now closed, but please feel free to continue chatting on the forum!

Hide low-scoring comments


Like what we do at Digital Foundry? Support us!

Subscribe for only $5 and get access to our entire library of 4K videos.

Digital Foundry

Digital FoundryRTX 2070 vs GTX 1080: Which should you buy?

Benchmark performance, value and feature comparison included.

Digital FoundryNvidia FreeSync support tested - and it's a game-changer

It works well and the implications for display tech are profound.

Digital FoundryBest graphics cards 2019: every major Nvidia and AMD GPU tested

The DF guide to the fastest and best value video cards on the market.