Skip to main content

Long read: How TikTok's most intriguing geolocator makes a story out of a game

Where in the world is Josemonkey?

If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Lost Planet: DirectX 9 vs. 10

We check out the differences between the two PC demos.

Oh, dear. This isn't right, this isn't right at all. With Vista leaving gamers at best perplexed and at worst furious, getting out a game that actually used its trumpeted but unproven new DirectX 10 graphics, and that looked jawdroppingly, heartbreakingly, console-shamingly blinkin' gorgeous with it, was incredibly important. As DX10's leading lights, Crysis and Alan Wake, are still trying to work out what weekends they've got free to finally come visit, it's fallen to a port of Xbox 360 shooter Lost Planet to trailblaze the future of gaming graphics on PC.

As you're probably aware, two demos have been released - one in traditional old DirectX 9, good for Windowses XP and Vista alike, and one in fancy-pants DirectX 10, and Vista-only. A fine opportunity to demonstrate all the unparalleled gimmicks DX10 brings to the table, no? Unified and geometry shaders mean a mighty performance hike.

The move away from API object overhead means the old limits on how many different objects - from characters to vegetation to slavering snow-monsters -can be shown at once are removed. Support for vastly improved polygon counts means the age of boxy background scenery with token shiny surface effects is over. Virtualised graphics memory means larger textures, better textures, more unique textures on-screen at once. Or, at least, that's what was supposed to happen. Hopefully, soon it will - but it certainly hasn't in Lost Planet.

Outdoor snow effects on DirectX 10. Swirly, no?

Using a DirectX 10 graphics card, a splendid GeForce 8800 GTS 640Mb kindly lent to us by Foxconn for this test, we played the hell out of both versions of the demo. The differences? While there's some pretty impressive effects that the card handled with aplomb in both demos, so long as a couple of graphical settings were dropped slightly, it takes an eagle eye to spot the variations. Are those shadows a bit softer? Is that bug's skin reflecting more light? Was there an extra snowflake there? Am I starting at these demos so intently that I'm losing my mind, seeing invisible snow, phantom lights, hallucinated haze? And how can a planet be lost, anyway? It's not like it can go anywhere...

There are certainly some refinements, but few that you'd notice whilst actually playing the game. The most obvious are the enhanced motion blur effects, which admittedly do a decent job of making monsters look less like clusters of polygons and more like the fast-moving, organic horrors they're intended to be. Other differences, like minutely improved detail on monsters' skin and glass, are only really apparent through close, nay anal,screenshot study. Essentially, play on DirectX 9 - i.e. Windows XP and/or an older graphics card - and you're not really going to be missing anything.

And the same scene on DirectX 9. On close inspection, the big beast isn't /quite/ as distorted by the snow haze, but you can't tell that when it's moving.

However, that's not a reason to write off DirectX 10. As a game made for the Xbox 360, the GPU of which is neither DirectX 9 nor 10, but a custom chip that shares some features with both (primarily the former), Lost Planet simply wasn't born a DX10 game, no matter what it calls itself. It's got a few DX10 knobs on, sure, but basically it's a DX9 game in a fancy hat. We won't see the real money shots until games made for DX10 from the ground up - Crysis and Alan Wake, specifically. DirectX 10 still has everything to prove - Lost Planet isn't a proper test, sadly.