Skip to main content

Long read: How TikTok's most intriguing geolocator makes a story out of a game

Where in the world is Josemonkey?

If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Crysis 1080p60 Grail Quest Update

So, a while back I bought myself an nVidia GTX295 graphics card (the world’s most powerful GPU). The aim here was to get Crysis running at maximum settings at 1080p60. Crytek’s most ambitious FPS remains one of the most technically challenging games ever devised, and early benchmarks seemed to indicate that the GTX295 could run the DX9 version at maximum everything at the current top resolution and refresh rate of ‘living room’ HDTV technology.

Initial tests seemed to indicate that performance was far from the ideal, hovering anywhere near 30-40FPS. The config used? Q6600 at 8×333MHz for 2.66GHz (a mild overclock but a higher clockspeed and a big boost in bandwidth), 4GB of 1066MHz DDR2, GTX295, Vista Ultimate 32-bit and DX9 Crysis Warhead. Settings were at Enthusiast for everything, with v-lock and 2xAA turned on. Deselecting the two latter elements didn’t seem to affect results, but the notion of a 400 GBP GPU producing screen tear and ‘jaggies’ is hugely abhorrent so they’re staying on.

A quick analysis of Crysis Warhead’s Airfield level, captured at 1080p60 via Digital Foundry TrueHD.

The video is unedited, and a first run through of the level, so expect some clowning around, particularly with that tyre-less jeep. . Regardless, the bottom line is clear: the grail seems as elusive as ever.

So, as indicated in the Twitter feed, it’s time to move to i7 and perhaps look into settings tweaks in order to creep us closer to the elusive 1080p60. But even here, I am not expecting massively improved results. Check out the results of this HD4870×2 vs GTX295 test on YouTube, where an i7 rig only manages an average of 40FPS, and a minimum of 21.4FPS.

Where next on the Grail Quest? Now we move into the realms of tweaked configs (which can bring about some extraordinary gains in performance) plus of course, we run the whole set-up with the new Intel i7 CPU, where an overclock from the default 2.66GHz up to 3.33GHz is just one BIOS tweak away. Additionally, the onboard memory controller on the chip should also iron out any kinks caused by the game’s physics.

So why pursue this quest at all? The reason is pretty straightforward… this blog is all about gaming performance. If we have reached the point where we can run this game maxed out to the full potential of current HDTV technology, then we’ve reached a technological milestone that should be celebrated. But I’m willing to bet that this technological leap still wouldn’t see us being able to play GTA IV on PC at the same performance level...

So, why no benchmarks at DX10? Here’s the thing. The GTX295 seems intent on running at 50Hz in DX10 from the DVI port, and at 24Hz via HDMI. DX10 Crysis also seems to introduce corruption on my Dell 2405FPW monitor as well, presumably down to the frequency shift. Maybe a problem with the EDID on the TrueHD card (the digital handshake between card and source, if you will), but if so, how come every other 1080p60 source - including DX9 Crysis - works just fine?

Read this next