At its best, rendering technology doesn't just make a game look great and run smoothly, it has an intimate relationship with gameplay - and it's for that reason that Crytek's The Hunt: Showdown is well worth checking out. In this fascinating multiplayer first-person survival horror shooter, CryEngine is used for more than just window dressing. Yes, it looks beautiful and well up to triple-A standards, but the technology is fundamental in creating some of the The Hunt's most impactful moments, as well as establishing its unique atmosphere.
A fresh new take on the last man standing concept?
The future of gaming graphics?
Our settings guide for best performance.
It's also very, very fast.
Anthem's return at E3 2018 was just as spectacular as its debut a year earlier, and last week, Electronic Arts posted a full 20-minute gameplay video of the latest demo - with developer commentary, no less. Presented in full 4K, the firm shared a source quality version of the asset with us, allowing us to take a closer look at the game's Frostbite foundations, and compared to other EA titles using the tech, we're witnessing a use of the engine quite unlike anything else we've seen before. Developer BioWare isn't talking about performance targets, but the sheer intensity in detail, and the integrity of the open world suggests that this will likely be a 30fps game on consoles - and an interesting counterpoint to the 60fps heroics of Battlefront and Battlefield.
Players of the recent Battlefield 5 alpha have been witness to quite a treat. Building on DICE's excellent work in BF1 and Battlefront 2, we're looking at an exceptionally handsome game that, small bugs aside, almost feels like the finished article. It's visually outstanding in fact, the only disappointment - if you can call it that - being that the signs are pointing towards an evolution of the Battlefield formula and its Frostbite engine, as opposed to a full-on next-gen revolution.
It's been ten years since Crysis first released on PC. In 2007, it pushed real time rendering to new heights and spawned the memetic phrase, "but can it run Crysis?". Never had a game released that pushed hardware and engine technology so much, and never has one since. In fact, combine the latest and greatest Intel Core i7 8700K overclocked to 5.0GHz with an Nvidia Titan Xp and there'll still be areas of the game that drop beneath 60fps - even at 1080p. For its own very specific reasons, Crysis is still more than capable of melting the most modern, top-end PCs, but regardless, it remains a phenomenal technological achievement. It deserves a remaster at the very least, but a franchise of this standing really deserves a full next-gen sequel, with state-of-the-art rendering and back-to-basics gameplay.
Have you ever loaded up a new PC title, run the in-game benchmark, tweaked settings for optimal performance then discovered that actual gameplay throws up much lower frame-rates, intrusive stutter or worse? It's a particular frustration for us here at Digital Foundry, and it leads to a couple of very obvious questions: firstly, if benchmark modes are not indicative of real-life performance, what use are they? And secondly, if their use is limited, how representative of real-life gaming are the graphics card reviews that use them, including ours?
Under Ubisoft's stewardship, the Far Cry franchise is now celebrating its 10th year - a full decade of series entries and offshoots that have seen the focus of the gameplay and the technology shift dramatically. And this has led to some interesting YouTube offerings from Mark Brown and CrowbCat, showing what look like substantial engine downgrades over the years. So what's going on here? Has the massive increase in processing power provided by the current-gen consoles been matched with a simplification in aspects of the technology?