The Microsoft console turns off v-sync and goes all out to render as many frames as possible. The result is virtually ever-present tearing: a mammoth 61 per cent of the console's 60Hz output consists of torn frames in these tests, and controller response varies dramatically depending on frame-rate.
PlayStation 3 is very bizarre indeed. For the most part it is still v-synced (a virtually unnoticeable 0.1 per cent of torn frames here, all of them lurking near the bottom of the screen, almost impossible to see), but the massive variation in the frame-rate introduces some horrible judder and a quite off-putting variance in controller response.
It's difficult to recommend the unlocked frame-rate on either console. The games are designed around 30FPS, and the impact to image quality on both systems is considerable. However, at least it feels as though there is some advantage to using it on Xbox 360 - controller response overall feels crisper, even if the tearing is truly horrific.
Moving back to the prescribed locked frame-rate modes, initially it seems that we're looking at a like-for-like experience. Lessons have clearly been learned from the lamentable PS3 rendition of the original BioShock, which was considerably inferior to the 360 build by just about all measurable technical criteria.
For starters, the sub-HD resolution of PS3 BioShock is gone: both versions are running at native 720p, and both feature no anti-aliasing. The blur filter added to PS3 BioShock (and removed in a subsequent patch) thankfully does not return. And, as we've seen, while frame-rate isn't spectacular, it feels more refined than the last game.
The big visual differentiator between the two games comes down to the handling of transparent "alpha" textures. These eat up bandwidth and fill-rate on the consoles, and as regular Digital Foundry readers will know, the 10MB dedicated RAM attached directly to the Xbox 360's Xenos GPU can give the Microsoft console a very real advantage here.
A very common solution on PS3 is to reduce the resolution of these textures: Killzone 2 for example scales them up from a quarter-resolution buffer, but adds multi-sampling anti-aliasing to smooth off the edges. For effects that are on-screen for a split second (for example, explosions) it's very hard for the human eye to notice much difference: it's a massive bandwidth-saver, with little impact on overall image quality.
BioShock 2 employs the same trick with its transparencies (without the MSAA). A massive amount of the game's alpha textures are rendered with a quarter-resolution buffer, which is fine, except for one problem: these aren't on-screen for a split second, they are there a lot of the time. All of the water, particles and fire effects in BioShock 2 are rendered in this way, meaning that depending on the scene, some or even all of the screen is being generated at quarter-HD resolutions.
Even some of the neon decals have a quarter-res effect on them that stays constant no matter how far you are away from them - the upshot being that, weirdly, the further away you move from the texture, the more obvious it becomes. To some extent or another, the sub-HD elements are with you for much of the game. After all, Rapture is an undersea city springing a hell of a lot of leaks: water is everywhere. It's as much a part of the BioShock 2's signature look as the art deco architectural style. Even transparent items such as EVE hypos exhibit the effect.
What's odd is that this can result in some very weird and unattractive effects that I'm not sure are artifacts of this decision, or simply bugs in the game: the final shot of the vanquished Big Sister reminds me somewhat of the low-res texture bug with the Big Daddies in the PS3 version of the first BioShock. Not pretty, and it's difficult to understand why it's happening.