Skip to main content

Media Manipulation: the "Bullshot" phenomenon

How game-makers are embellishing promo screens and vids, and why it has to stop.

Dark blue icons of video game controllers on a light blue background
Image credit: Eurogamer

Target renders. Bullshots. Pre-renders. Grading. Post-processing. A whole new terminology has built up within the games community to describe the ways and means by which game-makers are creating promotional material that may or may not actually look like the product we'll eventually be playing on our consoles and computers. Where did it begin, why are they doing it, and in the internet age where any kind of fakery and shenanigans is swiftly jumped upon, shouldn't they really be stopping it?

Of course, the truth is that massaging and manipulation of media assets isn't anything new. It can easily be argued that the situation used to be a whole lot worse: the old 8-bit home computer arcade conversions back in the eighties were often backed with screenshots taken from the coin-op source, where a "night and day difference" doesn't even begin to describe the gulf in visual quality. However, the popularity of the "bullshot", as it is now popularly known, really kicked off in the PlayStation era - and originally, I suspect, with the best of intentions.

Game visuals, when captured via frame grabbers, or dumped from the video RAM of the host consoles, are stark, digitally perfect representations of the game as the computer "sees" it, and somewhat removed from how it would actually have looked on displays of the time. Even the best, most precise progressive-scan CRT displays have a tendency to smooth off edges in gameplay, whereas the common-or-garden TV adds a whole lot of blur, all of which would have been taken into account by the artists of the original games.

Regressing back to my past life as editor of games magazines including Mean Machines, we actually opted to stay away from frame grabbers for as long as we could (until the bean counters killed the photography budget) simply because photographing CRT screens from within a dark room produced coverage that more accurately reflected how the games would be seen, and played, on our readers' TVs. Even in the here and now, emulator coders are working on bespoke upscaling algorithms to make the games of yesteryear run in a manner closer to their original look on our modern day, relatively ultra-resolution monitors.

The needs of the games media were a crucial factor in the rise of the bullshot, especially as the print media became of prime importance in marketing a videogame back in the mid-nineties. Whereas screen resolution was typically 72dpi (dots per inch), magazine production operated at anything up to 300dpi. Games could look a bit rubbish as a consequence, and the developers made pains to address that. With the shift to 3D, game-makers came up with more ingenious solutions in producing what would eventually become the standard bullshot.

The usual process is to capture an in-game scene then internally re-render it at a higher resolution, then scale down. It's still a game-engine shot, and it's usually supplied at the game's actual resolution, but it looks more natural, less artificial and blocky: good for making your games look decent in the press of the era, and with a change of camera perspective, and some additional effects, the game visuals themselves can be blown up large for full-page artwork, packshots and marketing use. To this day, the basic principles have not really changed much at all, we just get to see more interesting variations of the technique at work. The real difference these days is in the fact that seemingly everyone is perfectly happy to release screenshots and sometimes even complete video trailers, that sometimes feature very little actual gameplay.

As we stumble into the HD era, the fact is that the times they are a-changing and it can easily be argued that the usefulness of the bullshot in the games media has reached its natural conclusion. All it exists for now is to make screenshots and game trailers look artificially superior to the product you'll be playing at home. It's now reached and surpassed the point of misrepresentation and it really has to stop, especially since the two main reasons the practice kicked off in the first place are now all but irrelevant.

Firstly, the tide is turning away from CRT screens, and high-definition flatscreens are swiftly becoming the standard, certainly for the enthusiast gamers that devour the latest media. The notion that game artists are designing the assets to accommodate an inherently blurry display is effectively a thing of the past. The move to HD sees television technology moving to pin-sharp displays, and the rise of technologies like DVI and HDMI means that digitally lossless images are being transmitted to the display and reproduced with stunning clarity. Framebuffer shots exactly like those we use in the Eurogamer face-off features are - byte-for-byte - identical to what your display is handling.

Secondly, much as it pains me to say it for someone who worked with three generations of console over 15 years in the games print media, the fact is that this particular part of the industry is swiftly becoming a thing of the past - certainly in terms of the readership figures. Eurogamer isn't the biggest online portal in the world, but I'd be willing to bet that its readership and overall reach far exceeds even the most popular remaining print titles. Any excuse there may have been for massaging game images to look good on paper is no longer relevant in the digital age where the vast majority of the audience will be viewing the assets on a PC screen, with the strong possibility that the screen used will be the same display used to host actual gameplay.

But regardless of the arguments, in the here and now, the use of massaged media is effectively the standard, and virtually everyone is in on the game. Even the world's most technically proficient game-makers - industry leaders in graphical and gameplay innovation - seem shy to release actual screenshots of their forthcoming games, preferring instead to unleash super-scaled bullshots, or enhanced videos.

Take, for example, the standard bearers in graphical realism on console, the literally incomparable Polyphony Digital. E3 played host to a couple of videos from the developer designed to showcase Gran Turismo on PS3 and PSP. The thing is, neither of them was really a true indication of the quality of graphics you'll be seeing on your console. You can argue that they were effectively "mood" pieces, designed to make an impact at a big industry event, but awesome artistic merits aside, the trailers and some of the associated shots were still some way removed from the actual game they are designed to showcase.

One of the most exciting of E3 2009's trailers, but is it representative of the actual game?

Polyphony Digital typically embellish the base gameplay visuals with additional graphical bling for their replay modes (hence the drop from 60FPS to 30FPS in GT5 Prologue), and it's from here that the raw assets are usually derived for their trailer work.

However, what we are seeing in Polyphony's trailers are intricately directed and rendered images where the samples used to create the motion blur are massively increased compared to in-game video, giving an ultra-realistic feeling of movement you won't see in the game. Any visual deficiencies that may be seen in actual gameplay (such as "jumps" in LODs as objects move closer to the viewer) are effortlessly removed. By rendering the video internally at an impossibly high resolution, the maximum LOD models are automatically invoked, high frequency shimmering on texture detail is smoothed away and of course, and any artifacts linked to alpha textures, specular shine and of course "teh jaggies" are disposed of as a matter of course.