Skip to main content

Long read: How TikTok's most intriguing geolocator makes a story out of a game

Where in the world is Josemonkey?

If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

No Silver Lining

Cloud gaming continues to fascinate - but the economic arguments don't add up.

Published as part of our sister-site's widely-read weekly newsletter, the Editorial, is a weekly dissection of an issue weighing on the minds of the people at the top of the games business. It appears on Eurogamer after it goes out to newsletter subscribers.

Throughout the late seventies and the eighties, a revolution took place in digital technology - one which overthrew the entire model of computing as it had been, and replaced it with the one we recognise today. Out, for the most part, went the heavy-duty mainframe systems and the dumb terminals which allowed users to interact with them. In came microcomputers - systems boasting their own processing power and storage, capable of communicating over a network but by no means dependent upon one for their life support.

It's no coincidence that the same era saw the emergence of videogames as a pastime, and as an industry. Although games did exist for mainframe systems - some of which created ground rules for interactive entertainment that are still followed to this day - they were never going to be the basis for a thriving industry. The arrival of microcomputers heralded the era when computers entered the home as well as the corporate office, and the era in which users could (to some degree) run their own software, rather than relying on a systems administrator to install it on a centralised machine.

Ever since that point, progress in computing - first followed by game development, and in latter years, actually driven to some degree by the demands of gaming - has been pretty straightforward. Computing power and storage have risen. Prices and sizes have fallen. Such progress has opened up startling new possibilities undreamed of when the mainframe first fell from supremacy; the wafer-thin iPod Touch which sits in my pocket is a more capable computing device than any gargantuan mainframe of the 80s.

It's meaningless for your game to be "unpirateable" on OnLive if someone can rip the DVD of the retail version and post it on BitTorrent.

However, even if today's computing devices are more and more independently powerful, the irony is that the way we use them has changed dramatically - making them, in some regards, every bit as reliant upon network access as their dumb terminal forebears. Sure, your computer and your smartphone can do all sorts of incredibly clever and powerful things - but the real reason why most of us are glued to them for so much of the day is because of their ability to access information, media and communications from the Internet. Turn off the network pipe, and most users will find themselves staring at a fabulously powerful piece of hardware that's suddenly become entirely useless.

That's why, as storage and bandwidth costs have plummeted, the dream of the Cloud has become a reality. If the computer relies so heavily on the network for its functionality, then why not start to reverse the process that led to the domination of the microcomputer in the first place?

Why not start to hand back some of the role of the computer, specifically in terms of storage and retrieval of data, to the "mainframe" - but in this instance, not a big machine in the corridor down the hall, but an array of countless servers distributed across data centres all over the world? The advantages are many, at least in theory - no more worrying about backups, easy access to your data no matter where you are and which computer you're using, the ability to upgrade your storage space by clicking a button rather than running out to buy a new hard drive.

Just as videogames followed computers off the mainframe architecture and thrived in the era of the microcomputer, the buzz around the concept of the Cloud has excited plenty of people to see what videogames can do with this new paradigm. "Cloud gaming" has been a topic for gaming's chattering classes at conferences of all sorts for a couple of years now - met at first with derision as a pipe dream, then more recently with interest as technical tests seemed to show that it could and would work in a future nearer than most people expected.

The concept is, on the surface, just as beguiling as using the Cloud for other ends. It means you can have a thin client in the home, a dumb box that's cheap and doesn't have the power required to play top-end games, but easily capable of streaming them from a data centre. You'd never need to upgrade it. You could buy games with a click and play them instantly - and your entire game collection would follow you around to any networked device that could access it.

In recent months, limited launches for services like the hugely ambitious OnLive and the (rather less ambitious and perhaps more realistic) Gaikai have convinced some doubters that this could become a reality. Enough heads have been turned that even GameStop has started looking hopeful about using technology such as this as part of its plan to evolve into a new kind of company before the physical games retail ship sinks underneath it. Some publishers, meanwhile, talk enthusiastically if still cautiously about the idea of a world free of piracy - and free of awkward retail channels whose competitive marketplace drives their software prices down so much, so soon.