Skip to main content

Long read: How TikTok's most intriguing geolocator makes a story out of a game

Where in the world is Josemonkey?

If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Sony and the future of VR

Digital Foundry talks Morpheus with Sony's Shuhei Yoshida and Richard Marks.

For Digital Foundry, GDC 2015 ends as it began - in the company of Sony's president of worldwide studios, Shuhei Yoshida, and getting to grips with the firm's Project Morpheus VR demos. It's been several days since we first went hands-on with the London Heist experience, during which time we've enjoyed Crytek's Return to Dinosaur Island and WETA Digital's Thief in the Dark demos on Oculus Rift, and of course, Valve's immense HTC Vive-powered 'VR cave'. We've experienced the high-end of VR by the best in the business running on simply immense PC hardware. But what's clear is that thanks to Project Morpheus, console gamers aren't going to miss out, either.

What's become evident over this GDC is that virtual reality isn't just a technological arms race with hardware specs as the defining factor in the quality of the experience. Even before we discuss the challenges facing the future of gameplay in the VR world, the biggest obstacle will be in getting consumers to try the headsets and instantly get the intended effect, whether they're wearing glasses or not, and preferably without feeling ill. On a basic ergonomic level, Sony seems well ahead of the competition: the original Morpheus prototype was good, and the second-gen headset is easily the closest thing to a viable, consumer-friendly piece of kit we've seen at GDC this year.

"We were happy with last year's model, but when we went to events, to games shows and such we saw that people didn't understand how to put it on, so the hardware designers really, really focused on making that easy," says Shuhei Yoshida. "We really, really want people to be able to buy, open the box and use it without anyone helping, so now we have this one unified band solution. Now we're pretty happy."

"I think our first prototype had too much persistence in the screen and you got a blur effect when you rotated and there was too much latency as well," adds Sony's R&D chief Richard Marks. "And the update rate as well - when you're at 60, some people can feel the flicker if it's not refreshing fast enough."

And that's where the second-gen Morpheus prototype really impresses. Sony has taken bold decisions in ensuring that its headset is not just competitive with the likes of Oculus Rift and HTC Vive, but is capable enough to work effectively for the entire lifecycle of the PlayStation 4.

"Because we're a console, right?" says Yoshida. "It's a console not a PC, we have to make it right on the first time so people can use it for many years.... Our model is to produce great hardware that lasts for years, and we can cost-reduce that over the years."

And that means going for a full 120Hz display, taking Morpheus ahead of the 90Hz HTC Vive we saw demonstrated to spectacular effect during this week. It's an ambitious move bearing in mind the hardware constraints of the PS4, and made for the best of reasons.

"120Hz is a pretty easy conversion from 60, which is what most traditional games are and also you want to have as smooth an experience as you can when you rotate your head - and it's really important to have a low latency," explains Marks.

"We think that 120 gives us that, and there's an OLED panel that we really like that can do that, so that's why we picked it - it's really well matched to the PS4. Right now it'll be a challenge. Most games will go 60 and get reprojected to 120, but we're hoping that game developers will be able to push and hit native 120."

Reprojection is a crucial element in making VR smooth and responsive. In the PC space, it's called asynchronous time warp, but the tech is pretty much the same on Morpheus. A consistency in motion and refresh is required to eliminate judder and the motion sickness it causes. Reprojection guarantees a new frame on every refresh, even if the latest frame in the render queue isn't yet complete - the last one is simply remapped to the latest motion coordinates available from the HMD, giving the illusion of smoother movement. But there's more to it than simply filling in the blanks caused by dropped frames. It's key to allowing complex content running at 60Hz to operate at 120Hz. And it even makes native 120Hz gaming look better too.

"There are two uses for it. One is to fill in the frames if you go from 60 to 120 - you need to fill in the middle frame," explains Richard Marks. "And the other one... well, even if you're rendering at the full 120, you can get the most recent data and use that on the latest frame."

"Yeah, our native 120 demo from Japan Studio runs like that," adds Shuhei Yoshida. "The programmer who did it explained to me that's good to have reprojection all the time, even if you're running at native 120. Sometimes a frame will be dropped, but using reprojection it's still very smooth, and it's always taking the latest data."

The AR bots from augmented reality title The Playroom return in this native 120Hz Morpheus demo. The controller is mapped into the virtual world, making for an intriguing mixture of both AR and VR elements - an approach we didn't see in any of the other VR demos at GDC 2015.

But how demanding on system resources is the reprojection technology itself? Does the need to drive 120Hz impact on system resources?

"It's very short. It's done in the system software we have, a version that just does it for you. It runs right at the very end, just before the frame is going to be displayed. It interrupts the GPU and does this little bit of work. I don't know the exact timing of it, but it's very small. The impact of adding that in is not something that our people are worried about."

"The programmer who did the Japan Studio game said that this system doesn't take much power away from the game. He said it's easy," says Yoshida. "By the way, you've been counting frames for many years. Did you notice the difference between the native 120 demo and the others?"

I reply that 60Hz content has a very slight ghosting effect to it that I didn't see on the 120Hz demo. I explain it's a trade I'm happy to make if it means we get VR titles as visually rich as the London Heist.

"Another thing with the 120 is that it's easy to go to 60 for the television set to display the social screen," says Marks.

That's a crucial difference between Morpheus and the PC VR systems. This is a solution built for a console audience with the living room in mind. Sony's tech aims to invite in an audience, and get them involved in gameplay. As Shuhei Yoshida explains, it's an important feature borne out of a collaboration between hardware, R&D and in-house game developers - an important advantage Sony has over its competitors:

"Like you're probably aware, we have a cross-functional, international team: the hardware guys in Japan, the R&D teams and game teams globally... That undistorted, regular image - the idea came from the game teams. Our teams wanted to create social game not unlike something similar Nintendo is doing with Wii U. So one person might be wearing the headset, while another person might be participating, watching on the large-screen TV. So that idea came from the game teams and the hardware team implemented it."

Indeed, Sony's entire push into the world of VR is a direct result of the ideas generated by the R&D and game development teams. The rise of Oculus and PC VR actually happened in parallel with internal demos and discussions at Sony.

"At last year's presentation I showed myself wearing a handmade headset in 2010," explains Yoshida. "That's was actually the year we launched PS Move, and our game teams used that with a movie headset and created virtual reality with PS3. The teams were saying that 'we want to do this, we need to do this'. The R&D teams were doing the same thing... the company realised that PS3 didn't have the power to deliver, but it was something we could do with PS4."

The Deep returns for a second GDC outing, with new aquatic species and a renewed focus on engendering a sense of scale. It's using Sony's reprojection technology, effectively upscaling frame-rate from 60fps to 120fps to match the second-gen prototype's new display.

The sheer wealth of experience across disciplines may well be the key advantage Sony holds over the competition - a vast range of staff across the world experienced in producing great games and quality hardware. At his talk at GDC this week, John Carmack admitted that Oculus hadn't fully got to grips with the thorny issue of what kind of controller should be used for VR. Valve and HTC have developed their own twin wand set-up, which in many ways feels very similar to Sony's PS Move controllers - celebrating their fifth birthday this year.

"Yeah, it was ahead of its time," smiles Marks. "A bit. Maybe."

Sony had created a controller with full 3D positional tracking, but Move's enviable capabilities were left mostly untapped. It was a 3D controller in a 2D world - until now.

"Our game teams really struggled to use the accurate positional tracking in a conventional TV game," says Shuhei Yoshida. "Sometimes it made it harder for people to play because we designed, like, a bowling game where you can move like this [shimmies in seat] to throw the ball, like in a bowling alley. We were very excited that we could do that, but the people didn't know. In the end, you might as well use motion to throw the ball and the people create the image of throwing the ball perfectly, in their head. It's really difficult to use the 3D positional tracking."

"Yeah. I mean, you have the 3D motion you're doing here, but it's being shown on a 2D television," adds Marks. "You have to have a pretty good understanding of the mapping from 3D to 2D to be able to use that effectively and it is challenging for a lot of people to understand. But in VR it's not challenging at all - it's very natural."

So what is the ultimate VR controller? I wonder if it may well be the Minority Report glove. After all, as Marks himself said so many years ago, Isaac Asimov's theory was that our fingers and hands offer the highest level of control bandwidth available.

"If Kinect 2 can handle fingers and latency can be much reduced, that could maybe work, unless there's a technical reason..." muses Yoshida.

"Well, they have occlusion problems too. They can get fingers when they're easy to get, but when they're hard to get, they don't get the information," replies Marks. "I think with VR, it seems great that you can see your hands in VR but when you go to pick something up, if it's not really in your hand... well, if you lean down to touch a desk and it's not really there, that's really weird."

Japan Studio's second demo sees the return of The Playroom's AR bots, enjoying life in a doll-house style construction in front and around you. It's an interesting exercise in a detail-rich environment with much to explore and discover simply by looking around. Interaction is limited to looming close to the bots and giving them a little fright.

This makes Sony's native 120Hz Japan Studio demo all the more intriguing. In part, it's a fascinating combination of both VR and AR - two similar technologies integrated into one really compelling experience. Control is achieved via the standard Dual Shock 4 pad, accurately mapped, tracked and rendered within the VR world, embellished with AR features like a pop-up antenna and button annotations, while the touchpad springs open to release the mini-robots into the virtual world. Something you're holding, that has weight, and is crucial to the interactive experience is represented in the VR world, adding to the immersion factor in a way that the HTC Vive and Move demos can't quite match.

But what's really exciting about VR is that, now that the technological problems in terms of low latency tracking, high refresh displays are sorted, the challenges have moved on into the realms of gameplay. A giant reset button has been pressed in terms of how we're going to interact with and explore the game environments. It's notable that the new Oculus and Sony demos essentially have you static in VR space, with only limited travel. Valve's 'VR cave' play space is exciting, but both liberating and constrictive at the same time - do you have the physical area to implement it at your home? What if gameplay demands literally have you walking into walls?

"There's definitely a problem where that will happen. You design your game and people wander around and hit the TV or something!" smiles Yoshida.

"The game design is going to have to be very well thought out. The biggest design challenge for me as a gamer - as a consumer - for a future VR game is how to solve the right analogue stick issue, as I call it: camera rotation. I want a great shooter, action-adventure type of experience and it works if you make it like a shooting gallery, like the Heist demo.

"But if you try to get people to walk around in 3D and rotate the camera, that creates a serious issue," he continues. "Have you tried Alien: Isolation? They just converted the same game, a first-person shooter to Oculus. The mood and the suspense are great but once you start rotating the camera with a right analogue stick, it makes you feel sick. For that, we still don't have a solution. So I asked the same question to John Carmack. He came to try our demos and he said that it's a 'difficult issue' [laughs]."

The London Heist demo is the closest thing Sony displayed to an actual game. A masterpiece in immersion, its only real limitation is that you're effectively standing on the spot for the duration. Traversal through the VR world is a major challenge that nobody seems to have cracked yet.

"It's a crazy idea we've been talking about but if you translate movement, if you step forward, you can teleport a step forward. You can actually teleport rotation, so you can turn 90 or 180 instantly," adds Marks. "It's disconcerting but I think gamers could get used to it. You could flip 180 with a button press and then you're facing backward. You don't get sick from a 180 instant flip.

"John actually said that the beauty of the standalone GearVR [the Samsung mobile VR solution, co-developed with Oculus] is that there's no cable, right? He said that we can use a swivel chair so people can rotate without having to stand up and walk [laughs]."

From our perspective, this is what's most exciting about the arrival of virtual reality. Existing gaming paradigms only partially translate into the VR world, demanding innovative solutions and - perhaps, if we're lucky - entirely new game types. Valve, Oculus and Sony all have the power to create immense virtual worlds, but there are fundamental challenges inherent in how we are actually going to interact with them.

At the end of last year we discussed how the next-gen consoles have raised the bar visually, but gameplay remains based on the same core principles. Basic game ports into VR will produce some interesting results, but fundamentally, the new 3D worlds will demand a level of imagination and innovation. It genuinely feels like we're moving into a new, pioneering era with everything to play for; there's the potential - even the necessity - for a flood of fresh ideas in the VR space, while publishers necessarily have to play it more safely in the triple-A market.

And whether it's Oculus, Valve or Sony, there's a basic understanding here that we're onto something special, leading to a collaborative spirit that pretty sums up the entire ethos of GDC - a coming together of industry professionals, keen to share with each other for the common good.

"Oculus created excitement amongst developers, and lots of the experiments done by PC developers using Oculus is almost like helping to prototype games for Morpheus," says Shuhei Yoshida.

"We're very friendly with the Oculus guys, sharing opinions and inviting each other to show the latest demos... Some technical things they do before us, and other things we do before them. So for both companies, from a management standpoint, we are helping each other to get the engineers to work harder by creating healthy competition. Engineers are very honest people - when they see someone achieving something, not just talking about it, they're like - OK, we should do better."

Read this next