Digital Foundry: Can you tell us about the LBP stereoscopic 3D concept video we were shown at Evolution Studios and the future of 3D in LBP? The demo was a clear use of how the tech could benefit your system of "layers" in the environment, but at the same time, there's a sense that your performance is already at the edge and there wouldn't be overhead for a stereo 3D output.
Alex Evans: Well, I really love how the 3D tests made in LBP(2) actually make the game better to play - it differentiates the play area from the background, and you can stay focused. Nothing to announce today, and yes, it is tough to find the GPU time, and also the dev time to get every cool feature done, but - it would be cool, wouldn't it?
Digital Foundry: Given the nature of the camera in LBP, the game seems ideally suited to a streaming model. For example, the game SWIV on the Amiga provided 40 minutes of non-stop scrolling over ever-varying terrain and with ever-changing enemies. LBP2 could create a similar top-down shooter but would be limited in its population of assets. Are there technical or design issues with not allowing levels to stream in assets as used and allow more progressive variety in a level, as opposed to the 'few' materials and objects that fill the thermometer?
Alex Evans: Well, we could stream, and in fact do stream some assets, but in the end we've focused more on other areas of tech. It's technically possible, as are many, many other things!
One thing about LBP that makes streaming tricky is that there is no limit to 'action at a distance' - a switch in one location can spawn emitters or mechanisms anywhere else in the level, at any time.
For that reason, we (for example), never sleep any object: everything is running the full physics sim, all the time. Streaming would probably add knock-ons there, and to memory management, and to thermometer tracking, that while possible, would distract us from all the other features.
One thing that is nice though, in LBP2, is level links. You can chain levels together (with a brief transition) to make longer experiences, without needing the player to return to the pod. A simpler solution that gets most creators most of what streaming would give them.
Digital Foundry: Prior to release, LBP was described as allowing creation within gameplay, and you famously described a situation where there's a chasm to cross and a player could dial in a tank he had built earlier. This would have been a landmark departure from existing user-created content, but sadly this never made it into LBP proper, with instead a discrete editor. What were the reasons for this? Will LBP2 feature in-game creation, if not creating geometry at least allowing components to be assembled in situ?
Alex Evans: It's not something we've specifically tackled for LBP2 (yet), but one of the QA guys here has actually rebuilt a working Pop-It using the direct control devices in LBP2. In other words, he managed to make a create-in-gameplay mode, from scratch, using direct control and emitters. The possibilities of meta-game creation like that are quite mind-boggling...
Digital Foundry: There have been hints that the LBP2 engine will make the already-available LBP1 user-generated levels look even better. Can you go into more depth on that?
Alex Evans: All the tech mentioned above applies equally to LBP1 content, so you get nicer MLAA, you get volumetric fog light shadows, you get better transparency, you get sharper anisotropic filtering, you get shadows on backgrounds, and several other tweaks and enhancments.
If you use LBP2 content, then you get even more good stuff - hairy materials, new transparent materials, animating textures, animating backgrounds, new lighting, and so on. So it's good for all!
Digital Foundry: So players will be able to import their old levels directly into LBP2 and improve them with all the new features you've added to the editor?
Alex Evans: Yes.
Digital Foundry: Finally, with Media Molecule now firmly established as a part of Sony's impressive range of first-party studios, does this afford you any additional access to tech and personnel from within the group? How does this relationship work? The MLAA side of things is self-evident, but can you talk about examples of other code or knowledge shared within the Sony group that has helped LBP2 become the game it is?
Alex Evans: It's early days but Sony has always been incredibly inclusive of Media Molecule. The fact that I can now regularly sit in a room with the tech directors of Uncharted 2, God of War III, MAG and Killzone, among others, and listen to them all debating every aspect of game development, is exhilarating.
There are some incredibly talented people within the Sony group of devs, and it's the people and informal exchanges that ultimately stimulate me most as a game developer.
In terms of actual code, or tech, even on LBP1, Sony London and San Diego helped us out immeasurably with things like SPU optimisation, online technology and delivery, and that is continuing for LBP2. So while it's nice to brag about how tiny we (still) are at MM, we couldn't have done it without them.
Sony WWS knows how we like to make games, and we're very happy to carry on making things in our slightly eccentric, non-scientific way. In the end, the players and creators seem to like it!