GamesBeat: It sounds like there’s a lot of breakthroughs you could plan for or expect. It’s not just a linear path that you can see from Moore’s Law. There’s a lot of inventing along the way that will have to make this happen.
Sweeney: Mark Rein and I visited MagicLeap a few months ago. I had no idea that some of the things they were doing were even possible. I can’t talk about some things because of the NDA, but it was like stepping into a Harry Potter movie, seeing all these magical inventions that we hadn’t seen before. There will be a lot of very rapid progress there in all areas. Their light field cameras and light field displays, some of the long term solutions for displaying a 3D image with proper depth of field, new display devices, new input devices, cameras that can capture scenes in full 3D –They’ve put together all the different components you need.
All of our game engines and all our technology is output-oriented. We’re good at rendering beautiful scenes, but we have no support for input in 3D. If you want to walk around and digitally scan your house as you go so you can represent that in the game engine and modify it, place paintings on your wall — there’s going to be a whole revolution on the software side.
GamesBeat: The tool-making that will make it easier to create the content and allow more people to become creators, is that another parallel path that needs to happen?
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
Sweeney: Absolutely. Empowering everyone to create content is a big step here. Eliminating the technical complexity of content creation — if you can paint a painting or build a sculpture, you should be able to do the same thing in the virtual world without having to spend months learning complicated software. That’s the ultimate destination for all of this.
GamesBeat: I have a kid who’s a very good artist, but she doesn’t want to learn computer animation, computer graphics, complex math. That’s the kind of future she needs.
Sweeney: It’s all about reducing the barriers to entry. The YouTube revolution isn’t a revolution in content consumption, although there’s a huge number of content consumers. It’s about how anybody with a camera or a smartphone can create a video and share it with the whole world. The revolution around alternate reality will be around enabling anyone to become a content creator too.
GamesBeat: In 2015, I’m pretty impressed with the graphics of human faces I see in games like Batman and Call of Duty. It seems to stem with the face capture technology that you talked about a bit. Are you happy with where that is right now, the ability to motion-capture faces and turn them into something animated with less effort?
Sweeney: The state of the art now, from the game industry to Hollywood, has reached an astonishing level. The challenge right now is that the processes used to create very high quality facial animation in games and movies are very expensive. The next step is to use digital scanning techniques in mainstream cameras instead of specialized motion capture to put that within everyone’s reach.
When you’re in augmented reality having a chat with someone, you want to see their face moving as they talk. You want them represented in 3D so you can walk around them. That means it has to happen at the consumer level, not just the professional level. A lot of the effort we’re putting into developing new Unreal Engine 4 systems is in enabling this world class quality of content with greatly reduced cost and complexity.
GamesBeat: I wondered if the face capture becomes so good that some of the artistry involved isn’t necessary anymore. You do want to stylize a lot of these faces — create an orc out of someone’s scanned face, maybe. It almost seems like the face capture could supersede some of the human contribution.
Sweeney: That’s the interesting thing. Right now, to get the best quality of facial animation in a game, you bring on actors. You make sure the actors’ faces map naturally to the characters in the game. Then you capture their performance — the lines, the face, the emotions — and that goes into the game. It’s not a human animation problem so much.
Every Pixar movie, though, or something like Epic’s A Boy and His Kite cinematic — those are not photorealistic characters. You really want the artistry that only traditional animators can bring. I think there’s going to be a wide range of possibilities. In all cases we want to reduce the cost and the barriers to entry.
GamesBeat: This competition with Unity — you had an interesting explanation there. You consider the convergence on C++ to be one of the core advantages Epic has in the game engine war.
Sweeney: As games get larger, the performance of all the code in the engine becomes much more important. Also, as games get more visually realistic and complex, the ability to access all of the engine from the gameplay code becomes more important as well.
Epic started out with scripting languages in the first generation of the Unreal engine, in 1998. I wrote that. There’s a place in my heart that comes along with the simplicity of programming in a scripting language. But once you go beyond relatively small-scale games to larger experiences with more realism and more characters involved, there’s a strong benefit to C++ — its performance, its complete access to the engine and the operating system.
GamesBeat: That low-level API technology that you were talking about fits in with this strategy. That combination enables mobile devices to exploit graphics to the fullest?
Sweeney: Apple’s Metal already reduces the CPU overhead of graphics by a factor of 10. You can generally render 10 times more objects in a game in real time than you could with previous OpenGL calls, which had much higher overhead. That will all come to other smartphones in the form of the Vulcan API. It enables a revolution in game complexity, especially on mobile, when it comes to object counts and realism. To access that power, you have to have a high-performance programming language and that low-level access to the whole engine.
GamesBeat: So we have to shut down the LCD manufacturers. What else are you predicting?
Sweeney: Augmented reality will change the world more than a lot of other technologies. Traveling around to meet people will be much less important when you can stand in a room and chat with a virtual representation of a person that’s so close to reality — it’ll be a whole new level.
1950s movies always talked about video phones. Skype and other services have shown that it really isn’t all that exciting. You’re looking at a little postage-stamp view of a person animating. It’s not the same as being with them. AR will give you genuine presence, the sensation that you’re really there.
The ability to create shared experiences—It’s not just about single-player games and single-player experiences. It’s going to be social as well. Oculus created the Toy Box demo using Unreal Engine 4, which puts you in a virtual environment with another player. You can see their head and their hands. You can play tetherball with them using the virtual rackets. You can throw objects to them and play catch and do a lot of other things. It creates an awesome social experience that hasn’t been captured in traditional multiplayer games.
GamesBeat: Will we be talking about Unreal Engine 20 at some point?
Sweeney: We’ll be evolving this engine forever. It’s gone through four major generations so far. Computers have gotten approximately 100,000 times faster in that time period, going from software rendering on the CPU to multi-teraflop GPUs nowadays. The rate of progress in the industry is astonishing. We’ve been there at every step.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More