Skip to main content [aditude-amp id="stickyleaderboard" targeting='{"env":"staging","page_type":"article","post_id":1684337,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,mobile,","session":"A"}']

Otoy aims to enable developers to create cool VR imagery

Jules Urbach, CEO of Otoy, wearing Osterhaut's augmented reality glasses.

Image Credit: Dean Takahashi

Jules Urbach likes pretty pictures. The chief executive of cloud graphics startup Otoy likes them so much that he has dedicated his career to getting better and better 3D animated images in front of consumers. And right now, that means better virtual reality imagery.

Otoy’s Octane VR tool, available for free download for developers, can be used to create cool “light field rendering” images that are realistic and make you feel like you are part of a 3D scene. With VR, it can be used to create 18K bubble-like virtual environments that look clear. Otoy makes the software tools that enable developers to create cool imagery for a variety of platforms, including VR.

[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":1684337,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,mobile,","session":"A"}']

I caught up with Urbach at the Nvidia GPUTech conference in San Jose, Calif. He showed me some demos of light field virtual reality technology, including a demo of the 3D animated Bat Cave. He also showed some still images built with the Unreal Engine 4 tech from Epic Games. Those images look hyper-realistic, and you can move around inside them as if  you were really there. Over time, Otoy expects the tech to be able to show you moving images too. Right now, the picture gets a little fuzzy when you move around. But that’s a problem that better processing power can solve.

Here’s an edited transcript of our conversation.

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

Above: Jules Urbach, CEO of Otoy

Image Credit: Otoy

GamesBeat: Tell us what you’re showing here.

Jules Urbach: This is the image John Carmack [chief  technology officer at Facebook’s Oculus VR division] was talking about in his keynote at the Game Developers Conference. It’s the first test we ever did in this format. This is what he was talking about when he said it’s the highest resolution he’s ever seen in a VR device. It gives you stereo in every direction you look. This comes right out of our Octane tool.

It’s such a big difference. Until we got to this point, I think everyone was thinking that VR might work just with game engines. No. If you render it right — I grant you that Carmack is a genius and he can just come up with ideas to make this work. But I think this is what is going to make the Samsung Gear VR (mobile virtual reality goggles) super successful. The quality is there. For a casual user, you can just look at these scenes and be in them for a while. It’s almost like looking at a living photograph or painting.

Above: Otoy enables a streamed hockey game in 360-degree virtual reality.

Image Credit: Otoy

GamesBeat: Did you mention that Carmack worked on this or told your guys what to do?

Urbach: No. This is Oculus’s app. It was more like a collaborative project over a long time. He gave us a development kit early on. I said, “We’re doing all this crazy stuff to get light field rendering.” He said, “Maybe there’s something simpler. If you guys can render in this panoramic format, maybe we can do something really cool with Oculus 360 photos. It’s something we can do right away.” So we did it and sent it to him. He said, “Oh my God, this is the best thing we’ve ever seen. This is something compelling.” He tweeted about it and talked about it.

Right now imagery is done. Octane VR will just spit that out. Anyone can make these things.

[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":1684337,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,mobile,","session":"A"}']

GamesBeat: What do you call it? A 360 photo?

Urbach: Carmack calls it a stereo cube map, but it’s not exactly that. It’s basically just a miniaturized bubble that you—You don’t really move through the scene. The bubble effect just gives it a bit of depth, no matter where you look. It looks really good.

Carmack gave us the specification. He said, “Render it at this size, about an 18K image, and I guarantee your pixels will line up in the Gear VR.” He was right. He didn’t even see it before we did, but it turned out perfectly. We have this resolution in Octane now — “render this image for Gear VR” — and it spits out exactly the right format.

What’s cool is, since that’s happened, we’re seeing a lot of people get super excited about this kind of pipeline. The Keloid stuff I showed yesterday — my buddy in Barcelona has been working on this short. It’s an insane piece of work. It’s really realistic. But we’re taking shots from this and rendering a shot in VR right now. The guy who created all this — and I think this is going to be true of any computer-generated imagery artist — said, “I’ve been working on this thing for 18 months. I realize now that I’ve been filming in black and white when I should have been making color TV.” You see details you could never see in a movie.

[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":1684337,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,mobile,","session":"A"}']

Above: Octane VR renders VR still images that look real.

Image Credit: Otoy

GamesBeat: So the difficult thing is, when it moves like this it fogs up? You’re not creating it for motion, right?

Urbach: No, not yet. Here’s the thing. We haven’t found a person who wants to show the animation work we’re doing with them. But that’s probably going to be the first commercial project that comes out with this format — animation with interactive narrative experiences ready to go.

GamesBeat: What’s the limit of what you can do now? Is crossing into motion really difficult?

Urbach: No, it’s actually very easy. The only issue we have in Gear VR — this is a huge format. If we made it to stream video, it’s still an issue for a computer to play it back. It’s so much more data to play back a movie at 4K, let alone 18K. But it’s still essentially just imagery. There’s nothing super special about this.

[aditude-amp id="medium3" targeting='{"env":"staging","page_type":"article","post_id":1684337,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,mobile,","session":"A"}']

If you only pull up the part of the video you’re looking at, though, which is the trick we’re doing, you can get the same effect in video. That’s what we’re working on. By the time we launch the Warner Bros. thing, we’ll have it figured out.

Above: Explore the Batcave with holographic video

Image Credit: Otoy

I can show you an example of how things look animated. Do you remember the Batman cartoon from the ‘90s? Here’s the intro done completely in this format. We haven’t announced that that’s going to be in it, but if you hit the side button it’ll start playing. It looks exactly like the original TV show, but now it’s in VR.

All the kids who grew up watching this same intro, a lot of us are adults now. I think it’s important to show a translation of something you watched on TV into this experience. It’s easy for us to keep making this kind of content. We can do animated episodes. As we get into doing the more realistic stuff, it’s still rendered in Octane. That’s where everything we’ve been doing with Light Stage for years comes in. We can take humans, actors, and put them in these scenes and render it this way.

That shot you saw, in this format, you can move through it like a light field or hologram. It’s built in to the rendering pipeline. If you want to support Magic Leap or HoloLens and you want it to be this size of space, instead of rendering to this or to video, you get it automatically sent as a light field. It’s all handled on the cloud. Depending on the device that says, “I want to watch this video,” you pull down the right version. It’s a simple approach to giving this ultimate cinematic, holographic experience, no matter what the device is.

[aditude-amp id="medium4" targeting='{"env":"staging","page_type":"article","post_id":1684337,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,mobile,","session":"A"}']

GamesBeat: It looks viable. Why are you giving Octane VR away?

Urbach: I think people are going to start rendering images. Then they’ll want to make a movie, and then they’ll want this light field. That’s not something you can do quickly on your own laptop. You’re going to go to the cloud service. You can design and build the whole scene in Octane VR for free—I mean, if we start losing tons of money after 90 days we’ll pull it. But I don’t think we will.

When we made Octane $49, we had 5,000 customers in one day, which is insane. I always thought that we’d try this again, but I wanted to do it around something that’s compelling. This is about as good as it’s going to get. My sense is that because it’s 18K, even if Octane is super fast and you can render one frame on your laptop, when you’re rendering 24 frames a second, or 60 or 120, the cloud service is going to be useful and for that we’ll charge money.

Above: Jules Urbach, CEO of Otoly

Image Credit: Otoy

GamesBeat: It’s sort of like a free version and a more powerful version.

[aditude-amp id="medium5" targeting='{"env":"staging","page_type":"article","post_id":1684337,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,mobile,","session":"A"}']

Urbach: Yeah. The more powerful version is in the cloud. I think this whole space is going to be so big, especially for cinematic content. It’s not like the gaming content where you get motion sickness. This is a pretty obvious sell. Everyone from Warner Bros. to the NHL to individual artists, they want to have their stuff experienced in a better way. People make movies in IMAX because it fills more of your field of vision, because you really feel like you’re there. This does all that too. It gives that to the same space of users with mobile devices.

Carmack is focused on mobile VR. I am too. Companies like Magic Leap, and even Microsoft, these things are all really mobile devices. HoloLens is totally self-contained. This kind of effect, where you can look through this holographic window, is really interesting when you have AR tracking.

These glasses are prototype Osterhaut Design Group glasses. You may have seen them at CES. If you think of this thing as truly a window into a world–Imagine if you have an iPad where you’re watching a movie on it. Your screen can pull in a light field that would then be projected.

Here’s another demo. This is the Unreal engine running on Amazon, but you can see that I can control where the viewpoint is. If it’s a light field, which we’re just about to hook up, in that Keloid trailer or the Batman thing, you’ll be looking at it as a hologram. We’re waiting on this and that these other things to all converge. It’s going to be pretty awesome.

This is running in the cloud. The stuff you’ve written about before is this portion. We can load in anything in the cloud – Unreal Engine 4, our own light field stuff. The thing is, we may not need Unreal for these kinds of experiences. With the Octane 3 experience we announced yesterday, you’re going to be able to bake stuff into that. But the fidelity we get with a light field render is better than anything else. Either way, it’s the same principle. We render it in the cloud. It’s instantaneous. When I loaded the app, it connected and pulled down this demo. Because this is just a phone, we use the camera to do the tracking. That gives you some sort of position tracking, even on mobile.

Above: Octane VR image

Image Credit: Otoy

GamesBeat: Where are you going with this?

Urbach: The next step is, we want to get this and our own viewer app to synchronize, so that when you’re in VR you can still plot something on there. It gives you both vectors of tracking. You can move the view point and then also move around as you’re wearing the glasses. We’re trying that with the ODG things. It’s getting there. It’s a little wonky. This is just a prototype. But I think that HoloLens and Magic Leap won’t need markers.

The thing is, you’re wearing those glasses, and you also want this thing here, or you want it in the palm of your hand. Imagine the hockey demo, or a UFC match is a good example. You’re watching a UFC match in VR mode. You can’t see the room around you. But with HoloLens or Magic Leap, you have it in the palm of your hand. If you want to go to the fridge or something, you can watch the game while you’re doing that. That kind of stuff is insanely cool.

That’s the future of live events. For cinematic stuff, though, VR is perfect. You don’t want to see anything else. You want to get the story. You want to immerse yourself in it. That’s why cinematic VR, even on mobile, is so compelling. We’ve hit a sweet spot in this collaboration with getting our renders into Gear VR this way. There’s nothing that’s going to get you close to that other than this process.

I was talking to some guys from Pixar at the booth. They said, “We want to use this format, but we don’t want to use Octane.” I said, “All right. You want to use Renderman. How long does it take you to render one megapixel? We need to render 18K by 2K, like 40 megapixels.” That’s about 100 hours for them. You’re not going to get into this thing unless you switch to a faster renderer.

That’s one of the reasons why we have it dual-purpose. We have to get Octane to be adopted. That’s one reason we went all-out with all the features. There’s nothing in Octane 3 that would make somebody not want to switch. As people see this is an emerging medium, we’ll have success getting them to go over to a faster renderer, more than we ever did just doing faster rendering of films.

Above: Batmobile

Image Credit: Otoy/Warner

GamesBeat: The Batman assets, that’s a 2D film. How does a company go about saying, “Hey, we want to make this 3D”?

Urbach: You definitely don’t want to start with older things that are drawn in 2D. Batman was a 2D cartoon, but we rebuilt it all in Octane as a 3D scene, as if it were something re-rendered in full CG. We just made it look like it was drawn by the original guys. That’s why you look at the intro and think, “Oh, this is a really cool 2D version.” It’s actually not.

If you want to have a look, this is the inside of the Batcave. It looks like it’s hand-drawn, but it actually has this immersive feeling going on, which is pretty cool. There are so many fans of this TV show. Warner Bros., for whatever reason, stopped the cartoon in 2004. I was like, “This is the only version of Batman I care about!” Finally, because we lucked into this project — Warner Bros. said, “We’ll bring that version of Batman back in VR.” We went from 300 Twitter followers to 30,000, just because of that announcement. People were writing us emails. “You better have the voice of Mark Hamill!” Because he said he’d never do the voice of the Joker again.

My role in a lot of our work right now is evangelizing this stuff. Sometimes it’s paid off in these crazy ways. In some respects — we wouldn’t have done the NHL thing if I hadn’t been pushing on this. “Look how amazing this is. Let’s get this implemented and make it happen for tennis, hockey, and UFC.”

Above: Otoy 3D art

Image Credit: Otoy

GamesBeat: Your tools are useful beyond VR?

Urbach: There’s a bigger story than just the VR side of things. You see the pieces of that with these various different players in the mix. Microsoft’s HoloLens is not just about the media or the games. It’s also about the apps. They’re showing things like Skype and 3D modeling apps and Minecraft. I know Google is building a team called Android VR. But they’re all looking at a bigger thing called immersive computing, which is really just computing without being limited to a square that’s physically in front of you.

To me, that is the big step up from mobile. First we went from PCs to mobile devices. It’s not really just wearables that’s the next thing, not just VR. Those things are very narrow parts of something where there is absolutely no barrier between what you’re seeing in front of you and your eyeballs. All these devices are attacking this problem from different angles. But the operating system of the future is going to be something like that. It’s not going to be Google Glass or just Oculus, necessarily. That’s why there are different takes on it.

The guys that are taking it most seriously are Microsoft, with HoloLens, and Magic Leap. Magic Leap has been very clear. They’re not just going to create experiences. Apps are going to be written for it. I think they’re right. Whoever gets there first and gets developer mindshare, like Apple, will have an insurmountable lead. A thousand developers started with iOS and stuck with it, and they do their stuff right.

Microsoft or Facebook — any of them could win. But the guys who are focused on the AR space are going to get furthest first. It’s much harder to do, and it’s much easier for me to imagine turning this device into that, just by blocking out the lenses here. I thought maybe Magic Leap would get there first, but I don’t think they have that working yet. Somebody will. You’ll have one device for both, and then it’s going to get cool.

Above: Otoy 3D art

Image Credit: Otoy

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More