Octane VR renders VR still images that look real.

Above: Octane VR renders VR still images that look real.

Image Credit: Otoy

GamesBeat: So the difficult thing is, when it moves like this it fogs up? You’re not creating it for motion, right?

Urbach: No, not yet. Here’s the thing. We haven’t found a person who wants to show the animation work we’re doing with them. But that’s probably going to be the first commercial project that comes out with this format — animation with interactive narrative experiences ready to go.

GamesBeat: What’s the limit of what you can do now? Is crossing into motion really difficult?

Urbach: No, it’s actually very easy. The only issue we have in Gear VR — this is a huge format. If we made it to stream video, it’s still an issue for a computer to play it back. It’s so much more data to play back a movie at 4K, let alone 18K. But it’s still essentially just imagery. There’s nothing super special about this.

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

If you only pull up the part of the video you’re looking at, though, which is the trick we’re doing, you can get the same effect in video. That’s what we’re working on. By the time we launch the Warner Bros. thing, we’ll have it figured out.

Explore the Batcave with holographic video

Above: Explore the Batcave with holographic video

Image Credit: Otoy

I can show you an example of how things look animated. Do you remember the Batman cartoon from the ‘90s? Here’s the intro done completely in this format. We haven’t announced that that’s going to be in it, but if you hit the side button it’ll start playing. It looks exactly like the original TV show, but now it’s in VR.

All the kids who grew up watching this same intro, a lot of us are adults now. I think it’s important to show a translation of something you watched on TV into this experience. It’s easy for us to keep making this kind of content. We can do animated episodes. As we get into doing the more realistic stuff, it’s still rendered in Octane. That’s where everything we’ve been doing with Light Stage for years comes in. We can take humans, actors, and put them in these scenes and render it this way.

That shot you saw, in this format, you can move through it like a light field or hologram. It’s built in to the rendering pipeline. If you want to support Magic Leap or HoloLens and you want it to be this size of space, instead of rendering to this or to video, you get it automatically sent as a light field. It’s all handled on the cloud. Depending on the device that says, “I want to watch this video,” you pull down the right version. It’s a simple approach to giving this ultimate cinematic, holographic experience, no matter what the device is.

GamesBeat: It looks viable. Why are you giving Octane VR away?

Urbach: I think people are going to start rendering images. Then they’ll want to make a movie, and then they’ll want this light field. That’s not something you can do quickly on your own laptop. You’re going to go to the cloud service. You can design and build the whole scene in Octane VR for free—I mean, if we start losing tons of money after 90 days we’ll pull it. But I don’t think we will.

When we made Octane $49, we had 5,000 customers in one day, which is insane. I always thought that we’d try this again, but I wanted to do it around something that’s compelling. This is about as good as it’s going to get. My sense is that because it’s 18K, even if Octane is super fast and you can render one frame on your laptop, when you’re rendering 24 frames a second, or 60 or 120, the cloud service is going to be useful and for that we’ll charge money.