Skip to main content [aditude-amp id="stickyleaderboard" targeting='{"env":"staging","page_type":"article","post_id":1603328,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,media,","session":"D"}']

Otoy takes 3D graphics and virtual reality app development into the cloud

Otoy 3D animated art

Image Credit: Otoy

Otoy makes tools that artists can use to create stunning 3D art that looks as real as anything captured on film. The company recently launched its X.IO App Streaming service so that developers can create cloud graphic services such as next-generation cloud games, streaming virtual reality media, and workstation applications that can run on low-end devices.

The service is the latest cloud-based innovation from Los Angeles-based Otoy, which has also created cloud-based tools for filmmakers via Octane Render and for gamemakers with its Brigade tools. Those tools enable artists to create photorealistic images for games or movies using cloud-based computing resources, said Jules Urbach, the chief executive of Los Angeles-based Otoy, in an interview with GamesBeat.

[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":1603328,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,media,","session":"D"}']

Otoy has funding from Russian investor Yuri Milner, former Morgan Stanley boss John Mack, and Autodesk. Its advisers include Google chairman Eric Schmidt, talent agent Ari Emanuel, former IBM CEO Sam Palmisano, and former IBM exec Irving Wladawsky-Berger.

Here’s our edited transcript of our interview.

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

Above: Jules Urbach, the CEO of Otoy

Image Credit: Dean Takahashi

GamesBeat: Tell us what you’ve been doing.

Jules Urbach: X.IO has launched. That means developers can upload any application in a zip file. It could be a game. It could be a Windows app. It’s a little bit like YouTube. You just get a link back that allows you to play back the stream from a URL. There’s no App Store here. This is just running Unreal Engine 4 from my iPhone, and it works.

We’ve been doing this for a while, getting it to run in the browser in pure JavaScript. You now have no barrier to entry for deploying high-end content. You can use one graphics processing unit (GPU) in the cloud, four GPUs, scale way beyond anything that consoles or PCs have. A lot of the promise of gaming is finally realized.

Going forward, we’re going to port to devices that are different from just a 2D screen – things like Gear VR, the Samsung device, which is launching shortly. You have things like Project Tango, Google’s play on VR, where you can take your tablet and move it through space. We’re working toward streaming not just a video of what’s happening but almost a hologram. We announced this technology at [graphics conference] Siggraph. The idea is that when you have a holographic video stream, it allows you to look around, like through a window. You have a portal into that world. It’s super low latency.

This is something we’re doing specifically for VR streaming. You can have the ability, whether it’s with Oculus’s Development Kit 2 (DK2) or the Samsung Gear, to connect with this perfectly rendered virtual world and get a stream. It solves a huge number of problems, especially with Gear VR, where you have very low rendering power and you don’t have a lot of storage. That’s the plan. We’ll migrate from streaming existing 2D applications in the browser to doing full VR immersive streams.

Then you have some of these interesting new device categories, like Magic Leap, which is working on AR. The closer you get to the wearable glasses or contact lenses, the more you need the cloud to stream this stuff. That’s what we’re working toward. A big part of that is the developer backend we now have, and improvements in the codec that allow you to stream things like depth and multiple layers to give this holographic effect.

[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":1603328,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,media,","session":"D"}']

GamesBeat: I wonder about virtual reality. When you’re streaming VR to something, do you have to have everything there, everything sent to you? If you’re looking this way on a screen, it can just deliver what is visible in that direction. You don’t have to render what’s over here or there.

Urbach: We created a solution to get around that. This is an example of rendering on Octane, where I’m rendering everything. This is the entire 360. With ray tracing, it becomes pretty easy. This is one of the things we’re watching at Amazon. You don’t have to render one view. You’re rendering everything that’s around you, so the latency doesn’t matter, because as you look around, it’s all there.

It’s hard to do that with traditional rasterization, but with Octane and Brigade, we can render this stuff in 360 with multiple layers. That makes the whole effect in VR work much better. Even without that, we’re able to stream — this is streaming on the Samsung Note 4. This is already the speed we’re getting – better than the browser – with our native VR app. It’s pretty low latency and 120 frames per second.

[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":1603328,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,media,","session":"D"}']

We have two modes. We have the ability to stream what’s in the view at 120Hz, and we also have what you were seeing before, where we use a second stream to send the entire ray traced panorama as well. If you look around or the connection drops, you still don’t see any missing pieces. That’s the plan for VR. As we go forward, we also can send out more complex information that allows you to even, with one chunk, move around and see without having to re-download any new information from the server.

First step is just getting the 360 parts down. Step two is rendering the layers behind that so you can navigate through the scene to a certain point. All of those things are already part of the codec that we’ve built.

In the Galaxy Gear VR, this is what you see in your view port as we’re streaming it down, from Amazon and X.IO. If you look too quickly around, you’ll see a black edge, but I’m over LTE and there’s no black edge. It connects right into the time warping that Oculus’s John Carmack created, which re-projects the view very quickly so you don’t get nausea. We use that to send the server predictive information to send down the next image based on where you’re going to look. We never had that in traditional cloud gaming. It helps.

Above: Otoy 3D art

Image Credit: Otoy

GamesBeat: I wonder what the quality of the Samsung’s Gear VR is like.

[aditude-amp id="medium3" targeting='{"env":"staging","page_type":"article","post_id":1603328,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,media,","session":"D"}']

Urbach: It’s way better than the desktop VR. It’s a higher-res screen. This is the device. It’s 2560 by 1440. DK2 is 1920 by 1080, so this is double the number of pixels. It’s a higher quality screen. My assumption is that when Oculus launches the final version of a consumer desktop system, it’ll be this quality or better. But the better VR experience is on mobile. Mobile is the bleeding edge.

We’re working toward getting mobile VR to be a replacement for desktop VR. I think Carmack believes in that. He devoted his last nine months to getting Gear VR to work. That’s his big thing. We’re the software complement to that. We want to make that happen. The cloud and streaming and all these other tricks we’re doing with panoramic streaming and depth streaming and layers are the ways to make it work. He’s been very supportive. He got us in the development program for Gear VR very early on.

Our business model with X.IO is pretty straightforward. Normally at Amazon, you need to buy a server for an hour. At scale, we’ve figured out how to slice that into per-minute costs. We announced our pricing today. If you want a quick one-minute game experience on Gear VR, we can deliver that for five cents. It’s five cents a minute. Costs will go down as we get more users.

Above: Otoy 3D art Hugues Giboire

Image Credit: Otoy

Urbach: It’s exciting to get these pieces in place. It’s good timing, with Gear VR launching soon and Project Tango launching soon. There’s a desire to see high-end graphics in the cloud in a way that we never had a demand for before with cloud gaming. You had a pretty good experience on your PC or console. With VR, it’s a completely different world. And it’s not just VR. The way we’re looking at it, there are lower-latency experiences you’re going to need for things like Project Tango and AR and other things that make all this much more interesting and compelling.

[aditude-amp id="medium4" targeting='{"env":"staging","page_type":"article","post_id":1603328,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,media,","session":"D"}']

It’s thanks to Amazon that we’ve stepped over some of the pitfalls that OnLive hit, where they invested in all this infrastructure and GPUs. We had Amazon willing to do that for us. With X.IO we can launch worldwide and do permanent pricing and scale it as high as we want. A big part of what we’re doing is also supporting existing partners like Oculus, who invested in us to deliver apps for the cloud. We’re building this backend for them. We’re mixing games, app streams, and higher-quality GPU ray tracing all together. The more customers and usage we get, the cheaper the whole service becomes.

We have our very first partner on the VR side that we announced last month, which is Warner Bros. We’re doing the Batman animated series rendered in the cloud with the service for Gear VR.

GamesBeat: When you say that the more people you get, the cheaper the service becomes, does that mean Amazon is giving you a better rate?

Urbach: Yes. Right now we’re paying on-demand pricing. We just launched, so we don’t know how many customers we’ll have. We’ll get a good idea eventually. But if we get 100 percent usage between render jobs, we can lower the price by half. If we have less usage than we’re expecting , we’ll probably lose money, but I don’t think that will be the case given how many customers we have on Octane who can use the service, as well as how many Gear VR experiences we expect to stream. The tricky part is getting permanent Amazon costing in place – being able to move from one session to the next and do that very quickly.

[aditude-amp id="medium5" targeting='{"env":"staging","page_type":"article","post_id":1603328,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,media,","session":"D"}']

We’ve also added storage options. You can pull in files from Dropbox or OneDrive or any of these other third-party storage services. If you’re using things like Photoshop or Autodesk, all of that will be supported. The big piece is moving toward light field streaming. That’s the next frontier in rendering and cloud graphics.

Above: Otoy 3D art

Image Credit: Otoy

GamesBeat: What are some things that will be possible when all this stuff is in place?

Urbach: The first thing that’ll happen is, the case for doing ray traced graphics, whether it’s mobile or the web or even PC, becomes much more tenable. You can have a huge amount of graphics power in the cloud for pennies per minute and deliver those experiences. They don’t have to be long, 60-minute experiences. I imagine that for VR on mobile, a 10-minute high-quality experience is the equivalent of going to the arcade and putting 50 cents in something that’s better than what you could get on your home console.

The second thing is, for virtual reality cinema where we have huge amounts of rendered frames – that’s what we’re doing for Warner Bros. – we can have terabytes of light field data that’s stored in the cloud and rendered in the cloud and delivered in this small panorama round view. You can deliver experiences that look photo-real, that don’t cost more to render, and that are essentially linear.

As far as the apps go, the entire reason why Autodesk invested and wants to move apps to the cloud isn’t just getting rid of having to download the app to your PC. It’s that once you have Max or Maya in the cloud, you can tap into 30 GPUs and get this high-end rendering to happen in real time. That changes the entire economy of scale well beyond what you could do by just place-shifting your apps.

Those are the main areas that I see being disrupted by having cheap on-demand access. For developers, it changes the way that you can think of delivering apps. Being able to stream something to a browser is the ultimate goal that we’ve had. There’s no store that stops you from doing it. Apple is one of the most complex app stores there is, one of the most demanding. You’ll be able to deliver Unreal Engine 4 content for anything that you can imagine.

GamesBeat: I visited the Second Life guys recently. They’re working with OnLive to get Second Life viewable on an iPad for a fee – a pretty hefty fee, if I remember. But they’re also toying with Oculus quite a bit. Whatever they’re doing with their next generation, they’re doing that in VR.

Urbach: It’s compelling to stream it if you can deliver way better graphics. Second Life is designed to run on your home computer, so putting it in the cloud, to me, isn’t that compelling. You have mobile devices that can probably render Second Life just fine. You need to go to a new level of rendering quality. That’s where being able to do the ray trace stuff in the cloud makes a lot of sense.

There are lots of ways to make this model more efficient. If you create your room in Second Life and then turn that into a light field, you don’t need to re-render. You can navigate through that light field very inexpensively. That becomes a pretty compelling experience. That’s where I think the future is headed for these kinds of virtual worlds.

I don’t think anyone else is doing anything like this. We’ve been working closely with a lot of people who are connected in these spaces – Google, Oculus. We’re pioneering at this level. We’re focused on developers and content creators more than creating a portal, especially given that the end result of our service is that you get a link you can embed in your own app or web page and do what you want with. We don’t get between you and the customer. That was a mistake that OnLive made early on that caused a lot of friction. We make money just by taking a piece of the Amazon cost. It’s a cost-plus model.

At some point there’s a turnover where, even if you manage to replicate everything we did with Amazon, on-demand pricing will be more expensive than what we’re doing at scale. We can go below that. That’s a pretty good model. It’s a win-win.

Above: Otoy 3D art Niuqcam

Image Credit: Otoy

GamesBeat: The timing is interesting. How fast is some of this stuff coming?

Urbach: The launch of X.IO is now. People can start publishing apps today. It’s available almost worldwide. South America isn’t covered. Pretty much everywhere else in the world is. The pieces that go beyond the traditional stuff — if you want to push your Unreal Engine 4 game, all that works now, even with a joystick. Going beyond that, when we support VR or light field stuff, that’s probably a few months down the road.

We’re timing that next generation of our service to coincide with the launch of Gear VR. Probably before the end of the year, you’ll have the ability to stream in VR mode. I don’t think the DK2 is our target. The first consumer device is going to be the Samsung Gear VR. That’s the first piece of VR that you can really buy.

The other option is Google Cardboard, which seems to be getting more excitement from Google’s end, but I think it’s not a joke. You can have a real casual experience buying something at Barnes & Noble for a dollar, putting it over your phone. We’re going to support that as well. We have an Android app that will run in VR mode for the Gear VR, but it’ll also support Google Cardboard. There are like five people making adapters for iPhones, so it’ll do VR mode even on that. But the best experience for VR, the special one, will be Gear VR. That’s where it’s so low-latency that you get a better experience than even on Oculus. I imagine there being some interesting experiences possible in Google Tango, which is not quite VR, but as you move your tablet, move your view through the scene, you get to slide into this virtual world.

Even without VR, one texture from a holographic stream means you just look at it like you’re looking through a window. This is using the DK2 tracking. I put the goggles on my head and had my iPhone next to me. This is to show the effect. Inside the VR thing, it’s like you’re looking through a window. This effect is super cheap. It can be decoded on any device. One frame is like a hologram. It blows open the door to all sorts of crazy stuff, because all these streams I’m showing you, when you have just one frame of it, you actually can look around. If the Internet’s bad, you still have the ability to have presence.

It looks photo-real. This is our renderer running on X.IO. You can set up this scene and build it in Max or Maya, also in the cloud. Once it’s done, you get back a stream that’s sent down as a panorama that you saw before, but this type of panorama, even though it’s cut off at the top and bottom, has the ability for you to move around in it. That’s the magic of holographic streaming that we’re excited to launch.

We have the decoder working on PC. We’re working on a decoder for mobile devices. That’s it. This is also awesome for Project Tango, which is exactly this effect. You have the tablet moving through space, and as you move it, you’re moving through that holographic stream, or even one frame. This is exactly that. We send the coordinates to the server. The server sends one frame. That’s the one frame, and even without sending another one — it’s also double-sided. If you flip around you can see it from the other angle. It’s the first tech of its kind, that can stream a hologram down.

You can also store it offline. For one meter by one meter of holographic volume, it’s 64 megabytes. That’s not very big. A PNG that’s a panorama is 12 megabytes. It’s pretty small. Maybe four or five times more than a 2D PNG. Video is also something that we’re supporting. We’re working on doing live capture. I don’t know if you saw our Light Stage stuff, but we’re adapting that light field recording technology so we can record holographic, volumetric information and put that in the cloud and stream that down.

There’s a lot of sports leagues that are interested in this technology. They want to tap into VR and next-gen experiences. We’re covering that as a superset of what we can do with this kind of streaming.

Above: Otoy 3D art

Image Credit: Otoy

GamesBeat: How have you done with folks like filmmakers, who want to use laptops as their workstations?

Urbach: Everyone that’s tried our stuff loves it. Once you get the idea that you have more compute power in your web browser than you’d ever have installing this stuff on your desktop — I have my own product, Octane, on this computer. It’s not super fast, but if I go to my website, I can run it at a very fast speed.

GamesBeat: I went to DreamWorks Animation. They showed me all the stuff they did for How To Train Your Dragon 2. I was a little surprised when I talked to their CTO. He was talking mostly about their partnership with Intel. He wasn’t talking much about GPUs and all that stuff everyone’s starting to take advantage of. They have the cloud infrastructure in place at Intel and all that, but they seem like they could still make more progress on enabling every artist to work on something like a laptop. 

Urbach: We’re in a coffee shop and I’m running 3D Studio Max with four Titans in a web browser. This is a live link to our servers. We have all our tech running on this thing. You can run anything on here. Running it in a browser with this kind of access – it’s secure and encrypted – is pretty awesome. It’s the entire Windows content creation app market.

We had a good shot in the arm when Autodesk invested in us to build all this out and make it happen. Their goal was always, “Get it working in a browser so we can integrate it in A360.” We’ve done that. That’s a big deal. The content creation is in the cloud. The rendering is in the cloud. The delivery mechanism can take advantage of that. Storage is going to zero, and eventually compute will get really inexpensive as well.

As a company, Otoy will always find ways of leveraging unlimited rendering power. The GPU ray tracing we do can scale to infinity. The more GPUs we have, the less noise there is in that live render, the bigger that render can be, and the more holographic it can be. There’s a couple of orders of magnitude that we can still leverage, and the costs keep going down.

We don’t need to worry so much about watts. There’s probably a trend toward getting lower-power and thinner devices. In the cloud it doesn’t really matter if you can split a rendering job in parallel and just use more GPUs. Those GPUs become cheaper. That’s where we see everything headed for games, media, entertainment, and content creation tools.

GamesBeat: Are any other game companies doing much with this yet?

Urbach: We’re doing some hooking to get Unreal Engine hooked up, so Unreal developers can just publish this way. We’re working on getting game developers on publishers to work with it. But I think it’s just too early right now. This is day one. Some people have used our service that we still can’t talk about. They’re streaming stuff now. That’s pretty exciting.

Above: Otoy 3D art

Image Credit: Otoy

GamesBeat: What about all these other rivals that seem to be out there? There’s the Unity cloud effort.

Urbach: That’s totally different. The Unity cloud connects multiplayer between all your different Unity games, but it doesn’t handle streaming or ray tracing or whatever. In fact, the same thing we’ve done with Unreal, where we’ve done an integration of the engine into our service, we’ll probably do that with Unity. It’s just that Unreal is open source. The Epic guys have been very helpful in getting us going on that, so that’s why we picked them first. Unity is definitely on our road map. Then we’ll look at other engines, and maybe custom integrations after that.

GamesBeat: They seem pretty far away from the notion of making a game in the cloud, then?

Urbach: We could take the UDK, the entire Unreal development system, and put that in the cloud. There’s no reason we couldn’t do that alongside Autodesk tools and any other application. You’d just work out how that would work with Epic. That’s definitely where things could go. Epic does a lot of work now so that you can compile for iOS on your Windows device or whatever. If you build everything in the cloud, developers could have a much simpler time creating content. That’s our focus with X.IO, in addition to being able to support the end result of that, these higher-quality experiences.

For game streaming, Unreal Engine 4 is already working out of the box. You can take anything you created in the UDK and publish it and get it streaming back. With Unity it’s also pretty straightforward. Even if you don’t have those engines, you can still upload an executable and, with a little bit of work, get it to work within our APIs.

You can turn on VR mode in Unreal and get that to stream back down. That’s one of the key things we’ve done. We’ll keep doing that with other engines. Unity is another iteration.

GamesBeat: How many people do you have now? Are you raising any money?

Urbach: We’re 60 people. We did a big round almost a year ago. We’re generating revenue now and doing well.

Above: Otoy 3D art

Image Credit: Otoy

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More