Dean Takahashi of GamesBeat moderating AR VR panel at the Intel Capital Global Summit.

Above: Dean Takahashi of GamesBeat moderating AR VR panel at the Intel Capital Global Summit.

Image Credit: Milena Marinova

GamesBeat: Dan, let’s bring augmented reality into this a little bit too. You don’t have presence with augmented reality glasses. It’s not immersive. Why would we want to do it?

Eisenhardt: Because I spend most of my time in the real world. When you live in the real world and work in the real world, sometimes instead of being immersed in a completely different environment — whether it’s for entertainment or training or other things — you want to live your day to day life. You want to be able to see the world and understand things in it. That’s why I think some people project, as you said, that ultimately the market for AR will be larger than the market for VR in the long term.

Losing situational awareness is a big thing for people. They might love the Oculus Rift, but they would never wear something like that on the plane or in public. There’s this notion that you’re isolated, that you lose a bit of your humanity. That’s a challenge. But in enterprise, the opportunity is so huge, for training and all kinds of other things.

I’m wondering, what are the challenges in VR? Is it price point? Is it a mature ecosystem for the software? What is it, actually? It seems like the opportunity is so huge that it should outweigh any of this.

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

Azuma: One of the most serious is vection. The systems that are going to be available in 2016 — I’ve been very impressed by the technology. I assume the price points are going to be accessible to consumers. If you’re in a situation like those of us up here — you’re sitting in a chair and you’re only moving your head a bit and looking around. I’m not going to be able to get up and walk in a room in a space like this. Vection describes the sensation of flying or moving around in a first-person shooter game, even though I’m actually sitting still. It’s a conflict between what the inner ear is saying and what I’m seeing. That’s a very serious problem.

GamesBeat: You don’t want your customers throwing up, right?

Azuma: One approach to try to deal with this is using electromagnetic fields to induce a sensation in your inner ear and try to match what’s happening visually. My response to that is, “You go first.” Omni-treadmills are trying to provide the sensation of walking, but all the treadmills — it’s not the same as actually walking yourself.

Beall: We may need to frame this problem a bit, because I’m not sure everybody has experienced enough VR to know. One of the holy grails is being able to experience and navigate large spaces. Those of you who’ve tried Oculus, the experience may have been seated. You’re allowed to look around, maybe move your head a little bit, but that’s far different from, say, walking around this room. Or using some sort of controller to just blast through the wall as if you’re Superman.

Obviously serious impact is a problem. Ron’s looking at the fact that if you allow full control, you have serious motion sickness flaws. There are a few techniques, one of which we use. You get away from the fixed stool and get on a pivot stool. Don’t ever let a virtual person rotate with the joystick. Have them pivot with the stool and you avoid some of the inner ear conflicts. But as you see, we’re clearly dealing with some subtle human performance and perception problems.

HTC Vive virtual reality headset, which uses Steam VR, is already amazing despite how little we know about making games for the platform.

Above: HTC Vive virtual reality headset, which uses Steam VR, is already amazing despite how little we know about making games for the platform.

Image Credit: Valve

GamesBeat: Then you also start tripping and bumping into your furniture.

Davis: Yeah. There’s the chaperone problem. We ran into this with Kinect. People didn’t want to move the coffee table, and then they complained when they kicked their coffee table. The interesting thing, though, that we’re learning from a lot of VR and AR developers, is that you acclimate to a lot of the physiological impacts and dissonance.

Some of our core developers say, “I don’t trust you when you say that this won’t make anyone sick. Let me get our person who always gets sick.” Because there is an acclimation that starts to happen. Another thing is, how much of this problem is for us to solve now in this generation, as opposed to in the future when people take these things for granted? It’s going to feel a little funky at first, but the tradeoff in the form of the experience you get more than outweighs the cost.

GamesBeat: Is processing power going to solve this problem?

Azuma: Is processing power going to solve motion sickness? We’re talking about physiological problems.

GamesBeat: But if you can refresh fast enough….

Azuma: You can do better tracking. You can do lower persistence and other types of things that have licked it when you’re just sitting and rotating in space.

Beall: To the point about adaptation — we’re on an interesting trajectory. We’re on the precipice as far as many aspects of VR being viable as a large-scale option. That said, there is a rich history, 10 and 20 years, of putting up with artifacts. I’m not saying we should, but there are cases in which there’s enough return on that investment, enough productivity to be gained. There are cases where we need to have some compromise in order to gain what we can today. We can’t expect this to be absolutely perfect.

Davis: We’ve found that if you maintain a consistent frame rate, 90Hz being the holy grail for VR right now, and you avoid things like — don’t dramatically change the environment in a single frame. Otherwise, the brain is really good at adapting to these things. We’re constantly trying to reconstruct reality from poor sets of data. This is just an extension of that.

Eisenhardt: You asked about what some of the challenges might be. One thing that’s worrying on the business side, the application side, is that we have to field test equipment with a given experience. It doesn’t mean that every application that is developed may follow those kinds of guidelines. I don’t know how the companies prevent that from happening. Then the consumer will be in a bind of having spent a lot of money on this, they go through a few of these things, they don’t like the reaction, and then what?

Robots from Portal in VR

Above: Robots from Portal in VR

Image Credit: Valve

Davis: You’re right. As a developer that’s one of our main fears. You look at the Oculus and Valve HTC devices right now, they require a PC. The PC represents a lot of variables — what drive you have, what CPU you have, what GPU you have. Even if the developer does their best, the consumer might still have a terrible experience.

This isn’t some random application crashing on your computer. That sucks, but you just reinstall it or whatever. If a VR application has a terrible performance or crashes, that’s going to have physical effects on you. You’re gonna say, “Yeah, no more VR for me. I’m done.” The stakes are that much higher for people. I don’t know that every developer has totally bought that yet. They don’t understand yet. We all need to hold ourselves to that high bar. Otherwise we’re gonna make people vomit.

GamesBeat: There’s another high bar in AR as well, Dan. Magic Leap, which drew the half-billion-dollar-plus investment from Google, and some other folks, their stated goal is to create augmented reality images you can see through your glasses that are indistinguishable from real life. Think Jurassic World. You’re walking around a park seeing dinosaurs, and they’re fake, but you can’t tell. Are we going to get there? Do we have to get there?

Eisenhardt: That was going to be my comment — do we have to get there? We have to solve problems, good problems. They can’t just be gadgets. That’s what we’ve been focusing on at Recon and now at Intel. We’re solving problems and becoming essential.

VR is essential, or it should be essential. For AR, at least in the space that I’m interested in, it’s mostly outside. We’re wearing something on our faces and then through that we have access to information that’s more convenient than our traditional interfaces.

What’s happening as far as macro trends, we’re always checking our phones a hundred times a day — text messages, emails, everything else. Fast forward five years, when we have 50-billion-plus devices – your refrigerator generating data, your dog’s collar generating data — you’ll be more and more bound to your phone all day. That’s the thing about the humanity aspect I’ve brought up. You lose touch with humanity.

So how do we solve that problem? The solution is different depending on the situation. We’ve gone through the vertical. We’re going to create products that solve real problems right now. When we’ve done that, we’ll go on and build the next one. Eventually we’ll get to a space where everyone will be buying and wearing these devices.

Azuma: It’s a great visual that you bring up, Dean, but I don’t think the field has to get to that in order for AR to be useful. Think of the yellow first-down line when you’re going to see a football game. That’s an augmentation in real time. Is it useful? Sure, in terms of understanding the game. That doesn’t mean you mistake that for reality or that it has to be indistinguishable from reality in order to be useful. If I’m walking around Seoul and I can’t read Hangul, if have an app that translates signs into something I can read, that may not strain the muscles as far as displaying the text, but it’s still useful.

Davis: One of the things I’ve round really impactful in augmented reality app development — right now, with my smartphone, I have a hundred apps or so and they’re always with me, all the time. In order to find one I have to browse through all the hundred of them. But in augmented reality, you have a notion of space. You can start to map that out in relation to what’s in my phone.

If you use a recipe app, you generally have that open in your kitchen or at the grocery store. You can match that with a computer’s understanding of location — oh, we’re in the kitchen now, let me make sure that app is accessible. It’s there before I even knew to activate it.

You mentioned something that triggered a thought for me. The internet of things is ridiculous. We’re going to be just overwhelmed by too much data all the time. The idea of machines starting to get that understanding of our environment and our life, being able to position these things through AR experiences — that sounds fantastic.

110 Stories is an AR iPhone app that lets you see where the twin towers would be.

Above: 110 Stories is an AR iPhone app that lets you see where the twin towers would be.

Image Credit: Brian August

Azuma: Think of a parallel digital world that’s going to be tracked by Internet of Things devices. How are we as human beings going to interface with that and understand that within our environment? AR might be the interface we use to solve that problem.

Here’s another example to the question of whether it has to be truly photorealistic. One of the most compelling experiences I’ve ever seen is something called 110 stories. This was an app on an iPhone that would, if you pointed it at the New York skyline, would measure an outline of where the World Trade Center should have been.

To me, there were two design decisions that made this particularly compelling. One is that even though we have the graphics power to render something that looks real, the designer chose not to do that. Instead, he added just an outline of the towers, as if they were sketched in against the sky with a grease pencil. To me that made it much more compelling, because it played into the message it was trying to convey. They’re not really there.

The second aspect was, you could take a picture and then it invited you to write a few sentences. Why did you take this picture? What does it mean to you? If you go to the website and read some of the stories —  as you might expect, some of them were very emotional. For me, the power of augmented reality — for VR it’s presence. For AR is about making a meaningful connection between reality and something virtual, the combination of the two. It’s a different type of experience from an untouched reality or an entirely virtual one.

Davis: One of the things I see as a game developer — games are about empowerment. The most amazing games out there make you feel like a superhero or a wizard or whatever your fantasy might be. Augmented reality is going to play a part in that, especially combined with the internet of things. I have all kinds of things I can control with technology now. A virtual experience can start to invade reality in a way that nobody else could ever do before. That opens up all kinds of new horizons.

My favorite thought experiment about this is, hey, I’m going into Starbucks. I love going to Starbucks. It’s my favorite coffee shop. Imagine if no matter what coffee shop I go into, my augmented reality glasses can remap it, and I have my Starbucks experience.