Mooly Eden relaxes after leading an Intel press event at CES.

Above: Mooly Eden relaxes after leading an Intel press event at CES.

Image Credit: Dean Takahashi

VentureBeat: What sort of path looks the most promising for the best applications? If I look back on history, hardcore gamers love their Xbox 360 and Xbox One games, but they don’t like Kinect games that much.

Eden: That’s a fair question. At the beginning, we’ll see more casual gaming. The hardcore gamer with a Core i7 processor and two graphics cards, they’re not much more than a small percentage of the market, and that’s not the first audience we’re going to move toward. But one of the biggest CEOs in the gaming industry has said to me, “If I can monitor your face while you’re playing a game and change the flow of the game based on your emotions, that’s very interesting.” When you look at perceptual computing, it’s not just what you can tell the computer through your gestures. It’s what it can learn about you.

If you’re asking me about shooting games, hardcore games, maybe not yet. But if we’re talking about using this technology to make games more immersive, yes, there’s definitely much more to see. What are you trying to do in a game? You’re trying to create an augmented reality, to take the real world and make it better. The more immersed you are, the better the game is. I’m not sure we’ll replace the mouse, but the more three-dimensional we are, the more we’ll be able to do.

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

Intel's RealSense camera can see your fingers.

Above: Intel’s RealSense camera can see your fingers.

Image Credit: Dean Takahashi/Intel

VentureBeat: At the same time, I think gamers also want something different now. Maybe they’re not satisfied with just one more generation of consoles and a standard game controller.

Eden: I don’t know all the answers, but I feel like bringing in additional senses will make games more interactive. When you speak to me and we’re looking at each other — in perceptual computing we call it multi-modality.

Today, when you’re playing with a computer, I think it’s safe to say that it’s engaging two things. You have sound created by the gameplay, shooting or whatever, and you can see what’s happening. But what if the game could measure your heartbeat and use that? We can bring additional dimensions to games. We’ll definitely work within the ecosystem. To bring out just one title costs tens of millions of dollars. We need to show what the SDK we can do and work with people to make it happen.

VentureBeat: If I were Microsoft watching everything Intel shows at its press conference, I would ask if these guys my enemy or it they are still my partner. You had Android onstage. You had Steam onstage. You’re doing your own perceptual thing.

Eden: First of all, we’re not enemies. If you look at the overall ecosystem, each company is trying to optimize its own things. Microsoft is doing things with our competitors. They’re working with ARM and such, which is the right decision for them. Intel is working with Microsoft, and working with Android as well. I’m not sure if we’re all happy, but at the end of the day there’s a huge amount of space for collaboration.

They have a great operating system. They’re still a great company. They’re a collaborator in some spaces, and in some spaces they’re working with someone else. I respect their decision, in the same way that I think they respect ours.

iJustine showed off RealSense working with Microsoft's Skype.

Above: iJustine showed off RealSense working with Microsoft’s Skype.

Image Credit: Dean Takahashi

VentureBeat: Where are you deciding what you have to do as opposed to something like Oculus, with their own headset?

Eden: It’s different. What they’re doing is nice. I had a chance to play with it, and I think it’s great for gaming. But it’s not what we’re doing. When I speak about collaboration and simple gaming—I don’t want my five- or six-year-old to sit with this on his head all the time. It’s complementary, but it’s a different market, a different usage. I believe there’s space in the market for more than one company.

VentureBeat: Within all of this, where do you see the biggest computing challenge?

Eden: There are two computing challenges. One is to close the gap between our computing capability and the human brain. It’s going to happen. It’s scary, but it’s going to happen. I can tell you what the futurist Ray Kurzweil said. I believe he said that we’ll be able to get to the computing power of a mouse by 2016, the computing power of a human brain by 2020, and the computing power of humanity by 2050.

Of course, we have a lot of redundancy in what we know as a race. Maybe 80 percent of our knowledge is common. And he might be wrong as far as the timing — it could be 2025 or 2035. But technology is moving forward. The challenge for computing is only in how the architecture is going to change. So that’s one challenge. The other challenge is the man-machine interface. I want a computer device that will work with me in the same way I can work with you. That’s my utopia. Will it happen? Definitely. The question is just when.

The one thing that’s a little bit scary, although this is just one man’s opinion — this is not a technical revolution. It’s a social revolution. It will change us. If you think we can have machines that will think like humans — you know Isaac Asimov’s stories about robots. If you believe that by 2050, 2100, a machine could have the capacity of the human race, some futurists claim there won’t even be a homo sapiens anymore. We won’t be the homo sapiens we’ve known for the last several thousand years. One way or the other, this won’t be something I know. It will be something totally different.

Intel's Mooly Eden makes a point at CES.

Above: Eden makes a point at CES.

Image Credit: Dean Takahashi

VentureBeat: You started perceptual computing before Brian Krzanich was CEO, right?

Eden: Yes. It took us more than six months to do this. [Laughs]

VentureBeat: Has he helped change your direction or your focus?

Eden: No. We can just focus on what we’re doing even more. In order to advance, you need resources. But we’re still exploring. There are a lot of questions we need to answer. You learn as you move. This is a very risky venture, but it’s been fun.

VentureBeat: When you had the brain chart in your presentation, it almost seemed like you had the computing part done. Then you put the sensors around it, and then you had the brain. It’s sort of like thinking that the computer is the brain, right?

Eden: It’s a good question. During my presentation, if I had done too deep a dive, I probably would have lost 90 percent of the audience. Overall, we need more computing power. It’s no secret that there’s a big difference between a microprocessor and the neurons in the brain. They say that one neuron can be connected to as many as 10,000 other neurons. One transistor can be connected to 10 transistors.

When I say that we’re going to close the gap with the brain, that implies we’ll need much more performance, and we’ll need to change many things about architecture. We’ll need much more acceleration. More than 50 percent of the brain goes toward simply deciphering what your eyes see. So what I was trying to allude to was the need for that additional performance. Even if you tried to simulate it differently — and I believe Microsoft is trying to do that – the compute power necessary is huge. There’s not enough. Today, our brain uses only approximately 20 watts.

Mooly Eden at Intel's CES press conference.

Above: Eden explains perceptual computing at CES 2014.

Image Credit: Dean Takahashi

VentureBeat: Maybe, before we get there, getting all the sensors in place is still a big project.

Eden: It’s a big project, but now I believe it’s almost inevitable. You can argue that the jury is out and we still need to prove it. But the fact is that people want a natural man-machine interface. You’ve seen it in books. You’ve seen it in movies. That futuristic, science-fiction stuff, that’s what people want. It’s been a dream for many years. But we’re on the verge of taking the fiction away from the science.

This couldn’t have been done three years ago. Intel couldn’t have done it three years ago. We may be delayed by a year or two, but I’m confident that within a few years we’ll see this work.

By the way, you asked me how I know this is going to be successful. When you’re working with the consumer — why do you buy something today? Because you love it. Not because you need it. You already have what you need. How do you know you love something? When you smile, when you say “wow.” Here’s a picture of you when you were playing this thing. [Laughs] We did a lot of professional testing, and the jury is still out. But if you ask me about the adoption of stuff like this, if it leads to those kinds of smiles, we have a great opportunity.

 

Reblog this post [with Zemanta]

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More