Skip to main content [aditude-amp id="stickyleaderboard" targeting='{"env":"staging","page_type":"article","post_id":2147774,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"arvr,business,","session":"B"}']

The tantalizing promise of augmented reality games

Jeri Ellsworth (left) of CastAR and Nick Beliaeff of Spin Master at CES 2017.

Image Credit: Dean Takahashi

CES 2017, the big tech trade show in Las Vegas last week, was loaded with augmented reality smartglasses, which layer digital animations and information on top of the real world. They promise us wonderful connections between the digital and physical worlds, and games where we can chase cartoon characters through our own furniture.

Mainstream venture capitalists pumped up startups in virtual and augmented reality in the third quarter, with a record $2.3 billion invested in the last 12 months, according to tech advisor Digi-Capital. And the company predicts that AR will be a $90 billion market by 2020.

[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":2147774,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"arvr,business,","session":"B"}']

I talked about this great promise of AR on a panel at CES with Jeri Ellsworth, cofounder of CastAR, and Nick Beliaeff, vice president of production at Spin Master Studios, a toy company that recently launched its Air Hogs Connect: Mission Drone game.

Here’s an edited transcript of our panel.

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

Nick Beliaeff: I’m the VP of production at Spin Master Studios. That’s a division of Spin Master, which is a 20-plus-year-old toy company. My job is to bridge the gap between physical play patterns and digital play patterns as connected devices become more accessible to the kids of today. They’ve stopped playing with toys, because it’s such a compelling medium. It’s my job to re-engage them and go from there.

One thing we love about AR is how it’s such a natural pair with physical play. We’ve been investing in it since 2014. We released our first product late last year, called Mission Drone, where you transform your room into an AR sci-fi universe and fly a real physical drone while you fly rescue missions and battle aliens. Our next one, which we’re showing over at Luxor, is Nitro Boost, with multiplayer RC car racing on different circuits. What’s cool about AR and connected toys is we can do stuff with a physical RC car crashing into an AR virtual car and spinning out. There’s a certain magic to the experiential nature of AR. We’re huge fans.

Above: Air Hogs Connect: Mission Drone

Image Credit: Spin Master

Jeri Ellsworth: I’m an inventor. I’ve been an inventor my entire life, and also an entrepreneur. I got my start building and driving race cars in my first career. I opened up retail computer stores after that, at the height of the internet boom in 1995. Eventually I moved into electrical engineering and product design, and toy design as well, which is a very fun field.

I ended up at Valve Software running their hardware department, putting the team together that created the HTC Vive. We were tasked with exploring how to bring new gamers into the Steam platform. Just recently, in the last few years, I founded a company called CastAR, where we’re making AR glasses that allow you to blend virtual graphics with real-world objects on your table.

GB: There are predictions from Digi-Capital, a tech advisor, that AR is going to be a $90 billion market by 2020. Three times the size of virtual reality, goes their prediction. They’re very bullish on AR. I wonder what, as far as the backdrop goes, or how far back the thinking around AR goes in history — what sort of science fiction dreams inspired you to think about AR and get into AR?

Beliaeff: The one that spurred us at the toy company came from the head of our Air Hogs division, the RC division. He’s also a very big Star Trek fan. He wanted a holodeck experience with his toys. That’s where Mission Drone came from.

[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":2147774,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"arvr,business,","session":"B"}']

Ellsworth: I’m another huge Star Trek nerd, and Star Wars of course. When I saw holo-chess as a kid, I didn’t know what the gameplay was, but I wanted to experience that someday. All the sci-fi didn’t necessarily push me directly to AR, though. It was all the combined inputs and outputs and the blending of your digital experience. That excites me. Where are we going in the next 10 years? It’s going to be a blend of all our digital content into the real world in a seamless way. Science fiction predicts that for us.

Above: CastAR wants you to play tabletop games in mixed reality.

Image Credit: CastAR

GB: We have that very bullish forecast right now, but that wasn’t always the case. Where do you think we got this turn for optimism about AR?

Beliaeff: There are a few different touch points. Some things you don’t necessarily think of as AR that are technically AR, like the weather person doing the weather in front of a green screen. It’s been fairly pervasive, but it just hasn’t been labeled. There’s a bit more awareness there.

Meanwhile, the technology you guys all have in your pockets or in your laps — your smartphones and tablets and even PCs — have gotten powerful enough that there’s enough horsepower to run this technology. The software has also advanced a lot as well. SLAM, simultaneous localization and mapping, is getting more accessible. As hardware and software mature, awareness is driven by wonderful things like Pokemon Go. That’s not the perfect implementation of AR, but it drove awareness like nothing else.

[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":2147774,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"arvr,business,","session":"B"}']

Ellsworth: Pokemon Go is an interesting moment in time, where it shows this gap we have in entertainment that needed to be filled. I saw families getting together and playing this. Couples were walking around the street looking for Pokemon. It’s amazing, the technology we have that can unlock that. All these various sensors that have been commoditized by cell phones are enabling that. Even five years ago we didn’t have enough technology to do it. It’s all here now and we can start to have these experiences.

GB: With toys, we’re seeing things like Lego AR. You can snap the QR code on the box with your camera and then you can see the characters tied to that toy come to life on your screen.

Beliaeff: That’s another widespread but not necessarily labeled use of AR, in advertising. If you look at companies like Vuforia, which is an AR middleware provider, their top clients are all in advertising. It’s becoming more mainstream. People are recognizing what it is and they can put one and one together.

If you look at what’s going on in phones, we’re soon going to have stereoscopic cameras in phones and things like that. That’s also stuff that’s going to help AR move forward. Once you can start adding depth, you’re not going to need markers anymore.

[aditude-amp id="medium3" targeting='{"env":"staging","page_type":"article","post_id":2147774,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"arvr,business,","session":"B"}']

Ellsworth: Having a more seamless experience is going to be essential for the adoption of AR. We’ll start seeing more platforms doing it in a smoother way. Instead of having to install applications to do this, it’ll be part of the OS and part of the hardware itself. You don’t have to disrupt your normal day to have an AR experience. It’ll just happen automatically.

Above: VR Fund’s AR industry landscape.

Image Credit: VR Fund/Tipatat Chennavasin

GB: When it comes to games, it seems like the mad rush has been straight into VR instead of AR. Jeri, at Valve they did have that sort of choice point, and they chose to do Steam VR and get behind the HTC Vive. Your project was spun out, basically. Why did you choose AR as something to do with gaming?

Ellsworth: VR is amazing. I absolutely love it. But usually the experiences I have with VR — I pull out my Vive, set it all up, move all the furniture out of the way in the living room, move my main computer over, and then I have the experience for 20 minutes. Then I take it all down and pack it away and wait a month for the next cool experience.

The thing that excites me about AR is that it’s going to be a daily experience. When all of the technology is there so we don’t have to disrupt our lives to experience it, then it’s going to cover a much broader demographic than just hardcore gamers like me.

[aditude-amp id="medium4" targeting='{"env":"staging","page_type":"article","post_id":2147774,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"arvr,business,","session":"B"}']

Beliaeff: If you look at the evolution of video games, they’re trying to control everything a person is experiencing. They want you to look at the television or the PC monitor in front of you and completely immerse you. VR is the next generation, where it’s giving you that room-scale experience, but still totally taking over and wanting to dominate what it is you’re taking in.

AR allows you to mix reality, where you blend what’s going on in the room or wherever you are, plus now there’s all this stuff that wasn’t there. That’s one of the things Pokemon Go did well, that experiential part. If you saw those first pictures in Central Park, you got this experiential blend of the video game portion of it and the reality around you.

Ellsworth: That’s important. VR is inherently isolating. AR, you can have your friends and family around and share an experience that’s blended with your world. You have that comfortable safe place that they’re a part of.

GB: Magic Leap is a big presence in the AR discussion. They’ve probably raised half the money that’s gone into AR and VR in the past year, that one company alone. I wonder what the presence of that means for AR’s future. It seems like their goal is different from where you guys are going. They want to create imagery in your AR view that’s indistinguishable from real life.

[aditude-amp id="medium5" targeting='{"env":"staging","page_type":"article","post_id":2147774,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"arvr,business,","session":"B"}']

Beliaeff: I’m always a big fan of investment. The rising tide is going to raise all ships. It’s why I’m a big fan of Pokemon, a big fan of everything Jeri and her team are doing. The more things raise awareness and popularize this technology, the better it’s going to be for all of us. It’s an emerging tech. A lot of is figuring out what’s going to work and what isn’t. As you look at any technology over time, there will be periods of consolidation. Players will drop out and new platforms will appear. But the great thing about AR in particular, when you’re thinking about it versus VR, is it doesn’t necessarily require an HMD and dedicated hardware. A lot of it can just work with a cell phone or a tablet.

You’re going to get a lot of innovation out of what they do, particularly as they get closer to publicizing what they do. Outside of the sort of people in this room and in the games press, I don’t think most people know what Magic Leap is. Their unveiling, hopefully, will open a lot of eyes and get people excited about what we’re doing.

Ellsworth: I totally agree with that. This market is going to be a constant evolution. To have that walk-around AR experience is going to be a difficult thing to do, and to do it all in one chunk, it’ll be hard to stick that landing. We’re trying to focus on a tabletop experience in the comfort of your home where you feel OK putting some glasses on. We’re figuring out what those experiences are going to be, not pulling out a crystal ball and deciding how people are going to interact out in the real world. Even looking at my house today, how do I fit a game into my house? There’s clutter on the floor. There’s an old beer bottle on the coffee table. There’s a lot to learn about how we can start to blend these different experiences into the world.

It has to come in stages. We’re staging our company to nail the table first, then lift people up from the table and let them move around their home. Eventually we’ll let them leave their home. But we’re focusing on things we know there’s a hunger for, like this communal aspect, making the table the center of your home again. Communication over distance is going to be very important for AR. Collaborating in a holographic space, having direct voice communication. All these things will be figured out in the next two to five years, and that will shape what walk-around AR is going to be like.

Above: ODG augmented reality smartglasses at CES 2017.

Image Credit: Dean Takahashi

GB: Why did you decide to focus on the tabletop at first?

Ellsworth: Again, it’s difficult to know how to handle all these different interactions. If your hardware is thousands of dollars, too, it’s going to be difficult to find mass-market adoption. By constraining ourselves to the tabletop experience, we can nail that. Everything you experience there is going to be great, and from there we can springboard to people moving around the room or interacting with stuff in the home. That gives us time for sensors and technology to shrink, and for customers to get used to wearing glasses. It’s evolution. Customers are going to rebel against new technology that they have to take a big step into.

GB: How have you seen the technology advance? There some interesting companies here, like ODG. We’re seeing AR glasses already that are around four ounces. The larger ones are six ounces. I can remember times when these things were well over a pound. How fast is the technology moving when it comes to weight and cost?

Beliaeff: It’s going very fast. When you talk about investment, certainly Magic Leap has garnered the most, but ODG just closed a big round. If you look at AR investment over the last two or three years, it’s more than a billion dollars in total. That’s what you’re seeing.

We got into AR seriously in 2014. We released our first project, the Mission Drone, last year, and now we have the second one, Nitro Boost, for this fall. There’s stuff we can do with a car that we couldn’t do with a drone, and it’s only been 12 months. The level of acceleration is far more than what we’re used to. You were talking about Moore’s Law before. Right now I think we’re exceeding that.

For the companies that are doing dedicated HMDs, the hardware is accelerating very fast. But then the software is also accelerating. When you get that combination of hardware and software acceleration, the innovation, the speed of tech advancing, is glorious.

GB: Mike Abrash, formerly of Valve, now at Oculus, their chief scientist, he made this interesting observation about Moore’s Law. Everyone always thinks that Moore’s Law is inevitable, that it’s going to happen no matter what. He says that’s not the case. When you break it down, it’s really the work of very talented people figuring out how to do something that otherwise would be impossible. Advancements don’t automatically happen. It seems sometimes that some companies can either over-promise or get into trouble because their expectation is that Moore’s Law will automatically happen without enough going into the engineering side.

Ellsworth: Having been in engineering for many years, you see these pressures build up in an industry that force change to happen. We’re at an interesting point right now where you can’t really add any more pixels to my iPhone and make it any better, for me at least. You can’t make the screen any curvier. The audio fidelity is already great.

Above: Some augmented reality applications should clearly never be created.

Image Credit: Dean Takahashi

GB: It’s like design exhaustion.

Ellsworth: Right. It seems obvious, at least to me, that starting to replace the iPhones and tablets with other devices is the pressure that will force these different technologies to mature. Otherwise there wouldn’t be pressure to make investment happen or force prices down so these devices can exist.

A lot of our work in designing our product is going out and working with vendors and selling them on our dream so we can hit a price point that everyone in the world can afford. Some of the sensors are expensive because they aren’t being used in commodity products yet. Once they’re in commodity products, there will be tons of money for the industry, for whoever lets go of the higher-end markets they’re used to and ramps up production.

Beliaeff: When you watch the trends in engineers, it’s interesting. Engineers really are the straw that stirs the drink for this sort of innovation. On the hardware side it’s building in the efficiencies, but on the software side right now it’s really computer vision. When you look at all the investment going on right now, with everyone sponging up every computer vision engineer alive, if this level of investment continues and we start seeing more commercial successes in AR, there’s going to be a shortage of computer vision guys.

We already have people working on toys — these guys used to work on displays for military helicopters. We have multiple PhDs doing stuff. We have guys who’ve contributed to EU computer vision white papers. The talent out there is amazing, but there’s not a lot of it. At some point it’ll either be a driver of consolidation, putting all those smart guys in a bigger room together, or it’ll be a stall-out point.

Ellsworth: On top of that, machine learning is going to be huge in AR. Just with all the volumetrics and the sensors we can put on someone with a wearable device, we can start to understand more about their life and help them. We can predict and learn their patterns, unlocking all of this benefit that couldn’t be there without smarter machines.

GB: It seems like we’ve had this fortunate path. GPU computing came along. It made the big data crunching that’s required for neural networks and pattern recognition possible. Then you get AI and computer vision. Those things almost become something you can drop into any product. You can train a neural network and it’s not going to take 10 years anymore. You’ve had some good progress on the required technologies for what you guys want to do. What else still has to happen, though, to make AR more pervasive and more affordable?

Beliaeff: Jeri touched on this a bit earlier. The components you use need to become available at a mass level. Stereoscopic cameras are an example. We’re seeing glimmers of those in Tango and other things. That’s helpful. The processing chips, getting those bought in mass quantities so they become available at a device level. You touched on getting people to start having AR libraries integrated in their OS, graphics libraries and physics libraries and sound libraries. It needs to be part of the package.

Ellsworth: You were talking about AI and how much benefit it can provide. One of the learnings I had at Valve Software, when we were trying to figure out how to make games more fun — we found all kinds of little things. You could read people’s skin resistance. When you fed that back into the game, it would be just that little bit more fun for the end user. It’s hard to name a product that only makes a game two or three percent more fun, but if you can take a device and pack a bunch of those two or three percents in there, you get real value for the end user. To make AR more viable over the longer term, we’re going to have to find all those little bits and smash them all together in a tight package and make it seamless for the end user.

GB: How do you look at the timeline of what’s going to be possible? When do you want to get products into the market? How much runway do you need as far as funding? Jeri, you guys basically transformed your company in the past year.

Ellsworth: It’s been an interesting ride. We’ve been at it a little more than three years now. Rick and I left Valve. We had to work on a modest product at the time, because it was just the two of us and a bunch of cats in his house. We made a PC peripheral. Now that we’re working with Playground Global and we have more investment, we can lay out some of our bigger dreams for AR. We’re looking at how we can make it seamless, make it 60 seconds to play. Hit the power button, flip the board open, and have your experience.

It’s been great for us. It’s given us the runway to make the product what we want and what we believe the user is going to want. We’ve been able to come up with a strategy, which is at least three stages to get the customer using the experience on a daily basis, get them using it on the table, and then take the full experience out into the world.

Above: ODG’s booth at CES 2017.

Image Credit: Dean Takahashi

GB: You’ve brought some interesting people onto your team — Peter Dille from PlayStation, Darrell Rodriguez from LucasArts.

Ellsworth: It’s pretty obvious that games are going in this direction. There’s a huge content gap right now. You have hardcore gaming VR experiences on high-end PCs. You have phones for your snackable content. But everything in between, where you can bring your whole family together — it’s really exciting, that we can start to make those experiences and tap into that market.

GB: Nick, how are you guys marshalling the resources in your company to focus on AR games?

Beliaeff: We’re a big believer in AR. Looking at how to bridge that gap between physical play and digital play, it’s very organic. From a kid’s perspective, they don’t see the “wow” in the technology that we do, because they’re growing up with it. They just expect it to work. But it does work.

For us, a lot of the play patterns we’ve built on top of it is the core experience you have with an RC toy. Then we add video game depth to it, as well as the social aspect. We all know that’s the glue that keeps people playing anything. Doesn’t matter if it’s a board game or a video game or a toy. If you can do it socially, that’ll keep you into it longer. That’s a lot of where we’re going, multiplayer play. With the car, not only can you play with people in the room and have multiple cars going against the AI cars, but you can do a race, save your ghost, and upload it for a friend across the country to race against. That competition really opens it up.

We’re taking a bit of a different route, because we use devices our customers already own. We use smartphones and tablets. We don’t care if it’s Android or iOS. There’s enough computing power and fidelity in the cameras that we can get a rich experience out of it. We’re going toward more multiplayer experiences, more immersive experiences, and using things that don’t require a massive investment on the consumer’s part.

GB: What’s going to be the technology that reduces AR glasses to the size of the glasses I have now, do you think? What leads to these things Mark Zuckerberg and Larry Ellison are talking about, AR devices that fit on your head like any pair of ordinary glasses?

Ellsworth: It’s going to be a blend of glasses and non-glasses experiences. If you project out five years, optic is getting better. Sensors will get better. Compute and batteries will be small enough to fit in glasses and let us have great experiences. But I imagine a day where I’m on a bus and I replace my phone with my Cast glasses. I call my friends by tapping on my hand. Or I replace my iPad by drawing a square on the back of the seat. Getting to work, it’ll replace one of my compute devices there.

But when I get home, I want to take the glasses off and have a similar type of experience throughout my home. I’m deeply interested in the internet of things. On its own an internet-enabled thermostat isn’t terribly interesting, but if I can interact with it the same way I interact with my glasses — if I can wear my glasses and see my thermostat across the world and turn it down with a gesture — that’s actually what we’re going to see. It’ll be a blend of different display techs, different glasses, different sensors and ways for us to control and interact with all the devices in our home.

Beliaeff: Whether it’s VR or what I’ve seen released of AR now, it’s definitely been function over form. It gets the job done, but if you put it on, you don’t look cool doing it

GB: Google Glass.

Beliaeff: Not exactly what you want. What’s going to drive adaptation is when the platform, whatever’s going on with the hardware, is stable enough that you stop worrying about function and you can start getting some engineers worried about the aesthetics of it. When you put them on, it’s not obvious that you’re wearing an AR device. It’s when you’re wearing glasses that look like glasses, when it doesn’t look like you’re spying on someone. Once you have that natural, organic integration, that’s when adoption will start going through the roof.

Ellsworth: I think it might be halfway in between. It might be when the end user gets comfortable with these glasses that look slightly different. I look at Google Glass and it was really interesting. It was pretty stylish for what it did. But it was so early. Our natural reaction is to buck anything so different and new. If you look at Snapchat glasses, I’m pretty excited about how that’s moving customers forward. It doesn’t seem like they’re getting the same negative reaction to that.

Beliaeff: But Snapchat’s glasses help make the point. They actually look like normal glasses. You don’t look like you’re putting on a construct.

Ellsworth: Yeah. The end user is going to move partway, and then technology is going to move the rest of the way over the next five or 10 years. Eventually we’ll all be walking around with AR devices.

Above: Pokemon Go.

Image Credit: GamesBeat

GB: We’ve been talking about the ultimate nirvana of augmented reality, but what level of graphical fidelity is going to be good enough to blend with the real world? It seems like one of the lessons of Pokemon Go is that you don’t really need great graphics to make a very successful AR experience.

Beliaeff: The brand authenticity — you can critique Pokemon Go however you want, but if you watch the cartoons or play the games and see how the universe is presented, and then you play Pokemon Go, it’s so brand authentic. I’m walking down the street and there’s a Pokemon. That’s exactly what happens on the show. It was brilliant at that level. It did a great job of showing that.

From a graphics standpoint, I don’t necessarily know if it’s the fidelity or if it’s the, what is it, the motion to photon? It’s getting rid of the carsickness when you’re using it. When you use HoloLens, you can get that little bit of the dissociative experience. It’s having those really wide field of views when you’re using the device. You want to make sure that when you take the device off, there’s no hangover.

Above: Microsoft’s HoloLens in action.

Image Credit: Microsoft

Question: I work for Ashley Furniture. We’re working on an augmented reality where you can scan a room and take products from our catalog and put it in that scene. What we’re struggling with is getting the CEO of the company to understand that I can’t have that perfect fidelity, because it has to happen in real time. Have you seen any constraints with regard to level of visual quality in certain projects? How do you deal with other stakeholders, getting them to understand?

Ellsworth: It’s interesting that we’re at CES, where everything is about specs and the tangible aspects of everything. One of the lessons I learned working in the toy industry — this was actually beat into me by executives — is that when it’s fun, it’s fun. I’d come in with this super complicated piece of electronics — hey, I just made this better! — but it was fun before when it was half the price and easier to manufacture.

It’s going to be an evolution, I think, for AR gaming and AR applications. It has to function seamlessly enough that the end user will accept it. Some of the struggles with AR now on phones and tablets — you have to download an app and go through all this friction to make it happen. Once some of that gets smoothed out and it’s just persistent all the time in your OS or your glasses, then we’ll start to transition into a phase where it’s a specs war. Glasses manufacturers will push the boundaries of optics. Tracking developers will push the boundaries of machine learning.

Beliaeff: AR right now, because it’s not popularized out there yet — it’s so experiential. A lot of the communication issues you have revolve around how you quantify something that’s experiential. Taking a screenshot of AR doesn’t do you a lot of good. An example of something that’s working out right now in VR is the Samsung commercials, the Samsung VR that was running every second over Christmas. You get the reactions, the oohs and aahs. It does a good job of communicating that you have to try it. For your CEO, it’s experiential. You have to get them to try it. On the graphics standpoint, you have to start thinking about how you preload and cache stuff. Getting graphics up on a webpage quick is a challenge.

GB: Is it a tradeoff against what you’d call latency, maybe? I already hate how long it takes for things to happen in Pokemon Go sometimes. I’m still playing every day, but I feel like I’m being awfully patient, waiting for the loading times.

Question: In retail, you have to be very quick, or customers are just gone.

Beliaeff: If you’re doing stuff in the store, buy some really kickass hardware to render the crap out of it. If you’re doing stuff in the cloud, send an email later after you’ve rendered it. But that’s not really an AR question anymore. It’s a graphics fidelity challenge.

Question: With this overlay that’s being put on the real world, you’re creating a sort of digital landscape. Is there something in the pipe legally, any white papers in the works, as far as creating digital real estate, so to speak? If I own a store, I don’t want a competitor being able to put up his augmented-reality sign in my space.

Beliaeff: Just as a general statement, nothing to do with AR specifically, technology always moves faster than law. The law reacts when it has a need to react. Until court cases start going — that’s what’s going to do it. For us as content creators, we’re trying to make fun things. It’s a legislative issue. Or it’s a Google problem. [laughs]

Ellsworth: It brings up an interesting point. You mentioned science fiction. What does that teach us about AR? For the sake of coming up with really awesome movie plots, they always make everything go horribly wrong, and these writers are actually predicting some of the stuff we have to think about when we’re designing systems.

Beliaeff: There might be a decent example in Pokemon Go. In the beginning, there were certain places — you’d hear the stories about how there was an awesome Pokestop in a graveyard or someplace like that. People would be wandering around this place at night in droves. That forced them to add the feature where people can lodge complaints and go from there. There are forcing functions that may happen on a per-app basis.

Question: As VR and AR become more accessible and available, as much so as any other video game platforms, do you think we’ll hit a point where people may want to just go back to the simplicity of sitting on the couch and watching a screen? Or are these platforms going to entirely replace that old paradigm?

Beliaeff: I’m never a believer in a binary outcome. You had early PC games. Consoles came along. Mobile and tablet came along. There’s been space for each of them throughout. Games in general went from being a boutique hobby to the highest-grossing form of entertainment there is. There’ll be a certain level. The percentage that a particular facet has, console or PC, might go up or down. But in general, whenever we add a new method for people to be entertained, it ends up broadening the entire market. Not necessarily immediately. But it’s another way for games and entertainment to expand.

Ellsworth: I totally agree. It’s going to broaden the experience. I often think about how my father’s going to interact with AR. He’s not a very technical guy. But some of the experiences we’re working on are so direct and so intimate — holographic content you can reach out and touch with our wand — these are all things that my father understands how to do. Launching applications from our system is as simple as a tablet. If I gave him an Xbox controller he would never feel comfortable with it. He’d probably hold it upside down. It’s going to pull some of these people who don’t play games into playing games, because it’s more direct and more intimate. You can do it with your family and friends.

Above: You play CastAR with a wand-like controller.

Image Credit: CastAR

GB: I’m a gamer. I don’t really have a favorite platform. I care about my favorite games. Maybe eight out of my top 10 favorite games of the year were console games, but the others were mobile games. Each year that makeup changes. Platforms come and go, but I still enjoy something from all of them.

Beliaeff: In 2016 I probably didn’t play a lot of PC, but then Civilization VI came out. Then my PC hours went through the roof. When it comes to phone and tablet, I play a lot because I travel a lot.

Question: Are any of you thinking about more invasive technologies, things that might relate to other senses? Like a horror game that would surprise you with a shock or a gust of wind. Or even something very simple, like a paintball game.

Ellsworth: When we were doing R&D at Valve we came up with some pretty crazy stuff. We were given free rein to explore all of these different ways to do inputs and outputs. One of my favorites was a remote-control human, where we ran a bit of electric current behind her ears. There are research papers on this. You can make it feel like people are tipping left and right, so you can steer them down the hall that way.

There’s a lot of interesting techniques out there. Working with Playground Global, there’s a company that has a backpack called SubPac, which is just amazing. It has all these transducers in it that take audio and impart it onto your body. They’re out on the show floor. There are some pretty cool headphones that are 3D, moving sound around. As you said, some of these things will come and go, while some of them will consolidate down into smaller packages and become what we use to play games or collaborate over distances in AR.

Beliaeff: It’s really a matter of what can become integrated. So long as everything is a separate peripheral — the more peripherals you have to buy a have an experience, the fewer people are going to try it. You get a subset of a subset of a subset. You look at the evolution of PC games. At first you had monochrome graphics. Then you had color. Then sound cards and graphics cards and physics cards. All that came together over time, but the process took 20 years. Building in all that sensory stuff, you’re going to look at a similar arc, assuming there’s a killer app that drives it.

I’ve seen VR apps at GDC where the company’s gone all out. There’s one where you’re in Norse times, driving this wagon along, and they did a whole thing where you’re actually sitting in a wagon and holding a set of reins to control it. They have fans set up for the feeling of wind in your face, water and steam, the whole thing. It’s really cool, but it’s a hell of a lot of work for one game. You have to get to a point where there’s engineering efficiency. It can’t just be a labor of love.

Ellsworth: There are going to be location-based experiences just like that. VR lends itself quite well to that. But in the home, you have to think about what people are going to bring into their homes. Most people aren’t going to move their furniture around to have an experience. That’s why we’re focusing on such a compact experience — flip it over and go. As the technologies evolve, maybe holodeck rooms will be what everyone stalls in their house 20 years from now. But it’s hard to imagine that happening anytime soon.

GB: What do you think is a very fun game that already exists, but would be more fun in AR? Or what’s the AR experience that you’d want to ultimately have?

Ellsworth: I love board games, but board games are frustrating for me, especially when it’s a new game where I have to learn all the roles and set up all the pieces. For our platform, I’m excited about making some of those games more digital, allowing them to be stable. If I’m playing with my friends, we can stop for the night and it remembers the game state. Then we come back and resume with no problem.

I’m also looking forward to playing over large distances. With hand tracking and eye tracking built into the headset, I can be playing with my friends across the world. If I need to move a piece I can point and move it. They’ll see a holographic version of my hand. It’ll be more intimate, more like we’re in the same room together. In the short term, board games are going to be pretty amazing.

I also want to see an RTS game, a Command & Conquer type of game, where I can direct my troops. That’ll be amazing. I can have all my friends around the table and we’re playing together with the same big map. When we all meet up in the middle to fight, I can look across the table at my friends and see them react when I get a big kill. Those are the types of experiences that are really magical with our system. When we bring people in to playtest, sometimes it’s less about the game and more about the social aspect. Because you’re sitting over there, I know you can’t see parts of my virtual world. I can run and hide my character, see you approach, and jump out to snipe you in a way I never could have done on a flat screen. A lot of interesting twists on existing games become possible with AR.

Beliaeff: In our drone project, we create these 3D landscapes. Part of it is these alien hives, with things hidden behind buildings. People have to walk around to see behind the buildings and pilot over there. That’s part of the fun of mixed reality.

A lot of things happening right now, things that are very experiential, will lend themselves well to augmented reality. Escape rooms, for example. Maybe you’ve done an escape room experience where you have to figure out what’s going on and how you get out. That would be great for AR. You can turn your apartment or your house into the escape room, making it meaningful and real for you. Laser tag, any sort of game like that where you can visualize the gameplay. Racing, where you can spawn AR goals to race around. There are so many logical extensions, where people already know how to play a game, and now it’s that much cooler.

Ellsworth: You can take a physical, tangible object, like a figurine, and put it down on the table. Then the essence of the character jumps out to battle and level up. What’s interesting is that when you start having those types of experiences, this little plastic toy suddenly becomes mine. It has all these markers of the experiences I’ve had. I’m more connected to this physical thing than if it were just a digital object on a screen. The whole toys-to-life phenomenon is going to be taken to another level. Each of your little gadgets or toys will be unique to you, because it’ll be persistent.