WOODINVILLE, Wash. — Inside a big home in the forested suburbs of Seattle, augmented reality glasses aren’t just an illusion. They’re about to become real products, dubbed CastAR, that can deliver games and other visual apps with cool 3D effects.
Inside the house jammed with pinball machines and prototypes, a team of former Valve employees — led by chip engineering wizard Jeri Ellsworth and game progammer Rick Johnson — have created the startup Technical Illusions. They’re designing inexpensive augmented reality glasses that they believe will create a new level of excitement for the category.
While game publisher and Steam game distributor Valve chose not to pursue augmented reality, Technical Illusions spun out and then received huge validation for its efforts when it raised $1 million in funding on Kickstarter in November.
The CastAR glasses will be able to project 3D holographic images in front of your eyes so that you can either feel like you’re seeing a virtual layer on top of the real world, or you can feel like you’re immersed inside a game world. It works with glasses and a reflect sheet-like material called “retro-reflective.” Technical Illusions plans on delivering products to its Kickstarter supporters this year.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
We caught up with Ellsworth and Johnson at their (temporary) headquarters on our recent trip to Seattle. Here’s an edited transcript of our interview.
GamesBeat: Tell us how the CastAR augmented reality system is going to work.
Rick Johnson: On the glasses, there are two micro-projectors. The final version will be 1,280 by 720 per eye. They shine out to the surface. Just as you watch a movie on a movie screen, same concept here. The surface is a special material called retro-reflective. It’s designed to bounce almost 100 percent of the light back to the source. If you look at the material under a microscope, it’s made of these tiny microspheres. Light enters each sphere and comes right back.
This allows for several advantages. Your eyes are focused naturally, so you don’t have any eye strain or near-eye optics. It allows for multiple people to use the surface simultaneously. If I’m shining out and you’re next to my shoulder, you’re not going to see what I’m seeing. You can play a different game, or the same game, or the same game from a different perspective. It’s a collaborative multi-user environment.
On the glasses is a cell-phone-style camera. It’s tracking infrared LED points in the physical world. The camera has special hardware that Jeri developed that breaks down the image sensor, so it comes down with point data. It becomes very efficient to transfer a small amount of data to the PC to do the final world-solver.
The overall system – the projectors, the tracking – is very low-wattage. You can run it in a mobile environment, just have it tethered to your phone and walk around using it that way. In addition, we have two input devices, what we call the magic wands. That’s a three-dimensional input into the world. It also has joystick buttons.
There’s a radio frequency identification (RFID) rig that sits underneath the surface. Anything with an RFID tag can be tracked across that surface with centimeter-level accuracy and uniquely identified. You can put down Magic: The Gathering cards to spawn land or a creature. You can place Dungeons & Dragons figures and augment them with your stats. You could play Warhammer – since we know all the distances, you could have the computer calculate fog of war, firing ranges. You just worry about strategy at that point.
The final thing is the AR/VR clip-ons. Those fit on top of the glasses. Through a series of optical expansions, they turn it into a full VR device, as well as an AR device, both of which don’t require the surface.
GamesBeat: Can you talk about how you got started, how you began this research? Was it something you pitched to Valve to get it going?
Jeri Ellsworth: The backstory is that I was hired by Valve to help them create their hardware R&D department, around 2011. We were exploring all different types of gaming experiences, input and output devices, and we were very interested in virtual reality and augmented reality. We quickly tested a bunch of AR and virtual reality (VR) rigs. We identified issues with headaches and motion sickness.
I was working on trying to solve the headache issues that people get with near-to-eye displays. I had a workbench set up with projectors and reflectors and lenses and stuff. I accidentally put a reflector in this test rig backwards, and I was trying to look into it. Instead of projecting it into my eye, it was projecting it into the room. We had a piece of this reflective material in the room for a different experiment, and I saw this beautiful image show up in the reflector. I’m like, “Wow, that’s interesting.” I grabbed the material and started looking at it, and I realized, “Wow, this solves a lot of issues with eyestrain. You can still see through the glasses and the material isn’t expensive.” So it was a happy accident that we ran into this property.
GamesBeat: Why did you even think about VR, given that it had been so discredited in the ‘90s? The conventional wisdom was that it was never going to be ready.
Ellsworth: There are points where technology converges and things become more practical. Because of cell phones and the miniaturization of cameras and displays, it started to get more interesting. Our objective was just to see if we could heighten gameplay and make it better, at that point in this project. We were exploring everything, both input and output devices. Then Rick came on board to help with the project, because I was doing the hardware, but we didn’t have any game support.