Disclosure: The Siggraph organizers paid part of my expenses to Anaheim. Our coverage remains objective.
ANAHEIM, Calif. — Within sight of Disneyland, the top geeks of computer graphics met this week at the Siggraph conference. At the event, I saw the real edge of animation and gaming on display in the show’s Emerging Technologies exhibits in the Anaheim Convention Center.
They ranged from a bathtub filled with a milky white fluid that served as a display for a splashing game to an exhibit that gamified exercise for people in wheelchairs. These exhibits — from powerhouse researchers at Disney Research to student-run efforts — were insane, goofy, and inspiring all at the same time.
The theme of the Emerging Technologies event was assisted technologies for handicapped people. But there was plenty of visionary gaming technology too. Yasushi Matoba and a team of researchers at the University of Electro-Communications in Japan built the bathtub. The so-called AquaTop Display was a true “immersive” water display, using benign chemicals that created a white liquid canvas. (Here’s a video).
“This is all about promoting cool technology that can do something for people,” said Dylan Moore, organizer of the Emerging Technologies exhibit. “But there’s also plenty of fun stuff.”
As you can see in the pictures, the team used a BenQ projector to shine images on the water from above and a Microsoft Kinect for Windows sensor to detect your hands. You could then catch light objects and shove them back at an enemy that was shooting objects at you. The depth camera detected your fingers on and under the surface of the water. It was purely goofy, but it got the biggest laughs and smiles in the place.
Nearby was the IllumiRoom [left], created by Brett Jones from the University of Illinois and a team at Microsoft Research. It uses a Kinect for Windows sensor and a projector to try to blur the lines between the onscreen game and the walls of your room. The projector takes the parts of a game world that you can’t see — such as the rest of the scene beyond the edges of your TV — and then it projects them on your room’s wall and furniture.
The effect is more immersion, said Jones in an interview with GamesBeat. You can introduce special effects that make the furniture seem like part of the game world. For instance, if it’s snowing in the game, the projector can make snowflakes drop, and those snowflakes will stop falling once they hit a piece of furniture. It contributes to the illusion that you are inside a virtual world. You can change the appearance of the projected images on the wall to match the mood of the game. And you can use the projections to gain an advantage in a game. After all, if you see a grenade bouncing on your coffee table in your peripheral vision, you can try to dodge it more quickly. [See a video here.] This project won the top prize at the show.
Meanwhile, Eric Brockmeyer of Disney Research in Pittsburgh showed Papillion [right], a way to create interactive eyes for toy characters. Brockmeyer’s team used a 3D printer to create the cute toys and then employed a PrimeSense motion-sensing camera to detect gestures. The eyes are printed with a kind of polymer that takes light beams and passes them through in a way that creates moving eyeballs. If you wave your hand in front of the toy, its eyes can sense your movement and follow your hand.
Disney is about to debut its Disney Infinity toy-game combo products in August. Those toys sit on platforms that detect the type of toy and the data storage for the game associated with it. So you can put a toy on a base and then watch the character appear in the video game. Here, with Papillon, those toys can come to life not only on your TV but in the 3D printed object itself. Brockmeyer smiled and said to be sure to tell the Disney Infinity team about his research.
“That’s one of the applications we see for this,” he said. [See video here.]
Another favorite was the Aireal, another Disney Research project that created tactile gaming force-feedback using air. Aiereal shoots jets of compressed air in an interactive way, so that you feel sensations of air blowing on your skin when you fired a gun or pushed a button. The air-haptic technology could complement other kinds of sensations that could make a simulation seem more real to you.
The Aireal could go well with the HapSeat, another kind of motion simulator that could make you feel sensations in your hand as you controlled a haptic joystick. You could also feel vibrations from actuators in a chair. The seat and its accompanying screen could give you a visceral feeling of what it is like to ride a roller coaster or a horse. Other gaming applications included the EMY: Full Body Exoskeleton (like a mech, except to help people with weak muscles), and the Foveated 3D Display from Microsoft Research could conserve 3D processing power by putting less processing behind objects on a screen that were outside of your peripheral vision.
And in a demo called Skyfarer, a team created a mixed-reality shoulder exercise game to gamify exercise. Marientina Gotsis of the University of Southern California said that the system could give visual inspiration to people suffering from shoulder injuries by making them feel like they’re a bird in flight, flapping their wings, rather than sitting in a wheelchair in a hospital. [See video here.]
Some of these ideas are admirable. Some are kooky. But they put a smile on my face and made me believe that gaming still has a lot of frontiers beyond the next generation of game consoles.