Amazon’s Fire Phone is watching you. It looks you right in the eyes, and it watches your every movement. It’s cool, though, because it uses this to make things look all 3D and futuristic.
Earlier this week, Amazon finally lit the spark on its Fire Phone and revealed it to the world. It’s a $200 touchscreen device (on contract with AT&T only) that looks a lot like an iPhone and runs Amazon’s version of the Android operating system. One of its defining features is something called “Dynamic Perspective” that is capable of producing a 3D-like image without glasses. This is an idea that we’ve seen before with Nintendo’s 3DS handheld game console, but the execution is different — but this doesn’t mean developers will struggle to implement it.
[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":1494922,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,mobile,","session":"D"}']In his onstage demonstration of the Fire Phone, Amazon chief executive officer Jeff Bezos talked about his delight over a game called To-Fu Fury. It’s a mobile platformer where you need to guide a piece of tofu through a level. What sets the app apart is that To-Fu Fury is already using the hologram-like capabilities of the Fire Phone. This enabled Bezos to move his head around to get a different perspective on the level. For him — and anyone using it — this would also create a sensation that the To-Fu Fury levels had real depth.
Check out this quick clip to get an idea of what I mean:
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
So, how does this work? Well, the Fire Phone is basically mimicking how real depth perception works. If you’re reading this on a phone or tablet or computer, you can lean left or right to see more of the room around the edges of your device. You might see a vase or power outlet or a person sleeping on the subway that was previously hidden by the screen. Lean back to original position (quick, before the sleeping subway rider sees you staring), and those objects are hidden again.
This is gonna seem obvious, but you can see around your device because your eyes are moving in relation to it and the world around you. That’s how vision works. But this isn’t how static images and displays typically work. You looked around your smartphone, and your perspective on the world skewed and morphed, but the screen remained unchanged. That’s because photographs and displays have no idea where your eyes are. That kind of imagery will always produce the same static perspective.
Same with TV. It doesn’t matter what angle you look down on it from, you’re never gonna see if that’s really just coffee in Dave Letterman’s mug. Games are a little different. Developers often give us control over the in-world camera in 3D games, but that still doesn’t take your head position into account.
The Fire Phone can create a 3D-like hologram effect because it actually does know where your head is. It is tracking the position of your dome in space 60 times every second with its four infrared front-facing cameras. This gives To-Fu Fury developer HotGen the information it needs to connect the in-world camera controls to your head’s position. When you move the phone around or tilt your head, To-Fu Fury instantly redraws its polygonal world to reflect your change in perspective. This creates an effect that mirrors what happened when you looked around the sides of your device earlier. It’s basically like looking through a window.
“It’s all pretty simple, actually,” Hibernum Creations producer Louis-Rene Auclair told GamesBeat. “For the user, whether you move the phone or your head, the cameras will figure out your positioning and actually make the screen move in the opposite direction. It’s just like if you’re trying to peek behind something.”
[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":1494922,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,mobile,","session":"D"}']
Auclair is developing a new game specifically for this technology called Saber’s Edge, which you can see for a moment in the video. He says that the Dynamic Perspective gives the game a real sense of depth. We talked more to Auclair and other developers about this technology, and you can read it all about it.
If you’re reading this and thinking that the technology can’t possibly work, keep in mind that it’s not a recent invention. If fact, homebrew developers already enabled something identical on other mobile devices with front-facing cameras, including the iPad, using an app called i3D:
Of course, i3D doesn’t work all that well because head-tracking is difficult to pull off with one camera that doesn’t work well in the dark. We’ll see if a few more cameras that are capable of processing light in infrared can do any better when we get some extensive hands-on time with the Fire Phone.
[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":1494922,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,mobile,","session":"D"}']
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More