Lenovo and Google announced a partnership today to create a new smartphone that makes use of a 3D augmented reality technology known as Project Tango.
Johnny Lee, a member of the Tango team at Google, said that the sense of space and motion provided by Tango will now be built into a mobile phone.
[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":1858228,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,mobile,","session":"C"}']“We can use the room around us to play games, and hide behind the furniture,” Lee said. “Project Tango creates a magical window.”
To show what he meant, Lee did a demo in which he used Tango’s sensors to measure the exact height of a wall inside a room at the Aquaknox restaurant in the Venetian Hotel in Las Vegas, where the press event was held. The event was one of many at the 2016 International CES, the big tech trade show in Las Vegas this week.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
Jeff Meredith, vice president at Lenovo, said at the event, “We locked arms with Google to bring out a consumer device based on Tango.”
Meredith said the goal was to create a mainstream device.
Qualcomm is supplying a Snapdragon processor to be the brain of the new smartphone. Both Google and Qualcomm have collaborated with Lenovo on the device over the past year, Meredith said. The final design isn’t set just yet, nor is the timing for the launch of phone, beyond the “summer of 2016.” The price point is under $500.
“We are extremely proud of where we are at this stage of the effort,” Meredith said. “We don’t want this to be a niche technology.”
Meredith added that he hoped the technology would have a long life.
He said the smartphone will have a screen that’s under 6.5 inches, diagonally. More than 5,000 developers are already engaged in making apps, Meredith said. Lowes is working on an augmented reality ecommerce app based on Project Tango. In that app, the user can figure out if a refrigerator would fit in his or her kitchen. Google is seeking more apps through a developer search initiative with a deadline of Feb. 15.
[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":1858228,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,mobile,","session":"C"}']
Larry Yang, a former member of Microsoft’s Xbox 360 hardware team, gave me a demo of Project Tango recently over at Google. Yang is the lead project manager for Project Tango, which equips a mobile device with 3D-sensing capabilities so that you can create games and other apps that are aware of their surroundings. Yang showed me a game that uses Tango, and it was like stepping into a future where games, 3D-sensing, motion-sensing, cameras, and animated overlays all combine into a very cool augmented reality experience.
Project Tango is yet another way to create the feeling of AR, as you insert animated objects into a real 3D space that you can view via special glasses or a tablet’s screen. In our demo, we used a special tablet, but you could also use something like the upcoming AR glasses. Augmented reality is expected to become a $120 billion market by 2020, according to tech advisor Digi-Capital. But first, companies such as Google have to build the platforms that make it possible. Google is demoing the technology this week at CES.
With Tango’s technology, the mobile device can use the sensors to detect the physical space around it, and then it can insert the animations into that 3D space. So you can hunt through your own home for those killer robots and shoot them with your smart device.
Tango taps technologies such as computer vision, image processing, and special vision sensors. When I arrived at Google’s headquarters in Mountain View, California, Yang greeted me with a tablet that had Tango running on it. I asked to use the restroom. He held the tablet out in front of him and ran a query for the restroom. The tablet screen showed an image of the scene in front of Yang, but it included an animated green line that led the way through the building to the restroom. Yang held the tablet out in front of him and followed the green light through the corridors to the restroom. The tablet knew exactly where Yang was and which direction he was facing, thanks to the motion-tracking sensors.
[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":1858228,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,mobile,","session":"C"}']
And once Yang mapped out the Google building’s interior, Tango remembered the layout. Project Tango devices can use visual cues to help recognize the world around them. They can self-correct errors in motion tracking and become reoriented in areas they’ve seen before.
Check out the video below of Yang demonstrating Project Tango. Here’s a link to a bunch of the other apps.
[aditude-amp id="medium3" targeting='{"env":"staging","page_type":"article","post_id":1858228,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,mobile,","session":"C"}']
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More