Skip to main content [aditude-amp id="stickyleaderboard" targeting='{"env":"staging","page_type":"article","post_id":1461639,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,","session":"D"}']

Google shares how its self-driving car uses sensors to navigate cities (video)

Image Credit: Wikipedia

It’s been a while since we last heard about one of Google’s most intriguing “moonshots:” the driverless car.

In a video teaser, Google offers a new look at how its self-driving cars perceive a busy city street. The company details that it has updated its software “so it can detect hundreds of distinct objects simultaneously — pedestrians, buses, a stop sign held up by a crossing guard, or a cyclist making gestures that indicate a possible turn.”

[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":1461639,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,","session":"D"}']

In a blog post on the matter, Google shares that its cars have “logged nearly 700,000 autonomous miles” in total. “We still have more work to do,” Google says, “but it’s fun to see how many situations we can handle smoothly and naturally.” Google’s driverless car project first got its start at Stanford in the 2004 DARPA Grand Challenge.

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

Meanwhile, Nissan is apparently testing a “self-cleaning” car.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More