TOULOUSE, France — Before talking about the future of search, one of Google’s top researchers wants you to understand just how dramatically search has changed in the past two years.
Speaking at the Futurapolis conference in Toulouse, Behshad Behzadi, director of search innovation at Google’s Zurich lab, pointed out that the majority of searches now happen on mobile devices.
[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":1845684,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,","session":"C"}']And with Google’s cloud auto-tagging photos, searching images has become more effective. In addition, Google’s search will now even look into other apps on your smartphone for answers, he said, and open those apps that have the best info.
However, all of this is moving toward a larger goal.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
“The future of search is to try to build the ultimate personal assistant,” he said.
To that end, there are four aspects of search that, according to Behzadi, will continue to be dramatically changed and reinvented in the coming years:
Voice: Google’s natural language processing has taken major leaps forward. Just two years ago, Google was noting a one-in-four error rate on spoken-word queries. Now that’s down to one in sixteen, Behzadi said. That, in turn, is driving voice searches that can sound as natural as most conversations with other people. It’s not quite “Her” quality, but Behzadi said that the kind of natural back-and-forth between human and computer seen in that movie is not as far away as we might think.
Context: Increasingly, Google’s search engine is linking your searches to understand what you’re trying to find or figure out. So, if you search using the word “castle,” for instance, you could get an infinite number of hits from around the world. But if you search first for “London,” and then for “castle,” the search engine remembers that you’re looking at London and automatically narrows the search field for you.
Also, on Android phones, if you’re looking at a Facebook post, for instance, and hold down the home button while making a voice query, Google will scan the contents of that app (or other apps) and find relevant information without your having to copy and paste things between apps to do a search.
Location: You can argue that this is also a type of context. But of course, location-based searches can also be quite specific to mobile. If you’re out and about taking a hike, you can ask Google, “What’s that lake” or “What’s that store?” and it will give you results just based on knowing where you are at that moment. Behzadi said this location awareness is growing more powerful and will become more proactive in alerting you to things that are nearby that might be of interest.
[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":1845684,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,","session":"C"}']
Personal information: Potentially transformative, but also potentially the most controversial, especially in Europe where privacy is a hot-button issue. As Google learns more about you, it continues to provide more and more reminders, or suggestions. If you’re using Gmail and Google Calendar, you’ve seen this feature gradually develop, as more of your info triggers alerts. Google has been tailoring search results to users for years. But as it collects more data, expect those results to become even more specialized, Behzadi said.
Finally, here’s a short clip of Behzadi being interviewed by Le Point, the news organization that hosted Futurapolis:
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More