Skip to main content [aditude-amp id="stickyleaderboard" targeting='{"env":"staging","page_type":"article","post_id":1775816,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"big-data,mobile,","session":"B"}']

Google Translate now provides instant visual translations in 27 languages on iOS and Android

Google Translate translating visual text instantly.

Image Credit: Google

Google today announced that within the next few days its Google Translate app for iOS and Android will be able to give users immediate visual translations of text in 27 languages.

Instant visual translation is now available for Bulgarian, Catalan, Croatian, Czech, Danish, Dutch, English, Filipino, Finnish, French, German, Hindi, Hungarian, Indonesian, Italian, Lithuanian, Norwegian, Polish, Portuguese, Romanian, Russian, Slovak, Spanish, Swedish, Thai, Turkish, and Ukrainian, according to a blog post today from Google Translate product lead Barak Turovsky. Meanwhile, the app will soon be able to provide real-time translation of speech in 32 languages, Turovsky wrote.

[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":1775816,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"big-data,mobile,","session":"B"}']

The news comes six months after Google first introduced instant visual translation in Translate, and a little over a year after Google’s acquisition of Quest Digital.

People using the app can try out the support for the new languages by downloading a language pack for each language. From there, the app can work even when the mobile device it’s running on has no Internet connection.

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

Google Translate can do this by relying on an increasingly trendy type of artificial intelligence called deep learning.

Google has trained its artificial neural network — a key technology for deep learning — on images showing letters as well as on fake images marred by imperfections, to simulate real-life scenes. From there, the Google Translate app looks up the letters in order to make an inference, or educated guess, about the words that the mobile device’s camera was pointed at.

Historically, this sort of complex processing would happen in a remote data center, scaled out onto several servers. But Google built a very small neural network and a carefully curated training data set. That way, the computing can happen on a mobile phone with limited processing power and little if any connection to the Internet. And that’s significant.

Google Research has more about the work in a new blog post.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More