Google has begun to build its own custom application-specific integrated circuit (ASIC) chip called tensor processing units (TPUs), Google chief executive Sundar Pichai said today at the Google I/O developer conference in Mountain View, California. The name is inspired by Google’s TensorFlow open source deep learning framework. But the technology is one of a kind — something that makes sense only at Google scale.

These TPUs were used in the AlphaGo artificial intelligence (AI) powered Go player that beat top-ranked Go player Lee Sedol, Pichai said. It also works inside Google search and Google Street View. Now it sounds like they will become available for other companies to use, too.

“When you use the Google Cloud Platform, you can take advantage of TPUs as well,” Pichai said.

How TPUs compare with other chips.

Above: How TPUs compare with other chips.

Image Credit: Ken Yeung/VentureBeat

Specialty hardware — sort of taking a cue from the holographic processing unit (HPU) inside Microsoft’s HoloLens augmented reality headset — will not be the only thing that will make the Google public cloud stand out from market leader Amazon Web Services (AWS).

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

Also, over time Google will expose more and more machine learning APIs, Pichai said. Google has already launched the Cloud Machine Learning Platform service and the Vision API.

“Our goal is to lead the industry on machine learning and make that innovation available to our customers,” Google distinguished hardware engineer Norm Jouppi wrote in a blog post. “Building TPUs into our infrastructure stack will allow us to bring the power of Google to developers across software like TensorFlow and Cloud Machine Learning with advanced acceleration capabilities. Machine Learning is transforming how developers build intelligent applications that benefit customers and consumers, and we’re excited to see the possibilities come to life.”

Presumably the TPU chips will be useful for AI. But GPUs are being used to power databases like Nvidia-backed MapD. Time will tell just how much the TPUs will cost and how easy it will be to develop for them.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More