Skip to main content

Google launches TensorFlow 2.0 with tighter Keras integration

TensorFlow and Keras
TensorFlow and Keras
Image Credit: Khari Johnson / VentureBeat

Want smarter insights in your inbox? Sign up for our weekly newsletters to get only what matters to enterprise AI, data, and security leaders. Subscribe Now


Google open source machine learning library TensorFlow 2.0 is now available for public use, the company announced today. The alpha version of TensorFlow 2.0 was first made available this spring at the TensorFlow Dev Summit alongside TensorFlow Lite 1.0 for mobile and embedded devices, and other ML tools like TensorFlow Federated.

TensorFlow 2.0 comes with a number of changes made in an attempt to improve ease of use, such as the elimination of some APIs thought to be redundant and a tight integration and reliance on tf.keras as its central high-level API. Initial integration with the Keras deep learning library began with the release of TensorFlow 1.0 in February 2017.

It also promises three times faster training performance when using mixed precision on Nvidia’s Volta and Turing GPUs, and eager execution by default means the latest version of TensorFlow delivers runtime improvements.

The TensorFlow framework has been downloaded more than 40 million times since it was released by the Google Brain team in 2015, TensorFlow engineering director Rajat Monga told VentureBeat earlier this year.


AI Scaling Hits Its Limits

Power caps, rising token costs, and inference delays are reshaping enterprise AI. Join our exclusive salon to discover how top teams are:

  • Turning energy into a strategic advantage
  • Architecting efficient inference for real throughput gains
  • Unlocking competitive ROI with sustainable AI systems

Secure your spot to stay ahead: https://bit.ly/4mwGngO


The news was announced today ahead of TensorFlow World, an inaugural conference for developers set to take place October 28-31 in Santa Clara, California.

In other recent news, Google AI researchers have rolled out a series of natural language understanding breakthroughs, like a multilingual model trained to recognize nine Indian languages. Last week researchers shared news they created ALBERT, a conversational AI model that now sits atop the SQuAD and GLUE performance benchmark leaderboards.