Much as they are today, people were enthralled by artificial intelligence in 1973 — the year Michael Crichton released the film Westworld. The film became MGM’s biggest box office hit of the year, yet launched the same year as the first AI winter: a massive event of depleted AI resources, dashed expectations, and dwindling interest over the ensuing decades.
In 2016, Westworld is back, and the immense changes in deep learning, open source, and computing power are fundamentally shifting AI’s future. Compute capacity and the technology’s capabilities have now advanced far enough that AI can complement and accelerate society, versus the trough of disillusionment realized in 1973.
[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":2135228,"post_type":"guest","post_chan":"none","tags":null,"ai":true,"category":"none","all_categories":"ai,bots,","session":"B"}']HBO’s new version of Westworld, produced by Jonathan Nolan and Lisa Joy, is one of the most popular shows on TV today. The futuristic, western-themed show taps into our widespread AI obsession, and its popularity is showing that people are fascinated by AI’s potential. Westworld‘s success mirrors a robust AI ecosystem, where entrepreneurs, VCs, corporations, and consumers are getting in on the action.
In 1973, the underlying goals of AI were similar: to automate tasks and bottlenecks across organizations, and eventually in everyday life. Government, researchers, and corporations drove AI’s progress, while consumers were infatuated (exemplified by Westworld‘s box office success) but did not have the channels to consume the technology or understand its applications and limitations.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
Consumer interest was not enough for AI to continue on the same trajectory of funding, and just months after Westworld‘s release, James Lighthill released a pessimistic report on AI, sparking a 7-year lull. Lighthill’s report highlighted the growing gap between hype and reality, leading to universities slashing grant funding, government and defense cutting their AI budgets, and jobs and resources shifting to other initiatives.
Today, Ray Kurzweil, Jen-Hsun Huang, Andrew Ng, Yann LeCun, Yoshua Bengio, and others are making bold predictions about AI’s potential, and corporations are moving fast to prepare for opportunities in image recognition, voice detection, and conversational intelligence. The current AI revolution stretches far beyond university and defense research into our everyday lives. It is driven by 6 different components that were nonexistent or under-resourced in the 1970s: research, compute power, storage costs, open source resources, talent, and investment capital.
1. Research into finding cat faces
Although the underlying components of AI have been around for more than 50 years, many point to Andrew Ng’s 2012 work at Stanford as the kickstart to today’s AI obsession. Ng and his team made a breakthrough in unsupervised learning through neural networks, the underpinnings of deep learning (a series of algorithms which loosely resemble the brain). Ng, a visiting professor at Stanford (founding lead of Google Brain team, now lead of Baidu’s 1,200-person AI team) decided to test unsupervised learning, or training data without a model, through a neural network. He and his team used YouTube as the data set and collected some of the smartest AI researchers, plus 16,000 computers, to see if his deep learning model would recognize faces. It would, even recognizing cat faces in what became known as the “cat experiment.” This test was only possible because of improvements in deep learning algorithms due to decades of AI research.
Before 2012, traditional machine learning meant applying algorithms to reach an end variable. Ng’s test showed that deep learning (and building models for neural networks) had enormous potential.
In 1973, there was limited public research into AI, computing, deep learning, and how to run data on complex algorithms. Natural language processing was just being created, and Turing completeness was announced only years before. Researchers of the 1970s misjudged AI’s progress, a miscalculation highlighted by the Lighthill report.
2. Compute power driven by GPUs
Deep learning, or running data through neural networks, requires massive computational power that did not exist at the time of Crichton’s Westworld. Even before beginning the deep learning process, data sets need to be collected, synthesized, uploaded, and ingested into large databases and distributed computing systems.
[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":2135228,"post_type":"guest","post_chan":"none","tags":null,"ai":true,"category":"none","all_categories":"ai,bots,","session":"B"}']
Researchers and AI enthusiasts today can use graphical processing units (GPUs) to train and learn from datasets. Neural networks must train on GPU chips for 400 hours, leveraging decades of research in algorithms. Deep learning uses large datasets, the processing of which require highly scalable performance, high memory bandwidth, low power consumption, and great short arithmetic performance. Frameworks like Hadoop and Spark offer affordable databases, while Nvidia has a 20-year head start in GPU chip making, ideal for complex algorithms and computations. Nvidia chips, which include GPUs, are included in a majority of the intelligent hardware today, such as autonomous vehicles and drones, and have driven deep learning to new applications.
In 1973, compute power today was logarithmically weaker than compute power today (represented by Jonathan Koomey’s image).
3. Amount of data versus cost of storage
Today, we produce an insane amount of data that can be fed into deep learning models and which produce more accurate outcomes in image recognition, speech recognition, and NLP. Data is coming from devices like fitness trackers, Apple Watches, and IoT — and this data overload phenomenon is exponentially higher at the enterprise level. The coinciding rise of big data from mobile, the Internet, and IoT is creating mountains of data, prime for AI.
With all of the cloud services of today, large organizations can store and access data without breaking the bank. The rise of Box and Dropbox, combined with offerings from Google, Amazon, and Microsoft, have lowered the cost of storage for big data.
[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":2135228,"post_type":"guest","post_chan":"none","tags":null,"ai":true,"category":"none","all_categories":"ai,bots,","session":"B"}']
In 1973, there was no comparison in the amount of data — consumer watches were not streaming sleep cycles, health data, or any of our beloved add-ons today. In the enterprise, storage was all on-site; if you didn’t have budget for servers and upkeep, you couldn’t participate.
4. Collaboration and open source resources
Organizations in today’s hypercompetitive market are pushing open source resources to attract AI talent; IBM Watson has taken an early lead, and competitors are racing to offer their own services. Google offers AI infrastructure via Google TensorFlow, and Microsoft offers CNTK, a framework for deep learning. Universities have open sourced their research: Berkeley released the deep learning framework Caffe and the University of Montreal offers its Python library Theano.
While many are collaborating and open-sourcing AI research to attract talent, some organizations are preparing for AI’s hypothetical bad side, in case a single party surges ahead of the pack. Some of the players in this area include OpenAI, AI libraries like Open Academic Society, and Semantic Scholar. At the Neural Information Processing Systems conference in December, Apple announced it would open-source its AI, allowing its researchers to publish their scientific papers.
There was limited open source collaboration during the era of the first Westworld because progress was mainly happening in government and defense. Organizations shared coding languages like LISP and researchers compared notes in person, but the Internet’s distribution piping was not yet available, hindering collaboration and AI’s progress.
[aditude-amp id="medium3" targeting='{"env":"staging","page_type":"article","post_id":2135228,"post_type":"guest","post_chan":"none","tags":null,"ai":true,"category":"none","all_categories":"ai,bots,","session":"B"}']
5. AI talent is king
Students are riding the AI wave by taking data science and NLP classes, and universities and online courses have stepped up resources to meet demand. The pool has grown even larger through scholarship programs that provide funding to research and increasing the student numbers for the computer science, math, and engineering streams that feed all levels of AI R&D.
In 1973, scholarships were rare, and European universities cut AI programs after the Lighthill report punctured the hype.
6. Investment frenzy
Nowadays AI investment is increasing at an insane rate, growing 42 percent CAGR since 2011, according to Venture Scanner. VCs and tech giants are high on AI’s potential, putting resources into people, companies, and initiatives. There have been a number of early exits in the space, many of which are glorified acqui-hires to build or enhance corporate AI teams.
In 1973, investment came primarily from defense and government organizations like DARPA. As the hype began to fade, DARPA made massive cuts to its AI budget and dollars to top researchers dwindled. At the time, venture capital was in its early days, and VCs were more interested in semiconductor companies than in AI.
[aditude-amp id="medium4" targeting='{"env":"staging","page_type":"article","post_id":2135228,"post_type":"guest","post_chan":"none","tags":null,"ai":true,"category":"none","all_categories":"ai,bots,","session":"B"}']
AI is already in our daily lives (e.g., Prisma, Siri, and Alexa). It will disseminate throughout the enterprise stack, touching every piece of the enterprise: devops, security, sales, marketing, customer support, and more. The six components listed above will make an even stronger case for AI, and its wave has similar potential to the Internet wave of the ’90s and mobile wave of the ’00s. This movement is already being realized within image and video recognition, as well as speech recognition and translation.
To prepare, organizations must understand the technology’s applications, limitations, and future potential. Companies like Facebook are treating AI more as a philosophy than a technology, as CTO Mike Schroepfer outlined at November’s Web Summit.
Deep learning’s influence has created the age of AI, and entrepreneurs are in the Wild West of data acquisition and training. Get your lasso, or get left behind.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More