Bots are currently all the rage, but they’re not a new concept.
In fact, they’re a very old concept. Yet with all the discussion around bots, how will they really impact business processes? Bots are only as useful as the services they are integrated with, and their purpose is essentially automation — that is, creating and executing actions based upon a set of criteria. In order to know where bots are going, though, we need to understand where they’ve been.
While no one knows exactly when bots started, they’re widely thought to have gotten off the ground with ELIZA.
The bot was built by Joseph Weizenbaum, an MIT professor, in 1964. ELIZA was operated by a script called DOCTOR that emulated a conversation between a psychologist and her patient. Originally built on Fortran, ELIZA was an early example of natural language processing. Even in the early days of bots, people were attempting to communicate with them as if they were human beings, Weizenbaum was attempting to create a bot that would learn from its interactions.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
A decade later, in the late 1970s, a student named Roy Trubshaw from Essex University got obsessed with the computer game Zork.
But he noticed that the social element was missing. This led him to create a whole new experience, based off of Zork, called a Multi-User Dungeon (MUD). The MUD is the historical predecessor to today’s Massively Multiplayer Online Role Playing Game (MMORPG). However, it was also the next evolution in bots. Each of the Non-Player Characters (NPCs) in an MUD was an early version of the modern chatbot. The main difference was that these bots were interacting with multiple players simultaneously.
Bots continued along those lines for quite some time. They proliferated on Internet Relay Chat (IRC), Telnet, MMORPGs, and many other online experiences throughout the 1980s, 1990s, and 2000s until things changed drastically in the first decade of the new millennium with a game of Jeopardy. IBM set up a competition between two of Jeopardy’s most successful contestants and Watson; an intelligent natural language processor that uses machine learning technologies to answer questions.
While Watson was actually the evolution of IBM’s Deep Mind (originally released in the late 1990s), it went so much further once it added the machine learning elements. Watson is a blueprint for the future of all bots. All of the major technology companies are working on some semblance of the technology right now. Google, Apple, Microsoft, and Autodesk are just a small sampling of the organizations working hard to build a bot that can interact with people using natural language and learn from those experiences.
The next step for bots
We’ve seen Siri, Google Now, and Cortana being used as virtual assistants with voice interfaces. While the ways in which humans interface with bots has become more sophisticated (voice, and even video interactivity, are becoming the norm), the true advancement in the technology is in the machine learning components. With machine learning, a bot can do everything ELIZA could do. From a consumer perspective, the opportunities are staggering; from an enterprise perspective, it becomes the inevitable next step in bot evolution.
Enterprise organizations won’t be able to continue using trivial keyword matching algorithms and analytics data crunching engines to spit out automated responses to specific scenarios for too much longer.
Businesses need decision-making to happen faster, as their backend systems become more integrated with the world around them through APIs and IoT. Watson is the blueprint for where the virtual assistants need to go. Not only that, it’s also the blueprint for where the simple text-based bots being utilized in Slack, Cisco Spark, and Facebook Messenger need to go.
Both consumers and businesses want relevant answers to their queries. Those answers can change with the dawn of a new day when something as sudden and unexpected as the UK’s decision to leave the European Union occurs. Or they can be influenced by complex, long-term trends, such as the effects of climate change on state-wide real estate markets. How will businesses adjust their strategy in real time to compete with others in the global landscape?
It’s simple: They’ll outsource a lot of the real-time adjustments to algorithms. You’ve already seen this happen in the stock market with the advent of computer-driven quantitative algorithmic analysis being utilized en masse by major Wall Street firms. This is just the beginning.
Watson is the blueprint; Slack, Facebook, Google, and Apple have examples of the interfaces humans will utilize. And businesses will increasingly pay for more and more of their decision-making to be outsourced to algorithms.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More