The evolution of chatbots over the past few decades has ingrained them into our daily lives. From virtual assistants on our phones to de facto customer agents online, they have come a long way from their origins. However, if chatbots are going to one day outpace apps, they need to become more naturally integrated into a wider range of technologies.
In the mid-1960s, computer scientist Joseph Weizenbaum invented the first chatbot program. Eliza was designed to simulate a conversation by using pattern matching and substitution methodology to give the illusion that the program understood what the person was asking.
[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":2082535,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"bots,business,","session":"C"}']Today, chatbots are able to provide a sense to the user that they not only hear you, but understand. We experience chatbots being used in retail to answer basic questions on a website, to help manage patient care, or even in social media. However, they still don’t replicate the interplay of two humans communicating. Although there is tremendous potential for growth, chatbot technology will have to overcome some hurdles before it is fully functional and truly driven by artificial intelligence (A.I.).
Before we discuss where chatbots need to evolve, let’s first look at how chatbots function today. The present bot understands what is being said, typically through natural language processors — computer programs that are designed to attach specific meanings to spoken or written words. Many chatbots then use expert systems software, which imitates the decision-making ability of a human expert, to respond to a question by pulling from a limited subset of information.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
Those of us working in the field see that in the near future, many more chatbots will take a deep learning approach — accessing huge data sets to predict and prompt a much wider range of responses and relevant questions. The desired effect is that customers won’t realize they aren’t communicating with a person.
Limitations and opportunities
One of the biggest technical challenges with chatbots will be accessing and retrieving huge amounts of data. For example, we know from extensive retail examples that many customers simply don’t want to interact with a computer. They expect a human to help solve their problem. The only way that chatbots can offer a simulated human-like experience is through A.I.. However, to achieve this, we must first tackle the deep learning conundrum of building massive data sets, which requires some way to accumulate that data.
After compiling the various data sets, one must also consider the psychology of the individuals interacting with the bot to figure out how personality will integrate into the experience. How are people going to interact with the chatbots? Think of the diverse wording that each individual member of a large group would use to describe the same issue. Also, when does the chatbot interject? How do they steer a broad audience to the correct solution? With the use of large data sets, we will begin to see chatbots more closely replicate human conversation, with a better understanding (and anticipation) of language and situational vocabulary.
A.I. and chatbots
Trying to teach a chatbot to have the same ebb and flow found in our everyday conversations is extremely challenging. From a computational perspective, it is a massive problem. Some A.I. algorithms currently address pieces of this problem. For example, the A.I. community has done a good job of language assessment for chatbots integrated into search engines. Think about what happens when you Google a topic by asking a question; even if you aren’t 100 percent correct in how you phrase the question, it is very likely that you will still find the correct answer in the top results.
When you compare that success to chatbots, you readily see that using human language in a conversation is not as easy. This technology must take a unique conversation and understand what was said and how current statements may connect and modify past statements, and still provide an intelligent, human-like answer. The A.I. community continues to work to develop techniques to address this problem with the hope that we can eventually create a response that imitates what you would experience in a real conversation. Unfortunately, limitations of deep learning mean that few computer scientists are focused yet on adding personality to chatbot responses. At its core, A.I. and chatbot technologies are trying to solve a lot of complex problems to provide an understandable, eloquent conversational partner.
The future of chatbots
Chatbots are one of the hottest subsets of A.I. right now, and in the coming years they are going to become both more prevalent in their use and more hidden in their implementation. New and innovative use cases will help answer questions on where chatbot use is acceptable and how it can be used to better serve users across many different domains. Of course, the real question is whether a chatbot can ever live up to the expectations of the user, and whether one can ever convince a user they are chatting with a human. For example, Facebook integrates chatbots into its Messenger app to allow businesses to create interactions for customers; Amazon Echo gives customers access to chatbots that allow them to play music or pay their credit card bill; and Domino’s lets customers place pizza orders via social media.
[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":2082535,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"bots,business,","session":"C"}']
If we look at the evolution of the field of A.I. over the past 50 years, the biggest successes are algorithms that started as A.I. research and that we don’t even think of as A.I. any longer, such as browser searches. We are seeing the same thing in chatbots as these algorithms are getting implemented in other places. Indeed, within 10 years, we will likely find that we no longer think of chatbots as a distinct technology, but as just an aspect of apps or another technology that we have yet to imagine.
Cory Kidd is an active member of the ACM. As a side note, the ACM Turing Award is the most prestigious technical award in the computing industry. In recognition of the Award’s historic 50th anniversary, the Association for Computing Machinery (ACM) invites us to participate in activities celebrating the ACM Turing Award and computing’s greatest achievements. This content is prepared as part of that celebration.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More