There’s a lot of chatter about chatbots these days and how we might be able to use them in the future. The biggest question seems to be whether chatbots can be useful enough to convincingly replace human conversation. Answering that isn’t easy. Before chatbots can reach that point, they’ll need to develop and mature into a technology that enables human communication with a computer using natural language. Most bots today are not at the level where they can flawlessly replicate conversation.
Some chatbots today are not fed enough data. Others are fed enough data and leverage machine learning, using natural language processing in a way that can pick up keywords and phrases to understand human language. That said, the quality of the bot often depends on the amount of data it has learned to understand the language.
For example, there are bots that cannot recognize that “20” and “twenty” are the same thing. When the bot asks, “How old are you?” it’s able to recognize numeric data, but if it were to ask again and the user responded alphabetically (“twenty”), the bot would not recognize that the user has given the same answer, albeit in different formats.
Unlike less sophisticated examples, there is a bot that can distinguish a variety of location-based words that have the same meaning — for example: “SF” AND “San Francisco.”
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
We have some great technologies like the IBM Watson API, Facebook bot API, and Microsoft Bot Framework that can help developers create interactive bots based on different taxonomies. (You can try Watson for yourself here with this sample app.)
Line, a leading Japanese chat platform, is providing a bot API, and users are making a bot by connecting other APIs. One example is a bot that responds to user queries with images or recommends some restaurants nearby. These bots can be used for fun and entertainment, but they still can’t perfectly decode the intricacies and nuances of human dialogue in different languages.
Bots and neural networks
While discussing the future of chatbots, we cannot avoid mentioning the deep learning technology, as we see a lot of leading companies like Facebook, Microsoft, and others already using that technology for bots. When a deep learning system receives data, it captures the signature of data through its multiple layers of neural networks and can identify categories for each piece of data. What’s a neural network, you ask? Great question. Here’s as simple of an explanation as possible, from TensorFlow:
It’s a technique for building a computer program that learns from data. It is based very loosely on how we think the human brain works. First, a collection of software “neurons” are created and connected together, allowing them to send messages to each other. Next, the network is asked to solve a problem, which it attempts to do over and over, each time strengthening the connections that lead to success and diminishing those that lead to failure.
Neural networks are the core of deep learning architecture. Soumith Chintala of Facebook AI noted in a recent article, “Deep learning — neural networks that have several stacked layers of neurons, usually accelerated in computation using GPUs — has seen huge success recently in many fields such as computer vision, speech recognition, and natural language processing, beating the previous state-of-the-art results on a variety of tasks and domains such as language modeling, translation, speech recognition, and object recognition in images.”
Essentially, chatbots developed in neural networks are more likely to replicate human conversation in a believable way.
Take this simple sentence, for example:
“Hayato plays soccer.”
The bot can recognize:
Hayato (noun) plays (verb) soccer (noun).
An understanding of words and sentence structure gives the bot the power to categorize and form logical sentences. In another example, the bot can recognize:
“I want to go running around the park.”
as easily as it recognizes
“I want to go jogging in the park.”
and can respond to either sentence with “Hey, there’s a great park for running near your house” and related links or maps.
Currently, deep learning language processing is still in the R&D phase. With improvements in this technology, computers will be able to understand human language almost perfectly, and we may be able to create a very high quality, real-time machine translation bot.
Just a box of tech?
A bot is simply a box of technology and within the box we put deep learning technology or A.I.-powered tech. Today, the shape of the box looks like a chatbot but it doesn’t have to be chat or even a bot in the future. It may be a tangible device, like an iPhone. We may start calling it a device bot, but it will still have the A.I.-powered technology to make it act human. This could replace the information-based work that humans are doing right now. (Have a look at this fascinating example named Pepper.)
Is the conversational-based interface the best technology for bots of the future? Consider that calling Uber from Facebook Messenger doesn’t deliver the same experience you’d have if you used Uber’s smartphone app. Similarly, a chatbot in Facebook Messenger wouldn’t be the ideal way to use the bot system. If a user wants to have everything rooted in chat, then a chatbot is perfect. However, those who don’t use chat heavily will be disappointed at the chatbot’s limitations — as with the Uber scheduling scenario just described.
Google announced that its natural language processing system called Parsey MacParseface is now able to identify 94 percent of word dependencies within an English sentence. After that technology is improved, it would be possible for computers to write a book called, hypothetically, “Why Yosemite is the best place in California for families to vacation” by analyzing all the opinions on review sites. But of course, there are so many things we can’t decide based on statistics, and sometimes we need to make decisions based on vague facts and instincts only a specialized professional might have. Those skills are hard for a computer to understand.
To have that ability, an expert would have to input data for the bot. Look at a genius like Elon Musk. His decision-making skills are only in his head, and a bot cannot translate something like that. Yet. But could this considerable ability be incorporated into the bot of the future? Time will tell.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More