Just a few short months after the category took off, bots are everywhere. Facebook Messenger already has over 18,000 bots (at least) on its platform, and many of them are very good at the tasks they’re supposed to do.
So far, the conversation has been about what a bot can do. There are several bots that can order your pizza. Other bots are great at setting up reminders on the go. Some can schedule your meetings; others brief you about the weather each morning. No matter the function, people are thrilled to automate the minutiae of their personal and work lives with bots, but is that enough to keep people engaged? Is it enough that people will be willing to hand over the more important and sensitive tasks in their lives to bots?
[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":2075677,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"bots,business,","session":"D"}']If we want users to ask more of their bots, they’re going to need more than dutiful execution of simple tasks. And merely teaching a bot to complete complicated tasks won’t be enough to earn a user’s trust.
That’s precisely why the next frontier for bots will focus on how they build relationships with people.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
What’s the ‘how’?
The daughter of Next IT’s founder once asked him whether SGT STAR, the U.S. Army’s virtual assistant, had a favorite color.
The question threw him for a loop — why should a bot have a favorite color? Certainly not to pick out curtains or match paint. Then he realized that his daughter’s question implied a curiosity about more than just a favorite color. The point is that not having a favorite color compromised SGT STAR’s ability to demonstrate human qualities like honesty and trustworthiness. SGT STAR is very good at answering questions about life in the Army, but even the best answers don’t satisfy users who want to hear from a source that feels as trustworthy as a human.
Today, when you ask SGT STAR what his favorite color is, he says: “I am partial to red, white, blue, and of course, good ol’ Army Green!” It’s a small detail, for sure, but it’s incredibly important.
In this case, SGT STAR demonstrates human traits of likes, dislikes, and preferences, which ultimately makes him feel more candid, relatable, and trustworthy. Would a conversation with a person who expressed no preferences inspire trust?
The ability to communicate values such as honesty or trustworthiness is the how that will inevitably underwrite the what in the bot market.
Values move to the foreground
As we welcome bots into our everyday lives, we train them to recognize the languages we speak, discover our preferences, and even analyze our tone of voice and deduce our state of mind.
[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":2075677,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"bots,business,","session":"D"}']
Consider the first chatterbot, deployed in 2000, and the company that created it. The Verbot program is based on Dr. Michael Mauldin’s early work in natural language processing and chatterbots. The company that built it, Virtual Personalities, traces its technology back to Dr. Mauldin’s studies at Carnegie Mellon University, as well as Peter Plantec’s work in personality psychology and art direction. Personality psychology may have very little to do with the ability of a bot to integrate with Facebook Messenger, but it has everything to do with its ability to engage people authentically.
Today, the technology has progressed exponentially, and we have developed tools to help machines better communicate in the human world. Natural language processing and sentiment analysis are good examples. When Eugene passed the Turing Test in 2014, the era of human-machine communication turned a corner.
Now companies are pushing the boundaries even further. Consider IBM Watson’s Tone Analyzer, which uses linguistic analysis to understand emotion, language patterns, and social habits.
The goal of these tools is to improve bots as conversational interfaces. Using these tools, you can teach a bot how to express empathy for the user, which is an even more direct expression of values than having a favorite color. We’ve talked a lot about how to build trust with the foundational technology powering the performance of these bots, but these tools go one step further. They may not change what bots do, but they change how bots do things. They make it easier for bots to communicate values.
[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":2075677,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"bots,business,","session":"D"}']
Bot makers beware
From bot maker to bot user and back, there is a growing transfer of values. We don’t know the limit of how well bots can reflect human values, but we’re making tremendous progress already. Teaching bots to exhibit values will multiply the potential applications for the technology in our lives, but it will also differentiate bots and their makers.
Insofar as bots can only display values that their makers teach them, the makers will be under a giant spotlight. Bad actors will be easy to spot, and branding will matter more than ever.
Imagine two similarly effective bots that help you book travel tickets. A bot that can empathize with users who are rebooking after a cancelled flight will stand head and shoulders above a bot that merely books the new flight without consideration of the user’s frustration, exhaustion, or any other human feeling they may be experiencing at the time. Users can spot the difference instantaneously, and their loyalties will follow.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More