Skip to main content [aditude-amp id="stickyleaderboard" targeting='{"env":"staging","page_type":"article","post_id":1977948,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"bots,business,","session":"D"}']

People are reporting crimes to this chatbot — but it’s not a police officer

Credit: Tony Webster.

As AI and machine learning take steps further, the consequences of bots trying to act human are unpredictable.

For Sequel, a platform trying to create personas, one unforeseen reaction is that Detective Kees Larsen, part of the Probable Cause bot, is starting to get actual crime tips from users. The bot, which debuted in April alongside the Facebook Messenger bot platform, talks with people as they tackle a murder mystery.

[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":1977948,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"bots,business,","session":"D"}']

“We truly get at least 3-5 emails a week from people who are emailing our help line and reporting crimes. They’re like ‘Hey there’s this lady on my block. She’s not very nice. I think she sells drugs.’ And so we emailed her back to say this is not the real police. Please dial 911 and report it,” Omar Siddiqui, CEO of Sequel maker Kiwi, said.

Siddiqui shared this story of confused users on the wrong side of Uncanny Valley with a crowd of about 50 people Monday at Botness, a two-day gathering of chatbot makers in San Francisco. The event was organized by some of the biggest names in the messenger startup community, including Kik, Slack, and Microsoft, together with Chris Messina, who declared 2016 “the year of conversational commerce.”

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

Initially, Sequel wanted Detective Kees to be seamless and give no indication that the detective is a bot, but as a result of the attempt to report crimes, the company made a few changes.

“We’ve actually now taken a step back and before you engage with the detective, we are thinking really clear that hey this is a simulated experience. It’s not real. You can type ‘stop’ to stop the police from talking to you at any time because some people were like ‘Why are the police talking to me? Why are you after me? What have I done?'”

Like a lot of companies creating bots, Sequel wants to make theirs with personality — so much so that they don’t like the name “bots.” Instead, the company calls bots “personas” and gives them names, like Detective Kees.

“Bots are not what people want to connect with. It’s got a very tech-y, unapproachable, Silicon Valley sort of label on it,” Siddiqui said.

Siddiqui thinks the crime reports demonstrate a change in thinking that will have to take place within the growing bot ecosystem.

Bot makers will have be deliberate and remind users that they’re speaking to a bot, and consumers will have to adjust to the idea that some conversations happen with people and others happen with a bot.

[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":1977948,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"bots,business,","session":"D"}']

Sort of makes you miss the days when we all wondered if the guy using a Bluetooth headset was talking to himself.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More