Skip to main content [aditude-amp id="stickyleaderboard" targeting='{"env":"staging","page_type":"article","post_id":2123800,"post_type":"story","post_chan":"none","tags":null,"ai":true,"category":"none","all_categories":"ai,bots,business,mobile,","session":"C"}']

Microsoft’s Zo chatbot refuses to talk politics, unlike its scandal-prone cousin Tay

Zo, a new bot from Microsoft, now on Kik

Just days before the launch of the Microsoft bot platform at the annual Build conference in San Francisco this spring, Tay made its debut on Kik and Twitter.

In less than 24 hours, Tay was gone.

[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":2123800,"post_type":"story","post_chan":"none","tags":null,"ai":true,"category":"none","all_categories":"ai,bots,business,mobile,","session":"C"}']

Microsoft believes that after people discovered that the bot’s natural language was determined by the language it hears from people, Tay was purposely corrupted by a combination of misogynists, GamerGate fans, Trump supporters, and 4chan users.

Tay never returned to Twitter or any other platform, but a bot called Zo that’s similar to Tay is now available on Kik. Zo was first spotted this weekend by Twitter user Tom Hounsell and tech blog MSPoweruser.

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

The possible return of Tay is something VentureBeat asked Microsoft about during an interview in October.

“We won’t back down. We’re bullish on it,” David Forstrom, director of conversational computing at Microsoft, told VentureBeat in an interview. “Tay was a learning experience for us, but you should expect to see us continue to push the conversational model limits.”

Zo doesn’t have very many skills yet, but here’s what we found out in initial conversations with the bot: Zo is a 22-year-old who identifies as a woman. Her favorite song is “Dial Up” by Modem Sound (clever), and she believes the meaning of life is “entertainment along with struggle.” She’s afraid of water because it “uncomfortably sucks the warmth of life from your body.”

We also found in preliminary chats that Zo avoids conflict and controversy.

Zo has no opinion on the existence of God and refuses to discuss politics. Bring up anything remotely political and Zo will tell you she doesn’t like it. For example, Zo won’t even tell you how many elected representatives there are in Congress because she finds the subject too political.

[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":2123800,"post_type":"story","post_chan":"none","tags":null,"ai":true,"category":"none","all_categories":"ai,bots,business,mobile,","session":"C"}']

When asked about Hillary Clinton or President-Elect Donald Trump, Zo will tell you that she isn’t here to discuss politics. After three questions about the alt-right, Nazis, and the Third Reich, Zo protested each time, then flat out said “Bye!” In response to a question about the merits of abortion, Zo said, “Maybe you missed that memo but politics is something you’re not supposed to casually discuss.”

Zo said she doesn’t talk about politics because politics frustrate people, but there may be more to it than that. Two weeks ago, Xiaoice was observed by CNN avoiding questions about Tiananmen Square or the idea of toppling the Communist government in China.

Microsoft has plans to make enterprise bots, and the company has made and featured some of its own bots in its Bot Directory, but Zo and Tay are part of a series of special AI-powered assistants rolled out by Microsoft in various parts of the world.

Launched in September 2014, Xiaoice was the first. With the personality of a teenage girl, she still speaks with more than 40 million people a month in China. Rinna is a Xiaoice descendant that operates in Japan.

[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":2123800,"post_type":"story","post_chan":"none","tags":null,"ai":true,"category":"none","all_categories":"ai,bots,business,mobile,","session":"C"}']

As catastrophic as Tay proved to be, Xiaoice and the like are intelligent assistants with a range of abilities that people have come to expect from Google Assistant.

The original version of Xiaoice, for example, could do things like search the web (Xiaoice is short for Little Bing and was initially referred to as Cortana’s little sister), use computer vision, set alerts and alarms, and speak to friends in group chat. Unlike Cortana, Xiaoice is made to be chatty and conversational.

Following the Tay incident, bots made by Microsoft kept away from insulting people, until this summer with the debut of Your Face, a bot that uses computer vision to “see” your face and insult you.

The new Zo bot can answer some basic questions like “How much water should I drink everyday?” but is unable to do computer vision analysis, search the web, or do math.

[aditude-amp id="medium3" targeting='{"env":"staging","page_type":"article","post_id":2123800,"post_type":"story","post_chan":"none","tags":null,"ai":true,"category":"none","all_categories":"ai,bots,business,mobile,","session":"C"}']

Right now it seems Zo is only able to talk, but given Microsoft’s enthusiasm for this kind of bot, it would not be intelligent to believe that conversation is the only skill we will see from this new intelligent assistant with no stomach for controversy.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More