The game The Last Guardian features one big achievement that’s fairly awe-inspiring.
It’s not the level design or even the gameplay, which are both compelling enough. (You can read our full coverage of the title here.) As a long-time gamer who has fond memories of playing Ico quite a few years ago and then beating the spiritual successor to that release called Shadow of the Colossus in 2005 (all created by designer Fumito Ueda and his team with a focus on forming an emotional bond), I’ve kept up on this latest project. The Last Guardian features a young boy climbing around on castles and coaxing a giant dragon-like creature named Trico into helping him escape.
The interchange between the boy and the creature is what held my attention. It serves as a good lesson in how to make AI in a chatbot, a digital assistant or within an app or any piece of software. The creature seems to come alive as you play, and that’s all by design.
Here’s an example. In one level, I was standing on the edge of a ravine looking down. At this point in the game, you can indirectly control the creature, usually by pressing a button and looking in a certain direction. In most games, you might press a button and see a more immediate response, but in The Last Guardian, everything is more subtle. At some points, Trico would prance around, sit down and wait, or even wander away. An emotional connection develops between the boy and the creature, and it’s not because a cutscene explains the backstory. Instead, you feel as though the creature has its own identity and purpose. It has some intelligence of its own and does not always respond immediately. You have to put some effort into forming the bond, so the reward comes when the creature finally helps you.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
On that ledge, Trico eventually does wait for you to climb onto its head and leap over the chasm. In another scene, Trico catches you with its tail at the last minute. Much later, you feel a stronger bond because Trico chooses to protect you and assist you, even though it took some coaxing and cajoling in the early stages of the game. It’s not only that you learn how to get along with Trico but that Trico learns how to get along with you. That’s when AI becomes truly useful.
Why is this a good example for chatbot developers? Too often, the artificial intelligence in a messaging window acts more like an ATM machine or an automated call with FedEx. You press a button, the bot responds, end of story. For people to form a bond with a chatbot, if that’s even possible, the bot has to exhibit some personality traits. This is what I like about the Mitsuku bot — it doesn’t always respond perfectly. The bot can even insult you or not offer the information you want.
Of course, this indifference or non-compliance can turn into a nightmare. The Microsoft Tay bot was supposed to be more human-like, and in many ways, it was a good experiment because Microsoft wanted to “unleash” the creature into the wild and see what happens. Things did not turn out as planned, as the bot spewed racist comments and hate speech.
Fortunately, Trico never starts blasting you with fire from its tail and never flattens you like a pancake. The AI is utterly convincing, though. Having my nephews play the game, they felt like the creature was more like an animated character from a Disney movie, an intelligent being that happened to be part of a game and not so scripted and choreographed. They believed that the creature had choices to make, and that’s what helped advance the narrative. You are the one who does the convincing, so you feel like you accomplish something in the adventure.
In a chatbot, I’d prefer to have some of the same mystery. I’m not saying a chatbot should refuse to be helpful, but maybe it means a banking bot knows when to question your intentions. “Are you sure you want to pay off that bill right now because you’re pretty low on funds” is more helpful than a chatbot that follows every command perfectly. With a travel bot, I’d rather use an intelligent assistant that “wanders” off to Expedia and finds an amazing deal on a hotel even though I’m trying to find a flight. Feeding helpful information is a sign of intelligence; following every dictum is a sign that the developer took some shortcuts and skipped the AI.
This is my problem with even the most advanced bots like Google Assistant or Amazon Alexa. They don’t really act like Trico. They obey our commands — closing the garage door, turning on the lights, ordering a pizza. That’s fine, but true AI would become more proactive and even argue with me. “John, you’ve had too much pizza this month” is maybe insulting but, then again, I’m trying to lose some weight. “John, don’t take that route to the coffee-shop today, there’s a lot of congestion.” “John, I’m closing the garage door for you for security reasons.”
To me, that’s when AI will become helpful: when it thinks outside of the box. You could argue that it is also when AI gets dangerous, but then again, the boy in The Last Guardian really needed to cross that ravine. Trico might be dangerous and unpredictable, but at least the creature seems alive.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More