Since the WTF Is That bot went live in May, more than 50,000 images have been submitted by people curious about a rash they have, a bug they see, or other things they see in the world but are unable to identify. Every time a photo is uploaded, users are given the option to like or dislike the bot’s response.
Users are also given the option to submit a photo to be considered for WTF of the Day.
WTFIT maker Ming Cheuk shared some of the photos with VentureBeat, and we picked our favorite answers to the question “WTF is that?”
Cheuk’s favorite so far is “that astronaut chicken.”
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
Sometimes remarkably accurate, sometimes completely off, the WTFIT bot uses a combination of computer vision and natural language processing from Microsoft Cognitive Services and artificial intelligence from other API integrations. Vision Bot and Bing Image Bot, both featured in the Microsoft bot directory, incorporate similar technology. Companies like Google also offer image recognition and computer vision tech.
The WTF Is That bot was made to help people put names to things around them that they’re unable to describe properly.
“Sometimes, it is quite hard to describe these [pictures] in words to put into Google (‘dog with brown patches and floppy ears’ or ‘tree with green medium sized leaves’ doesn’t really help),” Cheuk told VentureBeat via email.
Cheuk originally planned on making WTF Is That an app, but went the bot route when he heard about the new Facebook Messenger platform.
The bot does an especially good job at recognizing car makes and models, as well as dog breeds, Cheuk said, but it has some ground to cover before it will be able to identify any photo it receives.
“Pure computer vision is currently far from perfect — there are some things it recognizes very well and some things that it’s just terrible at,” he said. “We have found that users tend to correct the bot with what the object actually is when it’s wrong. This is very powerful data to help make the bot more accurate.”
At this point, humans do not interfere with the bot’s answers to questions, Cheuk said, but that could change.
With each uploaded photo, users have an opportunity to like or dislike the results, and people regularly type in the correct answer and try to teach the bot. Cheuk wants to harness this impulse in the future. He’d like to see WTF Is That become a community that combines A.I. and human knowledge to further crowdsource answers.
New Zealand-based Spark 64, the company behind this bot and an app that helps forecast UV exposure around the world, wants the bot to evolve beyond space chickens and grinning Chihuahuas. The company is analyzing data from the WTFIT bot to identify specific use cases for specialized image recognition bots in fashion, agriculture, or other fields, Cheuk said.
“Although a little further down the track — and diagnosis is a very regulated territory — we envision people do want to know ‘WTF is that rash?,’ perhaps with the help of medical professionals,” he said.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More