Fifteen-year-old Sarafina, a female student in the capital city of Liberia, had a distressing problem at school: Her math teacher refused to give her a report card unless she had sex with him.
Every day at school, he would request sexual favors and touch her inappropriately. Embarrassed, Sarafina kept the issue hidden from everyone, even her parents, until her father overheard a sexually harassing phone call the teacher made to their home. Sarafina’s father successfully confronted the man and got the report card, but his daughter was reprimanded for reporting her teacher’s sexual advances and forced to move to another school.
[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":2100934,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"bots,business,","session":"D"}']1. Bots can uncover the truth
In Liberia, teachers enjoy high social status but children, especially young girls, are culturally trained not to speak out, leading to a culture of silence and tolerance. While Sarafina’s story sounds extreme to Westerners, her experience is painfully common yet largely ignored in her country.
Enter UNICEF’s U-Report, a social reporting bot that enables young people in developing countries to report issues in their community via SMS and other messaging platforms. U-Report polled 13,000 users in Liberia to ask if teachers at their schools were exchanging grades for sex. An astonishing 86 percent of reporters said yes.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
Within a week of the U-Report discovery of the “Sex 4 Grades” epidemic, help hotlines around the country were inundated with reports of child abuse. Simply exposing a pervasive taboo inspired victims to speak up and reach out for help. Since then, UNICEF and Liberia’s Minister of Education have collaborated on a plan to stop the issue.
“U-Report is not just about getting questions answered, but getting answers back out,” explains Chris Fabian, co-lead of UNICEF’s Innovation Unit. “We get responses in real time to use the data for policy change.” With over 2.6 million U-Reporters worldwide and deep expertise building technology for developing economies, the U-Report team is uniquely positioned to tackle challenging social issues like violence against children, HIV/AIDs policy, climate change, and war and conflict.
2. Bots can raise awareness
Less than 50 percent of the population in Ethiopia has access to clean water, and only 21 percent of the population enjoys proper sanitation services. Unfortunately, cold statistics like these rarely move people to take action.
That’s why Charity:Water teamed up with Lokai and AKQA to create Yeshi, a Facebook Messenger chatbot that humanizes the water crisis in Ethiopia. Yeshi is represented as a young girl in Ethiopia who walks 2.5 hours every day to the nearest reliable water source. She travels alone and straps huge plastic jugs to her back so she can bring gallons of water home to her family. You learn about her dreams of going to school and see a map of her journey.
Yeshi even asks you to send her a picture of where you live. “Wow! Our worlds are so different,” she remarks, before leaving you to continue her arduous walk.
The experience of “Walking with Yeshi” is undeniably emotional. Conversational experiences like this can be powerfully effective ways to convey the humanitarian challenges that face the global poor and inspire action.
[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":2100934,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"bots,business,","session":"D"}']
Besides raising awareness, charities can also use bots and messaging platforms to raise critical funds. Charity: Water recently worked with Assist to enable donors to donate funds directly from Facebook Messenger.
3. Bots can fight bureaucracy and inequality
Nineteen-year-old Joshua Browder is no typical teenager. The Stanford Computer Science undergraduate has single-handedly beaten over 160,000 unfair parking tickets with his bot, Do Not Pay. The sophisticated “robot lawyer” also helps tenants fight negligent landlords and the homeless apply for much-needed government support. Browder was inspired to help the most vulnerable segments of society acquire legal help they would otherwise never be able to afford.
“With parking tickets, the robot lawyer takes money from the government. However, so many government bureaucracies can be automated, like the DMV. Eliminating bureaucracy will actually save the government money,” points out Browder. “In the U.K., there is this really broken system where the government pays a lawyer to file an application back to the government for a homeless person to receive support. The government wastes so much money with the application process when they should just spent that money on houses.”
Browder’s vision for Do Not Pay extends far beyond simply fighting parking tickets and helping the homeless file for assistance. While some aspects of the law, like bankruptcy, are complicated and unintuitive, many legal processes can be modeled as logical decision trees by computers. Browder’s mission is to turn Do Not Pay into a legal bot platform where lawyers can identify aspects of the law that are automatable and create their own bots.
[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":2100934,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"bots,business,","session":"D"}']
Government bureaucracy is so pervasive that many other bots have cropped up to simplify civic matters. Against the backdrop of election cycle drama in the U.S., several voter registration bots — HelloVote, GoVoteBot, and VotePlz — have emerged to allow voters to skip onerous and error-prone paperwork and register simply through SMS and Facebook Messenger.
While competition between most Silicon Valley companies is fierce, voter registration is not a zero-sum game where startups squabble over limited scraps. According to Sam Altman’s VotePlz, only 54 percent of eligible young people were registered for the last presidential election, and 10 percent of millennials aren’t sure if they’re registered. Everyone is fighting for the same social goal: boosting voter turnout and helping citizens do their civic duty.
4. Bots can provide social and health services
With the threat of Zika looming over the Americas, knowing whether you’ve contracted the disease is critical to getting timely and adequate treatment. GYANT, a healthbot on Facebook Messenger, walks you through a questionnaire of symptoms to identify your likelihood of having Zika. Concerned users can get a personalized answer immediately, rather than waiting for a doctor’s appointment or ignoring the problem.
Many nonprofits and government agencies offer hotlines and support groups but face high demand and insufficient human staff. Some, like Samaritans, a suicide prevention group, are reportedly working on chatbots to offer faster response times and around-the-clock support. Such social support, whether given by a human or a bot, has a huge impact on people’s lives. Even gifting senior citizens with a robotic seal is shown to reduce stress and improve socialization. Besides simply building mechanical robots to address the physical challenges of old age, social chatbots can be built to address emotional and mental needs.
[aditude-amp id="medium3" targeting='{"env":"staging","page_type":"article","post_id":2100934,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"bots,business,","session":"D"}']
In the healthcare industry, providers are often overwhelmed by the number of patients, most of whom need continued social and emotional support outside of their doctor and hospital visits. Sensely is a digital nurse bot with a human-like avatar that can save up to 20 percent of a clinician’s time by monitoring whether patients are dutifully following their prescribed regimens.
For mental health, conversational avatars like Ellie, a digital therapist developed by USC’s Institute of Creative Technologies, can interview patients and detect depression and anxiety by analyzing words, facial expressions, and tone. Professor Louis-Philippe Morency, co-creator of Ellie, says the bot cannot replace a human therapist, but it is a decision-support tool that helps to “gather and analyze an interaction sample” for doctors.
5. Bots can motivate the right actions
Can’t kick your nicotine addiction but too embarrassed to nag a friend to help every time you feel a craving? Public Health England experimented with a Facebook Messenger bot for their month-long Stoptober campaign to help smokers quit. Stoptober helped 500,000 people quit smoking last year, succeeding with an impressive 20 percent of the 2.5 million smokers who registered.
PHE marketing director Sheila Mitchell believes the addition of the Facebook Messenger bot as a support tool for smokers will increase the percentage of successful quitters. “The heart of the campaign is social,” explains Mitchell. “We found that the big numbers and responses come from social and that within this, Facebook is absolutely dominant.”
[aditude-amp id="medium4" targeting='{"env":"staging","page_type":"article","post_id":2100934,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"bots,business,","session":"D"}']
Where humanitarian bots fall short
Addressing social issues requires emotional sensitivity, a critical skill that bots are universally missing. LawBot is a legal bot created by Cambridge University students to help users in the U.K. understand the complexities of the law and identify whether a crime has been committed. Users can engage with the bot to report rape, sexual harassment, injuries and assaults, and property disputes.
Unfortunately, the bot uses a strict checklist to assess if a “crime” has been committed. If your report of sexual harassment doesn’t fit within preset criteria, the bot quickly responds with a dismissive statement. Despite good intentions, the emotionally insensitive LawBot is unable to assist with sexual harassment if the harassment does not fit within a narrow set of legal technicalities.
According to the Guardian, over half of women in the U.K. have been sexually harassed at work. Even if the offending actions don’t fit into a neat legal box of being a “crime,” unwanted aggressions can cause lasting psychological damage and unnecessary suffering. Additionally, many corporate cultures discourage reporting to avoid expensive legal battles or PR nightmares.
When a bot bluntly tells a potential victim of sexual harassment that “no crime was committed here” without having a detailed understanding of the situation, the results can be counterproductive and further discourage victims from speaking up. And even if LawBot deems that a crime has occurred, the bot’s only response is to send you the address of the nearest police station.
[aditude-amp id="medium5" targeting='{"env":"staging","page_type":"article","post_id":2100934,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"bots,business,","session":"D"}']
While artificial intelligence technologies have not yet evolved to allow bots to respond with emotional acuity to difficult situations, a better solution for LawBot would be to connect distressed users with sympathetic hotlines, support groups, or expert human lawyers once the conversation has exceeded the bot’s domain expertise.
Help bots develop empathy and compassion
Do Not Pay’s Browder cautions that “the really big challenges are ones that require compassion and human judgment. If someone is to be granted bail, there is no set criteria. Bots work really well when there is a clear decision tree.” When bots seek to address complex issues, like rape or sexual harassment, even well-intentioned efforts like LawBot risk backfiring.
Many technology powerhouses are working to give bots the emotional sensitivity needed to make complicated judgements that can’t simply be captured with decision trees. As mentioned earlier, digital therapists like Ellie factor in facial expressions and vocal characteristics. Amazon is working to make Alexa, the cloud AI that powers the Amazon Echo, detect and respond to your emotions. SRI International, the research lab that created Apple’s SIRI, is building new virtual assistants that emote just like you do.
“Humans change our behavior in reaction to how whoever we are talking to is feeling or what we think they’re thinking. We want systems to be able to do the same thing,” says William Mark, head of SRI International’s Information and Computing Sciences Division.
While bots have a long way to go to reach their potential, even the emotionless bots of today have changed the world for the better, from revealing epidemics of violence against young girls (U-Report) to automating government bureaucracies like homeless status applications (Do Not Pay). Bots can tell stories to help you empathize with humanitarian crises (Yeshi), assist your healthcare providers (GYANT, Ellie, Sensely), and help you quit your worst habits (Stoptober).
We can’t wait to see what the emphatic and compassionate bots of the future will do.
This article appeared originally at Topbots. Used with permission.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More