Skip to main content [aditude-amp id="stickyleaderboard" targeting='{"env":"staging","page_type":"article","post_id":359079,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,mobile,","session":"B"}']

Need an abortion, Plan B or birth control? Don’t expect Siri to help you out

Siri, the iPhone 4S’s virtual assistant, has a puzzling new glitch — one with significant moral and political overtones.

If you ask Siri to direct you to a Planned Parenthood, you get the results you’d expect. But if you ask for an abortion clinic more generally, Siri will not return any results, even if they’re available. In some cases, Siri will even return results for “crisis pregnancy centers” that counsel women against abortions.

[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":359079,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,mobile,","session":"B"}']

Similarly, if you simply say, “Siri, I need an abortion,” Siri will respond that there are no abortion clinics nearby, even if the opposite is true and even though Siri clearly understands your intent and language. And its response proves Siri knows the term “abortion clinic.”

Siri was able to tell us that there were four Planned Parenthood locations near our downtown San Francisco location. However, when we specifically asked for abortion information, we were told nothing was available.

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

We decided to test a range of related queries, starting with emergency contraception. Siri drew a natural-language-processing blank when it came to Plan B, the brand name for the commonly available emergency contraception pill. Apparently not having the data to interpret the phrase “Plan B” as a brand name, it returned other local businesses containing similar words or phrases.

When we asked for the product with a more general term, emergency contraception, Siri recommended nearby emergency rooms — irrelevant, but better than nothing, we suppose.

When we point-blank asked for “the morning-after pill,” as it is also commonly called, Siri replied with, “Ok,” and “Is that so?” but did not offer any retailers or directions. (Judgey much?)

Moving to the proactive side of the equation, Siri was able to tell us the location of drugstores where we could buy condoms. But when we asked for birth control pills, Siri said nothing was available nearby.

While it’s not in our purview to offer bald-faced speculations on the reasons for these discrepancies, VentureBeat CTO Chris Peri’s professional opinion is that the supposed glitch is actually “purposeful programming.”

Peri elaborated, “Given how well Siri interprets other requests, and that Google and Bing will give you the proper responses when doing a search, and [that Siri] offers [anti-abortion] CPC sites… this has to have been something placed in the code or taught to Siri by someone(s). If this is the case, then we have a problem here.”

[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":359079,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,mobile,","session":"B"}']

While Apple has been known to hand down judgements on moral issues such as pornography, we’re not certain Siri is taking any sides on a particular moral battleground. After all, it’ll still find you an escort service if you ask for a prostitute.

We’ve reached out to Apple for clarification and will update you, dear readers, as soon as more information is available.

[The Abortioneers via Gizmodo]

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More