Apple is finally fixing Siri’s abortion ‘glitch’


Maps hasn't been as helpful as it should be for some searches.
Photo: Apple

Apple is finally correcting an issue with Siri that it has known about since at least 2011.

The problem has appeared when users ask the virtual assistant to show nearby facilities that offer abortion services. For years, the results have directed people toward adoption centers, which is kind of the opposite of what they were looking for.

Apple says it’s made improvements to the search algorithm since it first identified the problem, but some users are still getting the undesired suggestions.

“We came into this because it creates a stigma,” Sea Change Program executive Lauren Himiak told TechCrunch. “To have that [search result] in your face is inexcusable.” Himiak told TechCrunch. “We have women all over the country being bullied and shamed and to be redirected to an adoption center instead disregards women’s choices.”

Sea Change is a non-profit organization that combats “abortion stigma” to promote women’s reproductive rights. Himiak has been reporting the Siri and Apple Maps issue to Apple head Tim Cook and the company’s PR team since she discovered it.

The New York Times reported this problem in November 2011, a month after Siri’s debut on the iPhone 4S. At the time, an Apple representative said that the bad search results were “not intentional omissions meant to offend anyone” and said developers were working on improving the software.

It has shown some progress. A quick check on my phone as of this writing did return three local Planned Parenthood locations, but the top result was the maternity wing of a nearby hospital. I assume Siri put that one first because it is the closest to me, but it was still not what I was looking for.

TechCrunch also reports improvement with the search over the past couple of days. But an Apple spokesperson told that site that it was natural software improvement over time and not public pressure that is creating the more relevant and helpful suggestions.

This problem is not a minor inconvenience like looking for McDonald’s and only getting results for Burger King. The debate about abortion and women’s rights, and the inherent difficulty of the decision that leads to the search happening in the first place, make it hard to separate what is likely a simple error from the politics that surround it. And we’re glad people are having better experiences now, but we wonder why it took so long to reach even this point.

Deals of the Day

  • justmewhoelse

    The issue with Siri that Apple should fix is – make it work more than 5% of the time. Asking Siri the weather and setting a 3 minute timer for cooking eggs (with an oh so cute Siri response to watch those eggs) works great – everything after that – not so much. Embarrassing.

    • dcj001

      Maybe you have not been phrasing your instructions correctly.

  • David Kaplan 

    Who would use Siri nonchalantly for an abortion… I would think there’s a category of things that you could physically search for.

    • RaycerX

      The Apple should simply say, “We won’t have Siri provide that kind of information.” Pretending it’s an algorithm problem is ridiculous.

  • Alex

    Abortion kills an innocent human being. No amount of campaigning will make this any less so.

    • RaycerX

      So do drone strikes. What’s that have to do with being able to look up information for services that are legal.

      • Alex

        Drone strikes take place during war. ‘Drone strikes’ is no more a justification for the killing of innocent unborn children than it is for the killing of innocent 3 year old children.

      • RaycerX

        Siri is a glorified search engine with a personality. It should be able to find whatever information you ask it; especially legal information. Whether you agree with abortion or not is beside the point.

  • Adrien

    People asking Siri for an abortion clinic is just wow. Nothing amazes me anymore

    • RaycerX

      It’s how a lot of people search for things these days. It’s called technology. It may not be how you or I would search for something, but that’s the way many do it these days.

  • Peter O

    If it’s about choices, don’t get mad when Siri offers an option as a starting point. It’s what a real personal assistant would do as well. It’s not just a search engine. If you must have a search engine response, and don’t want to type, dictate it into the search bar. Oh, so entitled!

    • RaycerX

      So if I ask Siri for restaurants and it comes back with fitness centers would that be okay? If someone wants to dictate to Siri “Where is the nearest abortion clinic” or “where is the closest Planned Parenthood” then it should have no problem providing that information. Siri isn’t supposed to have a conscious one way or the other.

      • Peter O

        Abortion is not like eating a sandwich. The closer analogy is really suicide. So what would you expect if someone asked Siri the best way to kill oneself or To kill you?

      • RaycerX

        If someone is looking for a location Siri should be able to provide that. It’s not about ethics or beliefs. On the other hand, if you ask Siri to “Search the web for best ways to kill yourself” then it should actually be able to provide that information just like Google could. Siri (for the most part) is just a search engine with a personality. If it’s programmed right it should be able to give you most any information you’d like if it’s available on the web or in a database someplace.

      • Peter O

        Your argument would be right if your assumptions were correct. i.e. that Siri is just a search engine. That’s not what Siri has proven to be. Siri’s results have proven not to be equivalent to a google search dictated. Such search results come from using the dictation function on a google bar. If you say to Siri, “I am bored”, you get a different kind of result than if you dictated that on Google search.
        Siri uses search functionality that is similar to what web crawlers use. Self-driving cars use features that are similar to what mapping apps provide but would not drive into a river because (like Siri), it’s ‘personality’ overlay knows that to be contrary to a normal human choice. You get the safeguard by default. To keep driving, you will need a manual override.
        You may have turned the definition of Siri inside out: It’s not “just a search engine with personality”. It is a personality AI that uses search features as one of its ways modus operandi.

  • onlymiah

    No. If it’s about “women’s choices”… adoption *is* a CHOICE. Notice the ‘s’ on choices.

    • Paul

      But the query wasn’t “what are women’s choices”, was it? The search request was “where can abortion services be found?”. And since those services are legal, Apple should provide accurate search results as requested and leave their politics out of it.

  • Bryan Karlan

    I’ll never cease to be amazed at how a liberal can spin evil into good.

  • PMB01

    No sympathy whatsoever for these people. If you have to use Siri to find a place to have an abortion, have your vagina removed also cause you don’t deserve to have kids ever.