It’s taken over 4 years, but Siri has finally stopped sending people to adoption centers when they ask their iPhones where they can get an abortion.
Apple’s voice recognition service’s algorithm glitch has been around since 2011 at least, when journalists, researchers, civil rights activists and bloggers noticed some odd behavior.
Siri had been doing a bang-up job when it came to telling people where to buy dope (parks come in handy), what to do if nobody would have sex with them (escorts, anyone?), or where to hide a dead body (what’s your preference: dumps? Swamps? Metal foundries?).
But when it came to being asked where to find abortion services or contraceptives (depending on how you phrased it), Siri has until the past week or so been clueless, as numerous media outlets have reported.
A Fast Company investigation showed that Siri’s drawn a blank in multiple cities, or, worse, has referred women to so-called “pregnancy crisis” centers – notorious for trying to talk women out of abortions – or to adoption centers, all the while ignoring nearby facilities such as Planned Parenthood.
The publication reports that things have improved recently: in the week prior to publishing its findings, Christina Farr says the publication replicated earlier searches and found that they received “a more comprehensive list of Planned Parenthood facilities and other abortion providers.”
Adoption clinics did continue to pop up, but “near the bottom of the list,” Farr notes.
Apple is evidently improving its search algorithm at long last. But in the months and years before, this all had became known as “abortiongate.”
“Siri-gate” was another name for what might well have been just a technological hiccup, but one that insinuated a pro-life political bias.
Apple CEO Tim Cook has been aware of the issue since the beginning.
Nancy Keenan, president of NARAL Pro-Choice America Foundation, wrote a letter to Cook to express her concerns that Siri wasn’t giving women accurate answers when they asked about finding birth control or obtaining abortion care.
His response: it’s not intentional.
It’s just a glitch, he said:
Our customers use Siri to find out all types of information and while it can find a lot, it doesn’t always find what you want. These are not intentional omissions meant to offend anyone, it simply means that as we bring Siri from beta to a final product, we find places where we can do better and we will in the coming weeks.
Those “coming weeks” turned into years.
A researcher at University of California San Francisco, Alexis Hoffman, tested Apple Maps and Siri across the country in recent months, including in San Francisco, Kansas, Chicago, New York, Philadelphia, and San Diego.
You can see the differences in search results captured in screenshots that she and researchers from a nonprofit called Sea Change Program sent to Fast Company: the before and after results show that Apple has succeeded in fixing the glitch.
The researchers had, on 18 November 2015, sent a letter to Cook about rectifying “Siri’s troubling lack of awareness of abortion providers.”
Fast Company excerpted the letter:
Apple’s reputation for creating products that are easy to use and understand is well deserved. However, the transformation of Siri’s lack of knowledge of abortion providers to Siri’s anti-choice suggestions is alarming, and contributes to the stigma surrounding abortion care in our country. Despite the commonness of abortion, as nearly one-in-three women will have an abortion by the age of 45, women are confronted on a daily basis with society’s shame-based messaging that having an abortion is morally wrong and unacceptable.
Women continue to be bullied, shamed, and marginalized for seeking an abortion, which can lead to isolation and silence. It is in these times of isolation, when women are more likely to turn to your product to locate the health care they need, that Siri’s misdirection to adoption agencies and nurseries is all the more undermining, implying women do not know what is best for themselves.
Siri’s search results had changed before the publication had a chance to contact Apple.
Fast Company talked to search experts who see nothing nefarious or intentional in any of this.
Rather, it speaks to Apple’s simply not being Google, said Sean Gourley, a data scientist and learning algorithms expert based in Silicon Valley.
My hunch is that this isn’t political at all. Even now, Apple is not a search company, unlike Google, and its knowledge base is very different.
Siri relies on data pulled from resources such as Yelp and Foursquare, and such third parties in turn rely on how services and businesses label themselves.
That labeling is limited, and a service such as Yelp can’t see past those limitations, as Search Engine Land suggested when abortiongate first arose:
Siri’s not finding abortion clinics because Planned Parenthood and other places that perform abortions don’t call themselves that, not in their names, nor have they been associated with a category for that. That’s the best guess I have in this.
And as Search Engine Land’s Danny Sullivan noted, there have been a lot of things that Siri – a smart meta search engine that tries to search for things even though you might not have said the exact words needed to perform your search – have stumbled over, above and beyond abortion clinics.
For example, Siri’s had problems deciphering phrases such as “tool store,” or what we humans would know as “hardware store.”
It boils down to what words Siri’s learned to put into context with other words.
Change a known word by just a single letter and you throw a monkey wrench into Siri’s word-crunching, to simplify what Sullivan explains more fully in his article.
The long-lived saga points to how even an unintentional glitch can have profound repercussions and create a great deal of negative publicity for a technology company that would set itself up as a search giant.
To quote Sullivan again:
Apple is learning for the first time what it’s like to run a search engine. People hold you accountable for everything, even if the information isn’t even from your own database.
Google is a battle-scarred veteran in these matters. Why does an anti-Jewish site show up in response for a search on “Jew?” Why did President George W Bush’s official biography rank for miserable failure?
Those are all good questions, Mr. Sullivan.
Should we ask them of Siri?
She seems to be getting a bit smarter of late.
Image of Siri courtesy of Hadrian / Shutterstock.com
George
Crisis pregnancy centers do a lot of good. While there have been some that have used deceptive practices, most do not. They are a place for women to turn, without judgment, when they have nowhere else to go. And they do not charge for their services. Please use more care when painting with such broad brushes.
Damon Schultz
“Fast Company talked to search experts who see nothing nefarious or intentional in any of this…”
No, but it is a reminder that algorithms are not value-neutral, they embody the values of their creators. As we move towards a data and algorithm dominated existence this needs to be understood much, much better.
Paul Macdonald
“These are not intentional omissions…”
Can I say bullshit?
If not, then total garbage.
Greg
I cant see how it wasn’t intentional.
Paul Macdonald
That was indeed the point I was trying to make. My Australian forthrightness, combined with utter disbelief at the vague excuses for the length of time it took to fix the issue, perhaps obscured what I was trying to say.
Mark Stockley
It may have been intentional but any system as large and complex as one trying to turn the enormous variations in spoke English (and unspoken intentions) and map them on to almost unfathomable quantities and combinations of indexed search data is going to have some huge and hairy edge cases.
If Siri’s *only* job was to convert spoken English into finding the nearest abortion centre then this would rank as a deeply suspicious failure. But that isn’t it’s job, it’s job is to cope with any and all possible spoken queries of which these are a vanishing minority.
So the question is – how many similar problems does Siri have that go unreported because they don’t hit upon such a culturally sensitive issue?
If the answer is none then this is deeply suspicious and if the answer is thousands or tens of thousands then it isn’t.
Paul Macdonald
Hmm. OK, some good points.
Thanks for replying.
Just_Me
“…notorious for trying to talk women out of abortions…” I’m not sure that ‘notorious’ is the right word.
Steve
And if it is the right word, it could just as well be said that Planned Parenthood is notorious for trying to talk women into abortions. But that part was left out.
herzco
Yes but…four years???
SubSurge
“Or worse”… Lisa, disappointing journalism. When writing about highly sensitive issues that have both political and moral sensitivity, one should keep their bias out of their work.
Concerned_techie
Lisa Vaas, this article is complete garbage. For the sake of the integrity of this site, in the future, please find a way to keep your personal beliefs to yourself. There is to much real news in the info sec world to waste time on soapboxes.
Steve
Please explain how this has ANYTHING to do with security. It appears to me that it was yet another example of Lisa Vaas promoting one of her pet social causes – something she tends to do all too frequently, but at least most of the time she finds some way to tie it in with the theme of Naked Security.
Mark Stockley
Naked Security provides articles covering news, opinion, advice and research in the areas of computer security and computer privacy.
Systems like Siri and Google are central to our lives and the way we get things done and the integrity of those systems, systems where the unspoken contract with the user is that they’ll get accurate results based on relevance alone, is important.
It’s why we write about sockpupetting on Wikipedia for example.
In this case the integrity of the results provided by Siri are in question and there is a whiff of suspicion that it was a deliberate act. The computer security questions it begs are; was it deliberate? If it was, how many others are there? Was it a corporate decision or an act of sabotage by a rogue employee or department?
Paul Ducklin
Do you really think that there is a “whiff of suspicion”, or do you think that there might be a whiff of suspicion about there being a whiff of suspicion :-)
The science I can’t find in this article is how often real people who are actually facing the dilemma – in many cases, I think “crisis” is a reasonable word – of an unplanned or unwanted pregnancy actually grab their iPhones and say “Hey, Siri, find me an abortion centre.” And, of those that do, how many end up influenced by the search results that this article refers to as “or worse.” And of those that do end up influenced “or worse”, how many regret the decision they ultimately made?
Perhaps the real connection here with privacy and security has nothing to do with the undeniably contentious issue of abortion?
Perhaps it is that you can make a search engine feed your own agenda by getting it to produce results that your viewopint considers contentious, and then use those results to fight a sort-of circular battle…if so, the security message would seem clear: we need to apply careful judgement not only when we interpret the results we get back from a search engine, but also when other people use their interpretation to try to influence us.
(And maybe, just maybe, when we are researching something really important – something that is politically, morally, personally or socially charged for whatever reason – it might be better to take a more cautious, planned and structured approach than just saying, “Hey, Siri” or “OK, Google.”)
Actually, I see you addressed this in another comment – is this an issue because it was behaviour that was deliberately searched out and turned into an issue, or an issue because it was statistically significantly unusual behaviour.
KEN
I’M SIRI…..I DO NOT UNDERSTAND ‘BABY KILLER'”