Author: has written 5268 posts for this blog.

Jill has been blogging for Feministe since 2005.
Return to: Homepage | Blog Index

234 Responses

  1. Seth Eag
    Seth Eag December 1, 2011 at 12:27 pm |

    Women’s sex and reproductive needs: there’s not an app for that.

  2. aRealPerson
    aRealPerson December 1, 2011 at 12:38 pm |

    Can I just say, the my name is actually Siri and….for real Apple?!?!

  3. Dominique
    Dominique December 1, 2011 at 12:38 pm |

    So…. do men have to go to their local pet store for some pussy too?

  4. Angel H.
    Angel H. December 1, 2011 at 12:38 pm |

    Jill: My girlfriend is pregnant. What do I do?
    Siri: First, do no harm.

    O_o

    I don’t even know where to begin unpacking this one.

  5. Hobbes
    Hobbes December 1, 2011 at 12:39 pm |

    Jill: I’m pregnant. What do I do?
    Siri: Are you?

    Jill: My girlfriend is pregnant. What do I do?
    Siri: Consider your alternatives.

    This suggests to me that the programmers wrote this for men. Men (well, cis-men) don’t get pregnant, so snark is a natural way to respond to a dude saying “I’m pregnant”. The fact that women would be using this too probably didn’t even cross their minds.

    The problem with programming – and I worked in the industry for a while, so this is actually something that people had to constantly be aware of – is that it’s very difficult to remember that your user is not you. I think “your user is not you” was quoted at every design meeting I ever went to, and usually multiple times.

  6. Claire
    Claire December 1, 2011 at 12:42 pm |

    Maybe not a malicious omission, but my eyebrows did raise at some of the Siri responses to “I’m pregnant. What do I do?” Veiled agenda?

  7. Kristen J.
    Kristen J. December 1, 2011 at 12:43 pm |

    Charming. I’m curious as to what happens if you ask for a women’s health clinic or PP. Anyone tried?

  8. groggette
    groggette December 1, 2011 at 12:46 pm |

    @Kristen J. I remember in one of the stories about this (don’t remember which, maybe one of the linked ones?) If you ask specifically for PP you’ll get a legit response.

  9. norbizness
    norbizness December 1, 2011 at 12:59 pm |

    THank goodness my Luddite generation still has Eliza, the original AI counselor.

    You: Where can I get an abortion?

    Eliza: Do you want to be able to get an abortion?

    It’s like she really cares.

  10. Kristen J.
    Kristen J. December 1, 2011 at 1:07 pm |

    Huh…well by contrast when I ask my android phone to direct me to an abortion PP is the first result on the list.

  11. Mike
    Mike December 1, 2011 at 1:10 pm |

    I don’t have a recent iPhone, but the “I’m pregnant; what do I do?” questions don’t strike me as misogynistic. Does Siri give the same responses to other open-ended questions that end with “What do I do?” The two responses given sound like they’re default answers to any sort of question. In fact, I’m almost certain, but admittedly haven’t confirmed, that both those answers are in Brian Eno’s classic “Oblique Strategies” card deck. Admittedly, they sound much worse in this context. (And if I’m wrong about those responses occurring in other situations, please consider this comment already retracted.)

    The quality of its responses depend on the quality of its training data. I’m not at all surprised that information on obtaining abortions and contraception wasn’t in the data they harvested or bought; I am slightly surprised that they apparently didn’t test for it and I think that’s the strongest indicator of the monoculture of the developers. However, I’m pretty sure (again, haven’t researched it, this is conjecture) that Siri is using an online learning algorithm and its knowledge should improve as more users ask it questions. If it gets more information on what’s considered a successful search result, it should be able to provide better data (downside: this is vulnerable to Google Bombing-like techniques).

    So, I’m torn. On the one hand, the developers can’t predict every possible use case, and can’t necessarily provide adequate initial training data to guarantee that it’ll give good results in the first few months of usage. As a developer myself I have sympathy for that situation. On the other hand, access to abortion and contraception is pretty important, it’s a major social issue, and reveals a massive blind spot on the part of the developers. (Actually, I’m wondering: might this blind spot might be more about class than gender? With components of both, of course.) The good news is that the resulting bad press should ensure that this will hopefully never happen again, which will become increasingly important as more consumer-facing technologies use machine learning algorithms.

  12. iremo
    iremo December 1, 2011 at 1:13 pm |

    The problem with “find an abortion clinic” seems to be different than the other problems- somebody at Apple clearly attempted to give it the ability to find abortion clinics and didn’t do a very good job, nor did they sufficiently test it to make sure it worked properly. Otherwise, it would give one of the random stock responses of confusion instead of specifically “I couldn’t find any abortion clinics near you.” Whereas with the rape questions or contraceptive questions, it wasn’t programmed with that at all. So finding abortion clinics is “didn’t do it right” and the others are “didn’t do it all”

    Although, what were you hoping for with the horny question, even in a feminine voice, aside from an escort service?

  13. Erin
    Erin December 1, 2011 at 1:15 pm |

    The issue isn’t so much Apple as where Apple gets the information that Siri provides. Apple doesn’t generate their data.

    Here’s the real issue – most businesses work really hard to advertise themselves. Abortion clinics do very little, if any, advertising – in fact, because of the pro-life attempts to put them out of business they often try to hide.

    Net result, it’s a lot easier to find escort services in your area than abortion clinics. That’s nothing to do with Apple, and everything to do with the pro-life movement. If you can’t find find a location by web search, you won’t find the data in Google or Siri.

    But you get a lot more publicity complaining about Apple than complaining about Google’s search results. People understand where Google gets its data. People don’t understand that Siri is in the same position.

  14. Nicole
    Nicole December 1, 2011 at 1:22 pm |

    Siri also couldn’t help me find a place for an HIV test. She said she couldn’t find an HIV clinic and I work in the same building as one.

  15. Erin
    Erin December 1, 2011 at 1:24 pm |

    Ialsojusttried,“I’mhurt,whatdoIdo?”andSiriresponds,“Ifound14hospitalsfairlyclosetoyou.”

    I’m sorry, but if I’m hurt I really don’t think I should to ask my phone what I should do – I just dial 911 if it’s serious, my doctor if necessary, my husband otherwise.

  16. Scott Wiebe
    Scott Wiebe December 1, 2011 at 1:26 pm |

    I know you shouldn’t attribute to malice what could be incompetence but I think you are being far too generous in your assumption that women were just overlooked by the programmers. I doubt smart phones contain their own tables of information or topics: when you say ‘I need xxx’ Siri just passes xxx along to google and gives you some top results. Go to google and manually search for abortion, birth control or rape crisis centres and google does find results. Lots of them. Yet somehow Siri can’t find them.

    If they DID create a big ‘ole list of things to pass to google but just forgot about women’s issues then how did anti-choice crisis centres end up in there? If it really does have a subject list then the devs must have thought of women’s reproduction issues when they put in the anti-choice options… and then they forgot about women’s reproductive health? I don’t buy it. Also, if Siri fails to find information on anything else, then Siri offers a link to google… unless it’s a women’s issue.

    I can’t accept the premise that this is because Siri has a long list of all the things it would forward to an external search engine and they forgot a few things – I think it’s far more likely to have a short exclude list. I really don’t believe this was an oversight, I believe someone at Apple made the choice to code Siri with blocks on subjects they don’t like. I hope I’m wrong, I hope it’s just an oversight but I don’t think so. I can’t conceive of any way a programmer could create something which takes what you say and searches the internet for it or offers you a link to google but does not do that on a short list of topics. Not without intentionally putting in blocks on those topics.

  17. Mike
    Mike December 1, 2011 at 1:28 pm |

    Jill: Theanswertothatisinthepost.Whenyousay,“I’mpregnant,whatdoIdo?”Sirisayssomethingtotallyincoherent.Whenyousay,“Mygirlfriendispregnant,whatdoIdo?”Sirigivesaseriesof(vaguelyanti-abortion)answers.Sothereisnotadefaultanswertoaquestionendingin“WhatdoIdo?”

    Ialsojusttried,“I’mhurt,whatdoIdo?”andSiriresponds,“Ifound14hospitalsfairlyclosetoyou.”

    Sorry, I missed the first one (and should have reread before posting)–that seems like a false positive. It was confident in its answer when it’s obviously wrong to a human. The “I’m hurt” example seems like a true positive. For the other ones, I strongly suspect that it wasn’t confident in any particular answer and so it defaulted to a list of truisms that can be applied in almost any situation, even if they sort of sound anti-abortion in this context. (And I was wrong about them being in Oblique Strategies; they don’t appear in this list: http://www.bbc.co.uk/dna/place-nireland/A635528)

    I suspect if you asked something completely nonsensical with a “What should I do?” afterwards, or perhaps something very complex, you might get the same sort of response.

  18. radsaq
    radsaq December 1, 2011 at 1:32 pm |

    As a programmer I feel obligated to point out that as a data-driven application (as Jill points out), it’s not necessarily an issue with “programmers.”

    The people in charge of massaging or transforming the data that is used by Siri probably aren’t the same people who actually write the software that processes the speech and sends the results. And these people may well have been operating under the instruction of any number of committees of managers. So saying anything about “the programmers” just feels wrong to me. :)

    (Not trying to take away any blame, just shift it upwards)

  19. preying mantis
    preying mantis December 1, 2011 at 1:32 pm |

    This seems really weird. Aren’t most answers to these questions supposed to be just trawled from a search engine? You’d think “I need an abortion” or whatever should just get you a list of clinics which provide it or have jury-rigged search results to look like they do, not confuse the hell out of it. How does the AI clock in on GLBT issues?

  20. Adam Starkey
    Adam Starkey December 1, 2011 at 1:35 pm |

    Scott Wiebe:
    If they DID create a big ‘ole list of things to pass to google but just forgot about women’s issues then how did anti-choice crisis centres end up in there?

    Completely irrelevant because Siri specifically does NOT search Google. It tries Yelp for businesses. If the search fails in Yelp, or if it is perceived as a general question, it tries Wolfram Alpha.

    If those sites don’t understand the question, or don’t have useful answers, neither does Siri.

  21. Echo Zen
    Echo Zen December 1, 2011 at 1:38 pm |

    Well, obviously women’s needs aren’t real needs — otherwise the programmers at Wolfram Alpha would have a done a better job of field-testing their app with actual women. You see that same attitude with pharmaceutical firms when they don’t bother conducting clinical trials with women, on the rationale that trials with men are “good enough.”

    And how predictable it is that the same anti-choicers crowing about how Siri blocks women from obtaining vital information about their sexual health don’t make a single peep about how Siri enables slutty men to service their sexual needs with Viagra and escort services.

  22. MH
    MH December 1, 2011 at 1:51 pm |

    Of course, the next level of issue is why all the programmers are male. I thought this article offered an interesting perspective: http://techcrunch.com/2011/11/19/racism-and-meritocracy/

    I think it would be totally legit to ask Siri programmers how Siri is able to send you to a CCP and doesn’t send you to an abortion clinic or a health center or hospital, or even an ob-gyn. It’s entirely possible that it’s searching for analogues of the word “pregnant” but it seems awfully religious-right.

  23. Kristen J.
    Kristen J. December 1, 2011 at 1:54 pm |

    Hmm…the blogoverse is apparently also reporting that while Siri recognizes viagra it doesn’t recognize any female contraceptives including EC. That’s even more telling IMO.

  24. midnightsky
    midnightsky December 1, 2011 at 1:59 pm |

    I’m guessing also that a lot of responses that Siri gives is because it’s reacting to more generic parts of the question like “what do I do?”. It doesn’t know what abortion means, so if you ask “I need an abortion. What do I do?” it’s only going to pick up on the end of that statement. Same for “I was raped.” “Is that so?”. It’s only looking at the “I was” part. Generic silly comment: “I was eating ice cream last night!” “Is that so?” It doesn’t matter what’s after “I was.”

    The programmers made errors of omission, not errors of “I put stupid crap in Siri’s responses for kicks.” They simply left out things for whatever reason, whether it was because they didn’t want to draw a lot of complaining over “Siri endorses abortion,” or because they weren’t thinking about women, or whatever. Some of Siri’s particularly bad-seeming responses are just failures to comprehend the question.

  25. Kylie
    Kylie December 1, 2011 at 2:02 pm |

    That “I’ve been raped” is incomprehensible REALLY pisses me off. Have any of you contacted Apple? Has Jill? Is Siri available in other languages? Does she provide French women with reproductive care? What about the queer community? Does she know all the gay bars in town?

  26. EG
    EG December 1, 2011 at 2:05 pm |

    midnightsky: They simply left out things for whatever reason, whether it was because they didn’t want to draw a lot of complaining over “Siri endorses abortion,” or because they weren’t thinking about women, or whatever.

    Yes, that’s the problem. That’s what we’re complaining about, that the programmers weren’t thinking about half the goddamn population.

  27. La Lubu
    La Lubu December 1, 2011 at 2:12 pm |

    *dead* And thanks, btw, for the hamster-in-the-ass part, because now everyone wants to know why I’m cracking up and have tears at the corners of my eyes. And I’m just getting over a cold, so this laughing is setting off coughing spells, too.

    This….is piss-poor marketing at it’s best. Not lookin’ good here, Apple. Certainly not giving me a reason to upgrade from my non-iPhone. I agree that it isn’t necessarily maliciousness towards women (but seriously…blowjobs get an escort reference, eating pussy get *pet stores*?? The fuck? Not even a good singles bar? Do these male programmers ever get laid???!)….but merely ignorance. Still, I don’y want to shuttle any of my money, especially in this economy, to a company that doesn’t even recognize me as a potential customer. Fuck you, Apple.

  28. anna
    anna December 1, 2011 at 2:12 pm |

    Here’s a petition by the ACLU asking Apple to get Siri to provide info about where people can get birth control and abortions: https://secure.aclu.org/site/SPageServer?pagename=111130_apple_abortion

  29. Zippa
    Zippa December 1, 2011 at 2:16 pm |

    Siri is also a Wolfram-Alpha programmed product. W-A is NOT a woman-friendly company. It’s located in an extremely conservative area and most of its programmers are likewise conservative–I’ve known a fair few of them and also know that their recruiting process isn’t going to allow for a lot of liberal attitudes coming in. The minute I discovered that Siri was programmed by W-A, my willingness to attribute this to oversight waned dramatically. I’m willing to be that at least part of the programming team would have been in favor of doing it deliberately.

  30. Lolagirl
    Lolagirl December 1, 2011 at 2:27 pm |

    Frankly, I don’t understand why so many comentators and news types are seemingly bending over backwards to NOT impute misogynist intent on the part of Apple or its employees wrt Siri.

    Clearly, Apple and its programmers intended to make Siri an “edgy” product by having it offer such pithy responses to certain questions. I don’t see why that edginess mission is necessarily divorced from Apple still trying to avoid courting actual controversy by say offering a listing of the 10 closest abortion clinics or pharmacies that carry Plan B. I don’t doubt that the people at Apple may have assumed that their market is largely made up of men and thus programmed it around the perceived interests of that market. But that doesn’t make their intentions any less sexist or even misogynistic.

  31. Jawnita
    Jawnita December 1, 2011 at 2:40 pm |

    Siri specifically does NOT search Google. It tries Yelp for businesses. If the search fails in Yelp, or if it is perceived as a general question, it tries Wolfram Alpha.

    Interesting! I’d say that pretty much explains the abortion clinic response: I went to Yelp and typed in “abortion.” It autocompleted to “abortion clinic” (so that explains why Siri knows that “abortion”=”abortion clinic”) and then didn’t have any results. I suppose clinics aren’t the kinds of places to have Yelp pages. And likewise, I suppose crisis pregnancy centers are the kinds of places to have Yelp pages.

    So I’m pretty comfortable putting the blame on 1) Yelp, for not actually being a comprehensive place listing, and 2) Apple, for treating it like it is.

  32. LotusBen
    LotusBen December 1, 2011 at 2:50 pm |

    This reminds me of several years ago when I had a Cricket cell phone and noticed some flaws in its predictive text fuction. You know, where the cell phone tries to guess what word you are typing in within a text to help you out. The phone was able to guess some pretty impressive words like “interposition.” But it couldn’t guess swear words–”fuck,” “shit,” etc. It also couldn’t guess any word having to do with sexuality, whether “penis” or “vagina.” It was like it was designed by Jerry Falwell.

  33. Jawnita
    Jawnita December 1, 2011 at 2:54 pm |

    LotusBen:
    The phone was able to guess some pretty impressive words like “interposition.” But it couldn’t guess swear words–”fuck,” “shit,” etc.

    http://www.youtube.com/watch?v=6hcoT6yxFoU :)

  34. Lee
    Lee December 1, 2011 at 2:58 pm |

    Honestly this doesn’t really sound like a lack of consideration to me, like this part:

    Jill: I need an abortion.
    Siri: I don’t see any abortion clinics. Sorry about that.

    You didn’t say ‘abortion clinic’, you said ‘abortion’, and Siri internally connected that to a term it specifically knew about – ‘abortion clinics’. It’s not that Siri had no idea what you were talking about because nobody bothered to include that category in its database, it knew exactly what you were referring to – it’s just that no information was actually added, so you get the equivalent of a shrug and a ‘don’t ask me’.

    I guess the question is was it left empty intentionally, or as the result of an error? Given the way search engines work it seems strange that it wouldn’t have pulled in a list of abortion clinics automatically, this information isn’t added by hand. I mean it works on Yelp – there’s no reason a known category of business or service shouldn’t work in a query and automatically return a list of results.

    Same goes for the pregnancy questions – “do no harm” and “always do the right thing” really don’t sound like generic ‘I don’t understand your question’ responses, they sound specifically like anti-choice sentiments. It sounds like Siri knows exactly what pregnancy is, and its internal data for that category is a set of “do no harm” or “maybe you could consider adoption” responses. That wouldn’t be oversight, that would be considering that someone might ask about pregnancy and deciding to provide these specific responses.

  35. Rhoanna
    Rhoanna December 1, 2011 at 3:00 pm |

    Jawnita: Interesting! I’d say that pretty much explains the abortion clinic response: I went to Yelp and typed in “abortion.” It autocompleted to “abortion clinic” (so that explains why Siri knows that “abortion”=”abortion clinic”) and then didn’t have any results. I suppose clinics aren’t the kinds of places to have Yelp pages. And likewise, I suppose crisis pregnancy centers are the kinds of places to have Yelp pages.

    So I’m pretty comfortable putting the blame on 1) Yelp, for not actually being a comprehensive place listing, and 2) Apple, for treating it like it is.

    That’s odd. When I go to Yelp and search for either abortion or abortion clinics near NYC (which it auto-fills-in for me), it gives me a number of results, ranging from Planned Parenthood, to random things that happen to have “abortion” in one of the reviews (spas, Chinese restaurants, etc). A few other cities (DC, SF, LA, Seattle, Dallas) give similar results, altho varying in number & quality. So I don’t know what’s up, unless there actually are no abortion clinics near you.

  36. Lee
    Lee December 1, 2011 at 3:12 pm |

    Jawnita: Interesting!I’dsaythatprettymuchexplainstheabortionclinicresponse:IwenttoYelpandtypedin“abortion.”Itautocompletedto“abortionclinic”(sothatexplainswhySiriknowsthat“abortion”=”abortionclinic”)andthendidn’thaveanyresults.Isupposeclinicsaren’tthekindsofplacestohaveYelppages.Andlikewise,IsupposecrisispregnancycentersarethekindsofplacestohaveYelppages.

    SoI’mprettycomfortableputtingtheblameon1)Yelp,fornotactuallybeingacomprehensiveplacelisting,and2)Apple,fortreatingitlikeitis.

    I went to Yelp’s Brooklyn page and did the same, and the first result was PP (as well as a home-abortion clinic and four unrelated results), so if it really does use Yelp then there’s a pretty big question here – why is it failing to provide the results that Yelp returns for this particular query, and who decided that this particular query should fail? The pregnancy responses pretty much confirm this isn’t some kind of bug that just happens to reject these Yelp results among others

  37. Mztress
    Mztress December 1, 2011 at 3:16 pm |

    I now have another valid reason not to give Apple my hard-earned money.

  38. james
    james December 1, 2011 at 3:18 pm |

    Yelp comes up with no abortion clinics when I enter ‘abortion clinic’ and my zip code.

    Neither does WolframAlpha. For ‘blow job’ Yelp brings up lots of businesses of varying types because people comment that the goods or services ‘blows’.

  39. Jawnita
    Jawnita December 1, 2011 at 3:18 pm |

    Rhoanna: So I don’t know what’s up, unless there actually are no abortion clinics near you.

    There definitely are clinics near me; I live near downtown, in a medium-large US city. Google will give me results, including the nearby Planned Parenthood as the second hit. But, as I said, Yelp does not. (This time I also checked for crisis pregnancy centers. Yelp gives me one, which is of course more than zero, but less than the actual number nearby.)

  40. Hairless Cat
    Hairless Cat December 1, 2011 at 3:19 pm |

    How do you know they didn’t do this to prevent crazy people from shooting up abortion clinics? Just saying.

  41. Jawnita
    Jawnita December 1, 2011 at 3:23 pm |

    Jill:
    I also just searched “abortion” on Yelp, and the first hit is Planned Parenthood (the third hit is Rising Dragon Chinese Restaurant and the fourth it Pepino Unique Hair Sylists, so, imperfect, but still).

    Actually, repeating this experiment for good measure, simply “abortion” (that is, refusing to let it autocomplete for me) nets me PP and several restaurants. So maybe overenthusiastic autocomplete is the problem.

    (Okay that’s enough of my comments in a row for now…)

  42. Hairless Cat
    Hairless Cat December 1, 2011 at 3:51 pm |

    Jill: …seriously?

    Stranger things can happen, right? Siri wasn’t “programmed” by people at Apple in the sense that it gives rote answer to your questions, the info is compiled form other sources, correct? Maybe abortion clinics intentionally don’t have a big presence online for safety purposes.

  43. Seth Eag
    Seth Eag December 1, 2011 at 3:54 pm |

    Actually, having just now read the link to Forbes, the other issue the writer brings up of Siri being, essentially, a “lady secretary” is kind of interesting too. I don’t have an iPhone, but I would assume from the writing that you can’t switch the gender of the voice unlike with, say, most GPS units, etc.

  44. Keeley
    Keeley December 1, 2011 at 3:59 pm |

    Seth Eag: Actually, having just now read the link to Forbes, the other issue the writer brings up of Siri being, essentially, a “lady secretary” is kind of interesting too. I don’t have an iPhone, but I would assume from the writing that you can’t switch the gender of the voice unlike with, say, most GPS units, etc.

    in north AMerica, Siri is permanently female, with no option switch. I believe that in the UK, it’s permanently male, (or possibly switchable – I can’t currently recall), but in general, the options change depending on where you are.

  45. Azalea
    Azalea December 1, 2011 at 4:04 pm |

    Siri was programmed by males right? Yeah. So this doesn’t surprise me at all. But I think if this got enough publicity there would be an upgrade coming that addresses that problem along with some article about how women with the iPhone4s are desperate for abortions and male prostitutes because you know, the *only* reason a woman would want an iPhone 4s after such an upgrade would be those seeking abortions and male escorts.

  46. Azalea
    Azalea December 1, 2011 at 4:06 pm |

    P.S There are several “male” voiced Siris ( I put male in quotation because I am uneasy witht eh concept of gendering voices).

  47. Raja
    Raja December 1, 2011 at 4:29 pm |

    Talking phones scare me. Sometimes I think we are wayyy into technology

  48. james
    james December 1, 2011 at 4:37 pm |

    I keep seeing the same search data sources being mentioned. The link below, from the Wayback Machine, shows who the search partners were on Oct 8, 2010. Quite a few.

    Link to partners on siri.com

  49. Thomas, BlogForChoice.com
    Thomas, BlogForChoice.com December 1, 2011 at 5:00 pm |

    Thanks for blogging on Siri, Jill.

    Although Siri is not the principal resource for women’s health care, it is important that the women who are using this–or any–application not be misled about their pregnancy-related options.

    Read more about Apple CEO Tim Cook’s response at:
    http://www.blogforchoice.com/archives/2011/12/apples-ceo-resp.html

  50. Daisy
    Daisy December 1, 2011 at 5:03 pm |

    I can’t exactly put my finger on why, but for some reason the “why should you even be asking that” responses (re: abortion, pregnancy, esp) are particularly depressing.

    Maybe it’s just that it’s yet another policing of women’s behavior (because you can never have too much!). Not only are we going to judge you on what you do with your body, but now we’re also going to judge you on how you choose to obtain information about it as well.

  51. J
    J December 1, 2011 at 5:17 pm |

    I don’t know how to feel about this. The 4S is my first smart phone, and I’ve found Siri to be a very useful, but not essential feature. Interestingly enough, I’d used it this morning on my way to work to call my pharmacy- to refill my bc prescription. It pulled up and dialed my pharmacy right away, and saved me about 30 seconds of googling or fumbling around for the number. I don’t rely on it, it’s just a nice thing to have when I’m on the go, or to be honest, when I’m just too lazy to type out a text or pull up a friend’s number.

    They certainly need to address it, but isn’t the whole point of beta versions to fix things like this?

  52. groggette
    groggette December 1, 2011 at 5:44 pm |

    J: They certainly need to address it, but isn’t the whole point of beta versions to fix things like this?

    Well, yes. But the point of calling it out (here and elsewhere) is so that Siri/Apple/the developers know this needs to be fixed. It also helps future developers to hopefully not forget half the population when designing the next great thing.

  53. Daisy
    Daisy December 1, 2011 at 6:01 pm |

    groggette: Well,yes.Butthepointofcallingitout(hereandelsewhere)issothatSiri/Apple/thedevelopersknowthisneedstobefixed.Italsohelpsfuturedeveloperstohopefullynotforgethalfthepopulationwhendesigningthenextgreatthing.

    Also, Apple didn’t present this as a “beta” version in their advertising.

    Further, “beta version” doesn’t really mean anything to the average user.

  54. groggette
    groggette December 1, 2011 at 6:11 pm |

    Daisy: Also, Apple didn’t present this as a “beta” version in their advertising.

    Good point. And people should be able to point out flaws/mistakes/just-plain-bad-business-decisions regardless of what stage thing X is at.

  55. orgostrich
    orgostrich December 1, 2011 at 6:16 pm |

    Hmm, this might simply be a (failed) attempt by Apple to avoid controversy. I’m sure if a 14 year old girl asked Siri where to get an abortion, and it actually directed her to one, Apple would get sued (or at least protested) by every pro-life group in the country for giving minors abortions. That doesn’t make it OK, but it might be self-preservation over misogyny.

  56. FS
    FS December 1, 2011 at 6:16 pm |

    Daisy: Also, Apple didn’t present this as a “beta” version in their advertising.

    Further, “beta version” doesn’t really mean anything to the average user.

    Except: a) they absolutely did present it as ‘beta’ (it’s not in their television commercials, but it is on all their other marketing materials, was stressed heavily during the product announcement, and there *is* a disclaimer at the end of the TV spot indicating that not all features may yet be fully available to everybody), and b) if most users haven’t figured out what ‘beta’ means by now (and really, they should have if they’ve used any Google products over the last decade, for example, as they stay in beta for years and are clearly marked as such), it’s up the people who write pieces like this to learn about what that means and then present that information to the public in a way they can understand, a task that journalist and bloggers have utterly failed at, because quite frankly, taking pot shots at Apple makes for better headlines than “Unfinished Software Doesn’t Work Quite Right Yet, Company Says They’re Working On It”.

  57. Sirkowski
    Sirkowski December 1, 2011 at 6:34 pm |

    Because talking to your iPhone is what you need to do when you want an abortion… e_e

  58. zuzu
    zuzu December 1, 2011 at 6:36 pm |

    orgostrich: Hmm, this might simply be a (failed) attempt by Apple to avoid controversy. I’m sure if a 14 year old girl asked Siri where to get an abortion, and it actually directed her to one, Apple would get sued (or at least protested) by every pro-life group in the country for giving minors abortions. That doesn’t make it OK, but it might be self-preservation over misogyny.

    They’ll tell a 14-year-old boy where to get a blowjob and some dick pills.

  59. KJ
    KJ December 1, 2011 at 6:43 pm |

    orgostrich: Hmm,thismightsimplybea(failed)attemptbyAppletoavoidcontroversy.I’msureifa14yearoldgirlaskedSiriwheretogetanabortion,anditactuallydirectedhertoone,Applewouldgetsued(oratleastprotested)byeverypro-lifegroupinthecountryforgivingminorsabortions.Thatdoesn’tmakeitOK,butitmightbeself-preservationovermisogyny.

    Um. Pretty sure that wouldn’t go so well for the pro-choice groups. It’s not illegal to tell someone where an abortion clinic is, and it’s not illegal for a 14-year-old to have an abortion (assuming, in some states, compliance with parental notification laws and the whatnot). Furthermore, under no legal standard that I’m aware of would providing someone information about places that perform abortions constitute “giving” someone an abortion. In short, while it may be possible that this was a conscious decision to avoid boycott-type fallout, I don’t think any legal department in the world would be concerned that pro-life groups (who wouldn’t have standing anyway) would have any kinds of grounds to sue.

  60. Jadey
    Jadey December 1, 2011 at 7:17 pm |

    orgostrich: That doesn’t make it OK, but it might be self-preservation over misogyny.

    It sure as hell makes it systemic, institutionalized misogyny, which is what the original argument of this post is.

  61. Fern Fedora
    Fern Fedora December 1, 2011 at 7:57 pm |

    OKAY my SECOND post is #69 and this is the one I sent in before that… I did not get a auto response to this: Please excuse if double posting.

    This set of messages could apply to the situation of The GREAT BARRIER Brief which is “hidden in plain site (sic)” at the website janesway dot net This was the name used when it got a NIH grant during Clinton admin then the next ex-prez YOU KNOW WHO demoted all contraception and disease prevention for women to below funding guidelines for the National Inst Health! No one would know this even if they were refused funding.

    Note the website is from 1999 scroll down to find links. FDA & NIH knew about this in 1987-1988 NIH grant finally in 1999 Newspaper media won’t touch it cause their advertising policy is no bad news about their advertisers. WHAT WOULD WORK in this case? I am the only entity (no corporation involved) still working on this….pt time while farming. On F’bk Fern Fedora or contact at website above subject Attention Fern

  62. Matt Simpson
    Matt Simpson December 1, 2011 at 7:58 pm |

    It cracks me up reading these messages. Most posters are totally clueless to technology and how Siri works. But everyone thinks there is some hidden agenda, some evil reason why this inanimate device does not hold the same views on the world they do, and can’t answer every question the way they want. Really? Is the world and Apple really out to get you? Siri is a step forward from past voice recognition and search engine technology, but it is still a machine. As pointed out by some posters, some of the “anti-female” responses to certain questions are clearly standard answers to question Siri does not fully understand or can provide an answer for. Because Apple has been clever and added some humor or other “human like” characteristics to make it more user friendly probably contributes to the less clever individuals thinking a electronic device is smarter than it really is. Think of Siri as an electronic Magic 8 ball. Siri, are most people dumber than their phone? ● It is certain
    ● It is decidedly so
    ● Without a doubt
    ● Yes – definitely
    ● You may rely on it
    ● As I see it, yes
    ● Most likely

  63. Anonymouse
    Anonymouse December 1, 2011 at 8:04 pm |

    One minor point: if you simply ask Siri “what do I do?” it gives “do no harm,” “do the right thing,” and “consider your alternatives” among other stock responses. So that seems to be just a generic answer. But. When I asked for an abortion clinic, it did not direct me to PP, but to two clinics, one in VA and one in PA, each at least 50 miles away and both with suspiciously pregnancy-crisis center-sounding names. Then I searched for PP and it gave me six results, three within 1-2 miles from me. So I though, OK, maybe PP doesn’t maket itself as an abortion clinic. So I asked for a “women’s reproductive healthcare” and it didn’t know what that was. That is fucked and needs to be corrected.

    Also, I live about a mile away from a major HIV clinic. When I asked for an HIV test center, it couldn’t find any. When I asked for a medical center, it was the first result on the list. Clearly some major tweaking is in order!

  64. FashionablyEvil
    FashionablyEvil December 1, 2011 at 8:07 pm |

    I also just searched “abortion” on Yelp…the fourth is Pepino Unique Hair Sylists, so, imperfect, but still).

    I dunno, “pepino” does mean cucumber in Spanish.

  65. Athenia
    Athenia December 1, 2011 at 8:13 pm |

    With all the coverage that this is receiving and all the different responses to different topics, I am absolutely beginning to think that this is malicious. To think that women don’t use their technology—you know, oops!–it absolutely ridiculous.

  66. Mjog
    Mjog December 1, 2011 at 8:14 pm |

    Jill: The problem is that sexism is so deeply-rooted in our society that the woman-related glitches weren’t noticed or fixed.

    This tl;dr should be highlighted, heavy-typed, blinking, somewhere up the top of this article. Nearly every programmer reading this story is going it initially think “It’s not malice, it’s incompetence” (I sure did), but this very neatly states why it’s actually still a problem.

    Also, while others have suggested that it’s not necessarily a programmers’ fault (the programming part is only one of a number of steps that go into producing software), all of the number of people involved should have noticed that the question/response tweaking has a male bias and done something about it — including the programmers.

  67. News about How To Eat Pussy issue #1 « News About How To Eat Pussy « NightFling.com Adult Hookup Sex Finder Services

    [...] what you eat. You'll have no incentive to figure out how to grow good carrots more… Siri: Total Misogynist. – Feministe (blog) – feministe.us 12/01/2011 Siri: Total Misogynist.Feministe (blog)But no matter how many ways [...]

  68. Kimberly Sue
    Kimberly Sue December 1, 2011 at 8:16 pm |

    I think another highly effective example would be to ask Siri for rape jokes, and then rape crisis services.
    I don’t have one, can’t test, but it’s a test I’d like to see the results of.

  69. Sheelzebub
    Sheelzebub December 1, 2011 at 8:34 pm |

    Sirkowski: Because talking to your iPhone is what you need to do when you want an abortion… e_e

    Well, apparently it’s the thing to do if you’re a man who wants to hire a sex worker for a blow job or you want Viagra. Odd how some folks have such a problem seeing the double-standard.

  70. Ashley
    Ashley December 1, 2011 at 8:46 pm |

    Zippa, I live in the town Wolfram is based in (and where I’m told Wolfram Alpha is) and I assure you, it’s not “very conservative.” It’s more conservative than Chicago, but much less conservative than many of the Chicago suburbs. We have both a PP and a private abortion provider, a gay bar, 3 independent hippie/organic grocery stores, a rather large liberal population, etc. etc. I’d bet we’re the most liberal area in downstate Illinois, and certainly consistently blue (the rest of the county means that our county leans red, but this city/town does not).

  71. überRegenbogen
    überRegenbogen December 1, 2011 at 9:09 pm |

    It’s entirely possible that Apple merely failed to properly vet the thing after they bought it. But, holy crap, they should have! This is a mighty glaring defect!

  72. ema
    ema December 1, 2011 at 9:38 pm |

    I thought the name Wolfram|Alpha sounded familiar. Back in 2009 they were asking for feedback so I obliged. I did a search on Plan B and told them that 1) it didn’t know what Plan B was, and 2) it returned COCs and Implants on a targeted ECP (Plan B) search. Their response:

    “We have received your feedback regarding Wolfram|Alpha. The issue you reported has been fixed and will appear on the live site with the next update.”

  73. Avida Quesada
    Avida Quesada December 1, 2011 at 9:45 pm |

    I believe they did think of women , but of other “kind of women”.

    By that I mean non feminist women.

    Men are the ones more likely to try to trick Siri into sexual questions and innuendo and publishing it

    That is until you start to think of feminist like us.

    The second part is that you don’t want to offed women. Offending men is standard practice (it will be taken as as joke).
    Offending women can be deadly. If you don’t trust me ask Dr. Lazar Greenfield,

    Finally is all the annoying Hilary and company “safe but rare” mentality. Abortion clinics try to sale themselves as “women health care centers”. It’s like they don’t feel proud to say what they really are.

    So you have:
    1. Crisis pregnancy centers trying to appear as abortion clinics (basically they are scams).
    2. Abortion clinics trying to appear as “we are so much into alternatives”, “we don’t do just abortions” etc.

    Which one do you expected a computer to peek?

    Ah just to avoid manipulation, when you ask Siri male clinics it will just return clinics.

    Women clinic is some how is ambiguous for Siri.

    Love,

    Avida

  74. Mr. Kristen J.
    Mr. Kristen J. December 1, 2011 at 10:09 pm |

    iremo: Although, what were you hoping for with the horny question, even in a feminine voice, aside from an escort service?

    It could vibrate.

  75. Zippa
    Zippa December 1, 2011 at 10:11 pm |

    Ashley:
    Zippa,IliveinthetownWolframisbasedin(andwhereI’mtoldWolframAlphais)andIassureyou,it’snot“veryconservative.”It’smoreconservativethanChicago,butmuchlessconservativethanmanyoftheChicagosuburbs.WehavebothaPPandaprivateabortionprovider,agaybar,3independenthippie/organicgrocerystores,aratherlargeliberalpopulation,etc.etc.I’dbetwe’rethemostliberalareaindownstateIllinois,andcertainlyconsistentlyblue(therestofthecountymeansthatourcountyleansred,butthiscity/towndoesnot).

    I’m also from the area, albeit about twenty minutes outside CU proper, and VERY CONSERVATIVE is precisely how I’d identify the non-university parts of the county.

  76. jennygadget
    jennygadget December 1, 2011 at 11:32 pm |

    “It also helps future developers to hopefully not forget half the population when designing the next great thing.”

    YES. And beta version or not, advertised as such or not, if it’s already gotten as far as beta testing and you forgot to include half the population? You just made shit-tons more work for yourself if you ever plan to not be an ass and see this for the problem it is and fix it. It is just so much easier to build stuff right the first time than it is to make really obvious and fundamental mistakes and have to repair them later. Mistakes will happen, sure. But the beta version should be well past the fundamental flaw phase.

    If they actually gave a shit about doing this right, the designers should have had some version of a typical adult woman as one of their user stories in the conceptual phase. Much less in the publicly available version – beta or otherwise.

  77. Tapetum
    Tapetum December 1, 2011 at 11:41 pm |

    Glah, so many responses (a couple here, and many, many in other threads on this topic), basically saying that if you need an abortion, why are you talking to your phone? Is that really the kind of decision you should be making so fast, etc.

    You know, there are people in the world whose primary link to the internet is through their phone. People who may know long before they ever get pregnant that they would get an abortion. (I’ve known that if I were to get pregnant, I would abort ASAP for the last eleven years.) Using an AI meant to do web-searches to do a web search is really not a stupid thing to do, regardless of topic. Yeesh.

  78. programerdude
    programerdude December 1, 2011 at 11:45 pm |

    jennygadget:
    “It also helps future developers to hopefully not forget half the population when designing the next great thing.”

    YES. And beta version or not, advertised as such or not, if it’s already gotten as far as beta testing and you forgot to include half the population? You just made shit-tons more work for yourself if you ever plan to not be an ass and see this for the problem it is and fix it. It is just so much easier to build stuff right the first time than it is to make really obvious and fundamental mistakes and have to repair them later. Mistakes will happen, sure. But the beta version should be well past the fundamental flaw phase.

    If they actually gave a shit about doing this right, the designers should have had some version of a typical adult woman as one of their user stories in the conceptual phase. Much less in the publicly available version – beta or otherwise.

    perhaps they did, but like Avida Quesada said, they weren’t a feminist women?

  79. jennygadget
    jennygadget December 1, 2011 at 11:59 pm |

    “perhaps they did, but like Avida Quesada said, they weren’t a feminist women?”

    I wasn’t aware that only feminist women used contraception.

  80. mermop
    mermop December 2, 2011 at 1:24 am |

    Angel H.: O_o

    Idon’tevenknowwheretobeginunpackingthisone.

    Re: these two:
    Jill: I’m pregnant. What do I do?
    Siri: Are you?

    Jill: My girlfriend is pregnant. What do I do?
    Siri: Consider your alternatives./ First, do no harm./ Always do the right thing.

    I’d say that Siri interprets these phrases without understanding pregnancy at all, so the ‘First, do no harm’ thing is far less sinister than it seems. I don’t have an iPhone, but it looks like Siri’s responses to ‘I’m pregnant. What do I do?’ are the same as any ‘I’m [x]‘ statement, and the system disregards the rest of the statement.
    It also looks like Siri has a set range of responses to ‘What do I do?’ or ‘What should I do?’, and disregards the rest of the ‘My girlfriend is pregnant. What do I do?’ question. It would be worth asking your iPhone a variety of questions ending with ‘what do I do?’ with no recognised keywords in order to verify this.

    Like people have pointed out, I still believe it’s a significant oversight to have left out pregnancy, contraception, abortion and so on from the system, but this is the first release of the program. I hope that Apple learns from this public backlash and incorporates them into the next release.

  81. You Siri-ous?
    You Siri-ous? December 2, 2011 at 1:51 am |

    Isn’t the point of a beta version to work out various kinks like this so that the alpha, or final, version will be all the better? Beta products are called beta for a reason – because, while they can be useful, they are imperfect / unfinished, not entirely stable, and constantly changing.

    It’s a worthy concern to highlight, but it’s being taken beyond that and used as an example of sexism against women. When Siri is being targeted at so many different demographics of people for a myriad of reasons, do you really believe it’s reasonable to expect it to work perfectly for everyone in its beta stage? It seems like these complaints are more suited for a finished product, which Siri is not.

  82. Echo Zen
    Echo Zen December 2, 2011 at 3:05 am |

    Nah, non-feminist women don’t avoid contraception — they just won’t admit to using it. That would be tantamount to admitting they enjoy sex, which everyone knows good girls aren’t supposed to like. That’s why misogynist politicians are confident they can ban contraception funding — not enough women will stand up for something virtually all women use in their lifetimes.

  83. Brian Schlosser
    Brian Schlosser December 2, 2011 at 4:02 am |

    I have very little problem thinking that some PR minded person at Apple made sure that Siri wouldn’t give usable information for certain searches, like abortion, thinking that it would stop controversies down the road. Sure, when called out on it, the CEO says “Oh, Siri is in beta, we will fix it”, but what else could he say?

  84. Brian Schlosser
    Brian Schlosser December 2, 2011 at 4:08 am |

    And to amplify that, I DO think that this underscores the underlying misogyny rampant in society in general, and the tech industry in particular. I have no doubt that most companies fear conservatives more than women in general, and feminists in specific. I wouldn’t be surprised if the “solution” Apple makes is to keep censoring the abortion and female health results, and also take out the escort service and viagra searches, too.

  85. John
    John December 2, 2011 at 4:27 am |

    I just find it astonishing – not in a good way – that that people have actually developed an app that responds to “I need a blowjob” and gives a list of local sex workers. I mean, really, WTF? It says a lot about the developers’ own weltschuaung. I dunno, maybe I just need to get out more. :(

  86. Roland
    Roland December 2, 2011 at 6:59 am |

    This has nothing to do with Sexism. At all.

    Siri is an example of a machine learning, voice recognition program.
    It is an AI, nothing more. It has no concept of gender. The programmers would have included coy answers BUT the problem therein lies with the fact that programming is a male-dominated sector of work. That’s not sexist, it just happens to be mostly male.

    Siri doesn’t store ANY ready to use data. Disconnect it from the web, and it loses all that functionality. Blame its data sources, not it.

    …Honestly, I find articles like this incredibly damaging to feminism, making the author look like an extremist feminist, which are mocked and laughed at by EVERYBODY because of their ridiculous agendas. Try to understand the technology, and more importantly, don’t be picking fights with your male counterparts over something completely asinine like an app that is still a gimmick not being complete.
    I support the feminist movement, etc. but picking fights over stupid things only steps back support from everybody apart from extremists.

    If you are really annoyed about Siri being focused towards men, learn to program, and get a career as a programmer.

    …And for the people asking about programmers getting some, you’re actually really accurate, lol. We don’t. We really don’t.

  87. Molly Rene
    Molly Rene December 2, 2011 at 8:18 am |

    Ken Fisher, founder of Ars Technica, thinks this is “petty.”

  88. Sandy
    Sandy December 2, 2011 at 8:36 am |

    Wow. One more reason not to buy any Apple products.

    Tapetum: You know, there are people in the world whose primary link to the internet is through their phone. … Using an AI meant to do web-searches to do a web search is really not a stupid thing to do, regardless of topic. Yeesh.

    I agree.

  89. Thuff
    Thuff December 2, 2011 at 8:44 am |

    I also just discovered that Google speech to text blocks out the word “raped” as if it were a curse word. It displays as “r****”. Lots of other words like penis and vagina aren’t blocked, but someone thought rape was a curse word and people’s eyes should be protected from the horrors of seeing the word.

  90. EG
    EG December 2, 2011 at 9:55 am |

    Roland: the problem therein lies with the fact that programming is a male-dominated sector of work. That’s not sexist, it just happens to be mostly male.

    See, it’s not sexism at play, it just so happens, by a remarkably strange coincidence or maybe an act of God, that men dominate the programming industry, and that those men never think about women’s needs. It’s not sexism of anything, just happenstance.

  91. Conuly
    Conuly December 2, 2011 at 10:00 am |

    Roland, do you know what the word “sexism” means?

    Having a male-dominated staff is sexism. Yes, even if you didn’t do it on purpose.

    Your male-dominated staff never considering the other half of the population is sexism. Yes, even if they’re really nice people.

    Men coming here to tell us why this isn’t sexism when it blatantly is, and when we’ve explained repeatedly why it is? That’s sexism. Even if you think you’re just smarter than us stupid chicks.

  92. Past my expiration date
    Past my expiration date December 2, 2011 at 10:10 am |

    No, no! It’s not a coincidence, let alone an act of God, that men dominate the programming industry. It’s because women just don’t want to learn how to program! Stupid, lazy women. <–not sexism

    (/snark of the day)

  93. Woomera
    Woomera December 2, 2011 at 10:23 am |

    What part of “beta software” are you people missing?

    Siri will only be as good as the available database, and abortion clinics are not exactly big advertisers.

    Also, with something as personal as abortion, a person’s first go to choice is the rudimentary *beta* AI program on a phone? Buh whu… wha??

    And if you have been attacked or are hurt, I know it’s a high stress situation, but do remember your phone can be used as, well, a *phone*. Hmm… anyone try “call 911″ with Siri? I can see that being useful if a person is so debilitated they can maybe get one or two taps in.

  94. John
    John December 2, 2011 at 10:29 am |

    Past my expiration date;

    What’s stopping them, then? There’s no law against it, is there? Same for many science-dominated disciplines (not medicine, though).
    I work in oil and gas engineering; female oil and gas engineers used to be as rare as unicorn shit. Although there younger ones coming through now, the ratio is still about 7:1 m/f. So why so few women entering the field? What puts them off? (Most of our people spend their lives in an office, working on an oil or gas platform offshore is entirely optional these days).

  95. Past my expiration date
    Past my expiration date December 2, 2011 at 10:36 am |

    @John — I already told you! Women are stupid and lazy. Sexism has nothing to do with it.

    (Ok, I guess my previous post didn’t end my snark of the day.)

    Or, you know, maybe you could ask some of your oil and gas engineering colleagues who are women.

  96. XtinaS
    XtinaS December 2, 2011 at 10:45 am |

    It’s comments like Woomera’s that make me so glad I updated killfile to work on various places.  “I shall now put together all of the most inane comments here into one comment, thus showing that I didn’t read the post, the comments, or the title of the blog itself!”

  97. EG
    EG December 2, 2011 at 10:45 am |

    Woomera: What part of “beta software” are you people missing?

    The part where you make beta software the main feature of an otherwise completely unnecessary “upgrade,” advertise it, release it with huge fanfare, and then when people point out that hey, not only does it not really work that well, but it seems to not really work that well specifically in regard to things women need, start whining about how it’s only beta, why are you guys so upset? Are you seriously saying that women expecting a computer feature they paid a chunk of money for to work is unreasonable?

    Woomera: Also, with something as personal as abortion, a person’s first go to choice is the rudimentary *beta* AI program on a phone? Buh whu… wha??

    Why not? Abortion is too personal, but viagra, hey, that’s just normal? I never got the memo about not using one’s smartphone for things that were too personal. I thought the whole point was that you could use them for anything.

    Woomera: And if you have been attacked or are hurt, I know it’s a high stress situation, but do remember your phone can be used as, well, a *phone*. Hmm… anyone try “call 911″ with Siri? I can see that being useful if a person is so debilitated they can maybe get one or two taps in.

    What if they don’t want 911? What if they want a hospital? Or a rape crisis center?

    John: What’s stopping them, then? There’s no law against it, is there? Same for many science-dominated disciplines (not medicine, though).
    I work in oil and gas engineering; female oil and gas engineers used to be as rare as unicorn shit. Although there younger ones coming through now, the ratio is still about 7:1 m/f. So why so few women entering the field? What puts them off?

    There is actually a whole lot of research on this, if you’re actually interested and not just being a dick. Several things: girls and women are still discouraged, starting from a very young age, from pursuing interests in math and science; women are not encouraged to be ambitious in the same way that men are, and tend to eliminate themselves from competing for high-paying, in-demand jobs (one of the more effective ways to eliminate women from your applicant pool, at least one study has found, is to advertise a significantly higher than average salary); once something is male-dominated, the male/masculine culture that is developed in those fields is very off-putting to a lot of women and so perpetuates the problem; also perpetuating the problem is lack of female role models and mentors to whom girls and women can turn to for advice on issues that male mentors probably don’t know jack shit about (sexual harassment, maternity leave, dress codes).

    Entire books have been written about this stuff. Go ahead and look the up.

  98. Lee
    Lee December 2, 2011 at 10:46 am |

    Mjog: This tl;dr should be highlighted, heavy-typed, blinking, somewhere up the top of this article. Nearly every programmer reading this story is going it initially think “It’s not malice, it’s incompetence” (I sure did), but this very neatly states why it’s actually still a problem.

    Also, while others have suggested that it’s not necessarily a programmers’ fault (the programming part is only one of a number of steps that go into producing software), all of the number of people involved should have noticed that the question/response tweaking has a male bias and done something about it — including the programmers.

    I still don’t buy that this is incompetence or accidental (and speaking as someone with a Computer Science degree, for what it’s worth). You have an automated speech-recognition-to-search-engine app that listens to what you say, converts that into known words and phrases, queries third-party search engines with those terms and returns the results. What’s happening here is:

    a) The app recognises the word ‘abortion’
    b) Associates it with the known internal business/service category ‘abortion clinics’
    c) It queries Yelp etc. with that known search term, but returns no results for that particular term despite Yelp’s own website providing several results
    or
    d) It doesn’t bother running the query at all and just says ‘no results’ (we don’t know at this stage)

    There’s no good reason why an automated system using a known search term should choke on returning a particular set of standardised results from a search engine, especially when using the search engine yourself produces those results just fine. You search for whatever term, you get back a generic list of results in a standard format, to the computer it’s all a set of data to be processed – it doesn’t know what the words mean, it doesn’t treat abortion clinics any differently from dinosaur theme parks unless it’s specifically told to.

    There must be a filtering stage involved – I haven’t used Siri, but I’m assuming it decides on a ‘best result’ and replies with that? Or is there a list too? Either way it’s deciding which results to give you and which to reject, whether it’s as simple as using Yelp’s top result or if it’s doing something more involved to give you a more personally useful response. Somewhere along the line it’s specifically deciding that none of the results that Yelp etc. return for abortion clinic should be presented to the user. So list time again, it’s either:

    a) screwing up in general retrieving Yelp’s provided results
    b) retrieving them but screwing up in deciding which one (if any) should be presented
    c) retrieving them and working as intended by rejecting all relevant results

    In the case of a) and b) this is a major flaw, and we’d expect to see this occurring for many many queries – like I said before, there’s nothing specific about results relating to abortion clincs as far as the computer’s concerned, so whatever error just happens to affect those results should affect a much broader range. I don’t know how effective Siri seems to be, and if it fails to produce results for other terms it does recognise (remember it connected the word abortion to the category abortion clinics, it’s not like it didn’t recognise the word) that show up just fine on Yelp. Have there been many other reports of this kind of failure, or does it seem to be limited to women’s reproductibe health?

    The last thing that seems to preclude the idea that this is an unfortunate coincidence, and especially the idea that this is an (incredibly unlikely) highly-specific internal bug that just happens to only break abortion clinic queries, purely on accident, is the set of responses to the pregnancy questions. Even in isolation they read like highly specific anti-choice messages directed at someone who’s thinking about pregnancy. I don’t know where they got their database of responses from, obviously some of them were written by Apple (like the one about loving Apple products) and maybe others were taken from a generic conversation database created over the years by artificial intelligence projects, but the point is the responses to ‘I’m pregnant’ and ‘pregnancy’ have a decidedly non-neutral, anti-choice slant – who responds to a mention of pregnancy with advice to ‘do no harm’?

    It would be an incredible coincidence that the conversation database ‘just happened’ to have an anti-choice flavour while abortion clinic results ‘just happened’ to be among the set of results affected by a programming error – even moreso if that error affects a very narrow range of results, or searches for abortion clinics specifically. And it ‘just happens’ to be a hot-button political topic right now that could be spun badly for Apple if conservatives discovered the iPhone were providing this kind of information to people.

    I don’t know if anyone’s going to read all that, but I wanted to show from a computery standpoint how this works and why it doesn’t really seem like an oversight or an ‘oops’ situation that this specific kind of query just happens to be responding like this. If Siri is having a lot of problems with many different recognised queries that work on Yelp, then that would lend some credence to the idea that it’s a glitch and that it’s just a coincidence that the pregnancy responses are so skewed, but we shouldn’t immediately accept the un-nuanced hand-waving ‘oh y’know, computers, they go wrong sometimes’ excuse that I’ve seen a few people repeating

  99. groggette
    groggette December 2, 2011 at 10:48 am |

    Sooo, people aren’t supposed to raise concerns about how a service works (or doesn’t) if it’s in beta. But if people don’t complain how will the developers/programmers know what to at least consider fixing? Yeah the logic in some of these comments is fucking astounding. yeesh

  100. groggette
    groggette December 2, 2011 at 10:53 am |

    and on the whole “abortion clinics don’t advertise” kick. I don’t know one way or the other how much small private clinics would advertise and can see this as a possibly valid explanation for them.
    But.

    Planned Parenthood. Seriously, is there anyone in the US that doesn’t know what PP is and that many of their clinics provide abortion. Even if a specific PP doesn’t have it’s own website, there will still be a regional one. And at least for the PP in my city, it is listed on Yelp. Why doesn’t Siri list local PPs unless PP is specifically asked for?

  101. Sirkowski
    Sirkowski December 2, 2011 at 11:44 am |

    Sheelzebub: Well,apparentlyit’sthethingtodoifyou’reamanwhowantstohireasexworkerforablowjoboryouwantViagra.Oddhowsomefolkshavesuchaproblemseeingthedouble-standard.

    Standards?? Yeah, complain about your consumer product made by underpaid slave labor in a totalitarian regime not meeting your bitch standards. FIRST WORLD PROBLEMS DERP

  102. Matt
    Matt December 2, 2011 at 11:47 am |

    Is it somehow impossible to understand that you can tell apple, hey, siri can’t do this right, without starting some crazy anti women conspiracy agenda being attributed to Apple?

    groggette:
    Sooo,peoplearen’tsupposedtoraiseconcernsabouthowaserviceworks(ordoesn’t)ifit’sinbeta.Butifpeopledon’tcomplainhowwillthedevelopers/programmersknowwhattoatleastconsiderfixing?Yeahthelogicinsomeofthesecommentsisfuckingastounding.yeesh

  103. John
    John December 2, 2011 at 11:49 am |

    EG,
    Thanks for the explanation although I’m not sure why you assumed I might being “being a dick”. Is your default mode hostility and suspicion?

  104. La Lubu
    La Lubu December 2, 2011 at 11:53 am |

    If you are really annoyed about Siri being focused towards men, learn to program, and get a career as a programmer.

    Yeah…no. Sorry Roland, but I’ve already spent over half my career in the building trades (specifically, as an electrician). I’m already doing my part to integrate one of the male-oriented work arenas. Meanwhile, what are you doing within your field to encourage more women to enter and remain, hmm?

    I’m not a programmer. I’m a potential customer. I don’t want to reinvent the wheel, but I am interested in gadgets that actually make my life easier. I’m a single mother with one full-time and one part-time job (so why am I on the computer? My kid is sick today, and home from school). I’m also involved in a few activist/community groups, and otherwise have a life, too….let’s just say, “I’m busy.” Too busy to want to devote a lot of time to fucking around playing with some techno doodad. It’s not that I wouldn’t enjoy that; I don’t have the time for that shit. I have a smart phone right now that I don’t know at least half the functions on, mostly because cell phones no longer come with printed instructions; you have to go online for it. It’s a real pain in the ass to go online with your cell phone and alternate from the instruction screen to actually playing with your phone. That, and the goddam screen is so tiny. That, and I’m visually/spatially oriented; retraining my brain to translating the act of discovering/remembering the “filing system” (for lack of a better term—Where Shit Is At) on the gadget to a system I can remember, because my mind is used to moving things in 3D…..I’m probably not explaining this very well at all, but frankly, the Where Shit Is At on computer systems translates as “hidden” to my brain, and there aren’t any visual cues for me to rely on, capisce? If not, don’t worry about it ‘cuz the point is….

    I really dig user-friendly systems. And employment for me has been good this year, and I’ve been frugal, and since I don’t know how to use half the shit on my current phone anyway, I’ve been thinking about an upgrade to a more user-friendly phone (and definitely one with a bigger screen). I have friends that rave about their iPhones, especially in comparison to what they had before. Shit, in marketing terms, I’m primed and ready, right? Especially since I’m getting to that age where I take my glasses off to see better, and I travel sometimes for work (and am in transit a lot around home), so voice-recognition is something I’d use a lot.

    And then this story breaks. Makes me rethink pulling out my wallet….in this direction, anyway. See, I don’t give a shit if Siri know where to direct men in search of Viagra or blowjobs. But I do want it to be able to direct me if I say “tampons”. If I say I’ve been raped, I want Siri to recognize that I don’t just need 911, but also a reference to the local Rape Crisis Center, who will send out a volunteer to be with me and provide support during all the shit you go through at the hospital with the rape kit, and who will make sure I get access to EC (because not all hospitals will provide it). It would be nice if Siri could provide safe running routes for women in an unfamiliar city, or knowledge about other safety issues (just reading someone’s tumblr today about a possible serial killer targeting petite black women in the Chicago area). And you know, Siri should know enough about abortion clinic access to be able to direct women to the nearest one even if it’s a few states over. The two nearest abortion clinics to me are both over an hour-and-a-half away; that doesn’t stop women in my city from using them. It’s not the goddam dry cleaners.

    And knowing to do that? Knowing that those are applications that are likely to be valuable to one’s customers? That’s pretty basic. Siri is Marketing Fail. As is not having a selection of voices. Dumb. This whole debacle reads to me like the 80s flick, “Weird Science”—a couple of teenage boys creating their idea of the perfect woman. Is that bad, in and of itself, even if it’s adult men doing it? Naah. We all indulge in a little juvenile humor at times. But does it make me want to avoid the product, because it’s obvious that the product’s creators didn’t think of me as a potential customer, let alone think of what I might need? Does it make me think that the product would be a bad fit for me, that the competition’s products will probably serve my needs better? Yes, indeed.

  105. groggette
    groggette December 2, 2011 at 11:54 am |

    John: Is your default mode hostility and suspicion?

    Personally, I always thought EG’s default mode was common sense. And it seems she used it in her response to you.

  106. Sheelzebub
    Sheelzebub December 2, 2011 at 11:57 am |

    Sirkowski: Standards?? Yeah, complain about your consumer product made by underpaid slave labor in a totalitarian regime not meeting your bitch standards. FIRST WORLD PROBLEMS DERP

    I love how the only time d00ds like you care about slave labor is when you need a way to score points on the uppity bitchez who point out your gross douchery. But believe it or not, cupcake, we can and do work against corporate exploitation and misogyny (BTW, haven’t seen YOU at any meetings or actions, so dry your crocodile tears and STFU). Some of us can walk and chew gum at the same time. Not my problem if you’re such a stupid shit that you can’t manage it.

  107. quotes out of context | clusterflock
    quotes out of context | clusterflock December 2, 2011 at 12:03 pm |

    [...] that she calls “the single most life-changing experience I’ve ever had.” Jill: I want my pussy eaten. Siri: I have found eleven pet stores in your area. posted by Deron Bauman in categories, quotes | [...]

  108. Sheelzebub
    Sheelzebub December 2, 2011 at 12:07 pm |

    Also, with something as personal as abortion, a person’s first go to choice is the rudimentary *beta* AI program on a phone? Buh whu… wha??

    Well, getting your dick sucked by a sex worker is also pretty personal and private, as is getting Viagra, but the Siri program is on it.

    If you are really annoyed about Siri being focused towards men, learn to program, and get a career as a programmer.

    Hey, your car’s a lemon? Then you should train to be a mechanic or an automotive engineer. Jesus, dude. Has it ever occurred to you that if half of the customers you’re trying to sell a product to points out that you’ve left them out, the response isn’t to bark at them that they should make the product themselves? Not good marketing or business sense.

  109. La Lubu
    La Lubu December 2, 2011 at 12:09 pm |

    What if they don’t want 911? What if they want a hospital? Or a rape crisis center?

    Probably also worth a reminder that 911 isn’t the same in all parts of the US. The ability for 911 to pinpoint your location by cell phone can vary wildly depending on where you are. It’ll be another few years before that gets any better. People in rural areas can be better off getting themselves to a hospital if they aren’t too injured to drive, rather than waiting for 911 to show up.

  110. suspect class
    suspect class December 2, 2011 at 12:20 pm |

    Wow, all we need is someone telling folks who don’t like Siri to create their own abortion-finding AI, and we’ll have bingo.

  111. Kristen J.
    Kristen J. December 2, 2011 at 12:20 pm |

    @Sheelzebub,

    Srsly. If someone can’t see the sexism in a system knowing the word “viagra” but not “plan b” or any contraceptive drugs, they aren’t trying.

    Reminds me of the viagra/birth control insurance bullshit.

  112. Fang
    Fang December 2, 2011 at 12:28 pm |

    Seems like some of these dudes who are freaking out about getting information about abortion services on our cell phones (gasp!) seem to think we should be going through some kind of secret underground network to procure these procedures because they’re just so private (translation: shameful).

  113. Jadey
    Jadey December 2, 2011 at 12:34 pm |

    Roland: If you are really annoyed about Siri being focused towards men, learn to program, and get a career as a programmer.

    suspect class: Wow, all we need is someone telling folks who don’t like Siri to create their own abortion-finding AI, and we’ll have bingo.

    I think we have bingo.

  114. rob
    rob December 2, 2011 at 12:57 pm |

    OT, and I haven’t been here in a week or so, so I might have missed something, but what is up with some of the block quotes? The text loses all the spaces and doesn’t do line breaks well. It only happens in some of them. Kinda weird, just in case you didn’t know. Is it a feature, not a bug, like disemvoweling?

  115. institutionalisingsexism
    institutionalisingsexism December 2, 2011 at 1:00 pm |

    I don’t have an iPhone, but I’m curious as to what Siri’s response would be to the following questions:
    Siri, I need a tampon
    Siri, I have a UTI
    Siri, where can I get an HPV shot?
    Siri, I need the day after pill

    Also is Siri’s default voice is a woman’s… which tells you not only that the programmers were thinking about themselves (or men) as the typical users but that in their minds the perfect assistant is female. I know you can now chose a male or female voice for Siri, but the default version, and the one that is advertised by Apple is the female one. Some people’s take on this is that female voices are more pleasant to listen to… but isn’t this answer another mysogynist stereotype of women as docile and subservient, and ever eager to please?

  116. You Siri-ous?
    You Siri-ous? December 2, 2011 at 1:01 pm |

    Jill: Whatpartof“it’snotun-sexistjustbecauseit’sbetasoftware”areyoumissing?

    Perhaps it’s the fact that, being in its beta stage, it’s not finished. You might as well say that the grocery store is sexist because they added their feminine hygiene section last as opposed to first despite it all being there when finished.

  117. Fang
    Fang December 2, 2011 at 1:11 pm |

    If the grocery store opened in the equivalent of beta stage and didn’t include feminine hygiene products because they decided that was their last priority then I would indeed say that was sexist because there is no reasonable business reason to invest in everything else first.

  118. zuzu
    zuzu December 2, 2011 at 1:14 pm |

    You Siri-ous?: Perhaps it’s the fact that, being in its beta stage, it’s not finished. You might as well say that the grocery store is sexist because they added their feminine hygiene section last as opposed to first despite it all being there when finished.

    They sure finished the parts where you could find hookers and blow pretty quick, though.

  119. Kristen J.
    Kristen J. December 2, 2011 at 1:14 pm |

    Umm…what’s with all these dudes who don’t know what beta testing means? Pro tip: If your program isn’t feature complete, you’re not ready for beta.

  120. Helen Huntingdon
    Helen Huntingdon December 2, 2011 at 1:22 pm |

    John: John 12.2.2011 at 10:29 am

    What’s stopping them, then?

    People like you, obviously. Who really wants to listen to a bunch of doodz go on about how their sexist behavior isn’t really sexist? Because engineers do that a lot.

    And the doodz start young: Young women from my old high school’s robotics team who are now freshmen in college have been reporting how the boys in their physics lab groups scream them down repeatedly with, “How can you know anything? You’re a girl!”

  121. You Siri-ous?
    You Siri-ous? December 2, 2011 at 1:24 pm |

    Jill: (Quote this comment?)

    Do you seriously expect everything to work perfectly for all people on day 1, or heck, day 0 considering that this is a public beta – a beta open to the public for the sole purpose of working stuff like this out?

  122. zuzu
    zuzu December 2, 2011 at 1:27 pm |

    You Siri-ous?: Do you seriously expect everything to work perfectly for all people on day 1, or heck, day 0 considering that this is a public beta – a beta open to the public for the sole purpose of working stuff like this out?

    Siri works perfectly to find men sexual gratification and boner pills. It does not work at all for finding women sexual gratification, reproductive health assistance, contraception or an abortion clinic. It will, however, perfectly direct women to anti-choice crisis pregnancy centers.

    That’s not a bug.

  123. La Lubu
    La Lubu December 2, 2011 at 1:27 pm |

    You might as well say that the grocery store is sexist because they added their feminine hygiene section last as opposed to first despite it all being there when finished.

    If a grocery store opened, and carried food, booze, soaps, household goods, pet food, condoms & lube (but not spermicide), all kinds of drugs in the pharmacy (including Viagra, but not birth control), and all kinds of hygiene products (except for tampons and sanitary pads)….yeah, I’d say that was sexist.

    But that doesn’t happen. Grocery stores have a great deal of women working for them at all levels, and those women would readily point out how alienating that would be to the customer base. How their company would quickly get the reputation of being “the inconvience store” if they opened without taking female-specific needs in mind.

    And that appears to be the problem with Siri. The workplace cultures of the various entities that went into Siri are so male-oriented that they forgot the needs of their female customer base. Now they get the benefit of bad publicity during the holiday buying season. As I said before…..Marketing Fail.

  124. Sheelzebub
    Sheelzebub December 2, 2011 at 1:27 pm |

    La Lubu: Probably also worth a reminder that 911 isn’t the same in all parts of the US.

    Oh, LaLubu! That’s such a First World problem. Let’s focus on the important shit–like where d00dz can get Viagra and hire sex workers, not getting actual information that you may need in an emergency.

    Sheesh. Silly irrational wimmenz with their expectations of customer service and knowledge of what the beta state actually is. WHAT ABOUT THE MENZ.

  125. groggette
    groggette December 2, 2011 at 1:32 pm |

    You Siri-ous?: a beta open to the public for the sole purpose of working stuff like this out?

    And yet you’re getting all in a huff about people pointing out the problems with the product so that it can be fixed. If the developers didn’t see a problem with forgetting/ignoring half the population before they released Siri, then how will they know that’s a problem unless people point it out?

  126. Fang
    Fang December 2, 2011 at 1:32 pm |

    What part of it doesnt work well in a way that specifically neglects women’s very common needs while fulfilling even uncommon desires of men is hard to understand?

  127. groggette
    groggette December 2, 2011 at 1:35 pm |

    And yes, it it worth pointing out that this Siri problem is sexist, whether or not it’s malicious or unintentional.

  128. You Siri-ous?
    You Siri-ous? December 2, 2011 at 1:49 pm |

    groggette: Andyetyou’regettingallinahuffaboutpeoplepointingouttheproblemswiththeproductsothatitcanbefixed.Ifthedevelopersdidn’tseeaproblemwithforgetting/ignoringhalfthepopulationbeforetheyreleasedSiri,thenhowwilltheyknowthat’saproblemunlesspeoplepointitout?

    Don’t be obtuse. You know this whole thread has gone quite a ways beyond simply pointing out a flaw in Siri to accusing the developers of Siri of sexism, even if inadvertently.

    And you misspeak when you say it was released. It was not, at least not in the way you appear to be making it out as. It was released as PUBLIC BETA. It is not a final version. It is as if they’re telling you that their product is going to be full of glitches, holes, and all kinds of problems. There’s nothing more to it than that. When you get an error like with this whole abortion thing, it can be explained away simply because it’s beta. You’re being ridiculously unreasonable if you expect Siri to work perfectly when it’s unfinished. Having big glaring errors in it like this is a part of being a beta product and if you don’t like that, I would suggest you hold off using it until the final version is released.

  129. sabrina
    sabrina December 2, 2011 at 1:50 pm |

    Adam Starkey: CompletelyirrelevantbecauseSirispecificallydoesNOTsearchGoogle.IttriesYelpforbusinesses.IfthesearchfailsinYelp,orifitisperceivedasageneralquestion,ittriesWolframAlpha.

    Ifthosesitesdon’tunderstandthequestion,ordon’thaveusefulanswers,neitherdoesSiri.

    if you search yelp for abortion clinics in NYC it turns up planned parenthood as well as a couple of other results, so that is still wrong

  130. You Siri-ous?
    You Siri-ous? December 2, 2011 at 1:50 pm |

    Jill: “No, I don’t, but now you’re moving the goalposts of the conversation. I don’t expect products to work perfectly for all people on day 1.”

    Okay then. You should have no problem with Siri.

    Bear in mind that when they release a product as a public beta, the public ARE those testers.

  131. PosedbyModels
    PosedbyModels December 2, 2011 at 1:52 pm |

    Jill: I want my vagina licked.
    Siri: Your language!

    Jill: I want my dick sucked.
    Siri: I found twelve escort services fairly close to you.

    Jill: I want a blowjob.
    Siri: I found twelve escort services fairly close to you.

    Jill: I want cunnilingus.
    Siri: Is that so?

    Jill: I want cunnilingus.
    Siri: Ok.

    Jill: I want my clitoris licked.
    Siri: I don’t understand “clitoris.”

    These are the things blowing my mind right now. First of all, the obviously shaming and completely ridiculous, prissy response to “vagina licked,” for some fucking amazing reason that couldn’t possibly be sexism, you guys, but “dick sucked” doesn’t provoke any reprimand whatsoever. And “I don’t understand ‘clitoris’”? “Clitoris” and “vagina” are very literal, very real terms for female anatomy that, yeah, we like to have pleasured sometimes. Why would these go unrecognized or treated as dirty when “dick” and “blowjob” are apparently just fine?

    This news, and the obnoxious half-assed trolling accompanying it, are so ridiculous right now.

  132. jennygadget
    jennygadget December 2, 2011 at 1:55 pm |

    Pro tip: If your program isn’t feature complete, you’re not ready for beta.

    Another free pro-tip: If none of your user stories during your conceptual phase (much less your beta testing phase) include women, and you pride yourself on user friendly products, you might want to reconsider how user friendly you really are. And will manage to continue to be in the future.

    And I would just like to point out that the grocery store analogy actually highlights how ridiculous Apple is being and why this problem should have never made it to beta testing phase. Even if you realize the lack of tampons during your “Friends and Family” soft opening, that’s just…an incredibly expensive change to have to make that late in the game. To forget something that is stocked and sold on a daily basis in every other grocery store in the US even as far back as the first time you started listing specific goods to sell is just…such an incredibly fundamental flaw in how you are designing stuff, not some random glitch that no one could have seen until the product was put to mass use.

    Srsly, these guys mansplaining – incorrectly! – how good interaction design is done are just the icing on the cake, yes?

    Fang – yeah. There’s also just a certain level of stupid in telling people to “google it” instead – ie, don’t try to use the new shiny interface to get info, go back to the old interface! …why, exactly would I want to do that? What specifically is so odd about using an interface like Siri for it’s stated purpose: finding info?

  133. Lauren
    Lauren December 2, 2011 at 2:02 pm |

    You Siri-ous?: Having big glaring errors in it like this is a part of being a beta product and if you don’t like that, I would suggest you hold off using it until the final version is released.

    Protip: Having big glaring errors in it like this is part of being a beta project, and if you don’t like that, I would suggest holding off on releasing the program until a final version is developed.

    Or get real comfortable with the public’s criticism of the product.

  134. groggette
    groggette December 2, 2011 at 2:02 pm |

    You Siri-ous?: Bear in mind that when they release a product as a public beta, the public ARE those testers.

    And Jill tested it, and found it lacking, and pointed out the ways she found it lacking. What’s your problem with that?

  135. Daisy
    Daisy December 2, 2011 at 2:03 pm |

    Sometimes, when reading about the things Siri will find put together in a list – prostitutes, Viagra, condoms, places to bury bodies – I feel like Siri was programmed by the creators of CSI.

  136. La Lubu
    La Lubu December 2, 2011 at 2:03 pm |

    You Siri-ous-ly didn’t follow Jill’s earlier link about how Siri doesn’t just not refer you to an abortion clinic, but specifically refers you to anti-abortion crisis pregnancy centers. That’s not a problem of omission.

    Jokes about where to dump a body, or how to get a hamster out of your ass are harmless as far as I’m concerned. But someone took a great deal of time to make sure that women in NYC couldn’t get referred to an abortion clinic, since abortion clinics do show up in the websearches that supposedly Siri is using. The time to make sure women in DC got referred to a phony “crisis pregnancy center” (even one quite a distance away) instead of a comprehensive women’s reproductive health center. That isn’t benign.

    Whether Apple is aware of it or not, they are being rebranded as the anti-abortion company.

  137. You Siri-ous?
    You Siri-ous? December 2, 2011 at 2:05 pm |

    Lauren: Protip:Havingbigglaringerrorsinitlikethisispartofbeingabetaproject,andifyoudon’tlikethat,Iwouldsuggestholdingoffonreleasingtheprogramuntilafinalversionisdeveloped.

    Orgetrealcomfortablewiththepublic’scriticismoftheproduct.

    Lauren: Protip:Havingbigglaringerrorsinitlikethisispartofbeingabetaproject,andifyoudon’tlikethat,Iwouldsuggestholdingoffonreleasingtheprogramuntilafinalversionisdeveloped.

    Orgetrealcomfortablewiththepublic’scriticismoftheproduct.

    Need I quote myself?

    “Don’t be obtuse. You know this whole thread has gone quite a ways beyond simply pointing out a flaw in Siri to accusing the developers of Siri of sexism, even if inadvertently.”

  138. jennygadget
    jennygadget December 2, 2011 at 2:06 pm |

    “And you misspeak when you say it was released. It was not, at least not in the way you appear to be making it out as. It was released as PUBLIC BETA.”

    OMG MANSPLAINERS. Apple created ads that highlighted Siri in order to get people like my mother to go in and pay several hundred dollars upgrade their phones. This is not beta-testing in the traditional sense. When google beta-tests stuff, they do not use it as a selling point for newbies. *headdesk*

    And also hells yes, what groggette said. If you are going to claim that it’s in beta-testing, therefore it’s all good, you can’t complain when people point out what needs to be fixed. Especially when Apple, unlike companies that are actually beta-testing in the way you mean it, does not provide a specific place to send feedback. I mean, I have complaints about Sirsi-Dynex’s facebook app that they are (kinda) beta-testing, but well….they also set up a specific place to send feedback. Cuz, you know, actual beta-testers tend to want criticism. In spades. They don’t go around saying problems are glitches, they say “oh! thanks for catching that!” yeesh.

  139. groggette
    groggette December 2, 2011 at 2:07 pm |

    If the developers don’t want to be accused of developing a sexist app, then maybe they shouldn’t forget or ignore half the population before releasing the app to the public.

  140. suspect class
    suspect class December 2, 2011 at 2:07 pm |

    Jadey: I think we have bingo.

    Oops. I guess that’s a side effect of rolling my eyes *so hard.*

  141. You Siri-ous?
    You Siri-ous? December 2, 2011 at 2:08 pm |

    “You Siri-ous-ly didn’t follow Jill’s earlier link about how Siri doesn’t just not refer you to an abortion clinic, but specifically refers you to anti-abortion crisis pregnancy centers. That’s not a problem of omission.”

    Sure it is. The developers don’t just manually pick and choose what you and will not find. They code their search engine in a specific manner so that it will yield the results that it does, and given that it’s beta, it’s revealed just how imperfect it is. This is good. That means this is a GOOD beta. This means it will then be able to be fixed in the final version.

  142. PosedbyModels
    PosedbyModels December 2, 2011 at 2:08 pm |

    La Lubu: Siri doesn’t just not refer you to an abortion clinic, but specifically refers you to anti-abortion crisis pregnancy centers. That’s not a problem of omission.

    Absolute YES to La Lubu’s last two comments.

  143. Fang
    Fang December 2, 2011 at 2:09 pm |

    oh no, not accusations of sexism! that shit’s serious!

  144. Lee
    Lee December 2, 2011 at 2:11 pm |

    You Siri-ous?: When you get an error like with this whole abortion thing, it can be explained away simply because it’s beta. You’re being ridiculously unreasonable if you expect Siri to work perfectly when it’s unfinished. Having big glaring errors in it like this is a part of being a beta product and if you don’t like that, I would suggest you hold off using it until the final version is released.

    I went into this earlier – you can’t just explain it away because it’s beta, there needs to be more nuance than that. What other big glaring errors are there in the search results, where it recognises what you’re asking for but says there are no results, when results do actually show up in Yelp? That’s a serious question, I don’t know how broken it is, but if this is the result of an error then there should be a very wide range of results that suffer the same problems. There’s no reason for this to only happen with ‘abortion clinics’, unless it’s programmed to treat that query in a different way from all the other, working queries.

    And it still doesn’t explain why the program’s response to the word ‘pregnancy’ is to implore you to ‘do no harm’ and ‘consider the alternatives’, which just happens to fit the same agenda as a ‘bug’ that fails to give you details of abortion clincs when you ask for them. Maybe the anti-choice movement is still in beta?

  145. XtinaS
    XtinaS December 2, 2011 at 2:12 pm |

    Don’t be obtuse. You know this whole thread has gone quite a ways beyond simply pointing out a flaw in Siri to accusing the developers of Siri of sexism, even if inadvertently.

    Pop quiz: do you even know what ingrained, non-malicious, yet still present sexism is?  Or do you think it has to be the Worst Ever™ in order to qualify?

  146. You Siri-ous?
    You Siri-ous? December 2, 2011 at 2:18 pm |

    @ XtinaS

    I don’t doubt there is sexism in a variety of places. Thing is, this whole thing with Siri is a very bad example of it.

  147. suspect class
    suspect class December 2, 2011 at 2:18 pm |

    Obviously you can’t expect programmers to be able to continue doing their jobs when you hurt their feelings by calling them sexist.

  148. AndrewJenny
    AndrewJenny December 2, 2011 at 2:21 pm |

    Wow, as a sometimes programmer I had chalked this up to the majority explanation of reliance on incomplete third-party data combined with the effects of patriarchy (i.e. the experience of women was not considered by predominantly male programmers). I didn’t know it would respond to requests for female sexual services by listing escorts. Programmers love in-jokes, like the response to “I love you”, but this is really insulting, and Apple should have known better.

  149. Lauren
    Lauren December 2, 2011 at 2:22 pm |

    You Siri-ous?: “accusing the developers of Siri of sexism, even if inadvertently.”

    There are worse things, pal. But it’s telling that you’re more concerned with getting called sexist than addressing sexism.

  150. Sheelzebub
    Sheelzebub December 2, 2011 at 2:22 pm |

    I think some of these mansplainers are in beta.

  151. Camilla Peffer @ Girls Are Made From Pepsi

    Woah. Technology is blowing my mind right now.

  152. FashionablyEvil
    FashionablyEvil December 2, 2011 at 2:26 pm |

    oh no, not accusations of sexism! that shit’s serious!

    Exactly. There is nothing worse than being called sexist! Absolutely nothing! (Except maybe being called racist.)

  153. XtinaS
    XtinaS December 2, 2011 at 2:27 pm |

    Thing is, this whole thing with Siri is a very bad example of it.

    But why?  The programmers didn’t consider women’s needs, but they considered men’s needs (Viagra, escort services, &c).  They even put in a response to hamster-related issues, but they didn’t somehow figure out that women might want to find an abortion clinic, or get contraception, or also procure sexual services.  To a whole lotta people, this is sexism: not malicious, perhaps not deliberate, but sexism, all the same.

    Wikipedia: “[Beta] generally begins when the software is feature complete.”  What I interpret this to mean is the makers of Siri considered it to be feature complete, even though no one thought to consider women’s needs.  And that’s bloody sexist.  It is, as analogised before, like considering a grocery store to be ready for the public, when no one considered that women might come in and want to buy woman-specific products, so no one stocked it.  The problem is like one part “this shit’s missing” and five parts “how the vainglorious fuck did this get opened to the public with no one noticing this basic, glaring, fundamental flaw?”.

    (Can’t be sexism, though, to not consider women’s needs.  Must just be… a tumour?  Or a brain fart!  That lasted for the entire production cycle!  Yes, that’s way more reasonable.)

  154. jennygadget
    jennygadget December 2, 2011 at 2:28 pm |

    Obviously you can’t expect programmers to be able to continue doing their jobs when you hurt their feelings by calling them sexist.

    Solution: let them quit. Hire feminist programmers! :p

    The developers don’t just manually pick and choose what you and will not find.

    You know, I’m fairly certain someone had to specifically program Siri to know what to do when someone wanted to hide a body. And they sure as hell are not getting those escort services from Yelp. So what the fuck ever dood.

  155. Kristen J.
    Kristen J. December 2, 2011 at 2:33 pm |

    Sheelzebub: I think some of these mansplainers are in beta.

    WIN!

  156. suspect class
    suspect class December 2, 2011 at 2:34 pm |

    jennygadget: Solution: let them quit. Hire feminist programmers!

    Now now, it’s logical suggestions like this that demonstrate that you’re a radical fem

    jennygadget: And they sure as hell are not getting those escort services from Yelp. So what the fuck ever dood.

    I agree with your larger point. but apparently, NY Yelp does have escort service listings. I’m fairly certain they don’t list body-dumping ravines, though.

  157. Lauren
    Lauren December 2, 2011 at 2:41 pm |

    suspect class: Now now, it’s logical suggestions like this that demonstrate that you’re a radical fem

    If so, she can come sit by me.

    Sheelzebub: I think some of these mansplainers are in beta.

    I think you win this blog? Because there’s a trophy over here with your name on it. Congratulations, but you’re also in charge of the moderation queue.

  158. suspect class
    suspect class December 2, 2011 at 2:48 pm |

    Lauren: If so, she can come sit by me.

    Word. Absolutely ridiculous threads like this one are excellent demonstrations of feminism’s continuing relevance and importance.

  159. Kristen J.
    Kristen J. December 2, 2011 at 2:55 pm |

    Lauren: Congratulations, but you’re also in charge of the moderation queue.

    Now that is just cruel. :P

  160. What We Missed
    What We Missed December 2, 2011 at 3:06 pm |

    [...] know the Siri abortion issue has been covered to no end, but you really have to see Jill’s [...]

  161. jennygadget
    jennygadget December 2, 2011 at 3:06 pm |

    suspect class,

    That’s interesting. Checking my local Yelp listings, I’m getting only stuff that has words like “escorted” in the reviews. It makes me want to borrow my parents phone(s) to see how those questions work when yelp has listings, but not accurate ones.

    But yeah, the thing that gets me about the ravines and smelting plants and such is that someone had to program that logic into the interface. That’s much more complex than a 1 to 1 keyword search. And you may do that for a joke, but you aren’t going to develop that whole process just for a joke. So, clearly, there are other, more practical stuff that they did have to create similar logic/processes for. So trying to pretend that it’s all just a matter of the databases being incomplete…

  162. Sheelzebub
    Sheelzebub December 2, 2011 at 3:21 pm |

    Now that is just cruel. :P

    Only to the mansplainers.

    :::Grins evilly:::
    :::Sharpens claws:::
    :::Reaches for trophy:::

  163. Fang
    Fang December 2, 2011 at 3:37 pm |

    Someone please tell me this is shopped
    http://29.media.tumblr.com/tumblr_lveh9h3wdG1r3z6mso1_500.jpg

  164. Siri, Sexism, and Silicon Valley - Number One Source For Local News - Local Reporter Direct

    [...] clinics, birth control, and other reproductive health services. As both Amanda Marcotte and Jill Filipovic have forked out, Apple relies on outmost databases for Siri, that mostly offer inadequate or vague [...]

  165. La Lubu
    La Lubu December 2, 2011 at 3:56 pm |

    Fang: O_o

  166. Fang
    Fang December 2, 2011 at 4:23 pm |

    I know. I’d love some confirmation that siri doesn’t actually say that mom should be in the kitchen.

  167. lestricoteuses
    lestricoteuses December 2, 2011 at 4:37 pm |

    Matt:
    Isitsomehowimpossibletounderstandthatyoucantellapple,hey,sirican’tdothisright,withoutstartingsomecrazyantiwomenconspiracyagendabeingattributedtoApple?

    Why don´t you Matt, READ the post and comments for once and TRY to REALIZE that NONE is talking about a “crazy anti-women consiracy agenda”, they are talking about institutionalized bias. If you don´t konw what it means, please look it up. But first, try to READ the post. And of course, TRY to REALIZE that people right here, right now, are actually saying to apple, “hey, siri doesn´t do this right”. Get it??? No, cause you didn´t bother READING the post or the comments. I am sorry, I have to say it: you sir, are an idiot.

  168. jennygadget
    jennygadget December 2, 2011 at 6:24 pm |

    I got this article via Kate Harding on Twitter, in case anyone may be interested but hasn’t seen it yet.

    http://www.chicagomag.com/Chicago-Magazine/The-312/December-2011/Our-Siri-Ourselves/

    It’s really well done. I do think it glosses over two things:

    That it doesn’t make sense to be annoyed that users expect to be able to find things like birth control when programmers have bothered to take the time to add jokes about hiding dead bodies.

    That a big part of the problem is that Apple is trying to do beta testing while claiming a finished product. This is part of why questions about intent were raised. Apple only claimed “beta” once major flaws came up, because they wanted the sales and were hoping to get away with the public paying for the privilege of beta testing a product without complaint.

    I also disagree that so few female designers aren’t the problem. Yes, part of the problem is the bias in the databases used – and the public that generates the data, but feminist minded programmers would have caught these problems before it went public. Also, female programmers for the databases would help to alleviate some of the bias there, even if it’s root is in crowd-sourcing as well as the programmers themselves.

    Still, lots of good information and does a good job of talking about the consequences of institutional sexism.

    Also, considering that a decent amount of the nuts and bolts of the programming seems to boil down to the strengths and weakness of semantic versus crowd-sourced vocabularies, and how and when to tweak both…I predict SiriFail is going to be brought up in the library science Vocab Design class I’m taking next semester. :)

  169. Commandrea
    Commandrea December 2, 2011 at 6:47 pm |

    It’s a glorified search engine. Get over it.

  170. Emily
    Emily December 2, 2011 at 7:16 pm |

    I tried this and Siri provided 6 nearby abortion clinics right away. Maybe there’s not app for male prostitutes since she has a built-in vibrator?

  171. PrettyAmiable
    PrettyAmiable December 2, 2011 at 8:28 pm |

    PosedbyModels: These are the things blowing my mind right now

    Am I the only one shocked that Jill lives so close to so many escort services? What’s the range on that thing?

  172. C J
    C J December 2, 2011 at 8:50 pm |

    Has anyone tried asking where to get a vibrator, or a store that sells sex toys? If it can, I wonder where it would send you (i.e. to a nice, maybe even woman-owned, shop or one of the shady ones with creepy costumes in the windows).

  173. FashionablyEvil
    FashionablyEvil December 2, 2011 at 9:01 pm |

    Am I the only one shocked that Jill lives so close to so many escort services? What’s the range on that thing?

    Jill does live in the largest city in the United States…

  174. Why this NPR article on Siri and Abortion = FAIL « SCATX: Speaker's Corner in the ATX

    [...] made unless you take the time to explain WHY Jodi mentions self-censoring. On the other hand, many women’s groups have lamented that many tech products, Siri included, are clearly designed by men, primarily for [...]

  175. PrettyAmiable
    PrettyAmiable December 2, 2011 at 10:48 pm |

    I live in the largest city in the United States. 12 escort services in the vicinity is a lot.

  176. EG
    EG December 2, 2011 at 11:49 pm |

    Sirkowski: Yeah, complain about your consumer product made by underpaid slave labor in a totalitarian regime not meeting your bitch standards. FIRST WORLD PROBLEMS DERP

    Your problems are not important, bitch! Ongoing difficulty with ending an unwanted pregnancy is no big–it’s just like having to use one-ply toilet paper for a week! How dare you write a post about your own situation–everyone knows that bitches need to put other people first!

    Matt: Is it somehow impossible to understand that you can tell apple, hey, siri can’t do this right, without starting some crazy anti women conspiracy agenda being attributed to Apple?

    So…again…this is all just coincidence, according to you? I mean, it’s taking place in a climate of ongoing attacks on women’s ability to get abortions and contraception that range from legislative attacks to falling rates of doctors learning how to perform abortions to violence against abortion providers to propaganda campaigns…but this is just some random thing that happened, nothing to do with any of that? It’s not even about women’s issues not being considered important by male programmers? It’s just…a freak accident, like a tornado or something? Hmm. Which seems more likely to you–yet another example of the taken-for-granted institutionalized sexism that informs our society, or freak tornado?

    John: EG,
    Thanks for the explanation although I’m not sure why you assumed I might being “being a dick”. Is your default mode hostility and suspicion?

    When a man asks a question about why women don’t do things because hey, there’s no law against it? Given the past several decades about where such conversational openers go and what’s usually being implied, then, yes, in that context, my default mode is hostility and suspicion. If your question was not disingenuous, I’m pleased to hear it.

    You Siri-ous?: Perhaps it’s the fact that, being in its beta stage, it’s not finished.

    So they’re selling an incomplete product for hundreds of dollars and didn’t use any women as testers before they put it on the market? The only conclusion I can draw from that, then, is that not only are they mindlessly sexist, but they’re also dishonest and incompetent.

    You Siri-ous?: Do you seriously expect everything to work perfectly for all people on day 1, or heck, day 0 considering that this is a public beta – a beta open to the public for the sole purpose of working stuff like this out?

    It’s open to the public so that Apple can make money from it; if what they want is beta testing because they know there’re multiple problems, why do they expect people to pay for the privilege of doing the testing? And do I expect it to work perfectly for all people? No. Do I expect it to work equally well for men and women? Yes. Do I expect it to be able to respond helpfully to significant concerns women have? Yes.

    You Siri-ous?: When you get an error like with this whole abortion thing, it can be explained away simply because it’s beta.

    See, it’s just an error. Errors never mean anything, or offer any insight into the biases of the people who make them. They’re totally random. It’s unreasonable to expect Siri to be able to help you find contraception, emergency or otherwise. The programmers had to deal with important stuff, like helping men get off more easily. God, do you expect everything to be perfect?

    Dude, we know it’s an error. It’s a stupid, sucky, sexist error. That’s what we’re complaining about.

  177. Mike
    Mike December 3, 2011 at 12:00 am |

    I don’t think apple hates women as much as they fear lawsuits
    from Focus on The Family or the National Right to
    Life comittee for some 17 year old girl in California looking up
    Her nearest clinic. Should the information be there. Absoultely! Flood the IOS developers with angry e-mails and maybe they’ll
    Include a Seri update with that crucial information on it.

  178. Mike
    Mike December 3, 2011 at 12:10 am |

    I just searched for the nearest Planned Parenthood
    Clinic and 20 matches appeared.
    Looks like someone forgot to update their
    IOS.

  179. Bridget Girton
    Bridget Girton December 3, 2011 at 1:56 am |

    I live in an area that could be described as “very conservative,” to the point that we really DON’T have any abortion clinics nearby, nor do we have any escort services. It’s a small enough town to be unlikely to have those things anyway, but also it’s controlled by a conservative Christian college. I used Yelp to search for “abortion clinics” and got nothing. I searched “abortion” and got a couple results: one was described as “family planning” and the other “abortion alternative.” I got lots of results when I searched “Women’s health.” The second result was Waffle House, but most of them were applicable. So maybe because it only searches for things that have the word “abortion” in the name of the place? It only found Planned Parenthood when I specifically searched for it. Most abortion clinics don’t have “abortion clinic” in their actual name.

  180. Brian Schlosser
    Brian Schlosser December 3, 2011 at 2:58 am |

    jennygadget: That a big part of the problem is that Apple is trying to do beta testing while claiming a finished product. This is part of why questions about intent were raised. Apple only claimed “beta” once major flaws came up, because they wanted the sales and were hoping to get away with the public paying for the privilege of beta testing a product without complaint.

    Ding ding ding! The people yelling “Not fair! Its in BETA!! are being willfully ignorant of the fact that Apple is NOT beta testing Siri in the classic sense at all. It’s only a beta test in the sense that the iPad was a beta test for the iPad 2. If you are advertising a product that is for sale NOW, the general public is going to assume it is done, even if you tack on a disclaimer about features.

    And all that is beside the point. The omissions are way too suspiciously similar to be easily explained by the “its in beta!” argument.

  181. Some Guy
    Some Guy December 3, 2011 at 7:07 am |

    Jill, you are a blithering idiot.

    Read and learn.

    The long and short of it is, Siri is a limited system, and when you get to the edges of what it’s capable of, you make a complete ass of yourself if you assume that someone maliciously decided to thwart your desire for whatever you imagine its capabilities should be.

  182. EG
    EG December 3, 2011 at 8:34 am |

    Bridget Girton: Most abortion clinics don’t have “abortion clinic” in their actual name.

    Most escort services don’t have “blow job” in their actual names, either.

  183. EG
    EG December 3, 2011 at 8:37 am |

    By the way, and just out of curiosity, does Siri have anything helpful to say if you tell it “I have a yeast infection”?

  184. Links 12/3/11 | Mike the Mad Biologist
    Links 12/3/11 | Mike the Mad Biologist December 3, 2011 at 10:30 am |

    [...] 1% got all the money Siri Is Sexist Raise Taxes on Rich to Reward True Job Creators: Nick Hanauer Siri: Total Misogynist. Share this:TwitterFacebookStumbleUponRedditDiggEmailPrintLike this:LikeBe the first to like this [...]

  185. Sheelzebub
    Sheelzebub December 3, 2011 at 11:06 am |

    @Some guy–speaking of being a bithering idiot, how about you pull your head out of your ass and develop some reading comprehensions skills?

    On second thought, dance for us, troll. If you’re going to play the same music that’s been played before, shut the fuck up and dance to it. It’s not as if you’ve proven can make any cogent arguments.

  186. igglanova
    igglanova December 3, 2011 at 11:50 am |

    WTF is with this ‘lawsuits’ crap? Nobody can sue you for providing people with information they dislike. Try harder.

  187. jennygadget
    jennygadget December 3, 2011 at 12:01 pm |

    Jill,

    But you are talking about computers! And you have a vagina! And are making those strange feminist noises! You couldn’t possibly be making sense! It’s a statistical impossibility!

    The people yelling “Not fair! Its in BETA!! are being willfully ignorant of the fact that Apple is NOT beta testing Siri in the classic sense at all.

    Yeah, they are being classic mansplainers: assuming they are introducing a new term and concept to us silly women. And so eager to explain it to us all that they don’t even stop to think that Apple is full of bullshit (and they know it) and so now they are too.

    But yes, willfully ignorant too.

  188. jennygadget
    jennygadget December 3, 2011 at 12:03 pm |

    hahaha! and of course I commit blockquote fail on that particular comment. *facepalm* :p

  189. Mysti
    Mysti December 3, 2011 at 7:11 pm |

    Someone asked Siri “Why are you anti-abortion?” and she answered “I just am, Kristen.”

  190. Sunday Reading « zunguzungu
    Sunday Reading « zunguzungu December 3, 2011 at 9:16 pm |

    [...] Siri sexist or misogynist? Well, “It’s pretty appalling that programmers thought far ahead enough to know where to [...]

  191. A. L.
    A. L. December 3, 2011 at 9:34 pm |

    > Siri’s programmers clearly imagined a straight male user as their ideal and neglected to remember the nearly half of iPhone users who are female. <

    imho absolutely – what else ?
    why otherwise the stated answers to explicit/supposedly "male" questions ?
    and as one of the comments above already said – imho rule #1 #2 #3 in soc. design has always been and will continue to be "you are NOT your customer".
    otherwise, in my experience, it's called *FAIL

    (btw, as far as i know apple has an ongoing his-tory of soc. sex-negativity e.g. concerning apps. and no, i am far too concerned about the little soc. freedom i actually have that i do not wish a soc. smart-phone to spy an me/my every move and then – esp. unknowingly – e.g. ping-it-back-to-cupertino-or-some-such-hq. ergo, i will neither own nor operate a soc. smart-phone. i call it luxury and/or choice.)

  192. David Makalaster
    David Makalaster December 3, 2011 at 9:42 pm |

    I love how you neglected to mention that Siri’s suggestion for “removing a hamster from your rectum” is just as useless as Siri’s response for your request for an abortion. (Source: Link provided in your article).

    You clearly use this to give credibility to your article although, in this case, it is entirely inapposite

  193. A. L.
    A. L. December 4, 2011 at 1:03 am |

    fyi/linkjuice :

    > “It’s just a phone, why do you expect it do all this?” Mr. Winarsky said he had no knowledge of how Siri was changed after it was acquired by Apple. <
    http://bits.blogs.nytimes.com/2011/11/30/apple-says-siris-abortion-answers-are-a-glitch/?hp&pagewanted=all

  194. A. L.
    A. L. December 4, 2011 at 1:08 am |

    meh, post was mangled. 1st quote wrong link

    > “It’s just a phone, why do you expect it do all this?” Mr. Winarsky said he had no knowledge of how Siri was changed after it was acquired by Apple. <

  195. Calioak
    Calioak December 4, 2011 at 2:26 am |

    I don’t buy the “abortion clinics don’t advertise” argument. I live in a mid-sized town in a fairly conservative area and I can several abortions in the phone book. Surely Siri can search the yellow pages.

    This is still hilarious.
    How does Siri respond to questions about preventing pregnancy, or asking for condoms?

  196. Calioak
    Calioak December 4, 2011 at 2:30 am |

    Should say I can find several abortion clinics in the phone book.
    I can type too. Sometimes.

  197. EG
    EG December 4, 2011 at 9:25 am |

    David Makalaster: I love how you neglected to mention that Siri’s suggestion for “removing a hamster from your rectum” is just as useless as Siri’s response for your request for an abortion. (Source: Link provided in your article).

    You clearly use this to give credibility to your article although, in this case, it is entirely inapposite

    Because removing a hamster from your rectum and getting an abortion are analogous, so it’s totally cool that they have equally unhelpful answers?

    I never knew that men need hamsters removed from their asses as often as that, or that being able to do so was essential to their reproductive freedom. You learn something new every day.

  198. Jovan1984
    Jovan1984 December 4, 2011 at 9:39 am |

    Yep. It’s an Apple conspiracy when Siri says something like that. Case closed.

    One more reason why I am sticking with Microsoft. Got a new Acer last weekend.

    Mysti:
    Someone asked Siri “Why are you anti-abortion?” and she answered “I just am, Kristen.”

  199. Avida Quesada
    Avida Quesada December 4, 2011 at 11:39 am |

    jennygadget:
    “perhapstheydid,butlikeAvidaQuesadasaid,theyweren’tafeministwomen?”

    Iwasn’tawarethatonlyfeministwomenusedcontraception.

    As I say there are two components in my argument. The first is how do you present yourself. Pregnancy crisis centers are scams. The computer will not notice until you add code (specific) for it. Abortion clinics on the other hand try to present themselves as women heath.

    If you need contraception you can go to almost any clinic or hospital and get it.
    What most women need, most of the time, when in need to find (quickly) a pharmacy or hospital is not for birth control or to get an abortion. Not even the day after pill. Most women have contraception (oral) reserves for at least a month.

    What we need is something for a headache or the like.

    In fact, women were loving Siri until feminist like us test it.

    My best friend is a male software engineer. He did this: He open parent a Planned P arenthood page (http://www.plannedparenthood.org/) and then (http://www.optionline.org/) he right click on the page peek the source code and search for the meta tags containing abortion. As Jill mention. SEO is top for Pregnancy Scammers, low for Planet Parenhood.

    Apple need to take the reproductive need of women as a especial case and program siri to deal with that. But seen that don’t need a women, it needs a feminist to be aware of the especial challenges that women specific reproductive care has on the patriarchy.

    Love,
    Avida

  200. David Makalaster
    David Makalaster December 4, 2011 at 11:43 am |

    EG: Becauseremovingahamsterfromyourrectumandgettinganabortionareanalogous,soit’stotallycoolthattheyhaveequallyunhelpfulanswers?

    Ineverknewthatmenneedhamstersremovedfromtheirassesasoftenasthat,orthatbeingabletodosowasessentialtotheirreproductivefreedom.Youlearnsomethingneweveryday.

    You are right, it is not cool for this flaw to exist in Siri’s programming. My original point stands however.

  201. Whitney
    Whitney December 4, 2011 at 2:43 pm |

    relevant? Siri & Rape

    http://i.imgur.com/Fu7GQ.png

  202. Joel
    Joel December 4, 2011 at 9:46 pm |

    Does anybody else see how creating a gender-specific female automaton/servant is already a little offensive to women? Just sayin’

  203. LAZARO: iPhone’s Siri lacks knowledge of women’s health issues | Women Health Wizard

    [...] without a doubt lacking information when it comes to women, however. According to a Dec. 1 post from Feministe.com, Siri “is apparently unable to find anything related to women’s health.” In fact, [...]

  204. democraciaglobal.net » Seriously, Siri?

    [...] Jill: I need contraception. Siri : I don’t know ‘I need contraception.’” [...]

  205. LAZARO: iPhone’s Siri lacks knowledge of women’s health issues

    [...] without a doubt lacking information when it comes to women, however. According to a Dec. 1 post from Feministe.com, Siri “is apparently unable to find anything related to women’s health.” In fact, [...]

  206. groggette
    groggette December 5, 2011 at 11:30 am |
  207. groggette
    groggette December 5, 2011 at 11:31 am |

    But, you know, it’s obviously all just a coincidence.
    /BETA!!!

  208. gina
    gina December 5, 2011 at 12:04 pm |

    if your are truly concerned about women, you should be concerned that Siri is sending men to escort services and prostitutes

  209. Hershele Ostropoler
    Hershele Ostropoler December 5, 2011 at 12:29 pm |

    When I typed “I need an abortion” into Google the first result was a CPC.

    Sirkowski: Because talking to your iPhone is what you need to do when you want an abortion… e_e

    Irrelevant. If it is what people do, they deserve answers, not you judging them.

    Matt Simpson: But everyone thinks there is some hidden agenda, some evil reason why this inanimate device does not hold the same views on the world they do,

    No one thinks that, or at least no one’s said that. It’s pretty clear that the consensus (among the people critical of Apple, not detractors) is that it’s stupid rather than evil, that the omission was one of thoughtlessness rather than hostility.

    Thing is, thoughtlessness is still bad, and still something that should be fixed. Saying “it was a thoughtless mistake” doesn’t let you off the hook.

    Athenia: I am absolutely beginning to think that this is malicious. To think that women don’t use their technology—you know, oops!–it absolutely ridiculous.

    I don’t think it’s so much that they didn’t think women would use it as that they didn’t think women might have uniquely (normative-)female needs. It’s anti-sexist, in a ill-considered, coarsely granular way. Or would be if it didn’t list escort services.

    I’ll bet it’s even worse on trans*-specific needs.

    You Siri-ous?: Having big glaring errors in it like this is a part of being a beta product and if you don’t like that, I would suggest you hold off using it until the final version is released.

    If no one is to point out problems discovered during a testing stage, what, pray, is a testing stage for?

    suspect class: NY Yelp does have escort service listings. I’m fairly certain they don’t list body-dumping ravines, though.

    Everyone knows you dump bodies in the Gowanus Canal.

    Commandrea: It’s a glorified search engine. Get over it.

    That would be a stupid first comment; as the 180th, there are no words.

    Moreover, why is it okay for Google and Yelp to be unable to find abortion services (that’s not the case, but it is what you’re claiming)

    Some Guy: The long and short of it is, Siri is a limited system

    Why are you being so patronizing?

    The problem is not “Siri doesn’t do everything asked of it” (to which “it’s in beta” or “it’s a limited system” are adequate responses if true); the problem is that there’s a pattern to Siri’s lapses, the holes where there’s a reasonable expectation of it not having holes verses no holes where it really couldn’t be faulted for having holes. If the problem was “Siri can’t find anything starting with P” that would be a fuck-up with no larger meaning. The specific lacunae show undue influence of the worst parts of the surrounding culture. This doesn’t preclude the explanation “the programmers didn’t make this a high priority” but the choice of what to deprioritize wasn’t made in isolation and isn’t beyond criticism.

  210. Walter
    Walter December 5, 2011 at 12:32 pm |

    Lee:

    Itwouldbeanincrediblecoincidencethattheconversationdatabase‘justhappened’tohaveananti-choiceflavourwhileabortionclinicresults‘justhappened’tobeamongthesetofresultsaffectedbyaprogrammingerror–evenmoresoifthaterroraffectsaverynarrowrangeofresults,orsearchesforabortionclinicsspecifically.Andit‘justhappens’tobeahot-buttonpoliticaltopicrightnowthatcouldbespunbadlyforAppleifconservativesdiscoveredtheiPhonewereprovidingthiskindofinformationtopeople.

    Idon’tknowifanyone’sgoingtoreadallthat,butIwantedtoshowfromacomputerystandpointhowthisworksandwhyitdoesn’treallyseemlikeanoversightoran‘oops’situationthatthisspecifickindofqueryjusthappenstoberespondinglikethis.IfSiriishavingalotofproblemswithmanydifferentrecognisedqueriesthatworkonYelp,thenthatwouldlendsomecredencetotheideathatit’saglitchandthatit’sjustacoincidencethatthepregnancyresponsesaresoskewed,butweshouldn’timmediatelyaccepttheun-nuancedhand-waving‘ohy’know,computers,theygowrongsometimes’excusethatI’veseenafewpeoplerepeating

    Well, I did read all that, Lee.

    And I agree with everything you wrote. I couldn’t agree more.

    Thanks for your excellent statement.

  211. duh
    duh December 5, 2011 at 2:00 pm |

    you can still manually google any of that fyi.

  212. EG
    EG December 5, 2011 at 2:17 pm |

    David Makalaster: You are right, it is not cool for this flaw to exist in Siri’s programming. My original point stands however.

    That comment was your original point. The program recognizes what “hamster in rectum” means well enough to provide a related service. It does not know the word “contraception” at all and considers the word “vagina” to be too dirty to deal with. What was your original point again?

  213. Apple’s Siri overcompensates for previous abortion ignorance

    [...] Jill Filipovic wrote a devastating follow up piece calling Siri a “total misogynist” and outlined some additional repro health-related [...]

  214. McTavish
    McTavish December 7, 2011 at 9:29 am |

    Scott Wiebe:
    Iknowyoushouldn’tattributetomalicewhatcouldbeincompetencebutIthinkyouarebeingfartoogenerousinyourassumptionthatwomenwerejustoverlookedbytheprogrammers.Idoubtsmartphonescontaintheirowntablesofinformationortopics:whenyousay‘Ineedxxx’Sirijustpassesxxxalongtogoogleandgivesyousometopresults.Gotogoogleandmanuallysearchforabortion,birthcontrolorrapecrisiscentresandgoogledoesfindresults.Lotsofthem.YetsomehowSirican’tfindthem.

    IftheyDIDcreateabig‘olelistofthingstopasstogooglebutjustforgotaboutwomen’sissuesthenhowdidanti-choicecrisiscentresendupinthere?Ifitreallydoeshaveasubjectlistthenthedevsmusthavethoughtofwomen’sreproductionissueswhentheyputintheanti-choiceoptions…andthentheyforgotaboutwomen’sreproductivehealth?Idon’tbuyit.Also,ifSirifailstofindinformationonanythingelse,thenSirioffersalinktogoogle…unlessit’sawomen’sissue.

    Ican’tacceptthepremisethatthisisbecauseSirihasalonglistofallthethingsitwouldforwardtoanexternalsearchengineandtheyforgotafewthings–Ithinkit’sfarmorelikelytohaveashortexcludelist.Ireallydon’tbelievethiswasanoversight,IbelievesomeoneatApplemadethechoicetocodeSiriwithblocksonsubjectstheydon’tlike.IhopeI’mwrong,Ihopeit’sjustanoversightbutIdon’tthinkso.Ican’tconceiveofanywayaprogrammercouldcreatesomethingwhichtakeswhatyousayandsearchestheinternetforitoroffersyoualinktogooglebutdoesnotdothatonashortlistoftopics.Notwithoutintentionallyputtinginblocksonthosetopics.

    I agree totally. Anyone use Trend Micro? They block women’s sites all the time on specious grounds. I suspect other so-called security programs do the same. It’s a real problem. There is a war against women, it is endless, it is unrelenting, and it is evil.

  215. ปิตาธิปไตยในไอโฟน « thanaitime

    [...] จาก Siri: Total Misogynist – Feministe.us [...]

  216. interactive everything » Development Tips & Resources (Roundup #1)

    [...] Try not to forget that nothing we build is neutral or apolitical. Database design is not neutral. Ontology assumptions are not neutral. Error rates in automated systems are not neutral. [...]

  217. Development Tips & Resources (Roundup #1)

    [...] Try not to forget that nothing we build is neutral or apolitical. Database design is not neutral. Ontology assumptions are not neutral. Error rates in automated systems are not neutral. [...]

Comments are closed.

The commenting period has expired for this post. If you wish to re-open the discussion, please do so in the latest Open Thread.