Quantcast
Channel: Lousy Canuck» siri
Viewing all articles
Browse latest Browse all 2

Siri is probably not explicitly hiding abortion information

$
0
0

Next query: Siri, look harder.

So, Apple’s engineers recently found themselves with egg on their faces with regard to the iPhone 4S’ Siri application. Why? Privilege, as it happens.

My best guess about the “Siri doesn’t help find abortion clinics” maneno is that the engineers responsible for the application completely forgot to vet stuff relevant to women today, like abortion clinics, when testing their assistant’s ability to pull real phrases out of voice commands then finding relevant information from various sources. And they probably completely forgot to vet this stuff because the people testing it initially don’t particularly have that concern. They’re likely all men, or women who weren’t in positions to need reproductive services for whatever reason. Thus, those without that privilege get short shrift.

The AI technology works like this:

  • First, you speak a phrase at Siri. Your phone sends the raw audio of your speech to a central server, where it is processed to turn it into text.
  • That text is run through a keyword search, to pull out relevant words from the list, a syntax scanner that tries to determine what kind of question this is. In our example flowchart, let’s say “where can I find” clues it in that you’re looking for a nearby X, whatever X happens to be.
  • If overrides are programmed here to search for something else instead of a particular class of phrases, to obtain better results, then those “aliases” will be searched for here. Any alias-based programming overrides the original intent of the phrasing. Case in point: ask Siri where to hide a body, it asks whether you want to search for marshes, lakes, construction sites, etc. Or, the wide range of requests that Siri aliases to be requests for escort services. It’s intended to be funny, but it can be used to improve the service for everyone on the fly.
  • Then, since we’re assuming this is a location search, X is searched for via a location-based service like Yelp, or some other search if that class of phrase is going to get better results on a specific search engine.
  • The results are returned to your phone, and all that usually in practically no time flat.

As a test, I tried searching for abortion clinics on Yelp and didn’t get very many actual abortion clinics — just a few regular medical centres being derisively referred to as “like a back-alley abortion clinic”, e.g. dingy and dirty. So, the problem as far as I can tell is that nobody tested “abortion clinic” to see if the results spit back from the default location-based search engine were actually relevant. Thus the Crisis Pregnancy Centres (a.k.a. anti-abortion religious outfits pushing forced pregnancy, that love to masquerade as family planning centres) getting mixed into the search results, and Planned Parenthood being missing in many of them.

What’s more curious is the fact that normally when Siri doesn’t understand a query it sends it to Wolfram Alpha or Google. With “where can I find the morning after pill”, it doesn’t do this. I strongly suspect it’s the “morning after” part, since Siri is programmed with ease of making calendar entries or time-based reminders in mind. The syntax interpreter would read the “morning after” part and think it’s part of a time-based request, but the “where can I find” flags it as a search, and the AI probably gets confused for receiving two classes of requests at the same time.

The fix for the abortion question is to add aliases for “abortion clinic” or other such common phrases, and to direct the search process to seek out appropriate family-planning or other reproductive resource clinics via an engine that will actually provide appropriate results. Preferably, it would give you results that do not skew toward the CPCs which actively steer patients away from abortions, but I’d settle for a fix that shows both (described appropriately, of course). That fix has not yet been implemented, and there’s no real indication from Apple as to why.

Apple has claimed these issues are mere glitches, and are wholly unintentional. It strikes me as a very good thing they labeled the Siri service as beta, because they obviously alphatested it with a bunch of wise-asses internally, so as to get the widest range of joke answers they could manage, but only did cursory real-life testing before release. If someone using this technology had an emergency need for Plan-B contraceptives, one would hope they would not depend on this technology to steer them in the right direction.

All that said, I don’t believe it is an intentional omission, and I think people screaming about some sort of conservative agenda are imagining things. With all the real-life ways conservatives are trying to fuck up everyone’s lives for their own twisted sense of morality, there are better fights to expend resources on. Apple got burned by this one, and rightly so, but they’re not villains as far as I can tell. They just got really excited about some promising bit of tech they’d cooked up, they overhyped it as being like the Enterprise computer.

What they delivered was closer to ELIZA tied to a few search engines with a few quips for the most obvious jokes or sexist remarks. So now they have to scrub off the raw and raggedy edges, of which there are very many, and hopefully it’ll all happen in due course. The last thing I want to see is that every omission or oversight in Siri get turned into a knock-down drag-out fight over how embattled Apple has been in delivering on their promises. If they have to backpedal on their promises somewhat to prevent that from happening, I’d strongly advise they do so as early as possible.

In the event that they drag their feet on fixing the abortion issue, but manage to implement fixes for other issues that come up, that’s when you’re well within your rights to let slip the dogs of war.


Viewing all articles
Browse latest Browse all 2

Latest Images

Trending Articles





Latest Images