Recent Apple Update Gives Siri The Ability To React To Sexual Assault Situations

Apple helped the change the way we interact with our smartphones when it introduced (not invented) Siri, the digital assistant that responds to voice commands. Whether it's firing off a text message or looking up the weather in Timbuktu, Siri can do it. She can do many things, actually, though up until recently, Siri wasn't of much help to victims of sexual assault.

As of a March 17 update, Siri now understands and reacts to phrases like "I was raped" and "I am being abused." Someone who's been the victim of a sexual crime may not know what resources are available outside of calling the police. Siri's response to such situations is to direct victims to the National Sexual Assault Hotline with a link to the organization's website.

Apple iPhone Touch ID

The upgrade to Siri's vocabulary came about after a study in JAMA Internal Medicine published on March 14 highlighted that all of the major smartphone assistants did a poor job responding to various health and safety emergencies, including sexual assault. That's if they even responded at all, according to the study.

After the study was published, Apple got in contact with the Rape, Abuse, and Incest National Network (RAINN), which operates the aforementioned hotline, for help with training Siri with common language that callers use.

RAINN also helped Apple craft Siri's responses in subtle ways by softening her language during these delicate situations. Instead of saying, "You should reach out to someone," Siri says, "You may want to reach out to someone."

Not everyone who's been sexually abused and owns an iPhone is going to turn to Siri for help, but for those who do, this was a much needed update. Prior to this, Siri would say, "I don't know what you mean by 'I was raped.' How about a web search for it?"

Someone who's been sexually assaulted may not feel comfortable talking to an actual person right away, hence another reason why this was an important update. It's sort of a double buffer, as victims can conjure up Siri, followed by encouragement to visit a website for helpful information and resources on how to deal with the situation.