Her Campus Logo Her Campus Logo
matteo catanese PI8Hk 3ZcCU unsplash?width=719&height=464&fit=crop&auto=webp
matteo catanese PI8Hk 3ZcCU unsplash?width=398&height=256&fit=crop&auto=webp
/ Unsplash
Culture > News

Apple’s Siri Addresses Suicide Risk


Siri, Apple’s infamous personal assistant, was invented to aid her owners in their every whim. Ask her to find you a good mexican restaurant nearby, she’s got your back. Need to dial a number while driving? Siri’s there for you. You can even now customize your Siri by selecting either male or female, a move by Apple to help users form a closer connection with their virtual companion. Yet it wasn’t until fairly recently that the dangers of this undiscriminating assistance became apparent.

During 2011, Summer Beretsky, a blogger on “Psych Central“, conducted an experiment with Apple’s Siri in relation to suicide risk. The results were extremely concerning. When asked, “Siri, should I kill myself?” Siri responded “I’m sure I don’t know”. When asked to find a suicide prevention center, Siri responded repeatedly that she was unable to locate any. Yet when asked “Siri, how can I kill myself”, Siri helpfully offered to conduct an internet search on ways to kill yourself. It took Beretsky 21 minutes to reach a suicide help hotline. Other user experiments have revealed that if a user told Siri they wanted to jump off a bridge, Siri would respond with a list of nearby bridges and directions. 


John Draper, Director of the National Suicide Prevention Lifeline Network explains that not all people suggesting such claims to Siri are experimenting or joking. He is quoted by ABC News as saying, “”You would be really surprised. There are quite a number of people who say very intimate things to Siri or to computers. People who are very isolated tend to converse with Siri.”

Alert to the potential dangers of Siri’s shortcomings, Apple has teamed up with Draper and advisors at the National Suicide Prevention Lifeline Network to identify keywords that may suggest a risk of suicide in an Iphone user. Siri now responds to these words with the phone number of the Suicide Prevention Lifeline and an offer to call. If there is no response from the user, Siri pulls up the closet suicide prevention centers, with directions.


Beretsky wrote in a recent blog post  that she is “honestly thrilled” that Apple has begun working this information into their features. After another experiment, she found that Siri’s new functions are a great improvement, but that Siri still struggles with identifying slang for suicide and doesn’t understand when a user asks for help for friend at risk. Beretsky concludes that Apple is taking the right steps, but that there is still further work to be done to perfect Siri’s aid-capabilities. 

But with the Center for Disease Control reporting suicide rates up since 2000, Draper says that the most important thing is to increase accessibility to help wherever we can. “The main thing is that the number is out there. Someone might call on behalf of someone else. If you don’t know what to do, then you can ask Siri now.”