Priorities

Facebook to launch AI based Suicide Prevention Program

Facebook to launch AI based Suicide Prevention Program

by Yash Saboo December 5 2017, 6:43 pm Estimated Reading Time: 3 mins, 13 secs

Who would have thought a software could save lives? Facebook has come up with a new artificial intelligence technology that will scan all posts for patterns of suicidal thoughts, and when necessary send mental health resources to the user at risk or their friends, or contact local first-responders. By using AI to flag worrisome posts to human moderators instead of waiting for user reports, Facebook can decrease how long it takes to send help.

Facebook says the new effort will help it flag concerning posts and connect users with mental health services. It also represents a new front in its machine learning. One of the earliest cases of a live-streamed suicide was Abraham Biggs in 2008 during which he linked to a live-stream site called Justin.tv, where the video showed him overdosing on prescription pills.

Recently, a Miami teen committed suicide by hanging while streaming on Facebook Live, a Georgia teen committed suicide by hanging while live streaming on live.me, several Russian teenagers reportedly committed suicide as part of a social media game called Blue Whale, and a youth in Mumbai committed suicide by jumping off the Taj Hotel while streaming on Facebook Live. These tragedies invoke strong emotional reactions, such as shock, disbelief, disgust, helplessness, loss, anger, senselessness and numbness in the viewer.

To reach its at-risk users, Facebook says it is expanding its services that allow friends to report posts containing signs of any suicidal or self-mutilation plans and it provides a menu of options for both those individuals and the friend who reported them. Choices include hotlines to call, prompts to reach out to friends and tips on what to do in moments of crisis. This tool will now be available for Facebook live streams as well. Similar reporting systems exist on a number of social media platforms including Twitter, Pinterest, and YouTube. Facebook is now also piloting a program that will let people use Messenger, its instant messaging app, to directly connect with counsellors from crisis support organizations including Crisis Text Line and the National Suicide Prevention Lifeline (NSPL) for the people residing in the United States.

Craig Bryan, a researcher at the University of Utah who investigates suicide risk in veteran populations, has started to examine the importance of timing in the path to suicide. “In our newer research, we’ve been looking at the temporal patterns of sequences as they emerge—where we find it’s not just having lots of posts about depression or alcohol use, for instance, but it’s the order in which you write them,” he says.

Another important factor to consider, especially with teens, is how often their language changes, says Megan Moreno, a paediatrician specializing in adolescent medicine at Seattle Children’s Hospital. In a 2016 study, Moreno and colleagues discovered that on Instagram, once a self-injury–related hashtag was banned or flagged as harmful, numerous spin-off versions would emerge. For example, when Instagram blocked #selfharm, replacements with alternate spelling (#selfharmmm and #selfinjuryy) or slang (#blithe and #cat) emerged. “I continue to think that machine learning is always going to be a few steps behind the way adolescents communicate,” Moreno says. “As much as I admire these efforts, I think we can’t rely on them to be the only way to know whether a kid is struggling.”

The bottom line is that this initiative is an effort which will help millions all over the world. AI is positioned to potentially improve our knowledge about suicidal ideation and help us deliver interventions that could be sensitive, safe, and address one of the leading causes of death. But if it is oversimplified, non-transparent, and not subjected to the “do no harm” principle, it will simply add to the current challenge of managing suicides and could potentially worsen the situation.




Disclaimer: The views and opinions expressed in this article are those of the authors and do not necessarily reflect the official policy or position of thedailyeye.info. The writers are solely responsible for any claims arising out of the contents of this article.