The World Health Organization reports that every 40 seconds there is someone committing suicide in some corner of the world. And what is striking is that between the age group 15-29, suicide is the second leading cause of death. The impact of suicide is far reaching and devastating, destroying families and hitting loved ones the hardest. But, the WHO also maintains that suicides are preventable.
It’s been a year now that Facebook launched the live-video streaming feature so that people all around the globe could share their personal and important moments for the rest of the world to see. But, things went awry as the social media giant saw a series of unfortunate incidents on Facebook Live, such as the suicide of a 33 YO actor in Los Angeles County who shot himself in the head on Facebook Live and that of a 14 YO girl in Miami who hanged herself in her foster home on Facebook Live, both the incidents taking place in January, 2017.
In the light of such horrifying events, Facebook CEO, Mark Zuckerberg has stepped up to prevent such happenings by incorporating new suicide prevention measures in Facebook Live and Messenger. In his manifesto written in January 2017, Zuckerberg wrote: “There have been terribly tragic events — like suicides, some live streamed — that perhaps could have been prevented if someone had realized what was happening and reported them sooner. Going forward, there are even more cases where our community should be able to identify risks related to mental health, disease or crime.”
According to the company, “Facebook is in a unique position – through friendships on the site – to help connect a person in distress with people who can support them”. Facebook has taken this much necessary step by collaborating with organizations such as Crisis Text Line, the National Eating Disorder Association, and the National Suicide Prevention Lifeline.
Individuals watching such video content will have the option to report the video to Facebook and even reach out to the person in question directly and they’ll be also able to directly connect with mental health service providers and organizations through Facebook Messenger. By employing Artificial Intelligence (AI), Facebook is attempting to test “streamlined” reporting for suicide using pattern recognition to pick out potentially suicidal people in posts previously reported for suicide. This technology will supposedly make the option available to report a post about suicide distinctively prominent for such users identified as likely to cause self-harm.
Though Facebook has offered suicide prevention tools for over a decade now, the latest attempt is to try to improve these features for its users and arm them with the options to help out a person in need. But, really how successful can this step be? Joe Franklin, an assistant professor at Florida State University who runs the school’s Technology and Psychopathology Lab, states: “I don’t think it’s a bad thing and I think we should study it. But I would immediately have questions—I would not assume it would be effective.” Like him, many are doubtful about the real potential of this gesture.
Whenever a new step is taken, questions and doubts are bound to arise. But skepticism does not have to be the final take on it. In the right hands, anything can become a powerful tool for spreading goodness. So, we can hope that with this significant step of Facebook, we can all do our bit to increase awareness in this regard and together prevent such happenings.