In an effort to prevent suicide that is causing a death every second worldwide, Facebook has rolled out new tools. Facebook founder and CEO Mark Zuckerberg has announced this through his FB page.
Suicide is the second leading cause of death among young people in between the age 15 and 29, who make up the largest proportion of Facebook’s users. The social media giant believes it is in a ‘unique position’ to offer those in distress the support that they need.
Facebook is integrating suicide prevention tools in to its Live videos to help people in real time. It has unveiled a live chat support from crisis support organisations on Messenger and streamlined reporting for suicide, with the help artificial intelligence.
I wrote a letter on building a global community a couple weeks ago, and one of the pillars of community I discussed was keeping people safe. These tools are an example of how we can help keep each other safe.Mark Zuckerberg, Facebook CEO, in a post.
By integrating suicide prevention tools in Live videos, Facebook is giving viewers an option to reach out to the broadcaster in real time as well as report video to it. The social media giant will then show a set of resources on broadcaster’s screen to either reach out to a friend or contact a help line or see tips.
The live Messenger chat support feature has added the ability for people to connect to Facebook’s crisis support partners – including Crisis Text Line, the National Eating Disorder Association and the National Suicide Prevention Lifeline. This connection can be made through the option to message someone from the organisation’s page or through Facebook’s suicide prevention tools. In addition, Facebook has launched a video campaign with partner organisations to raise awareness about ways to help their distressed friends.
Facebook’s AI technology is currently being tested on posts previously reported for suicide in the US. The AI will make the option to report posts about ‘suicide of self-injury’ more prominent. The AI is also being tested for identifying posts as ‘very likely’ to include thoughts of suicide – which would then be reviewed by Facebook staff.