Facebook rolls out new suicide prevention tools including AI tech test
- Wednesday, March 1st, 2017
- Share this article:
Facebook has updated its tools and resources to offer support those who may be considering suicide, and their concerned friends and family, in its bid to help against the one death every 40 seconds by suicide worldwide.
Suicide is the second leading cause of death in young people – who make up the largest proportion of Facebook’s users – and Facebook believes it is in a ‘unique position’ to offer those in distress the support that they need.
The updated tools include suicide prevention tools integrated into Facebook Live, live chat support from support organisations through Messenger and is testing AI technology in identifying posts that are likely to include thoughts of suicide.
Facebook said: “Suicide prevention is one way we’re working to build a safer community on Facebook. With the help of our partners and people’s friends and family members on Facebook, we’re hopeful we can support more people over time.”
The Facebook Live integration means that people watching a live video will now have the option to reach out to the person directly and report the video to Facebook. Facebook says it will also provide resources to the person reporting the video to help their friend.
The live Messenger chat support feature has added the ability for people to connect to Facebook’s crisis support partners – including Crisis Text Line, the National Eating Disorder Association and the National Suicide Prevention Lifeline. This connection can be made through the option to message someone from the organisation’s page or through Facebook’s suicide prevention tools. In addition, Facebook has launched a video campaign with partner organisations to raise awareness about ways to help their distressed friends.
Facebook’s AI technology is currently being tested on posts previously reported for suicide in the US. The AI will make the option to report posts about ‘suicide of self-injury’ more prominent. The AI is also being tested for identifying posts as ‘very likely’ to include thoughts of suicide – which would then be reviewed by Facebook staff.