Skip to content
Naked Security Naked Security

Facebook to listen out for posts from people vulnerable to suicide

Scheme rolling out in the US will refer posts flagged by algorithms to human beings for response

Facebook plans to update its algorithms so that it can “listen” for people who are in danger of suicide, in a move planned to roll out initially in the US. The idea will be to look out for certain key phrases and then refer the matter to human beings on the Facebook staff, who will then ask whether the writer is OK.

The move follows a similar attempt by the Samaritans in 2014, on Twitter. This received criticism for its design and the organisation later canned the whole idea due to privacy concerns – it was criticised for enabling stalking as users couldn’t opt out.

Facebook founder Mark Zuckerberg had already announced that the organisation would be using artificial intelligence to root out terrorist posts in his much-publicised blog entry last month, but this appears to be the first time the company has used AI in anger since he wrote this.

Stephen Buckley, head of information at mental health charity Mind, gave the move a cautious welcome:

People in crisis may ask for help in places where the help they need is not readily available. Signposting people to appropriate sources of support can be a really important step in helping people to access the help they need.

The nuts and bolts are relatively straightforward. For the moment at least the idea covers generally viewable areas and not (for example) Facebook Messenger, so if you’ve set something to be private it should remain that way.

The tips and advice that will be offered have been developed in conjunction with professional and clinical bodies including the Samaritans and many others, and Facebook staff handling the issues won’t be cut adrift – the company recognises the difficulty of this sort of work and will offer psychological support and wellness resources to these employees, and it proposes to review this support annually.

The company is also teaming up with external organisations to handle suicidal behaviour on its Facebook Live broadcast system in the US.

Any UK readers who are affected by the issues raised in this article are encouraged to call the Samaritans on 116 123


1 Comment

With the massive amount of depressed people, I suspect this algorithm will be working overtime, and itself might commit suicide when it see how depressed people are. More should be paid attention to is why people feel this way. A feeling of lack of control over their lives, of helplessness, drives them to control the one thing they feel they can. I believe much of this is from people being constrained by financial constraints, the less you earn the more you pay (bank fees, medical bill, food) cost that cannot be avoided. No hope for improvement, voices and opinions that fall on deaf ears of politicians that are only vested in financial returns (themselves or industry), barely fettered pollution driving up illnesses, and all we can do is watch as the world becomes sicker and sicker. If it wasn’t for being distracted by media, thick skinned and drunk/high half the time, the suicide rate would be a lot higher – or that’s my opinion anyways.

Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?
You’re now subscribed!