Skip to content
Naked Security Naked Security

Artificial Intelligence to listen for suicidal thoughts on social media

Individuals won't be identified. Nor will intervention be attempted. The aim is, rather, to proactively spot regional trends.

Canada is planning a pilot project to see if Artificial Intelligence (AI) can find patterns of suicidality – i.e., suicidal thoughts or attempts, self-harm, or suicidal threats or plans – on social media before they lead to tragedy.
According to a contract award notice posted by the Public Health Agency of Canada (PHAC), the $99,860 project is being handled by an Ottawa-based AI company called Advanced Symbolics Inc. (ASI). The agency says the company was the only one that could do it, given that ASI has a patented technique for creating randomized, controlled samples of social media users in any geographic region.
The focus on geographic region is key: As it is, the country is reeling after a dramatic spike in suicides in Cape Breton among girls 15 years old and younger and men in their late 40s and early 50s.
The idea isn’t to identify specific individuals at risk of suicide. Nor is it to intervene. Rather, the project’s aim is to spot patterns on a regional basis so that public health authorities can bolster mental health resources to regions that potentially face suicide spikes.
The project is set to begin this month and finish by the end of June, if not before.
First, the PHAC and ASI will work to broadly define these suicide-related behavior terms: ideation (i.e., thoughts), behaviors (i.e., suicide attempts, self-harm, suicide) and communications (i.e., suicidal threats, plans). The next phase will be to use the classifier to research the “general population of Canada” in order to identify patterns associated with users who discuss suicide-related behavior online.
According to CBC News, PHAC says that suicide is the second-leading cause of death for Canadians aged 10 to 19. The news outlet quoted an agency spokesperson:

To help prevent suicide, develop effective prevention programs and recognize ways to intervene earlier, we must first understand the various patterns and characteristics of suicide-related behaviors.
PHAC is exploring ways to pilot a new approach to assist in identifying patterns, based on online data, associated with users who discuss suicide-related behaviors.

Kenton White, chief scientist with ASI, told CBC News that nobody’s privacy is going to be violated.

It’d be a bit freaky if we built something that monitors what everyone is saying and then the government contacts you and said, ‘Hi, our computer AI has said we think you’re likely to kill yourself’.

ASI’s AI will be trained to flag particular regions where suicide may be likely. In Cape Breton, for example, three middle-school students took their lives last year.
White said that there are patterns to be gleaned from Cape Breton’s spike in suicides. The same can be said for patterns that White says have appeared in suicides in Saskatchewan, in Northern communities, and among college students.
ASI CEO Erin Kelly told CBC News that the AI won’t analyze anything but public posts:

We’re not violating anybody’s privacy – it’s all public posts. We create representative samples of populations on social media, and we observe their behavior without disturbing it.

CBC News reports that ASI’s technology could give regions a two- to three-month warning before suicides potentially spike – what could be a vital beacon that government officials could act on by mobilizing mental health resources before the suicides take place.


This isn’t the first time that technology has been applied to suicide prevention. At least as early as 2013, Facebook was working with researchers to put its considerable data mining might to use to try to discern suicidal thoughts by sifting through the social media streams and risk factors of volunteers. Such risk factors include the distinct types of suicide that correlate with factors such as whether the victims were male (making suicide more likely), married (less likely) or childless (more likely).
Facebook and researchers at the Geisel School of Medicine at Dartmouth recruited military veterans as volunteers: a group with a high suicide rate.
At that early stage, Facebook, like HPAC and ASI, didn’t include intervention. The researchers were’t empowered to intervene if suicide or self-harm was flagged.
Since then, Facebook has introduced technologies geared at intervention.
In March 2017, Facebook said it planned to update its algorithms so as to “listen” for people in danger of suicide. The idea was to look out for certain key phrases and then refer the matter to human beings on the Facebook staff, who would then ask whether the writer was OK.
The move followed a similar attempt on Twitter by the Samaritans in 2014. That attempt was aborted in a matter of months as critics lambasted the project’s design due to privacy concerns – it was criticized for enabling stalking, given that users couldn’t opt out.


3 Comments

They can do all the analysis they want, harass people who are sick of this sick facade called social structure. But Suicide will only continue to grow until governments address the reason people feel the desire to leave. (wont’ happen), we live in a society that rewards greed and punishes being humble. News media and politicians strives for and thrives on fear, conflict, being social dividers. The ever increasing air and water pollution – which has only ever increased. Work more for food and shelter – taxes and fees keep people living on the brink of homelessness so that corporate greed can skin a few more human souls for the sake of drilling more oil to dump in the water, so drinking water cost more than food. It’s amazing everyone isn’t suicidal the way the world is going – or are we. Can you honestly say in the last month you didn’t at least once even jokingly say; F it, I’d rather not wake up (or similar).

Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?
You’re now subscribed!