Canada will use AI to monitor suicidal social media posts

By Laura Lovett
12:30 pm
Share

This year the Canadian government will start using artificial intelligence to help track social media posts that could indicate someone is at risk of suicide, according to a contract

The Canadian government recently signed a contract with Ottawa-based AI firm Advanced Symbolics to monitor social media posts for suicidal behavior. 

In the first phase of the partnership, Advanced Symbolics will work with the government to define “suicide-related behavior,” according to the contract. This is typically defined through thoughts, behaviors, and communications. The company will then identify patterns that are associated with these behaviors based on online data. All of the data will come from the public domain, and will be anonymized. 

By June, Advanced Symbolics will give the government a final report that summarizes the findings. It will also be required to produce a mock-up of what a monthly report might look like — including at-risk demographics by age and gender, how changes in patterns impact risk, and protective factors, according to the contract. The government will then use this document to decide if there is potential use in continuing the national surveillence program. 

As part of the agreement Canada has the option to extend the term of the contract by up to five additional one-year periods under the same conditions. The first contract, which will start this month and go until June 30, 2018, costs $24,860. But if the Canadian government decides to extend for all five years, it will cost the country a total of $399,860. 

Advanced Symbolics developed its AI product, called Polly, in 2012. The company offers public opinion research, market potential analysis, real-time living surveys, and market research. 

In November Facebook announced a new initiative that uses AI to identify posts that are suicide threats or are associated with the risk of suicide. In this model Facebook AI prioritizes high-risk posts so that the Community Relations team (which includes dedicated self-harm specialists) addresses the most immediate danger first. AI can also alert first responders if needed. 

"Over the last month, we’ve worked with first responders on over 100 wellness checks based on reports we received via our proactive detection efforts," VP of Product Management Guy Rosen wrote in a blog post in November. "This is in addition to reports we received from people in the Facebook community. We also use pattern recognition to help accelerate the most concerning reports. We’ve found these accelerated reports — that we have signaled require immediate attention — are escalated to local authorities twice as quickly as other reports. We are committed to continuing to invest in pattern recognition technology to better serve our community.”

Canada continues to monitor suicide trends and risks the traditional way as well. The government collects data from the Health Behavior in School Aged Children Survey and the Canadian Community Health Survey, according to the Canada Public Health Services. The government also looks at specific populations such as the Canadian Armed Forces, Veterans people living in the First Nations, and Inuit communities and incarcerated people. 

Share