Can AI Help To Prevent Suicides Among The Homeless?

The suicide rate among young people has been on the rise for several years, so any attempts to improve matters have to be taken incredibly seriously.  New research from the University of Southern California hopes to utilize AI to help mitigate the risk of suicide.

“In this research, we wanted to find ways to mitigate suicidal ideation and death among youth. Our idea was to leverage real-life social network information to build a support network of strategically positioned individuals that can ‘watch-out’ for their friends and refer them to help as needed,” the researchers explain.

The research builds upon previous work by the team that hoped to train AI systems to identify suitable ‘gatekeepers’ in any social group who could keep an eye out for people at risk of suicide.  The latest study explores how our social connections can mitigate the risk of suicide among that social network.

Watching out

The researchers examined the social relationships of a group of homeless people in Los Angeles.  This group was chosen because around half of this group have considered suicide.  Potential gatekeepers are inevitably limited by availability, so it’s important that they can be efficiently identified.  The researchers hope that their work will allow policy makers to better direct the interventions they provide so that they are most effective.

“Through this study, we can also help inform policymakers who are making decisions regarding funding on suicide prevention initiatives; for example, by sharing with them the minimum number of people who need to receive the gatekeeper training to ensure that all youth have at least one trained friend who can watch out for them,” they explain.

They believe that using an AI-based system could help to ensure both fairness and transparency.  The researchers say that when resources are so limited, it tends to disproportionately affect the most marginalized and vulnerable populations.

By using AI, they believe they can better reach these people and ensure that resources are targeted at the most vulnerable populations.

“One of the surprising things we discovered in our experiments based on social networks of homeless youth is that existing A.I. algorithms, if deployed without customization, result in discriminatory outcomes by up to 68% difference in protection rate across races. The goal is to make this algorithm as fair as possible and adjust the algorithm to protect those groups that are worse off,” the researchers say.

By deploying the AI-based system, the researchers were able to reduce the bias in coverage in real-life groups of homeless people by up to 20%, which has a significant bearing on the lives of this most vulnerable of populations.

“Not only does our solution advance the field of computer science by addressing a computationally hard problem, but also it pushes the boundaries of social work and risk management science by bringing in computational methods into design and deployment of prevention programs,” the researchers conclude.

Facebooktwitterredditpinterestlinkedinmail