A city plagued by homelessness builds AI tool to predict who's at risk
The model has identified 88 people in London, Ont. likely to become chronically homeless
A new artificial intelligence tool built in London, Ont. is being used to predict whether people will become chronically homeless with programmers touting it as the first of its kind in Canada.
By weighing data points like age, gender, family and shelter history, the Chronic Homelessness Artificial Intelligence model (CHAI) predicts whether people are likely to seek shelter services or find themselves living rough on a longterm basis in the next six months.
So far, the City of London has used CHAI to identify 88 people who are at risk of chronic homelessness, which is defined as being homeless for 180 days in the span of a year.
One individual is John Doe, a man living in London, whose identity is being protected for reasons of confidentiality.
He's more than 52-years old, he's single and doesn't have kids. He's spent 27 days in shelter in the past two months, and, according to the model, has a 94-per cent chance of becoming chronically homeless.
The city is still figuring out how to use CHAI's predictions, but Jonathan Rivard, London's manager of homeless prevention, said knowing someone like John Doe is likely to become homeless and stay homeless allows social services to provide him with more support, and possibly reduce strain on the shelter system.
"We might be able to have a bit of a longer diversion conversation, or put resources toward that person where we otherwise might not," he explained.
And, because it has been built to explain its predictions, the tool might also reveal some of the reasons why people in London, specifically, are becoming homeless in the first place.
High level of accuracy
Born out of collaboration between the city's information technology and homeless prevention departments, the CHAI model uses information from the city's shared database to make its predictions.
That database, which is called the Homeless Individuals and Families Information System (HIFIS), is used by more than 20 organizations across the city. Of the 6,000 people in the system, about 4,000 have consented to having their information put through the CHAI model, said Rivard.
The algorithm uses 21 million data points to make its predictions. One of its key features is the way it explains those results, said Matt Ross, a manager of artificial intelligence and information technology for London.
"If you read about AI in popular culture, the big issue right now is unintended bias or black box models, models that give you a prediction but you don't know why," he said.
"We built this from the ground up, ensuring that the model actually can explain exactly why it made the prediction it did, and that's to do two things: build trust in the model and allow it to be implemented safely and ethically, but also reduce or eliminate unintended bias."
Another detail is that the system was never trained by a programmer on how to make its predictions.
"It learns from the data itself, the patterns that are predictive of homelessness," said Ross.
In this case, the system was trained using data on the attributes and service usage patterns of clients within London's shelter system. So far, it has learned that people who are more than 52 years old, who are male, and who don't have family are more likely to become chronically homeless.
It also boasts a 93 per cent accuracy rating.
"It's a pretty powerful model," said Ross. "We're doing literature review right now of other cities that have tried to approach the problem with machine learning and this accuracy is, as far as we can tell, the highest in the world."
Chuck Lazenby, executive director of the Unity Project, a shelter and support service in London, said she and her staff are excited to use the tool so long as it provides better outcomes for people.
"Any of these kinds of new tools or any kind of new systems that come into play, we're going to use them," she said. "If there's struggles with the outcomes, if we're not seeing it lead to better outcomes for people, then we need to re-evaluate."
But in a resource-strapped system, she said it's necessary to identify people who need more resources than others.
"We've spent many, many years just not doing that level of priority assessment, so what happens then is the people who are easiest to serve get the resources instead of those who may have some more complex issues."
However, Lazenby said the model needs to be tracked so people don't fall through the cracks.
The CHAI model has been in the works since March 2019. Rivard said the city hasn't committed a certain amount of money toward the project, and the cost hasn't been significant because it's statistical research.
The city spent $14,581 to hire a consultant to look over the system and make sure it was being used appropriately.
It's also unique to London.
Ross said there are comparable projects in Montreal, as well as in Austin, New York and Los Angeles in the United States, but they don't look specifically at chronic homelessness and they aren't live models – they're either prototypes or research projects.