New report shows bias in how homeless people are prioritized for services

A study released last week confirms something that many people in the homeless-services community had suspected: a key tool used to assess and prioritize homeless people in King County for access to services is biased against people of color.

It’s well-known, and has been repeatedly documented, that people of color — and especially African-Americans — are over-represented in the homeless community here in King County and across the country. In King County last year, about 6% of the population was black/African-American (as measured by the US Census Bureau’s American community Survey), compared to 37% of the homeless population. And while work has been done to understand the economic and societal reasons why more people of color become homeless, much less is known about disparate impacts in the system of services offered to homeless people.

As a condition for federal grants, King County is required to maintain a database of information about those receiving services, and to “establish and operate either a centralized or coordinated assessment system that provides an initial, comprehensive assessment of the needs of individuals and families for housing and services.” This is called the “Coordinated Entry System,” or CES. While the federal government doesn’t dictate the exact tool(s) to be used for conducting that assessment, the most common tool is called VI-SPDAT. VI-SPDAT was developed and released in 2014 by OrgCode and Community Solutions, and was touted as “the standardized assessment tool of choice.” It’s a 27-question interview script that a service provider administers to a homeless person, and based upon the answers given it generates a vulnerability score between 0 and 17. Scores between 0-3 lead to a recommendation of no housing intervention; between 4-7 lead to a recommendation of “Rapid Rehousing”; and 8 and above lead to a recommendation of permanent supportive housing or another form of “housing first” program. Within each group, scores can be used to prioritize the most vulnerable.

Since its introduction some service providers and officials have raised concerns, based on anecdotal evidence and their own experiences, that VI-SPDAT results in lower assessments for people of color. But the recently released study did an in-depth statistical analysis of the CES systems for four counties, including King County, and confirmed that there are indeed racial disparities. Across the four counties combined, several categories of people of color saw lower median and mean VI-SPDAT scores when compared to white people.

While the score differences may look small, it’s important to note that the mean and median scores for several groups hover right around 8 — which is a critical threshold  for deciding which intervention is recommended for an individual (rapid rehousing versus permanent supportive housing). So those small differences actually represent qualitative changes in the services offered on top of the quantitative differences in relative priority for individuals within the homeless population. While the study didn’t find statistical differences for families (who have their own version of the VI-SPDAT), for individuals there was a clear and measurable difference in the percentage of people of color who were recommended each of the different levels of service. In short: more white people got the highest levels of service.

That leads deeper into the mystery: why do people of color get lower VI-SPDAT scores? Prior to the study, service providers had suggested thee broad areas of concern with the VI-SPDAT:

  • the limited evidence of the tool’s validity and reliability in assessing vulnerability. It has never been validated nationally, and its creators claim that it is “evidence informed” rather than “evidence based.”
  • the fact that it relies on homeless people to self-report, including mental and behavioral health challenges. To make matters worse, providers pointed out that with the introduction of the VI-SPDAT the conventional wisdom flipped: whereas before homeless people were told that the fewer issues they raised the more likely they were to be recommended to services, under the new system, the more issues they raised the more likely they were to get help. And some of the VI-SPDAT questions could be considered incriminating, such as admitting to drug use or selling prescribed medication. Thus there is even more tendency for homeless people to under-report;
  • other implementation and fidelity concerns: assessor demographics, training in effective and trauma-informed interviewing, biases held by the assessors, and relationship/rapport between assessors and clients.

The study focused on the first point: the validity and reliability of the VI-SPDAT tool itself. While it didn’t attempt to do a true national validation, it looked to see whether components of the assessment tool led to biased results. And it found just that: among the 16 components (each worth 1 point), more than half were much more likely to be predictive of a high VI-SPDAT score for white individuals than for individual people of color.

Race was a predictor of 11 of 16 components: whites were more likely to say “yes” to eight, and people of color were more likely to say “yes” to three.

This highlights the likely flaw in the VI-SPDAT assessment: the choice of the sixteen components, and their weighting. There is no evidence that white homeless people are actually more vulnerable than homeless people of color, but a set of sixteen questions were chosen that give more weight to issues that white homeless people experience. Why sixteen components — why not nineteen, or twelve, or five? Why these particular sixteen? Are any of them tightly correlated to each other, effectively giving higher weight to certain issues, or all they truly independent? And as the data above suggests that people of difference races/ethnicities seem to experience homelessness differently, does it make sense at all to try to have a single metric to prioritize everyone together? These are the questions that a true validation test of the VI-SPDAT (which, again, has never been done despite the wide adoption of the tool) would answer.

The study authors make several recommendations. Among them are for King County and other jurisdictions implementing a Coordinated Entry System to investigate alternative tools and/or methods to VI-SPDAT, as well as the inclusion of other factors, for assessment and prioritization; and for the federal government to reconsider the training and guidance it gives to jurisdictions on their CES implementations as well as to work on ensuring that assessment tools such as VI-SPDAT are normed using demographics that match the communities they will be used with.

It turns out that King County is ahead of the game. It is already undertaking a broad re-assessment of how it prioritizes people for services, after recognizing that its current system was producing racial inequities. The county’s stated long-term plan is to introduce a new assessment and prioritization tool that ensures equitable access to resources and services. It also plans to move to “Dynamic Prioritization,” which is “a dynamic process that uses prioritization criteria to identify the most vulnerable households based on the number of anticipated housing placements across all resources that will occur in the next 60 days.”  In other words, prioritization will be influenced by the actual service capacity (especially housing) expected to be available in the short term.

But in the short term, it has already rolled out (as of January 2019) an update to its prioritization formula, through the work of an “Interim Prioritization” workgroup that looked at how to make immediate changes that would improve racial equity.

It still uses VI-SPDAT, but it also factors in the length of time that an individual has been homeless.  This addresses a concern that noted expert Barb Poppe raised in her 2016 report, in which she pointed out that individuals that don’t score high on the assessment could remain in the system for an indefinite amount of time and never get the help they need if a sufficiently large number of people who score higher are constantly entering the system.

The use of VI-SPDAT is an excellent case study on how bias can accidentally be introduced into systems that were created with the best of intentions. The sixteen components of VI-SPDAT all individually look perfectly reasonable and I have no doubt that they were selected without any racist intent, but collectively they produce a result with a disparate negative impact on people of color. It also shows us why rigorous evaluation of every aspect of the homeless response system is difficult but necessary.

Thanks for reading to the end!  If you find my reporting valuable, please consider making a financial contribution to support my work. Even just $1 a month helps.