For each of the past three years, a Seattle University research team has conducted a survey of Seattle residents to assess their views on public safety issues in their communities and on the police. Called the Seattle Public Safety survey, it is commissioned by the Seattle Police Department as part of their Micro-Community Policing Plan (MCPP) to help it understand how best to engage with each neighborhood in Seattle. The most recent survey was fielded last October and November, and the results were published last month. Yesterday, representatives from the Seattle University team and from SPD briefed the City Council on the report. The briefing was high-level, but there’s an ocean of data on individual neighborhoods included, so here’s a deeper dive into what’s notable and meaningful in this year’s report.
First, we must start with some caveats about potential weaknesses in the report. Overall, the survey captured 6454 responses, from self-selected participants. The team did their best to advertise it widely, and translated the survey into ten languages. But 6454 is not a great response rate for a city of 700,000 residents; as a city-wide aggregate it has some statistical power, but as we dive into police precincts and individual neighborhoods the sample size sometimes becomes too small to be trusted as representative. For example, the Pigeon Point neighborhood yielded only eight responses, so our ability to draw broad conclusions from that small sample rate is very low.
The survey team reweighted the results to match the demographics of the community, in order to make it more representative, which is good practice. We can also take some comfort from the fact the results are remarkably consistent with the two previous years on the major numerical sentiment metrics they measured:
That suggests that either the same people are responding every year, or the results are fairly representative of the community.
So with that said, let’s look at what they measured. The first part of the survey asked respondents to assess on a scale of 1 to 100 some specific metrics within three broad categories:
- The police: it measures both people’s self-assessed knowledge about the MCPP, as well as their perception of it. It also measures “police legitimacy,” defined as “as acceptance of the rules, laws, and precepts that define the police role in society, and a willingness to grant deference to police as a consequence of the belief that they are the authorized representatives who dutifully carry out the rules and laws that make society function smoothly.” Finally, it measures views of SPD specifically and of the police in general.
- Social measures: under the rubric of “social efficacy” it measures “informal social control,” the likelihood that someone in a neighborhood would intervene if something bad or illegal were happening; and “social cohesion,” the extent to which a neighborhood is friendly and a good place to raise children, and people in the neighborhood can be trusted, will help each other, and are happy to be there. The survey also measures “social disorganization,” generally the extent to which a neighborhood is host to undesirable social behavior such as fights on the street, public urination, noise late at night, people being bothered on the streets, and buildings with broken windows or graffiti. (when you see this later, remember that this is the only metric for which a lower score is better).
- Fear of crime. For the most recent two years of the study, in addition to asking generally about fear of crime, they asked separately about fear of crime during the day and at night to provide some additional context to the responses.
- Top public safety concerns. The report lists out the top five public-safety concerns for the city as a whole, for each police precinct, and for each of 59 neighborhoods in the MCPP.
The survey also collected 2999 written comments from respondents; the report provides some thematic summaries, but this kind of information is hard to analyze, and it only provides high-level thematic summaries that at some level are redundant with the top concerns, so I’m mostly going to leave that part aside.
We’ve already seen from the chart above that the high-level results have changed very little from last year. If we look at the five police precincts, we also see a tremendous amount of consistency across the city:
… and if you stop reading this article right here, you’ll walk away with entirely the wrong idea of what’s going on in Seattle, because while the aggregate numbers for each precinct look very similar, the individual neighborhoods vary much more:
And if we dive into each individual precinct, we see that there is diversity in each of them:
There is also a fair amount of consistency; it’s probably more accurate to say that there are “statistical outliers” in each precinct. There are two important take-aways from this:
- Every police precinct is different, and each has its unique aspects to it. None are consistent throughout their area, even if in aggregate they all look similar.
- You can see why micro-policing strategies are so important: the issues vary significantly between neighborhoods, and so must the approach to policing. But the survey is a very useful resource, because it highlights “outliers” where things are just different. You can see in the chart below that for almost every metric there are a handful of outliers; each of those is a starting point for SPD to start to understand what is happening in a community and how police officers can most constructively engage.
The Seattle Times recently ran a piece looking at how fear of crime within a neighborhood doesn’t always correlate with actual crime level. This latest survey provides some additional material for us to ask: what might be driving the fear of crime? When we compare the metrics, we see that police legitimacy doesn’t correlate at all with fear of crime.
Neither does social cohesion.
Social disorganization does correlate with fear of crime, but the effect is small: the difference in fear level between neighborhoods with high and low social disorganization is not that great. Also, be careful not to infer a causal relationship here.
And to confuse matters more, social cohesion and social disorganization are strongly negatively correlated — as one might expect. That would suggest that social cohesion should have some relationship to fear of crime, but the data didn’t show us that. So there’s lots of food for thought here, but no clear answers.
And just for completeness’ sake, the “fear of crime” data for daytime, night time, and overall are very consistent in their relationship to each other; people are slightly more fearful at night, but there don’t seem to be differences in what’s driving daytime fear of crime and night time fear of crime.
So now let’s turn to the other part of the survey: the top public safety concerns as expressed by the survey respondents. I took all of the top-five lists and charted them to look for patterns. Here’s a PDF with the full chart — it’s easiest to print it and read it that way, but you can click on the picture below to expand it too.
Again, we see that in aggregate the list is very short, and very similar across most of the precincts. Two concerns are nearly universal:
- A lack of police department capacity. Based on this, it might not be prudent to run for local office next year on a platform of “abolish the police department.” There was only one neighborhood where this didn’t make the top-5 list (Miller Park), and in 42 neighborhoods it was the #1 concern.
- Car prowls. It was a major issue in every precinct, though somewhat less so downtown and in the east precinct.
Other city-wide concerns include:
- residential burglary;
- property crime;
- littering and dumping (though not in the west precinct); and
- unsafe driving and speeding.
There are a few cases where a particular precinct broadly shares a concern. Car/RV camping is a big issue across the north precinct, but hardly made the list anywhere else in the city. “Shots fired” and violence were raised as issues in the south, east and southwest precincts. And “civility issues” were a west precinct issue. On the flip side, “auto theft” wasn’t raised as an issue in the north and west precincts.
But then you see, once again, that many neighborhoods have their own unique issues. SODO complained about car and RV dumping. Belltown and the downtown commercial district listed issues related to public drug use, loitering, aggressive panhandling, and assault. Only South Park and Fremont listed “graffiti.”
The data is a bit messy because many people see different aspects of the homelessness crisis; for some it’s unsanctioned encampments; for others, it’s people engaging in unsavory behavior on downtown city streets; and for still others, it’s people living cars and RVs parked on the streets of their neighborhood. But the overall message is that same as for the metrics: beyond the core similarities, there’ relevant information here for SPD to use to drill into the specifics of what is happening in individual neighborhoods.
For its part, SPD claims that they are indeed using the data to inform the ongoing development of its micro-community policing plans. They have a web site where they post details of what they believe are the top issues for each neighborhood and their strategies to address them. They expect to produce quarterly reports on the results of the MCPP approach.
One last note on the survey results: it shows that in aggregate across the city residents have a consistent view of the Seattle Police Department, and they think of it more highly than they think of police in the United States in general. The survey unfortunately doesn’t break out those numbers by neighborhood, though it’s safe to assume there is increased variability there as well. And a rating in the low-60’s is nothing for SPD to be proud of; the department clearly has more work to do to improve its image and regain the public’s trust.