Last week, a team of Berkeley researchers published a study on the effects of Seattle’s minimum wage on the food service industry up through March of 2016, that concluded that wages had indeed gone up, and there was no sign that employment had gone down in response. Yesterday, a team of researchers at UW published a similar study, looking at data through September of 2016, concluding that employment had gone significantly down.
The UW study team published a study last year (funded by the City of Seattle) on the early results of the move to $11/hour, and found little to no effect on employment. In contrast, their new study, which looks at the step up to $13/hour, found that it reduced the hours worked in low-wage jobs by around 9% – versus a 3% increase in wages – leading to a total decrease in payroll for low-wage employees of about $125 per month. They attribute this to a combination of reduced positions and reduced hours for existing positions.
Over the weekend, I posted a look at the Berkeley study. One of the authors, Michael Reich, briefed the Council on his group’s study yesterday morning and also provided a critique of the conflicting UW study – both in the Council briefing and in a letter to the city. I took a look at the criticisms he raised and they raise serious concerns about the UW team’s methodology.
Let’s start by looking at some of the key differences in how the studies were done. First, they analyzed different data sets: Berkeley used Bureau of Labor Statistics data on wages and employment for all businesses, nationwide, participating in the Unemployment Insurance system; from that source they pulled information just on the food service industry. UW’s team used data from the State of Washington’s Employment Security Department across all industries. Both are legitimate data sources; the Washington state data is more granular, but the geographical limit to within the state is problematic, as we will discuss later
Second, the two studies took a different approach to handling a complicated issue around how to handle companies that have multiple sites (e.g. Starbucks and McDonald’s). Some of those companies submit data to the government separately for each site; some aggregate the data for all the sites in the city; some aggregate for a much larger area, such as King County or the whole state. That means that the Seattle wage and employment data could include some employees outside of Seattle. The Berkeley group concluded that added “noise” to the results but were worth including nonetheless; the UW group, noting a previous survey they had done showing that multi-site employers were more likely to express intent to reduce hours and positions in response to an increase in the minimum wage, and thus including them would bias the study – so they excluded all multi-site employers. This isn’t a trivial change; 38% of low-wage food service workers are hired into multi-site companies.
Third, the two studies constructed completely different points of comparison to Seattle. The standard practice is to create a “Synthetic Seattle” by aggregating economic data from other locations. The Berkeley team, using their national data set, assembled a collection of counties spread across the country and aggregated their wage and employment data. The UW team did a similar exercise, but selected cities and counties from within Washington state (since that was the data set available to them).
Here’s the Berkeley list:
And here’s the UW list:
OK, so with that context, let’s look at some of the problems with the UW study.
1. Excluding multi-site employers heavily biased their analysis. Excluding 40% of food-service workers on its own is eyebrow-raising. But it’s far more problematic than that, because of the way that Seattle’s minimum wage is phased in. Large firms with 500 or more employees saw a faster ramp-up than smaller firms, and larger firms are much more likely to have multiple sites. The difference in wages is significant:
In other words: there’s a strong incentive for workers to move from smaller companies to larger ones because they’ll get paid more. But in the UW study, a worker that moves from a smaller, single-site company to a larger, multi-site one drops out of the data set – and looks like a job loss.
2. UW’s construction of “Synthetic Seattle” is highly suspect. Creating a credible point of comparison as a “control” is tricky. It’s an exercise in controlling for one, and only one, variable: the minimum wage increase. So you want to pick cities and counties that mimic all the other economic conditions of Seattle, but are independent enough that wages and employment in Seattle won’t have “spillover” effects into those other regions. Berkeley’s “synthetic Seattle” uses counties across the nation and a few in Washington state that are long distances from Seattle. UW’s, however, excludes King County but includes Pierce and Snohomish County and other western Washington locales. The test for whether the synthetic “control” is well constructed has three elements:
- Seattle and “Synthetic Seattle” should behave similarly in the period before the minimum wage increase happened.
- Increasing the minimum wage should have a detectable effect on Seattle, but not on “Synthetic Seattle.”
- Groups (such as high-paid salaried workers) that did not get the “treatment” (i.e. a wage increase) should not see any effects (i.e. a decrease in employment) across the two groups.
On the first test, the Berkeley team’s study shows very clearly that in the years preceding the new minimum wage, Seattle and their “synthetic Seattle” behave identically:
UW either did not perform this test or chose not to publish the results, so there is reason for skepticism as to how their control stands up. On the second test, UW’s inclusion of Snohomish and Pierce Counties, including Everett and Tacoma, is highly suspect as there is strong reason to believe that Seattle’s minimum wage increase put wage pressure on its neighbors to the north and south. We’ll come back to the third test in a bit.
So that’s a bunch of reasons why UW’s analysis could be flawed, and why we should be skeptical of their results regardless of what they say. But when we look at the actual results, do they seem credible?
Here’s a paper, referenced by Reich, that looks at what tends to happen right after an increase in the minimum wage. It analyzes historical increases in the United States between 1979 and 2016 and what happens to employment. It finds a very similar pattern across the instances, represented by this chart of changes in the number of people employed at each wage level:
The “0” on the horizontal axis is the new minimum wage. So what we see is:
- compliance is very high;
- a large drop in people being paid the old minimum wage and an increase of almost exactly the same size at the new minimum wage;
- a small increase in people being paid slightly above the minimum wage, attributable to employers giving raises to low-level supervisors to keep their pay above what the people they supervise are getting paid;
- little to no effect the farther you go up (which is consistent with test #3 on “synthetic Seattle” above).
The big finding of the paper is represented by the two big bars: increases in minimum wage tend not to create large shifts in employment, but rather simple shifts in wages from the old minimum to the new one.
Here’s the equivalent graph for the UW study’s results:
Very different — and that’s a red flag that something’s amiss here. The red graph at the bottom is March of 2016 — two months after the increase to $13/hour at the top end (for large firms with that don’t provide health insurance). There are big drops, but no bump up at the increased rate. It looks like huge job losses, but the most likely explanation is that the 38% of low-wage employees excluded from the study are the “missing workers.” Also, there is a lot of increases at wage rates well above $13, which we shouldn’t see at all; that suggests UW’s “Synthetic Seattle” isn’t doing a good job of filtering out Seattle’s booming economy.
Now the counter-argument here is that because the 2016 increase is only nine months after the April 2015 increase, there wasn’t time for the market to absorb the first one and what we’re really seeing is the effects of a much bigger increase in two connected pieces, creating a shock to the economic system. But look at the blue graph on top, which shows UW’s result for the April 2015 minimum wage increase, the first bump up. Again, the graph looks nothing like the expected one: a big bump down at the old minimum wage, much smaller bumps up at the new one, and lots of change at the higher wage levels. These aren’t small changes. These are huge ones; according to Reich, ten times the consensus view among economists on the effect that wage changes have on employment (measured as the percent employment decrease per percent wage increase) UW provided no explanation as to why Seattle’s employers would be that much more sensitive to wage changes than employers in other cities.
The obvious conclusion is that UW’s study is hopelessly flawed. Now UW could counter this by re-running the study including the data from multi-site employers, and by publishing the historical fit for its “Synthetic Seattle” so we can verify that it isn’t accidentally incorporating some of Seattle’s economic boom into the data. But until it does those two things, we can’t verify the accuracy of its numbers — or its conclusions. Reich notes that UW’s data set would allow them to track individual employees over time and actually count the movement between single-site and multi-site employers; that would provide further insight into whether excluding multi-site employers biases their results.
Reich also critiques the UW study for cutting off its analysis at $19/hour at the top end, arguing that they should be looking higher. UW counters that they tried the same analysis with higher cutoffs (e.g. $25/hour) and saw no significant difference. We can safely let the economists argue that out in private, as there are more than enough reasons to disbelieve the UW study even without this point.
So what conclusions can we draw from the UW study? Nothing. Literally, nothing. Its underlying data set is deeply suspect, so we can’t trust any analysis or conclusions built on top of it. The safest course of action is simply to ignore it while peer review among the economics research community does its thing and the UW team re-works it to address the many criticisms. And I hope they do; the heart and soul of science (including social science) is replication, and even Berkeley’s study is of limited value until others replicate their results.