Sunday, September 25, 2011
There have been a number of news stories this week about commute times, based on information from the Census Bureau. Much of it is confusing and contradictory. For example, on Thursday the Times said that New York has the longest commute, but then on Friday a different Times blogger said that New York doesn't have the longest commute any more. What gives?
Well, the Census Bureau just released the commuting data from the 2010 American Community Survey. But on the same day they also released an analysis of commuting patterns and trends (PDF) based on data from the 2009 survey. New York has the longest mean time to work in the 2009 data (see page 16), but in 2010 it dropped to number 2. At least according to the analysis reported by Sam Roberts in the City Room, and by the Washington Post. Honestly, I don't get that ranking at all. By my analysis, New York is still #1, Maryland is tied for #6, and Grand Forks, ND, is simply not listed as a metro area that has any commute time data for 2010. I have no idea what Roberts is doing with the data.
Maybe they're using a different method of calculating mean time to work? I'm dividing aggregate time to work by total number of commuters. That matches up with the rankings given by MarketWatch and the Los Angeles Times. What method are Sam Roberts and Ashley Halsey using? What are the Census analysts using?
The rest of it seems like typical bureaucratic dysfunction at the Census. Why release an analysis of the 2009 data at the same time as you release the 2010 data? It'll just confuse everyone. Why have no rows for metro areas with no data, instead of empty rows? Why does the Census Bureau talk about Grand Forks in a press release, but not release the data?
There's a lot of other stuff I have to say about this data, so stay tuned for a few more posts about it.