Hacker News new | past | comments | ask | show | jobs | submit login
How NYC's first “director of analytics” revolutionized building inspections (slate.com)
88 points by murtali on May 3, 2013 | hide | past | favorite | 8 comments



Big data mashups like this (and a recent Microsoft article about managing their campus) have great potential for making things better.

Unfortunately like most things this technology has the potential to be highly abused and most column inches have been dedicated to just that, usually things like facebook and advertising.

This is one reason I hope more governments open up their datasets.


>the most important reason for the program’s success was that it dispensed with a reliance on causation in favor of correlation

Is this necessarily so? I see a lot of good old-fashioned factors mentioned in the article: converting data to a common format (locations as Cartesian coordinates), transforming expert knowledge into computer code (the inspector's insight about the brickwork), and even pure and simple political clout (obtaining several institutions' complete data dumps). It's not like all these mountains of data and expert knowledge were open-sourced and both "causationalists" and "correlationalists" had a run at it. Sure, the head of the program attributes it to his belief in correlation over causation, but there are many other and arguably more important factors present here.


> Is this necessarily so?

No; they're just two correlated characteristics of the program, without necessarily any causal link.


This is very cool. I work with information visualization/visual analytics, which provides some very powerful analysis tools to handle situations where you have to make decisions based on heavily multi-dimensional data.

This research hasn't been as heavily applied in practice as it could be, but it's obvious that it has huge potential. This article is a textbook example of what sorts of situations these analysis techniques are good for. (Except that there are no textbooks on this yet, but that's a minor detail). The major point is that these techniques will show you trends and correlations, which is usually enough information in real life. People will yell "correlation does not equal causation" all day, but often correlation is enough to be able to conduct a closer investigation in the right place or ask the right follow-up questions.


At the beginning of 2012, Obama ordered federal agencies to build web APIs. What ever happened with that? Has anyone tried using them? There's probably a lot of interesting use cases.


http://www.data.gov/

Last I checked, it was very much a work-in-progress. Some agencies - like the USGS - had boatloads of data. Others not so much.


311 has had analytics for years. Either this guy is better at it than the last few guys or he is better at self-promoting.


It seems like he's better. From the second page of the article:

>For example, the number of calls to the city’s “311” complaint hotline was considered to indicate which buildings were most in need of attention. More calls equaled more serious problems. But this turned out to be a misleading measure. A rat spotted on the posh Upper East Side might generate 30 calls within an hour, but it might take a battalion of rodents before residents in the Bronx felt moved to dial 311. Likewise, the majority of complaints about an illegal conversion might be about noise, not about hazardous conditions.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: