Popular Posts

Friday, September 30, 2011

Those Pesky Humans: Urban Planning and its Discontents

Article first published as Those Pesky Humans: Urban Planning and its Discontents on Blogcritics.

Greg Lindsay writes in the New York Times that Pegasus Holdings, a technology company based in Washington, DC, is building a "medium sized town" on 20 square miles of New Mexico desert. The town, dubbed the "Center for Innovation, Testing, and Evaluation" (mark it on the map!), will contain infrastructure adequate to support a population of 35,000, but will be home only to a handful of engineers and other geeks from Pegasus, who plan to use it as a laboratory to build future "smart cities", where power grids, traffic, security and surveillance systems are monitored and controlled by computer.

On the face of it, "smart cities" sound like a good idea (better than, say, "dumb cities"). The idea is, in outline, simple enough: a) install sensors to get information about how people move about and interact in cities, then b) feed this data to computers develop complex models of human behavior, generating policies that make things work better, more efficiently. To take an obvious example, who wouldn't want traffic lights optimized to increase vehicle throughput? Or pedestrian pathways that make two-way foot traffic flow more smoothly? Makes sense, right?

Yet, as Lindsay points out, these seemingly innocuous examples paper over a broader project that has repeatedly been exposed as folly, that of trying to simulate the behavior of people in cities using abstractions like computer models, rather than by gaining an understanding of what people living in cities care about, and find valuable. These qualitative, subject elements are typically what determine what makes a great city "great", smart by computer modeling standards or not.

It would seem obvious and necessary to account for this "human-factor" when constructing quantitative models for smart city projects like Pegasus' (after all, we're talking about humans), only, as is so often the case, the computer geeks view "qualitative" features of a city as the very thing that needs to be analyzed quantitatively, and replaced. As Rober H. Brumly, managing director and co-founder of Pegasus pronounced, "We think that sensor development has gotten to the point now where you can replicated human behavior".

And so Brumly and the Pegasus visionaries, in this latest round of "machine versus man", continue the tradition of remaining seemingly ignorant of the manifest lessons of over-thinking urban planning going back decades, at least to the publication of the seminal "The Death and Life of Great American Cities" by flesh and blood New Yorker Jane Jacobs. Jacobs repeatedly documented how best laid urban plans would lead to frustration and a sense of alienation in the neighborhoods of New York City. For example, urban planners who attempted a gentrification project in a NYC slum decided that planting strips of grass outside tenements would have a salubrious effect. But alas, the pesky human tenants saw the grass strips as ridiculous, ill-placed, and insulting. It had the opposite effect, in other words, which could have been "predicted" had only the urban planners taken the time to understand the neighborhood, and get to know the tastes and circumstances of its inhabitants.

And there are more nefarious examples, like the 1968 RAND project to reduce fire response times in NYC, resulting in an estimated 60,000 fires in impoverished sections of New York, as "faulty data and flawed assumptions" triggered the replacement of fire stations in Brooklyn, Queens, and the Bronx with smaller ones. The coup de grace here was the politicization of the supposedly "scientific" project, where clever RAND officials, realizing that rich folk in well-to-do neighborhoods would not tolerate the effects of "efficiency" using their (flawed) simulations, placed such neighborhoods outside the scope of the project.

And on and on the story goes. Unintended consequences are simply part and parcel of the development of causal or predictive models using quantitative data gleaned from messy, complex systems. The real folly, however, in the Pegasus project and so many others like it, is not in the (basically correct) idea that quantitative analysis can provide useful information when devising strategies, for urban planning or otherwise, but that the human element can therefore be eliminated. That latter claim does not follow, and taking it too seriously will almost certainly guarantee that among the lessons we learn from the "Center for Innovation, Testing, and Evaluation", one of the most important is likely to be that innovation, testing, and evaluation is not enough.

1 comment:

Anonymous said...

The way get your divorce differs from country to nation.

Do you do understand how to archive a motion into suppress evidence?


Also visit my web site; agencja detektywistyczna