On the rise and decline of America’s cities in the 20th century.
The United States is one of the most urban nations on the planet, with 81% of its total population residing in cities. Of course, it wasn’t always so; at the turn of the 20th century, the majority of citizens lived on farms, in small towns, and around other rural outposts, but by 1920—after decades of industrialization—the scales tipped in favor of urban living. Waves upon waves of immigrants coalesced with other drastic socioeconomic changes to bolster city growth in the ensuing decades. Throughout the 20th century (and still today) the steady rise of urban populations seemed unstoppable—and by and large it was. But while the growth never stopped, it did sputter—substantially.
That sputtering, often characterized by “white flight”, rising crime, and derelict buildings, took place predominantly over the course of the late-60s through the 80s, and it is this moment that forms the point of departure for Stephen J.K. Walters’ Boom Towns: Restoring the Urban American Dream. As Walters notes “library shelves groaned under the weight” of books trying to diagnose the causes behind the baffling decline of and divestment from America’s cities—none of which, by Walters’ estimation, quite hit the nail on the head. In his book he posits what he sees as the real culprit behind urban decline—still at large today, hamstringing growth and capping the potential for citywide prosperity. What follows is an excerpt from the first chapter of his book, “What We’ve Lost—And Why” in which he describes the boom and post-World War II bust of the U.S.’ major industrial cities.
The following is an excerpt from Boom Towns: Restoring the Urban American Dream:
In the first half of the twentieth century, Detroit’s black ghetto was known as Paradise Valley.
Apparently, this was not meant ironically or sarcastically. In one former resident’s memory, the place was “next to heaven!” It delivered “economic growth, first-class entertainment, and new opportunities for Detroit’s Black community.”
What an odd—even outrageous—thing to say about a ghetto. For many decades, if you lived in Paradise Valley and crossed certain streets into other neighborhoods you might get a beating—or at least some hard questioning by a cop. You faced relentless discrimination in the workplace and in public accommodations, your kids went to segregated and grossly underfunded schools, you got minimal services from City Hall, and you were daily confronted with injustices and indignities that today would make any sane person boiling mad. Yet it’s not uncommon to read similarly fond reminiscences about other segregated neighborhoods of that era, from New York’s Harlem to San Francisco’s Fillmore district.
We may be less puzzled by warmhearted portrayals of the various Chinatowns, Little Italys, Poletowns, or other ethnic enclaves that have long dotted America’s urban landscape. Perhaps we delude ourselves into thinking that such segregation was more about ethnic solidarity and personal choice than it really was, or are comforted by the belief that these groups faced less overt hostility than did blacks when they arrived as immigrants. At the least, however, the bias that consigned certain racial or ethnic groups to limited areas caused them to pay higher rents for smaller quarters of lower quality and greatly handicapped them in their pursuit of employment or goods and services.
Nevertheless, these groups kept flocking to America’s great old industrial cities and not only put up with the indignity, overcrowding, noise, and grime that typified urban living conditions at the time but commonly celebrated their quality of life. Why?
[A] metropolitan economy, if it is working well, is constantly transforming many poor people into middle-class people. Cities don’t lure the middle class. They
create it.
Because cities worked. Or, rather, because there was every bit as much injustice and bigotry elsewhere—plus grinding poverty and a reduced array of opportunities for work and play. In cities, there were not only plenty of jobs, but jobs that paid wages that were far higher than those in rural areas. In Detroit in 1930, for example, the average unskilled factory worker made $1,762 a year (almost $25,000 in today’s money). That may not seem like much, but it was triple the amount a similar worker could earn in, say, Madison County, Alabama, or Troup County, Georgia.
And the new residents of America’s booming cities of the first half of the twentieth century did not just have more money to spend, but more and better things on which to spend it. Their higher incomes and cities’ dense populations could sustain markets for goods and services unimaginable down on the farm. In Paradise Valley in the 1920s, that meant a sharecropper’s son could, after riding the streetcar home from a lucrative shift at the Ford plant, don a suit and take his wife to hear a top jazz band at the Music Bar at Hastings and East Adams Streets; bowl a few strings at the Paradise Bowl; or—if it was a really special occasion—see Ethel Waters, the “Charleston Dance Queen,” perform at the Gotham Hotel downtown. Nothing like that could be done in rural Alabama or Georgia in the 1920s—or, perhaps, ever.
[…]
What’s clear is that cities were not just livable but superior to the alternatives of the day. They had slums, but substandard housing was common outside central cities, too. For example, 17 percent of the homes in Baltimore in 1950 were classified as dilapidated or had no running water, private bath, or toilet, but 20 percent of those in Baltimore’s surrounding suburban census tracts were similarly classified. And though big-city crime rates were high, they were yet to become catastrophically so. In 1950, the murder rate in large cities (those with over 250,000 residents) was just one-third above the national average for all cities and suburbs, and the burglary rate was just 15 percent higher.
Even the most deprived and racially segregated neighborhoods of large cities exhibited signs of health, as sociologist William Julius Wilson has summarized:
Blacks in Harlem and other ghetto neighborhoods did not hesitate to sleep in parks, on fire escapes, and on rooftops during hot summer nights in the 1940s and 1950s, and whites frequently visited inner-city taverns and nightclubs. There was crime, to be sure, but it had not reached the point where people were fearful of walking the streets at night. . . . There was joblessness, but nowhere near the proportions . . . that have gripped ghetto communities since 1970. . . . There were welfare recipients, but only a very small percentage of the families could be said to be welfare-dependent. In short, unlike the present period, inner-city communities prior to 1960 exhibited the features of social organization—including a sense of community, positive neighborhood identification, and explicit norms and sanctions against aberrant behavior.
In sum, through most of this period American cities were magnificent engines of economic and social progress. As the great urbanologist, the late Jane Jacobs, once put it, “[A] metropolitan economy, if it is working well, is constantly transforming many poor people into middle-class people, many illiterates into skilled (or even educated) people, many greenhorns into competent citizens. . . . Cities don’t lure the middle class. They create it.” Our cities performed this wonderful work for many decades, until something fateful—and a bit mysterious—changed.
The most obvious sign that something had gone wrong, that many core cities had lost some vital life force, was the great post-war exodus to surrounding suburbs and exurbs. In the second half of the twentieth century, the population of St. Louis fell 60 percent; Pittsburgh, Buffalo, Detroit, and Cleveland weren’t far behind, losing half their residents. Newark, Cincinnati, Rochester, and Baltimore lost a third or more, Washington, Louisville, Philadelphia, Minneapolis, Boston, Birmingham, and Chicago at least a fifth, and New Orleans, St. Paul, Milwaukee, and Kansas City slightly smaller proportions. The losses would have been greater but for the fact that those cities’ buildings couldn’t sprint for the exits, too. While they slowly deteriorated, they’d shelter some inhabitants and give these cities an illusion of continuing viability.
This evacuation didn’t merely signal that there were problems, but made them worse. With smaller populations and shrinking tax bases, city governments would experience chronic fiscal crises that forced service cuts, tax hikes, or both. And core cities’ populations didn’t just fall—they changed. Those who fled tended to be better-educated and have higher incomes than those who stayed or moved in to replace them. Demand for social services grew; the wherewithal to provide them shrank.
Slowly, over a few decades, public perceptions of the American city changed. Cities had never been perfect, but had been undeniably attractive and important. By the 1960s, however, many of America’s core urban areas had become desperately poor and afflicted with the kinds of problems that both result from concentrated poverty and contribute to its endurance. Crime rates soared; illicit drug markets took root and flourished; schools became dysfunctional; neighborhoods crumbled; infrastructure deteriorated; good jobs became harder to find.
By the 1970s, it was clear that America’s core cities were no longer cornerstones of its citizens’ social, cultural, or economic lives, no longer keys to national identity and sources of strength and pride. Rather, they were things to be pitied and propped up by taxpayers living in wealthier areas or, more often, just ignored. At some point, it became routine to define cities by their problems rather than their (apparently nonexistent) virtues. By the early 1980s, for example, The World Book Dictionary would define the inner city as that part of a metropolitan area “characterized by congestion, poverty, dirt, and violence,” adding “especially U.S.” Ouch.
Comments