Blog

  • The Myth of the Back-to-the-City Migration

    Pundits, planners and urban visionaries—citing everything from changing demographics, soaring energy prices, the rise of the so-called “creative class,” and the need to battle global warming—have been predicting for years that America’s love affair with the suburbs will soon be over. Their voices have grown louder since the onset of the housing crisis. Suburban neighborhoods, as the Atlantic magazine put it in March 2008, would morph into “the new slums” as people trek back to dense urban spaces.

    But the great migration back to the city hasn’t occurred. Over the past decade the percentage of Americans living in suburbs and single-family homes has increased. Meanwhile, demographer Wendell Cox’s analysis of census figures show that a much-celebrated rise in the percentage of multifamily housing peaked at 40% of all new housing permits in 2008, and it has since fallen to below 20% of the total, slightly lower than in 2000.

    Housing prices in and around the nation’s urban cores is clear evidence that the back-to-the-city movement is wishful thinking. Despite cheerleading from individuals such as University of Toronto Professor Richard Florida, and Carole Coletta, president of CEOs for Cities and the Urban Land Institute, this movement has crashed in ways that match—and in some cases exceed—the losses suffered in suburban and even exurban locations. Condos in particular are a bellwether: Downtown areas, stuffed with new condos, have suffered some of the worst housing busts in the nation.

    Take Miami, once a poster child for urban revitalization. According to National Association of Realtors data, the median condominium price in the Miami metropolitan area has dropped 75% from its 2007 peak, far worse than 50% decline suffered in the market for single family homes.

    Then there’s Los Angeles. Over the last year, according to the real estate website Zillow.com, single-family home prices in the Los Angeles region have rebounded by a modest 10%. But the downtown condo market has lost over 18% of its value. Many ambitious new projects, like Eli Broad’s grandiose Grand Avenue Development, remain on long-term hold.

    The story in downtown Las Vegas is massive overbuilding and vacancies. The Review Journal recently reported a nearly 21-year supply of unsold condominium units. MGM City Center developer Larry Murren stated this spring that he wished he had built half as many units. Mr. Murren cites a seminar on mixed-use development—a commonplace event in many cities over the past few years—as sparking his overenthusiasm. He’s not the only developer who has admitted being misled.

    Behind the condo bust is a simple error: people’s stated preferences. Virtually every survey of opinion, including a 2004 poll co-sponsored by Smart Growth America, a group dedicated to promoting urban density, found that roughly 13% of Americans prefer to live in an urban environment while 33% prefer suburbs, and another 18% like exurbs. These patterns have been fairly consistent over the last several decades.

    Demographic trends, including an oft-predicted tsunami of Baby Boom “empty nesters” to urban cores, have been misread. True, some wealthy individuals have moved to downtown lofts. But roughly three quarters of retirees in the first bloc of retiring baby boomers are sticking pretty close to the suburbs, where the vast majority now reside. Those that do migrate, notes University of Arizona Urban Planning Professor Sandi Rosenbloom, tend to head further out into the suburban periphery. “Everybody in this business wants to talk about the odd person who moves downtown, but it’s basically a ‘man bites dog story,’” she says. “Most retire in place.”

    Historically, immigrants have helped prop up urban markets. But since 1980 the percentage who settle in urban areas has dropped to 34% from 41%. Some 52% are now living in suburbs, up from 44% 30 years ago. This has turned places such as Bergen County, N.J., Fort Bend County, Texas, and the San Gabriel Valley east of Los Angeles into the ultimate exemplars of multicultural America.

    What about the “millennials”—the generation born after 1983? Research by analysts Morley Winograd and Mike Hais, authors of the ground-breaking “Millennial Makeover,” indicates this group is even more suburban-centric than their boomer parents. Urban areas do exercise great allure to well-educated younger people, particularly in their 20s and early 30s. But what about when they marry and have families, as four in five intend? A recent survey of millennials by Frank Magid and Associates, a major survey research firm, found that although roughly 18% consider the city “an ideal place to live,” some 43% envision the suburbs as their preferred long-term destination.

    Urban centers will continue to represent an important, if comparatively small, part of the rapidly evolving American landscape. With as many as 100 million more Americans by 2050, they could enjoy a growth of somewhere between 10 million and 20 million more people. And in the short run, the collapse of the high-end condo market could provide opportunity for young and unmarried people to move into luxurious urban housing at bargain rates.

    But lower prices, or a shift to rentals, could prove financially devastating for urban developers and their investors, who now may be slow to re-enter the market. And for many cities, the bust could represent a punishing fiscal blow, given the subsidies lavished on many projects during the era of urbanist frenzy.

    The condo bust should provide a cautionary tale for developers, planners and the urban political class, particularly those political “progressives” who favor using regulatory and fiscal tools to promote urban densification. It is simply delusional to try forcing a market beyond proven demand.

    Rather than ignore consumer choice, cities and suburbs need to focus on basic tasks like creating jobs, improving schools, developing cultural amenities and promoting public safety. It is these more mundane steps—not utopian theory or regulatory diktats—that ultimately make successful communities.

    This article originally appeared in the Wall Street Journal.

    Joel Kotkin is executive editor of NewGeography.com and is a distinguished presidential fellow in urban futures at Chapman University. He is author of The City: A Global History. His newest book is The Next Hundred Million: America in 2050, released in Febuary, 2010.

    Photo by miamism

  • Chicago Stimulus Program: A Family Affair

    Even though cities all over the United States are running large deficits, Chicago Mayor Richard Daley feels that an investment in one particular charity is an investment for the future. After School Matters, founded by Mayor Daley’s wife Maggie Daley, funds l youth programs and helps low-income youth obtain job skills. It has received more than $46 million from the city since 2005, with nearly one-third of that total coming in 2009 alone ($15 million). This is a 50% increase from 2008, when the charity received $9.36 million.

    The city has even given some of its federal stimulus package to fund After School Matter’s job program, which pays low-income 14 to 24 year-olds $9-$10 an hour for four and a half hours of work each workday. The contract, signed in 2009, has allotted $1.31 million to the charity for three years. However, Illinois lags behind its projected job growth, and Mayor Daley must find a way to create sustainable jobs for these new workers if he is going to justify this allotment of stimulus money.

    Aside from that, companies that have contracts with the city are donating money to the project as well. Mayor Daley may not be accepting money from city contractors for his campaign, it certainly does not hurt that these contractors are giving millions to his wife’s charity. The Mayor has encountered a lot of criticism for patronage in City Hall after his nephew was found to have used city pension money to buy union land. After School Matters may represent a much more righteous investment, but the Mayor’s seems determined to make Chicago’s budget a family affair.

    Hat tip to Steve Bartin’s Newsalert

  • Kudos to Houston Traffic from IBM

    IBM has released its annual “Commuter Pain Index,” which ranks traffic congestion in 20 metropolitan areas around the world. According to IBM, the Commuter Pain Index includes 10 issues: “1) commuting time, 2) time stuck in traffic, agreement that: 3) price of gas is already too high, 4) traffic has gotten worse, 5) start-stop traffic is a problem, 6) driving causes stress, 7) driving causes anger, 8) traffic affects work, 9) traffic so bad driving stopped, and 10) decided not to make trip due to traffic.”

    Each metropolitan area is given a score between 0 and 100, with the highest score indicating the worst traffic congestion (See Table).

    IBM Commuter Pain Index: 2010
    Metropolitan Areas Ranked by Worst Traffic Congestion
    Rank Metropolitan Area Score (Worst to Best)
    1 Beijing 99
    1 Mexico City 99
    3 Johannesburg 97
    4 Moscow 84
    5 Delhi 81
    6 Sao Paulo 75
    7 Milan 52
    8 Buenos Aires 50
    9 Madrid 48
    10 London 36
    10 Paris 36
    12 Toronto 32
    13 Amsterdam 25
    13 Los Angeles 25
    15 Berlin 24
    16 Montreal 23
    17 New York 19
    18 Melbourne 17
    18 Houston 17
    20 Stockholm 15

    Favorable Urban Planning Characteristics Associated with Intense Traffic Congestion: The worst traffic congestion was recorded in the developing world metropolitan areas of Beijing, Mexico City, Johannesburg, Moscow, Delhi and Sao Paulo. In many ways, these metropolitan areas exhibit characteristics most admired by current urban planning principles. Automobile ownership and per capita driving is low. Transit carries at least 40% of all travel in each of the metropolitan areas. Yet traffic is intense. This is due to another urban planning “success,” objective, high population densities. Higher population densities are inevitably associated with greater traffic congestion (and more intense local air pollution), whether in the United States or internationally. All six of these metropolitan areas scored 75 or above, where a score of 100 would be the worst possible congestion.

    The next five metropolitan areas have accomplished nearly as much from an urban planning perspective. Milan, Buenos Aires, Madrid, London and Paris all achieve more than 20% transit market shares, and their higher urban densities also lead to greater traffic congestion. Each scores between 35 and 52.

    Traffic congestion is less in the next group, which includes Toronto, Los Angeles, Berlin, Amsterdam and Montreal. With the exception of Berlin, transit market shares are less, though the urban densities in all are above average US, Canadian and Australian levels. Amsterdam, the smallest metropolitan area among the 20, scores surprisingly poorly, since smaller urban areas are generally associated with lower levels of traffic congestion.

    The Least Congested Metropolitan Areas: Four metropolitan areas scored under 20, achieving the most favorable traffic congestion ratings. New York scores 19, with its somewhat lower density (the New York urban density is less than that of San Jose). Even lower density Melbourne and Houston score 17, tying for the second best traffic conditions. Stockholm achieves the best traffic congestion score, at 15, despite its comparatively high density. Stockholm is probably aided by its modest size which is similar to that of Orlando (Florida).

    The Houston Advantage: Perhaps the biggest surprise is Houston’s favorable traffic congestion ranking.

    • Houston has the lowest urban density of the 20 metropolitan areas.
    • Houston has the lowest transit market share, by far, at only 1%.
    • Houston also has the highest per capita automobile use among the IBM metropolitan areas.

    Yet Houston scored better than any metropolitan area on the list except for much smaller Stockholm. As late as 1985, Houston had the worst traffic congestion in the United States, according to the annual rankings of the Texas Transportation Institute. Public officials, perhaps none more than Texas Highway Commission Chair and later Mayor Bob Lanier led efforts to improve Houston’s road capacity, despite explosive population growth. Their initiatives paid off. By 1998, Houston had improved to 16th in traffic congestion in the United States. The population growth has been incessant, so much so that Houston has added more new residents since 1985 than live in Stockholm and more than half as many as live in Melbourne. While Houston had slipped to 11th in traffic congestion by 2007, the recent opening of a widened Katy Freeway and other improvements should keep the traffic moving in Houston better than in virtually all of the world’s other large metropolitan areas.

    Photo: Freeway in Houston

  • “Little Monsters”? Children and the Environment

    The idea has bubbled around the edges of the environmental pond for a while: choosing to be childfree expressly for the purpose of reducing one’s carbon footprint. An environmental correspondent at Mother Jones, for example, has pointed out that “…Nothing else you can do — driving a more fuel efficient car, driving less, installing energy-efficient windows, replacing light bulbs, replacing refrigerators, recycling — comes even close to simply not having that child… Why are we pretending that because they’re cute they’re harmless? Little monsters.”

    A Planet Green channel segment on the Voluntary Human Extinction Movement, the organization that offers voluntary human extinction as “a solution to involuntary human extinction” (slogan: “May we live long and die out”), cites the group’s 4000 Facebook friends.

    As absurd as it may seem, the concept has picked up supporters, and is actually inching into mainstream environmental thinking. It’s a trend that poses dangers, most of all to the green movement’s own sustainability.

    Attention a couple of years back focused on Australia, where the issue of a tax on (greenhouse gas emitting) newborns was raised. This month, a Princeton bioethicist, in a New York Times opinion blog headlined, Should This Be The Last Generation? eventually concluded, “In my judgment, for most people, life is worth living,” but then tamped down this irrational exuberance by questioning whether “the continuance of our species” really is justifiable.

    Earlier, a blog at the Nature Conservancy — the deservedly well-respected environmental group — made the case that it’s pointless to blame Bangladesh for its high birth rate when our own reproductive decisions have far greater environmental impact. The mixed reader response to author Peter Kareiva ranged from enthused zero population growth supporters — “..ninety percent of us could die without affecting our genetic diversity,” — to the head-scratching “…wonder if it’s perhaps a little short-sighted,” and “..Removing ourselves from the gene pool isn’t necessarily the best idea, no?”

    Notably absent from the commentary is the potential cost to the credibility of the environmental movement. Should we accept childlessness as the ultimate pathway to carbon neutrality? Or that eco-brownie points for sidestepping the egotism and self-indulgence of procreation accrue to future generations (in absentia)? That children are yet another impulse “buy”, thrown into the shopping cart of degenerate conspicuous consumption?

    The resurgence of support for zero population growth — or even negative population growth — as a means of preserving the earth represents a twist on our nation’s spiritual — and very green — heritage. Thoreau, other transcendentalists, and essayists both before and after him recognized America’s wilderness as a spiritual sanctuary. Reverence for our natural bounty and, more broadly, the planet, is now shared by countless Americans.

    But anti-natalism takes the religion of conservation well beyond respect for the natural world, to view the very existence of humans as defilement. It rejects the notion — powerful since the 19th Century — that children are the essence of purity. Now they’re unwitting agents of the sinful pollution of nature. An earlier era’s worry over Youth’s loss of innocence through exposure to the wild world is being replaced by the opposite concern: Youth is now seen as the destroyer of the world by its mere existence.

    We’ve come full circle from the early twentieth century national hysteria over Margaret Sanger, the great birth control pioneer who was condemned by members of the old Anglo-Saxon elite for hastening the extinction of America’s “native stock”. In that era, the impulse of families to restrict their size was seen as a selfish quest for mere personal fulfillment, harmful to the growth of the nation. Today, we see the opposite: An impulse to cast procreation as a personal indulgence at the expense of the larger society.

    The reasoning is, of course, that the choice to be child-free is not merely a personal decision, but rather a laudable contribution to a more sustainable world. But as a response to global population trends — widespread fertility declines, particularly in the West, combined with record high overall global population — it’s very different than offering birth control options to those who want them.

    This particular manifestation of environmentalism — the concept of solving humanity’s problems by eliminating, as much as possible, human beings — while positioning itself as both future-focused and statistically supported, is remarkably oblivious to the worldwide drop in birthrates and its economic implications. The demographic transition, which in Europe began before the mid-1800s, is bringing us both an aging population and more widespread participation, particularly among women, in the wealth of the modern economy.

    Today’s environmental movement has always included strands of Luddites. But, like other ultra-ascetic religions, and as even the anti-natalists themselves ruefully admit, the idea is not about to conquer the world.

    The demon-seed statistical projections on the carbon output of a single infant born today are based on the premise that the world’s energy use and methods will change not one iota during its lifetime. And the calculations usually include a reproductive chain over the next century or two. The assumption that the grandkids of today’s infants will be tucking AAA batteries into their toys or gassing up their Grand Cherokees isn’t — despite the impressive spreadsheets — objective or scientific.

    Of course, religion, guilt, and the quest for purity have a long, shared history, with holiness as the garlic that wards off Armageddon. The urge to condemn anything short of perfection reeks of fundamentalism. The witch-hunt for hypocrisy has been relentless by critics of environmentalism who believe that dangers to the ecosystem have been exaggerated: Does anyone in America not now know that Al Gore has a big house with a lot of light bulbs, and that he flies around on (gasp) planes?

    Now the annoying Puritanical fervor has been taken up those who think environmental dangers have been minimized. With fundamentalist zeal, they’ve one-upped their fellow environmentalists with a soul-purifying — and seemingly bulletproof — sacrifice of the urge to reproduce. This particular fast-track to holiness doesn’t require chastity; sex is allowed for everything except procreation. And, when considering a society where reproduction is denigrated, please imagine the mental health of children raised with the philosophy that the world would be a whole lot better off without them.

    Why has this issue gained such traction right now? The Age of Anxiety morphed into a Prozac Nation, but maybe the depression lingered on, marked by an inability to project positive outcomes, including the potential benefits of today’s infants over the coming years. The phenomenon’s growth can also be at least partly attributed to the unprecedented internet-age ability to connect with masses of like-minded individuals for group reinforcement.

    The choice to have a child or not is a purely personal decision. “Breeders,” as their critics sometimes describe them, shouldn’t need to justify their offspring with cost-benefit analysis’ showing that we need children to balance the national books (with social security payments) or to renew our civilization. Childless men and women still — even in our more-open-than-ever-society — encounter prejudice. To respond by claiming a ‘sacred’ justification as a guardian of the earth might appeal in a moment of self-righteousness. But it stands to reason that the custodian of a precious resource shouldn’t begrudge the very existence of its future inheritors.

    Photo derived from Face_0110

    Zina Klapper is a Los Angeles-based journalist, and Deputy Editor of newgeography.com.

  • The Economic Significance of Village Markets

    Flea markets and garage sales have been around for years. But for most New Zealanders, produce markets have been associated with old European villages, or the ethnic markets of Hong Kong and other exotic locations. Village markets focus on locally made crafts, while Flea Markets are essentially centralized garage sales.

    At the true Farmers’ Market vendors may sell only what they grow, farm, pickle, preserve, bake, smoke, or catch themselves from a defined area. There are now over 50 “official” Farmers’ Markets in New Zealand. But when all the flea markets, village markets, and less formal markets are tallied up there must be hundreds throughout New Zealand.

    When I grew up they simply didn’t exist – unless we count the school “Bring and Buy” and Church fétes. We simply shopped in shops. Why is this? Why did my parents feel no need for such markets? I suspect my parents would have regarded such markets as somewhat old-fashioned and even primitive. This was the sort of thing our forebears left behind in Ireland in the 1830s.

    However, they are now a part of our lives. For the last few years I have routinely – effectively every Saturday Morning – shopped at our local market at Mangawhai, a nearby coastal village in Northland. It’s where people sell their own produce, but also sell books, bric-a-brac, power-tools, and other bits and pieces. The market works for me because it is just across the road from my excellent butcher, and next door to the local lending library.

    So what’s the new appeal? The conventional theory is that the rise of these markets reflects a desire for fresh healthy food, and fruit and vegetables grown locally, and in-season rather than imported from far away. It’s also considered green to buy local and support local cultivars, and growers of eco-sourced native plants and so on.

    These markets are also a good place to meet for a chat, and they also provide a convenient means of selling off numerous “priceless objects” now growing mould in the garage.

    Indeed, last weekend, my wife and I decided to win back some space and earn some ready cash. Setting up a stall at the Mangawhai market was easy. We simply phoned the market organizer (from the local Cheese Shop) and booked a trestle table.

    We thought our real cash-cow would be the plants and seedlings but the biggest and most regular seller was our collection of vinyl records dating from US pressings of jazz giants from the sixties. Our first major sale was a high-quality Akai turntable. It was fun to see grandmotherly types shuffle up to the table and enthuse over early discs by Oscar Peterson, Miles Davis, and Billie Holliday. As a bonus we gave the turntable buyer a 1950s 10 inch LP of Bill Hayley and his Comets – Don’t Knock the Rock.

    The last time I thought about these markets was two weeks ago when I wrote the sad story of the urban Onehunga Market that had to close because Auckland City demanded a resource consent that would have cost maybe $30,000 dollars.

    I presume our Mangawhai market operates without such costs because it is housed in the Village Hall, on public ground, shared with the Library and the Museum. Consequently our stall space and trestle cost us only $10 for the morning. But if the Council had demanded say $30,000 for a land use consent, then a twenty-trestle market at $10 a trestle would take 150 weeks to recover just the consenting cost. Obviously, there would have to be many more spaces, or the rental would have to be much higher.

    On our first morning we netted only about $80. (Being newcomers, we were outside and it rained) But even this represented about $20 dollars an hour – not huge but better than the minimum wage. On the other hand it was an $80 dollar return on our $10 dollar capital investment (using simple “homespun” economics). Remember the stuff we were selling had negative value, and I drive back and forth from the village every Saturday anyhow.
    And it was fun. But could such markets become an endangered species? As in so many areas, the culprit is heavy-handed regulation. The high costs of land and development, and the burden of consenting and development contributions already make it nearly impossible for small corner stores to make any return on capital.

    Yet, the stall renter’s capital-productivity is massive. But many regulators cannot stand to see such an opportunity slip from their grasp. So the Onehunga market had to close.

    These village markets remind us of the “power of markets”. As the heavy-handed regulators drive down capital productivity, entrepreneurs have responded by rediscovering the outdoor markets of much earlier times when capital was scarce and labour was plentiful. Market economies are like water-beds – push down on one corner and they bounce up in another.

    We are beginning to see similarly ad hoc responses in the residential and commercial property markets. The regulators have so severely constrained the supply of coastal land in New Zealand that people like my parents, who bought a batch at on the coast at Tairua out of their working class income, no longer have a hope of enjoying the sprint from the Kiwi bach straight into the sea.

    Those who have generated this scarcity then complain that only foreigners can afford to buy our coastal land. But many of us really do want to occupy a beach side property for the best weeks of summer, and then return home to our rural dwellings in the regional towns and villages. Enter the motor home.

    As farmers become more and more regulated by central planners who know nothing about agricultural economics but instead are determined to ‘save the planet’, enterprising farmers will look for new ways to supplement their incomes.

    Well, here’s one way we can solve our mutual problem. First, buy a quality self-contained motor home. Then use Google Maps to find what looks like an ideal bay, with a farm track connecting the main road to the beach.

    Then approach the farmer and negotiate a “right to occupy” this little patch of heaven. It could be no more than the right to park on the spot for perhaps eight weeks a year, but could include an obligation to fence off the area to contain any children or pets. No resource consent, no title, no lease – just a right to drive on to the farm, park on the spot, and drive away if it rains.

    Farmers supplement their income and Kiwis reclaim the low cost beach. The Environmental Puritans will gnash their teeth at the prospect of so many people having fun – but this time we might be ready for them. Markets and human ingenuity can still win in the long run.

    Owen McShane is Director of the Centre for Resource Management Studies, New Zealand.

    Photo of Mangawhai Village Saturday Market by Sids1

  • Sponge Cities on the Great Plains

    “Sponge cities” is an apt metaphor to describe urban communities in rural states like North Dakota which grow soaking up the residents of surrounding small towns, farms and ranches. North Dakota’s four largest cities, Fargo, Bismarck, Grand Forks and Minot, are growing in large part due to the young adults who for decades gone elsewhere to other regions. In the process, rural North Dakota is facing a protracted population crisis as significant numbers of its small communities are on a slow slide to extinction. This migration pattern is not new, nor is it unique to North Dakota. Historically, one of the most significant demographic trends in the United States has been the movement of people from rural to urban areas. In 1915 sociologist E.A. Ross declared that small Midwestern towns reminded him of “fished out ponds populated chiefly by bullheads and suckers.”

    Beginning in the 1920s, North Dakota and South Dakota youth stopped wanting to step into their parents’ worlds. According to the Department of Rural Sociology, in 1927 more than 87 percent of farmers encouraged their children to go into farming, but by 1938 less than half of Dakota high schoolers were taking that advice.

    A June 2010 survey of 111 North Dakota high school juniors and seniors offers a glimpse into the minds of the state’s young adults as they stand on the precipice of adulthood. They were asked to choose the size of community in which they aspire to live and work. Although roughly four in ten were raised in communities of fewer than 2,000 residents, out of the over 100 students surveyed, only six wished to live their adult lives in the a town of fewer than 2,000. Overall, 70 percent aspired to live in larger communities than those of their childhood (see Figure 1).

    Small towns struggle to provide urban amenities that can match the sponge cities’ bustling malls, skateboarding parks, concert venues and Olive Gardens. While the serene, friendly, stable rural communities, farms and ranches reflect the way that “life is supposed to be” in the minds of many outside the region, young adults often view that way of life as dull and sluggish. City life—albiet small scale by national standards—is perceived as fun, fast and fashionable; jobs pay better and there are more of them.

    There has always been a desire among young adults to experience life outside of where they grew up. Experts believe that economics and quality of life are the two dominant motivations people have for moving from rural areas into cities. Mark Stephens, a young college graduate who left his small town of under 400 for Fargo, with a metropolitan population pushing 200,000, the largest city in the state, said “The first thing people throw out as an excuse is increased opportunity, but let’s face it, 18- to 20-something adults are not thinking long term. For the most part, kids in that age group are really pretty shallow. In truth, I think it comes down to one word: Jealousy. They are walking down a gravel road in their tiny town with a link to massive amounts of media right in their back pockets. It’s no different than when they were little kids—they see someone with ice cream and they want some too.”

    Richard Rathke, director of the North Dakota State Dakota Center, notes that aggregate data demonstrate the movement of young adults to larger cities in the Great Plains The young adult population (age 20-30) has been inmigrating to metro areas each decade since 1950 while in the farm dependent rural counties, they are outmigrating in sizeable numbers (see Figure 2).

    A dozen young adults moving from Edgeley, North Dakota (population 637) to Fargo is irrelevant to Fargo as it absorbs the new residents with barely a nod, but to Edgeley, the shift represents significant and chilling loss of young, skilled, educated workers that will have a detrimental impact on the town’s future prosperity or even survival. Some predict that once Fargo has soaked up all the smaller communities, young professionals will then abandon Fargo for the more illustrious Minneapolis. North Dakota is touted in nationwide polls as one of the friendliest states in the nation, but schadenfreude flourishes on the Plains. Mayors of small towns that have lost their young people to growing population hubs have been known to remark, “Just wait until all of their kids move to Minneapolis.”

    Perhaps metropolises like Minneapolis or New York will not be the ultimate sponge cities. Indeed, Minneapolis has experienced a 1.4 percent drop in population since 2000. Demographers are beginning to observe that for many of us there is a point where diseconomy of size becomes real. Traffic, congestion, housing prices, crime and pollution levels may already be curtailing inmigration to the nation’s major metropolitan cities.

    The phenomenon of sponge cities will change the nature of states like North Dakota. At the turn of the twentieth century a mere 7 percent of North Dakotans were urban, by 1980 one out of every three residents was urban and in 2010 the state is projected to be 50 percent urban and 50 percent rural, a loss of nearly 88,000 rural residents in a state whose total population has remained stagnant, hovering between 600,000 to 650,000 for over a century.

    A legitimate reaction to the entrenched loss of people from rural areas of North Dakota might be, “So what?” What does a state or nation have at stake in the health of a small town like Edgeley? After all, one community’s loss is another community’s gain. Migratory patterns are simply indicative of an efficient labor market; employers discover employees and employees find jobs that fit their skills, interests and education. Thus, one might conclude that is does not really matter if a significant percentage of rural Americans move to more urban areas. In fact, what some perceive as a seemingly endless stream of discouraging census data may actually be a positive indicator. Robert E. Lucas, Jr., University of Chicago economist and winner of the 1995 Nobel Prize in Economics, believes that a region’s successful transformation from traditional agriculture to a modern, growing economy depends on talent clustering—an accumulation of human capital that sponge cities are performing. The future may not be rural, but sponge cities could make traditional rural states like North Dakota very viable.

    Wayne Sanstead, North Dakota’s State Superintendent of Public Schools and former lieutenant governor, has for decades watched rural schools close their doors. “We want the small communities to be a part of the state’s economic and social future, but we have to face reality and focus on retaining young adults within the state.” This may be the new reality. In large part due to the state’s booming economy, Sanstead asserts that in his 25 years as a state superintendent he has never seen a better opportunity then today for the state to retain its young people.

    Deb Kantrud, Executive Director of the South Central Regional Council in Jamestown, North Dakota, has spent her entire career as a community developer. She noted that if small town residents stay behind after high school graduation, they aren’t considered as successful as those who relocate. We need to figure out a way to keep some smaller communities viable—turning them into sponge cities—while acknowledging that some smaller communities may not be part of the brighter future that awaits our reviving state.
    Debora Dragseth, Ph.D. is an associate professor of business at Dickinson State University in Dickinson, North Dakota. She trains and develops leadership curriculum for CHS, Inc. a diversified energy, grains and foods company. The Fortune 100 company is the largest cooperative in the United States. Dragseth’s research interests include Generation Y, outmigration and entrepreneurship.

  • Why the Great Plains are Great Once Again

    On a drizzly, warm June night, the bars, galleries, and restaurants along Broadway are packed with young revelers. Traffic moves slowly, as drivers look for parking. The bar at the Donaldson, a boutique hotel, is so packed with stylish patrons that I can’t get a drink. My friend, a local, and I head over to Monte’s, a trendy Italian place down the street. We watch a group of attractive 30-something blondes share a table and gossip. They look like the cast of the latest Housewives series.

    It might sound like an evening in the Big Apple, but this Broadway runs through downtown Fargo, N.D. A decade ago, this same street was just another unremarkable central district in a Midwestern town: bland restaurants, adequate hotels, no decent coffee. After the local stores closed for the day, the street was mostly populated by a few hard-drinking louts.

    That has all changed, part of a transformation that foreshadows the growth of the vast Great Plains region. “I come from a big city, but I like the lifestyle here,” says Marshall Johnson, an African-American who played football for the nearby University of Minnesota, Crookston, and now works for the local Audubon Society. “In a decade this place will be a small Minneapolis. Everyone sees a bright future ahead.”

    Johnson may be an anomaly in this still homogeneous state—the population is more than 90 percent white, and Native Americans constitute the largest minority by far—but he senses something very real. Throughout the good times and, more important, the bad of this new millennium, the cities of the plains—from Dallas in the south through Omaha, Des Moines, and north to Fargo—have enjoyed strong job growth and in-migration from the rest of the country. North Dakota boasts the nation’s lowest unemployment rate—3.6 percent, compared with the national average of 9.7—with South Dakota and Nebraska right behind it.

    The trend has been particularly strong in urban areas. Based on employment growth over the last decade, the North Dakota cities of Bismarck and Fargo rank in the top 10 of nearly 400 metropolitan areas, according to data analyzed by economist Michael Shires for Forbes and NewGeography.com. Much of that growth has come in high-wage jobs. In Bismarck, the number of high-paying energy jobs has increased by 23 percent since 2003, while jobs in professional and business services have shot up 40 percent.

    That’s not bad for a region best known by East Coast pundits for the movie Fargo. It got so bad a decade ago that even local boosters suggested North Dakota jettison the “North” to make the place seem less forbidding. Two Eastern academics, Frank J. Popper and Deborah Popper, predicted that the region would, in a generation, become almost totally depopulated, and proposed that Washington speed things along and create “the ultimate national park.” Their suggestion: restock the buffalo.

    Certainly, many small towns across the plains—such places as Reeder, N.D., which lost its only school, or Mott, N.D., with its struggling downtown—have withered. Others are likely to disappear altogether. But growth has rebounded in larger towns, according to Debora Dragseth, an associate professor of business at Dickinson State University. She describes places like Fargo—with a population approaching 200,000—as “sponge cities,” absorbing population from rural areas. Just a decade ago, those people fled the region entirely.

    The primary drivers of this new growth, says Dragseth, are basic industries like agriculture and energy. Salaries may be low by coastal standards, but so are living costs. And the prices of commodities like beef, soybeans, and grains have generally continued to rise, due in large part to growing demand from China, India, and other developing countries.

    But the biggest play by far is in energy, including coal, natural gas, and oil, which exist in prodigious quantities from Texas to the Canadian border. Besides the vast reserves of oil that have made it the country’s fourth-largest producer, North Dakota possesses significant deposits of natural gas and coal, as well as huge potential for wind power and biofuels. These industries are drawing hundreds of skilled workers from places like California and Michigan, who are moving into Bismarck, the state’s capital, and towns to the west.

    The energy boom has placed states like the Dakotas and Texas in an enviable fiscal situation. Oil and gas revenues are filling up their coffers, allowing them to eschew the painful cutbacks affecting most coastal states. North Dakota has a $500 million surplus, and next year the cash gusher could rise to more than $1 billion, estimates Dragseth. That could go a long way in a state with barely 600,000 people.

    Of course, the people of the plains have seen booms before—commodity prices soared early in the last century, and there was an oil-fired boom back in the 1970s. But growing demand in developing countries could sustain long-term increases of energy and agricultural products. Niles Hushka, CEO of Kadrmas, Lee & Jackson, a growing engineering firm active in Bismarck, sees other factors working for the plains. The public schools are excellent; the Dakotas, Iowa, Minnesota, Nebraska, and Kansas enjoy among the highest graduation rates in the country. North Dakota itself ranks third and Minnesota fourth (after Washington, D.C., and Massachusetts) in the percentage of residents between 25 and 34 with college degrees.

    Nowhere is this potential clearer than in Fargo, which is emerging as a high-tech hub. Doug Burgum, from nearby Arthur, N.D., founded Great Plains Software in the mid-1980s. Burgum says he saw potential in the engineering grads pumped out by North Dakota State University, many of whom worked in Fargo’s large and expanding specialty-farm-equipment industry. “My business strategy is to be close to the source of supply,” says Burgum. “North Dakota gave us access to the raw material of college students.”

    Microsoft bought Great Plains for a reported $1.1 billion in 2001, establishing Fargo as the headquarters for its business-systems division, which now employs more than 1,000 workers. The tech boom started by Burgum has spawned both startups and spin-offs in everything from information technology to biomedicine. Science and engineering employment statewide has grown by 31 percent since 2002, the highest rate of any state.

    These jobs, and the people they attract, shower cash on Broadway’s busy bars and dining establishments. Both Burgum and his ex-wife, Karen, have been driving forces in this restoration. Karen led the effort to convert the once seedy Donaldson into a stylish downtown hotspot, featuring the work of local artists on the walls and bison on the menu. “People thought I should be put in a padded cell for doing this,” she says. Of course, entrepreneurs like the Burgums will continue to face big challenges to lure customers and workers—cold weather, isolation, and competition from more urban places. But for the first time in generations, parts of the Great Plains have a chance to be great again.

    This article originally appeared in Newsweek.

    Joel Kotkin is executive editor of NewGeography.com and is a distinguished presidential fellow in urban futures at Chapman University. He is author of The City: A Global History. His newest book is The Next Hundred Million: America in 2050, released in Febuary, 2010.

    Hotel Donaldson photo By jeffreykreger

  • McChrystal Exit: Obama and His Generals

    General Stanley McChrystal may be the first commanding general in the history of warfare to be relieved of his command because he groaned over the receipt of an email from an ambassador, or because one of his aides whispered to a Rolling Stone reporter that the president had looked “intimidated” in a meeting with the military brass.

    In terms of carrying out strategy, it has been stated that the president had no military complaints about the heavy metal general, who was walking the impossibly thin red line between a general war in Afghanistan and a campaign waged only with assassinations and drone missiles.

    Just a month before his firing, McChrystal successfully packaged a tour of the White House and Capitol Hill for President Hamid Karzai. In earlier media campaigns — notably when the president flew into Kabul in the dead of night to lecture a pajama-clad Karzi over corruption — the Afghan president was deemed unworthy of an American war effort.

    However briefly, McChrystal had succeeded in integrating the Afghan government into the order of battle. So why was he sacked for humming a few bars of Satisfaction in the presence of a rock reporter?

    No doubt McChrystal had his enemies within the bureaucracy, including the ubiquitous ambassador Richard Holbrooke, and the U.S. ambassador in Kabul, former general Karl W . Eikenberry. Along with these two add in a legion of jealous Army politicos, all of whom would love to wear combat fatigues to a presidential photo-op.

    In relieving General McChrystal, perhaps as part of a search for his mojo, President Obama joins a long line of presidents who never figured out how to command their commanders. Here’s a brief summary of some of the more complicated relationships between American presidents and their field generals:

    President Lincoln— Often praised for his habits of command in the Civil War, he nevertheless promoted, endorsed, and endured the incompetence of such generals as McClellan, Meade, Burnside, Pope, and Rosecrans before winning the war with Grant and Sherman, both of whom would horrify a Senate confirmation hearing, let alone the editors of Rolling Stone.

    Grant was a drunk who killed thousands at Shiloh and Spotsylvania, and Sherman once celebrated the drowning of a boatload of reporters, pointing out that maybe their “heavy thoughts” had taken them to the bottom. He also burned Atlanta. Both understood how to win modern wars.

    President Madison— In the war of 1812, he had to endure generals who botched several invasions of Canada, allowed Washington to burn, and, in the case of Andrew Jackson at New Orleans, fought battles after the peace was signed. (But the Battle of New Orleans did more than Yorktown to forge American independence.)

    President Kennedy— He loathed his top generals, blaming them for the Bay of Pigs fiasco and for pushing him into Vietnam, saying “They always give you their bullshit about their instant reaction and split-second timing, but it never works out. No wonder it’s so hard to win a war.” Kennedy’s skepticism about the military command, however, pushed him to ignore their advice for invasion and air strikes in the Cuban Missile Crisis, possibly averting nuclear war.

    Presidents Carter and Johnson— In the style of the Obama White House, these two both micro-managed their war efforts. Jimmy Carter was the air traffic controller for Operation Blue Light, the failed attempt to rescue American hostages in Iran. Lyndon Johnson boasted that the Air Force could not hit so much as “a shithouse” in Vietnam without his authorization. Both presidencies were lost due to the foreign entanglements of the commander-in-chief.

    President Roosevelt— A successful example of a commander-in-chief; no president handled generals better than FDR, who was a shrewd judge of character. Roosevelt spent many months of the war in proximity to his fighting forces (including his own sons, who were serving officers). He vested authority in a number of competent commanders, starting with General George C. Marshall.

    Roosevelt was clear in his strategic objectives and did not meddle, for example, in the deployment of 30,000 troops. Nor did he fire General Patton when he slapped a fatigued soldier. Imagine what General MacArthur would have said about FDR to Rolling Stone? Would FDR have cared? (Eisenhower remarked: “I spent seven years under MacArthur studying dramatics.”)

    Despite all the media visibility around his decisions on Afghanistan, we know little about President Obama’s habits of military command. When he’s before large audiences, he is good at articulating the role he sees for the United States in the world. For better or worse, he is unafraid to offend traditional allies, such as Israel and Great Britain. He even sided against England in a recent flare-up around the Falkland Islands.

    Strategically, however, Obama rarely contradicts his military-industrial complex. Yes, he fired McChrystal, but he replaced him with his boss, mentor, and near Siamese twin, General David Petraeus, as if to imply that the only problem in Afghanistan was McChrystal’s joke about Vice President Biden.

    While hitching his political star to the Nobel Prize for Peace, Commander-in-Chief Obama continues to fund Israel’s war footing, stations forces in Iraq, widens the commitment in Afghanistan, attacks Pakistan with drones, and pushes for war sanctions against Iran. In the pulpit, he is Woodrow Wilson; in action, he’s George W. Bush.

    Nor has the Obama administration been able to articulate a coherent war aim behind the commitment of additional forces in Afghanistan. Look at the many mixed messages sent to Karzai, who depending on the week is “our man” or the next Diem.

    The president’s current directive to his generals is to avoid casualties, hold a mountainous country the size of Texas with eight divisions, foster rural development in places like Helmand, find bin Laden, pacify the federal tribal areas, make President Karzai look democratic, train the Afghan army and police, leer across the border at Iran, and prop up a wobbly government in Pakistan — although, politically speaking, all the administration wants is enough shock and awe so that the Republicans in the 2010 mid-term elections cannot paint it as “weak on terror” or having “lost” Afghanistan.

    In turning the strategic decisions about Afghanistan into an endless university teach-in (with all the allusions to “accountability,” “transitions,” and “benchmarks”), the president acts as if all the timing questions in this war were on his side. Let’s hope that the Taliban and other insurgents, especially those now planting car bombs in Islamabad, Baghdad, and Kabul, got the departmental memo that the United States would be on sabbatical in 2011.

    In 1815, Andrew Jackson felt he had to attack the British the very night he heard they had landed near New Orleans. By contrast, President Obama spent a leisurely year pondering the Weltanschauung of Afghanistan and publicly ruminating about strategic options. He now feels he can afford the luxury of sacking a field general for failing to sound reverential in an interview. Aren’t there better measures of a commander? (At Bellow Wood, a Marine officer said: “Retreat? Hell, we just got here.”)

    Before Lincoln could become the wartime president that we admire, he needed to find a general “who fights,” and he needed to articulate an acceptable and collective war aim, which he achieved with his Gettysburg address and Second Inaugural. He also had to come to the conclusion that Grant, drunk, made more sense than his other generals sober.

    President Barack Obama meets with Army Gen. Stanley McChrystal. Official White House photo by Pete Souza.

    Matthew Stevenson is the author of Remembering the Twentieth Century Limited, winner of Foreword’s bronze award for best travel essays at this year’s BEA. He is also editor of Rules of the Game: The Best Sports Writing from Harper’s Magazine. He lives in Switzerland.