Category: Policy

  • The Ultimate Houston Strategy

    Last week was the 7th anniversary of my blog, Houston Strategies. After 947 posts (cream of the crop here), almost half a million visitors, and thousands of comments in an epic dialogue about Houston, I thought this would be a good time stand back, look at the big picture, and ask "What should be next for Houston?" while linking back to some of the gems from that archive.


    First, let’s look at where we are currently. Our foundation is in great shape. Houston has started the 21st-century with a set of rankings and amenities 99% of the planet’s cities would kill for: a vibrant core with several hundred thousand jobs; a profitable and growing set of major industry clusters (Energy, the Texas Medical Center, the Port); the second-most Fortune 500 headquarters in the country; top-notch museums, festivals, theater, arts and cultural organizations; major league sports and stadiums; a revitalized downtown; astonishing affordability (especially housing); a culture of openness, friendliness, opportunity, and charity (reinforced by Katrina); the most diverse major city in America; a young and growing population (fastest in the country); progressiveness; entrepreneurial energy and optimism; efficient and business-friendly local government; regional unity; a smorgasbord of tasty and inexpensive international restaurants; and tremendous mobility infrastructure (including the freeway and transit networks, railroads, the port, and a set of truly world-class hub airports). 

    To those I’d add:

    With all that, it’s really easy to get complacent. In fact, in some ways I think we might be coasting a bit now. But coasting is definitely not how we got here. Big initiatives are a proud tradition here: dredging the original port, founding the Texas Medical Center, establishing the Johnson Space Center, and being the first in the world to build a gigantic, futuristic, multi-purpose domed stadium – just to name a few examples. But what should be next? Where should the world’s Energy Capital put its energy, so to speak?

    I was recently inspired by the Urbanophile’s post on Indianapolis’ 40-year economic development and tourism strategy built around sports. Starting with nothing but the Indy 500 they’ve built a string of wins all the way up to hosting one of the most successful Super Bowls ever last month. We need that same sort of sustained, long-term strategy that goes beyond specific projects to a theme we can weave into everything we do over the decades ahead. We need to take the energy boom we’re currently enjoying and invest it to secure our long-term prosperity no matter how technology shifts in the future (most especially energy technology).

    In an unpredictable world, the only safe bet is a talent base that can adapt. With the Texas Medical Center, we concentrated health care talent in a district that has grown and adapted into the largest medical concentration in the world with an array of world class facilities. We’ve done the same on an even larger scale with energy and engineering talent. The next step is to take that strategy and generalize it to focus on being the global capital of applied STEM (Science/Technology/Engineering/Math) talent. We need to mobilize the city around a common purpose of building this human infrastructure. We need to embed it into our education, tourism, cultural and economic development strategies. It’s just a perfect fit for Houston on so many levels:

    In particular, I think we should focus on applied STEM – systems-based problem solving (engineering) over pure knowledge (where we are at a competitive disadvantage with many university clusters around the country). Facilitating man’s progress through innovative problem solving.

    Part of this strategy includes tourism, articulated in more detail here. We need the big tourism experience of other world class cities, and STEM is a unique niche we can build around, with a primary focus on families, schools, and STEM-related conferences. We already have some of the assets in place – JSC and Space Center Houston, the Natural Science Museum, the Health Museum, the Children’s Museum, Moody Gardens – and others with more potential, like the Texas Medical Center. But we need that signature attraction: the world’s largest institute/museum of technology. Not just a history-focused museum, but an institute actively involved in the community with a strong focus on the future. Local kids should spend frequent school days and summer camps there on fun and inspiring STEM activities. It could provide educational STEM experiences both online and on-site, helping to attract talented global youth to Houston for amazing experiences that draw them back later for college or after graduation. It should have the world’s largest hackerspace. It should be an inspiring space that attracts global academic and professional STEM-related conferences (building on the OTC) – groups trying to solve big problems and contribute to humanity’s progress (imagine a Davos or G8 of STEM…). Each conference could leave behind a new exhibit on its subject area, building the collections over time. And since it has the event space, we might as well open it up to festivals to expose more of our community to that same inspiration.

    The natural place for such an institute is clearly the Astrodome, our historic icon looking for a second life. We should embrace the Astrodome as Houston’s architectural icon like Paris does the Eiffel Tower, New York does the Statue of Liberty or Empire State Building, Rome does the Vatican or Coliseum, and San Francisco does the Golden Gate bridge. It can find a second life as our inspiring cathedral to man’s technological progress (along with some fun mixed in – Robot Rodeo anyone?). Most importantly, it has around a million square feet of space. Here’s how it compares to other top museums:

    But unlike every other museum in the world where exhibits are carved up into a series of halls, almost all of them could be visible in a giant 360-degree panorama while standing on the floor of the Astrodome.  How amazing would that space be?

    The cost, you ask?  Easily in the hundreds of millions.  But if LA can come up with $1.2 billion to build the Getty Museum, I have no doubt that Houston can muster the needed resources.  It’s a tiny fraction of the wealth of Houston’s 14 philanthropic billionaires, much less the broader base of wealth in this booming city.  We can come together to make this happen before the Astrodome’s 50th birthday in 2015, and it can put us on a path to greatness for our bicentennial in 2036 that Houston’s and Texas’ founding fathers could never have imagined.

    We, the citizens of Houston, aren’t the types to get complacent and rest on our laurels.  That’s not the legacy previous generations left us.  It’s time to step forward and tackle our next great challenge.  Are you in?

    Tory Gattis is a Social Systems Architect, consultant and entrepreneur with a genuine love of his hometown Houston and its people. He covers a wide range of Houston topics at Houston Strategies – including transportation, transit, quality-of-life, city identity, and development and land-use regulations – and have published numerous Houston Chronicle op-eds on these topics.

    Photo by telwink

  • Time for Real Solutions to Vancouver’s Housing Affordability Crisis

    Vancouver is in desperate need of new solutions to ease its worsening housing affordability crisis. The 8th annual Demographia housing affordability survey released by the Frontier Centre found that Vancouver has the second least affordable housing market next to Hong Kong. On average, and assuming zero interest, a house in Vancouver would cost the median family more than ten years income. Three years is the threshold after which a market is considered unaffordable.

    Mayor Robertson recently announced the launch of a new task force to tackle the housing affordability crisis. The only way to tackle this problem is to focus on getting more housing units on to the market.

    Much of the debate around housing affordability descends into discussions about manipulating housing prices by freezing out market mechanisms. Rent control used to be a popular remedy, until cities realized that the side effects of the cure were worse than the disease. Two common methods of attempting to tackle housing today are social housing and inclusionary zoning. Social housing has been responsible for creating some of the most crime ridden neighbourhoods in the Western world. There is a reason "the projects" have such a bad name. Yet politicians of all stripes tend to promise more "affordable housing" as they call it, knowing that it will at best benefit a narrow group of people who qualify. Inclusionary zoning—requiring developers to build a specific number of below market rate units in new developments—has been one of the methods that municipal governments have attempted to compensate for this shortcoming. It also misses the point. It fails to bring broad price levels down, since it increases prices substantially for market rate units in the same development. One study from San Jose State University economists found that inclusionary zoning increases the price of market of new homes by $22,000-$44,000 in the median city. That is simply how developers pass off the cost of losing money on affordable units.

    The policies mentioned above ignore the fundamental issue: houses are priced by supply and demand. In a desirable city like Vancouver, prices are bound to be higher than in Omaha, Nebraska, or Saskatoon. But the dramatic price escalation that started in the 90s isn’t beyond the city’s control. There are many ways to get more supply on the market. One of the commendable policies undertaken by the city has been the introduction of laneway houses. These are small units that are hived off from existing houses. They are essentially small secondary suites that back in to laneways. But it won’t be anywhere near enough on its own. Vancouver needs to develop more land. The land is there, but it is off limits to development because of the agricultural land reserve (ALR). That needs to change.

    The ALR serves two purposes. The first is to preserve agricultural land. The benefit from it is contingent on whether the benefits from local agriculture outweigh the costs of taking land off of the market. From a nutritional and an economic perspective that simply isn’t the case. Flash frozen foods are often more nutritious than "fresh" local food, and intensive farming is more economical and sustainable than small scale farming. We would not be able to accommodate anywhere near our current population without industrial agriculture. This justification simply fails.

    The second justification for the ALR is to prevent urban sprawl. In a sense this works, since there is no sprawl development in the ALR. On the other hand, this approach is conducive to "leap frog" development which takes place beyond the growth boundary. It happens anywhere that a growth boundary exists. People commute further for cheaper housing. This is as true in the smart growth Mecca of Portland as it is in Toronto or Ottawa. From an economic perspective, there are reasons to worry about sprawl. People who move out into cheaper housing on the urban fringe typically pay less property taxes, and often cost municipalities more per capita. But the ALR hasn’t solved this problem. Metro Vancouver outside of the city proper accounted for 87% of the metropolitan area’s growth between 2006-2001. Simply put, the ALR simply hasn’t prevented sprawl.

    In order to balance the concerns of housing affordability and urban sprawl, the city of Vancouver should strike a compromise: open portions of the ALR, but only to high density development. This may not be the optimum solution for families that would prefer to purchase single dwelling homes, but a significant influx of new units would be a countervailing force against runaway home prices. This would also put downwards pressure on housing in the rest of Greater Vancouver. Though opening up broad swaths of the ALR may be the ideal, this seems like a reasonable compromise.

    This type of solution would rile people on both sides of the political spectrum, but it would be a dramatic improvement over the status quo. High home prices can only be solved from the supply side. The choice between maintaining the ALR as constituted or opening up portions should be obvious. Infill development can only go so far towards solving Vancouver’s housing crisis.

    Steve Lafleur is a Policy Analyst with the Frontier Centre for Public Policy.

    Downtown Vancouver photo by runningclouds

  • Shale Revolution Challenges the Left and the Right

    In his State of the Union address, President Obama invoked the 30-year history of federal support for new shale gas drilling technologies to defend his present day investments in green energy. Obama stressed the value of shale gas—which will create thousands of jobs and billions in profits—as part of his "all of the above" approach to energy, and defended the critical role government investment has always played in developing new energy technologies, from nuclear to solar panels to wind turbines.

    The president’s remarks unsurprisingly sparked a strong response from some conservatives (here, here, here, and here), who have downplayed and even attempted to deny the important role that federal investments in hydrofracking, geologic mapping, and horizontal drilling played in the shale gas revolution.

    This is an over-reaction. In acknowledging the critical role government funding played in shale gas, conservatives need not write a blank check for all government energy subsidies. Indeed, a closer look at the shale gas story challenges liberal policy preferences as much as it challenges those of conservatives, and points to much-needed reforms for today’s mash of state and federal clean energy subsidies and mandates.

    The Government’s Role

    Some have pointed to the fact that fracking dates back to the 19th century and hydraulic fracking to the 1940s as evidence that federal funding for today’s fracking technologies was unimportant. But dismissing the importance of federal support for new shale gas technologies in the ’70s and ‘80s because private firms had succeeded in fracking for oil in the ’40s and ’50s is like suggesting that postwar military investments in jet engines were unnecessary because the Wright Brothers invented the propeller plane in 1903.

    Enhancing oil recovery from existing wells in limestone formations by injecting various combinations of water, sand, and lubricants, as was done by private firms starting in the 1940s, is a vastly different and less complicated technical challenge than recovering widely dispersed gas methane in rock formations like shale that are simultaneously porous but not highly permeable.

    Recovering gas from shale formations at a commercial scale requires injecting vastly more water, sand, and lubricants at vastly higher pressures throughout vastly larger geological formations than anything that had been attempted in earlier oil recovery efforts. It requires having some idea of where the highly diffused pockets of gas are, and it requires both drilling long distances horizontally and being able to fracture rock under high pressure multiple times along the way.

    The oil and gas industries had no idea how to do any of this at the time that federal research and demonstration efforts were first initiated in the late 1960s—indeed, throughout the 1970s the gas industry made regular practice of drilling past shale to get to limestone gas deposits.

    This is not just our opinion, it was the opinion of the natural gas industry itself, which explicitly requested assistance from the federal government in figuring out how to economically recover gas from shale starting in the late 1970s. Indeed, shale gas pioneer George Mitchell was an avid and vocal supporter of federal investments in developing new oil and gas technologies, and regularly advocated on behalf of Department of Energy fossil research throughout the 1980s to prevent Congress from zeroing out research budgets in an era of low energy prices.

    Early Efforts

    The first federal efforts to demonstrate shale gas recovery at commercial scales did not immediately result in commercially viable technologies, and this too has been offered as evidence that federal research efforts were ineffective. In two gas stimulation experiments in 1967 and 1969, the Atomic Energy Commission detonated atomic devices in New Mexico and Colorado in order to crack the shale and release large volumes of gas trapped in the rock. The project succeeded in recovering gas, but due to concerns about radioactive tritium elements in the gas, the project was abandoned.

    These projects are easy to ridicule. They sound preposterous to both anti-nuclear and anti-government ears. But in fact, the experiment demonstrated that it was possible to recover diffused gas from shale formations—proof of a concept that had theretofore not been established.

    A few years later, the just-established Department of Energy demonstrated that the same result could be achieved by pumping massive amounts of highly pressurized water into shale formations. This process, known as massive hydraulic fracturing (MHF), proved too expensive for broad commercialization. But oil and gas firms, with continuing federal support, tinkered with the amount of sand, water, and binding agents over the following two decades to achieve today’s much cheaper formula, known as slickwater fracking.

    Early federal fracking demonstrations can be fairly characterized as big, slow, dumb, and expensive. But when it comes to technological innovation, the big, slow, dumb, and expensive phase is almost always unavoidable. Innovation typically proceeds from big, slow, dumb, and expensive to small, fast, smart, and cheap. Think of building-sized computers from the 1950s that lacked the processing power to run a primitive, 1970s digital watch.

    Private firms are really good at small, fast, smart, and cheap, but they mostly don’t do big, slow, dumb, and expensive, because the benefits are too remote, the risks too great, and the costs too high. But here’s the catch. You usually can’t do small, fast, smart, and cheap until you’ve done big, slow, dumb, and expensive first. Hence the reason that, again and again, the federal government has played that role for critical technologies that turned out to be important to our economic well-being.

    Drilling Down into Innovative Methods

    In fact, virtually all subsequent commercial fracturing technologies have been built upon the basic understanding of hydraulic fracturing first demonstrated by the Department of Energy in the 1970s. That included not just demonstrating that gas could be released from shale formations, but also the critical understanding of how shale cracks under pressure. Scientists learned from the large federal demonstration projects in the 1970s that most shale in the United States fractures in the same direction. This led government and industry researchers to focus their efforts on technologies that would allow them to drill long distances horizontally, in a direction that situated the well hole perpendicular to the directions that fractures would run, which allowed firms to capture much more gas from each well.

    Government and industry researchers also focused on developing the ability to create multiple fracks from each horizontal well, and in 1986 a joint government-industry venture demonstrated the first multifrack horizontal well in Devonian Shale. During the same period, government researchers at Sandia Laboratory developed tools for micro-seismic mapping, a technique that would prove critical to the development of commercially viable fracking. Micro-seismic mapping allowed firms to see precisely where the cracks in the rock were, and to modulate pressure, fluid, and proppant in order to control the size and geometry of each frack.

    George Mitchell, who is widely credited with having pioneered the shale gas revolution, leaned heavily upon these innovations throughout the 1990s, when he finally put all the pieces together and figured out how to extract gas from shale economically. Mitchell had spent over a decade consolidating his position in the Barnett Shale before he asked for technical assistance from the government. “By the early 1990s, we had a good position, acceptable but lacking knowledge base,” Mitchell Energy Vice President Dan Steward told us recently.

    Mitchell turned to the Gas Research Institute and federal laboratories for help in 1991. GRI paid for Mitchell to attempt his first horizontal well. The Sandia National Laboratory provided Mitchell with the tools and a scientific team to micro-seismically map his wells. It was only after Mitchell turned to GRI and federal laboratories for help that he finally cracked the shale gas code.

    A Counterfactual?

    But so what? Federal investments in new gas technologies may have proved critical to the shale gas revolution, but could they have happened without those investments? Where is the counterfactual?

    Constructing a counterfactual can be a useful analytical method, but it can be abused. In this case, the counterfactual has been asserted as a kind of faith-based defense against the inconvenient history of the shale gas revolution. Nobody has offered a real world example—for instance, a country where private firms developed economical shale gas technology without any public support.

    Nor has anyone offered a detailed historical analysis to justify the claim that private entrepreneurs would have done the critical applied research, developed the fracking technologies, funded the explorations in new drill bits and horizontal wells, and created the micro-seismic mapping technologies that were all required to make the shale revolution possible. A close look at the development of those technologies reveals private sector entrepreneurs, like Mitchell, who were loudly and clearly asking for help because they knew they had neither the technical knowledge nor the ability to finance such risky innovations on their own.

    The Implications for Renewable Energy Subsidies

    In the end though, we are mostly having this debate now because historical federal investments in shale gas are being compared to current investments in renewables. There is much that is in fact comparable—the federal role in the shale gas revolution went well beyond basic research, as some have claimed, and matches up with current renewables programs virtually demonstration for demonstration, tax credit for tax credit, and dollar for dollar when comparing the scale and nature of present federal support for renewables with past support for unconventional gas. But that doesn’t mean that President Obama’s subsidies for green energy are immune to criticism.

    Indeed, once we acknowledge the shale gas case as a government success, not a failure, it offers a powerful basis for reforming present clean energy investments and subsidies. Federal subsidies for shale gas came to an end, and so should federal wind and solar subsidies, at least as blanket subsidies for all solar and wind technologies. In many prime locations, where there is good wind, proximity to transmission, state renewable energy purchase mandates, and multiple state and federal subsidies, wind development is now highly profitable.

    If federal investments in wind and solar are really like those in unconventional gas, then we ought to set a date certain when blanket subsidies for wind and solar energy come to an end. Imposing a phase-out of production subsidies would encourage sustained innovations and absolute cost declines. We might want to extend continuing support for some newer classes of wind and solar technologies, those that are innovating new technological methods to generate energy, or those that are specifically designed to perform better in lower wind or marginal solar locations. But in the ’80s and ’90s we did not provide a tax credit to all gas wells, only those using new technologies to recover gas from new geologic formations—and we should not continue to provide subsidies to wind and solar technologies that are already proven and increasingly widely deployed with no end in sight.

    Another key lesson is that many of the most important research and demonstration projects in new shale gas technologies were funded and overseen by the Gas Research Institute, a partnership between Department of Energy laboratories and the natural gas industry that was funded through a small Federal Energy Regulatory Commission-administered fee on gas prices. GRI had both independence from Congress and the federal bureaucracy, and strong representation from the natural gas industry, which allowed it to focus research and dollars on solving key technical problems that pioneers like George Mitchell were struggling with. Federal investments in applied research and demonstration of new green energy projects ought to be similarly insulated from political meddling and rent seeking.

    These and other lessons from the shale gas revolution point to far-reaching reforms of federal energy innovation and subsidy programs. If the history of the shale gas revolution challenges the tale of a single lone entrepreneur persevering without help from the government, it also challenges the present federal approach to investing in renewables in important respects. The history of federal support for shale gas offers as much a case for reform of current federal clean energy investments as it does for their preservation.

    This piece originally appeared at The American.

    Shellenberger and Nordhaus are co-founders of the Breakthrough Institute, a leading environmental think tank in the United States. They are authors of Break Through: From the Death of Environmentalism to the Politics of Possibility.

  • Don’t Bet Against The (Single-Family) House

    Nothing more characterizes the current conventional wisdom than the demise of the single-family house. From pundits like Richard Florida to Wall Street investors, the thinking is that the future of America will be characterized increasingly by renters huddling together in small apartments, living the lifestyle of the hip and cool — just like they do in New York, San Francisco and other enlightened places.

    Many advising the housing industry now envisage a “radically different and high-rise” future, even though the volume of new multi-unit construction permits remains less than half the level of 2006. Yet with new permits at historically low levels as well for single-family houses, real estate investors, like the lemmings they so often resemble, are traipsing into the multi-family market with sometimes reckless abandon.

    Today the argument about the future of housing reminds me of the immortal line from Groucho Marx:Who are you going to believe, me or your lyin’ eyes? Start with the strong preference of the vast majority of Americans to live in detached houses rather than crowd into apartments. “Many things — government policies, tax structures, financing methods, home-ownership patterns, and availability of land — account for how people choose to live, but the most important factor is culture,” notes urban historian Witold Rybczynski.

    Homeownership and the single-family house, Rybczynski notes, rests on many fairly mundane things — desire for privacy, need to accommodate children and increasingly the needs of aging parents and underemployed adult children. Such considerations rarely enter the consciousness of urban planning professors, “smart growth” advocates and architectural aesthetes swooning over a high-density rental future.

    Just look at the numbers. Over the last decade— even as urban density has been embraced breathlessly by a largely uncritical media — close to 80% of all new households, according to the American Community Survey, chose to settle in single-family houses.

    Now, of course, we are told, it’s different. Yet over the past decade, vacancy rates rose the most in multi-unit housing, with an increase of 61%, rising from 10.7% in 2000 to 17.1% in 2010. The vacancy rate in detached housing also rose but at a slower rate, from 7.3% in 2000 to 10.7% in 2010, an increase of 48%. Attached housing  – such as townhouses –  posted the slightest increase in vacancies, from 8.4% in 2000 to 11.0% in 2010, an increase of 32%.

    The attractiveness of rental apartments may soon be peaking just in time for late investors to take a nice haircut. Rising rents, a byproduct of speculative buying of apartments, already are making mortgage payments a more affordable option in such key markets as Atlanta, Chicago, Miami, Phoenix and Las Vegas.

    Urbanist pundits often insist the rush to rental apartments will be sustained by demographic trends. One tired cliché suggest that empty nesters are chafing to leave their suburban homes to move into urban apartments. Yet, notes longtime senior housing consultant Joe Verdoon, both market analysis and the Census tells us the opposite: most older folks are either staying put, or, if they relocate, are moving further out from the urban core.

    The two other major drivers of demographic change — the millennial generation and immigrants — also seem to prefer suburban, single-family houses. Immigrants have been heading to the suburbs for a generation, so much so that the most diverse neighborhoods in the country now tend to be not in the urban core but the periphery. This is particularly true in Sunbelt cities, where immigrant enclaves tend to be in suburban areas away from the core.

    Millennials, the generation born between 1983 and 2003, are often described by urban boosters as unwilling to live in their parent’s suburban “McMansions.” Yet according to a survey by Frank Magid and Associates, a large plurality define their “ideal place to live” when they get older to be in the suburbs, even more than their boomer parents.

    Ninety-five million millennials will be entering the housing market in the next decade, and they will do much to shape the contours of the future housing market. Right now many millennials lack the wherewithal to either buy a house or pay the rent. But that doesn’t mean they will be anxious to stay tenants in small places as they gain some income, marry, start a family and simply begin to yearn for a somewhat more private, less harried life.

    In the meantime, many across the demographic spectrum are moving not away from but back to the house. One driver here is the shifting nature of households, which, for the first time in a century are actually getting larger. This is reflected in part by the growth of multi-generational households.

    This is widely believed to be a temporary blip caused by the recession, which clearly is contributing to the trend. But the move toward multigenerational housing has been going on for almost three decades. After having fallen from 24 percent in 1940 to barely 12 percent in 1980, the percentage topped over 16 percent before the 2008 recession took hold. In 2009, according to Pew Research Center, a record 51.4 million Americans live in this kind of household.

    Instead of fading into irrelevance, the single-family house seems to be accommodating more people than before. It is becoming, if you will, the modern equivalent of the farm homestead for the extended family, particularly in expensive markets such as California. This may be one of the reasons why suburbs — where more than half of owner-occupied homes are locatedactually increased their share of growth in almost all American metropolitan areas through the last decade.

    Some companies, such as Pulte Homes and Lennar, are betting that the multi-generational home — not the rental apartment — may well be the next big thing in housing. These firms report that demand for this kind of product is particularly strong among immigrants and their children.

    Lennar  has already developed models — complete with separate entrances and kitchens for kids or grandparents — in Phoenix, Bakersfield, the Inland Empire area east of Los Angeles and San Diego, and is planning to extend the concept to other markets. “This kind of housing solves a lot of problems,” suggests Jeff Roos, Lennar’s regional president for the western U.S. “People are looking at ways to pool their resources, provide independent living for seniors and keeping the family together.”

    But much of the growth for multigenerational homes will come from an already aging base of over 130 million existing homes. An increasing number of these appear to being expanded to accommodate additional family members as well as home offices. Home improvement companies like Lowe’s and Home Depot already report a surge of sales servicing this market.

    A top Home Depot manager in California traced the rising sales in part to the decision of people to invest their money in an asset that at least they and their family members can live in. “We are having a great year ,” said the executive, who didn’t have permission to speak for attribution. “ I think people have decided that they cannot move so let’s fix up what we have.”

    These trends suggest that the widely predicted demise of the American single family home may be widely overstated. Instead, particularly as the economy improves, we may be witnessing its resurgence, albeit in a somewhat different form. Rather than listen to the pundits, perhaps it would be better to follow what’s before your eyes. Don’t give up the house.

    This piece originally appeared in Forbes.com.

    Joel Kotkin is executive editor of NewGeography.com and is a distinguished presidential fellow in urban futures at Chapman University, and contributing editor to the City Journal in New York. He is author of The City: A Global History. His newest book is The Next Hundred Million: America in 2050, released in February, 2010.

    Photo by Bigstockphoto.com.

  • Clues from the Past: The Midwest as an Aspirational Region

    This piece is an except from a new report on the Great Lakes Region for the Sagamore Institue. Download the pdf version for the full report including charts and maps on the region.

    The American Great Lakes region has long been a region defined by the forces of production, both agricultural and industrial. From the 1840s on, the region forged a legacy of productive power, easily surpassing the old northeast as the primary center of American industrial and agricultural might.

    The Rise of the Great Lakes

    Natural forces shaped the region, from its waterways and mineral resources, which made it ideal for industrial development. The lakes themselves are the largest sources of freshwater on the planet; the five lakes together are twice the size of England. This “fresh water Mediterranean” provided an essential pathway for transport between the various regions of the Great Lakes, as well as a connection to the northeast and, through the Saint Lawrence and the Erie Canal, to New York and the Atlantic.

    But more than anything, it has been the people of the Great Lakes that proved its greatest resource. In the early 19th Century, the region’s development was paced by migrants from New England, who brought with them their values of thrift, hard work and a passion for education and self-improvement. Later others, notably Germans and Scandinavians, injected a similar culture of self-improvement to the area.

    Like New England, the Great Lakes, noted author John Gunther, was possessed with a “gadget mind” that sparked the innovations that gave America command of the industrial revolution. Much of the brawn for this came from the poorer parts of Europe — Russia, Italy, and most particularly Poland, which led one observer to call Chicago “a mushroom and a suburb of Warsaw.” By 1920 one third of third of the population of Chicago, Cleveland and Detroit was foreign born.

    Initially based largely on agricultural exports, by 1860 the region had blossomed into an urbanized industrial powerhouse. “All over the Middle West,” wrote historians Charles and Mary Beard, “crossroads hamlets grew into trading towns, villages spread into cities, cities became railway and industrial centers.” The area’s rapid growth sparked great optimism; in 1841 journalist and land speculator predicted that by 1940 Cincinnati would be the largest city in North America and by 2000 “the greatest city in the world.” Cleveland, Cincinnati, Toledo, Milwaukee and most of all Chicago stood at the center of a “web of steel” that marked the region as the world’s preeminent industrial center. It also sparked other innovations, from the auto assembly line and the high-rise building to the mail order catalog.

    This growth cascaded in the early years of the last century. It became the nation’s primary growth engine. Between 1900 and 1920 Chicago added a million people while Cleveland doubled its population and Detroit, epicenter of the emerging “automobile revolution”, grew three fold. In everything from architecture and city planning to literature, the Great Lakes stood at the national, even global, cutting edge.

    A Half Century of Decline

    By the 1970s, the Great Lakes region, including Ontario, accounted for two-thirds of the North America’s automobile production, 70 percent of pig iron and three quarters of its steel. Yet by that time, this close tie to industry was seen not as an advantage but as a curse, driving the region towards precipitous decline.

    By then America was widely seen as entering a “post-industrial era,” and the Great Lakes, the former bastion of the manufacturing economy, seemed the odd region out. Defined as the “foundry” in Joel Garreau’s Nine Nations of North America, it was the only one he identified as in decline. He described the region’s inner cities as “North America’s Gulag Archipelago.”

    Once a magnet for newcomers, the region now took a back seat as a place that attracted domestic or foreign migrants.10 With the exception of Chicago, the Lakes region have continues to lag both in domestic migration and foreign immigrants. Newcomers were reinventing places like Los Angeles, Houston, Miami and New York, but relatively few were coming to Cleveland, Detroit or Cincinnati.

    The Great Lakes cities, also with the sometimes exception of Chicago, also found themselves increasingly regarded as cultural backwaters. Occasional stories of restoration and renaissance made the rounds in the media, but the trend was to greater obsolescence, to becoming permanently “a cultural colony” of the coasts. “To a Californian or a New Yorker,” noted Indiana-based historian Jon Teaford, “Cleveland, Detroit, Indianapolis and Saint Louis were down-at-the-heel, doughty matrons, sporting last year’s cultural fashions.”

    Until recently there has been ample reason to believe this decline would continue. Only nine of the Midwest’s 40 largest metropolitan areas have a higher per capita GDP than the national average. This reflected a deep seated loss of jobs paced by industrial decline but not made up for by gains in other fields.

    During this period the region not only lost many of its industrial jobs but, more pointedly, failed to replace them with the technology and service jobs that grew rapidly elsewhere. As a result, the region’s percentage of the national workforce dropped steadily over the past half century. In 1966, the Great Lakes region possessed one in four jobs in the country; by 2010 that percentage had fallen to less than one in five.

    As a response to the perception of industry-led decline, some Great Lakes leaders sought out other sources of employment and growth. In Detroit, for example, much emphasis was placed on casino development. Michigan’s former Governor Jennifer Granholm, sought to reverse decline by targeting the so-called “creative class” by turning its hard-hit towns into “cool cities.” Across the region, others focused on convention centers, arts attractions such as museums and other entertainment venues as the way to improve their sagging fortunes.

    Seeds of Resurgence
    None of these efforts – although much heralded throughout the 1980s and 1990s – did much to reverse the region’s decline. Notes Jim Russell, author of the widely read Burgh Diaspora website:

    Should Akron start putting more money in skateparks or global warming?

    There are huge problems in spending money in order to attract the geographically fickle. Fads fade and the mobile – largely people under 30 – will move again…Tying up the urban budget with projects aimed at retaining the creative class has its own perils. There is little, if any, evidence indicating that this policy will decrease the geographic mobility of the well-educated. Many cities stuffed with cultural amenities also sport high rates of out-migration. Furthermore, tastes change. “Best places to live” lists change quite a bit from one year to the next.

    Instead, the region’s current rebound is occurring in surprising fashion. The real lure of the Great Lakes lies in its own fundamental advantages: lower housing prices, business climate and perhaps, more importantly, a nascent industrial rebound.

    This can be seen, most importantly, in employment numbers. Starting in the last few years, the area’s share of jobs has remained steady. The highest unemployment rates in the country are no longer concentrated in the Great Lakes region, but in states such as California and Nevada. In many Great Lakes states, unemployment rates have been dropping more rapidly than the national average.

    Critically this resurgence has not resulted in a shift away from industrial growth. Instead, we are witnessing the early stages of what could be a profound increase in both the economic heft and job creation tied to the industrial sector. But the Great Lakes rebound is not merely a cyclical, one dimensional rise; it also includes growth in a host of other sectors, including in the information area and, perhaps even more remarkably, in energy, particularly shale gas.

    At the same time the rise in non-industrial jobs also should testify to the growing attractiveness of the region, particularly for young families. After decades of mass outmigration, the region has begun to achieve a more favorable balance with the rest of country. Outmigration rates for states in the region are at or below national levels.

    Migration in the Midwest, as Russell and others have pointed out, should be regarded more from the vantage point of recruitment, not retention. By promoting its affordability and improving economy, the region could improve its trailing inmigration rates. As people vote with their feet for the region, they are laying down the foundation for the area’s resurgence in the coming decades.

    The Rise of New Growth Nodes

    The Great Lakes demographic and economic turnaround does not mean that growth has occurred in the pattern of the early 20th Century. Instead we see the emergence of a new set of leadership cities. If Akron, Detroit, Cleveland and Chicago paced the region’s early 20th century ascendency, the new “winners” appear to include affordable, attractive cities, many of whom are home to major universities, state capitals and key research institutions.

    These areas have done well in attracting many people from the less successful metropolitan areas of the region. Columbus, for example, evidenced strong growth from the rest of Ohio and other parts of the Midwest, notably Michigan and Illinois. But perhaps more importantly, the area enjoys strong in-migration from those parts of country — notably the Northeast and California — that have traditionally dominated knowledge-intensive industries.

    A similar pattern can be seen in Indianapolis. In recent years, as urban analyst Aaron Renn notes, the Indiana capital has enjoyed “a profile closer to the Sun Belt than the Rust Belt.” It grew its population at a rate 50 percent greater than the national average, and also had strong net inmigration, with almost 65,000 net people deciding to pack up and move to the Indiana capital.

    Already a center of regional culture and services, the area has succeeded as well in attracting new migrants not only from big Midwestern cities such as Chicago, but also from the two coasts.

    By way of contrast, Chicago’s migration patterns look much different than those in Columbus and Indianapolis. Many other regions around the country benefited from people leaving the Windy City than Chicago gained from them. Chicago’s biggest gains have come from other, more troubled Great Lakes regions, while Indianapolis, for instance, has taken advantage of Chicagoans looking for more opportunity elsewhere.

    Behind this shift in migration from the coasts lie many factors, such as taxes and regulations.
    But perhaps most important may be the region’s greater affordability. Even after the bubble, for example, many key eastern and west coast regions suffer a ratio housing prices to annual incomes of five, six or even seven to one. For the most part, virtually all parts of the Great Lakes have ratios of three or less.

    Over time, this could prove a critical advantage to the Great Lakes. As the current millennial generation – the largest generation in American history – enters their 30s, it is likely that they will seek out places where they can afford to buy a home and enjoy a middle class quality of life. The Great Lakes will be one place that can offer that opportunity.

    Key to recovery: Both Brain and Brawn

    The future of the Great Lakes region lies neither in simply the “information” economy nor in the brute force of manufacturing. Instead it is as a result of a combination both of the industrial sector and the high-value service sectors that feed into it.

    Critically, the region boasts many areas where the information and service economies are particularly strong. Of the nine Midwestern metropolitan areas with per capita GDP growth above the national average, four are capital cities and six are home to major universities. Given governmental involvement in two of the fastest-growing sectors of the economy, health care and education, it is no surprise that seats of government and large state-funded research universities – which also double as the hotbeds of medical services – are growing ahead of other regions with a more traditional, and perhaps outdated, economic base.

    Indeed, some Midwestern areas are outperforming the coastal economies even in the realm of high-tech. In a recent ranking by Forbes magazine of best areas for tech growth among the nation’s 51 largest metropolitan areas, the region boasted three of the top fifteen areas, led by #3 Columbus, followed by Indianapolis and St. Louis.

    However, it would be inaccurate to portray the Midwest as depending purely on a service or information economy. Producing things for sale and export is still alive and well, and the Midwestern regions that have blended their traditional capacity for manufacturing with newer fast-growing sectors of the economy.

    Cedar Rapids, Iowa enjoyed the highest rate of GDP growth from 2001-2010 of any metropolitan area in the Midwest. Between Cedar Rapids and Iowa City, home to the University of Iowa, a new high-tech corridor has grown up that takes advantage of the area’s historical manufacturing capacity and the new technology driven through the university.

    Terre Haute, Indiana, fifth on the list of GDP leaders, reflects even more completely the blending of the “old” Midwest with the emerging one. Manufacturing has held steady as a share of the local economy at about 15.5 percent since 1991, but health and education have jumped from 14 to 17 percent, while wholesale services and agriculture have dropped. Terre Haute is home to Indiana State University and Rose-Hulman Institute of Technology, a regional leader in engineering, science, and mathematics education.

    Peoria, Illinois is second behind Cedar Rapids in GDP growth the past ten years. It is home to more than 200 manufacturing firms, two of the world’s largest earth-moving equipment makers, and coal fields. Peoria is also a leader in college degree attainment in the Great Lakes. While its absolute attainment levels are still low, its college educated population is growing faster than nearly every community in the Midwest. Peoria is one example of how brains + brawn, and not just brains, is the key to Midwestern growth going forward.

    Consider what we might call the dynamic of the Badgers and the Wolverines. In Wisconsin, home of the Badgers, there exists an east-west corridor between Madison, home to the state university and state capital, and Milwaukee, the state’s historical center of industry and commerce. In Michigan, home of the Wolverines, an east-west corridor stretches between Ann Arbor, home to the University of Michigan, and Detroit, the state’s historical center of industry and commerce.

    In Figure 14 we see that both Ann Arbor and Madison have high levels of bachelor degrees compared to the national average. But Madison is leading the Midwest in bachelor degree growth while Ann Arbor rate remains fairly static. Meanwhile, even though Detroit surprises with a fairly high rate of bachelor degree growth, Milwaukee stays in front of the national average in both growth and absolute numbers of college-educated workers.

    Some might say that the Badgers are beating the Wolverines in the knowledge-intensive sectors of the economy, but that the lead manufacturing is up for grabs. But the truth is that the Wisconsin corridor also enjoys positive marks in manufacturing.

    Milwaukee, for example, leads Detroit in the growth of manufacturing jobs. And Madison is emerging as a manufacturing center while Ann Arbor lags far behind. The knowledge economy and the old-time manufacturing economy can work happily together, in the case of Madison Milwaukee, or so far less so in the case of Ann Arbor-Detroit.

    The New Industrial Paradigm

    Despite the attempts to write it off as a spent force, manufacturing will remain a key driver of Midwestern and national growth. Despite the many job losses that impacted this sector over the past generation, American manufacturing remains remarkably resilient, with a global market share similar to that of the 1970s.

    More recently, however, American industrial base has begun to expand and begin to gain on its competitors. This places the Great Lakes in an advantageous position. American manufacturing after a decade of decline has outpaced the overall recovery over the two years, in part due to soaring exports. In 2011 American manufacturing continued to expand even as Germany, Japan and Brazil all weakened in this vital sector.

    Many factors are driving this change. One is a tie to the growing domestic energy industry, which has already sparked growth in the shale areas of eastern Ohio and other parts of the Great Lakes region. The United States together now boast the largest natural gas reserves in the world. In Ohio alone, new finds in the Utica shale could be worth as much as $500 billion; one energy executive called it “the biggest thing to hit Ohio since the plow.”

    The boom in natural gas has already sparked a considerable industrial rebound including the building of a new $650 million steel plant for gas pipes in the Youngstown area.18 Karen Wright, whose Ariel Corporation sells compressors used in gas plants, has added more than 300 positions over the past two years. “There’s a huge amount of drilling throughout the Midwest,” Wright says. “This is a game changer.”

    It also leads to the prospect that as coal-fired plants become more expensive to operate due to concerns over greenhouse gas emissions, the region will have a new, cleaner and potentially less expensive power source.

    Another critical factor has been the rise of wage rates in both Europe and East Asia. Increasingly, American-based manufacturing is in a favored position as a lower cost producer. Concerns over “knock offs” and lack of patent protection in China may also be sparking a “back to USA” trend, something particularly favorable to the Great Lakes region.

    Yet the new industrial base will not resemble old one. We are seeing both an industrial renaissance in the country and one that is heavily concentrated in the Great Lakes region. But it is a resurgence that is as much brain as brawn; an industry increasingly dependent not just on hard work, but skilled labor.

    This pattern cuts across industry lines. Indeed even as the share of the workforce employed in manufacturing has dropped from 20 percent to roughly half that, high skilled jobs in industry have soared 37 percent. Even after years of declining employment, manufacturers in heavy industry, such as automobiles, are running short on skilled workers. Industry expert David Cole predicts there could be demand for 100,000 new workers by 2013. Overall, 83 percent of all manufacturers, according to Deloitte Touche, suffer a moderate or severe shortage of skilled production workers.

    This remains a fundamental strength of the region. Much of the skilled labor base in the nation remains in the Midwest. The region is also home to four of the highest ranked, according to US News, industrial engineering schools in the nation: the University of Michigan at Ann Arbor, Northwestern, the University of Wisconsin at Madison and Purdue.

    Equally important for the region will be replacing the large cadre of skilled workers, many of whom are entering the late 50s and early 60s. “We have a very skilled workforce, but they are getting older,” says Ariel Wright, who employs 1,200 people at three Ohio factories. “I don’t know where we are going to find replacements.”

    For now the very culture of production – often seen as a liability in the past – could prove a key to the Great Lakes’ future resurgence. These advantages are already redounding to the region. Indeed a recent Forbes survey of “heavy metal” industries – that is those involved heavy industry, metals, vehicles and complex machinery – found the region in surprisingly good shape.

    The Milwaukee area, for example, ranked number 2 among the 50 metropolitan areas on the list, while Detroit clocked in with a respectable 6 placed finish. Cincinnati, Kansas City and Cleveland all ranked well within the top 20. In all, the 40 Great Lakes metropolitan areas added 50,000 heavy metal industry jobs since 2009.

    Looking Forward

    For the first time in a generation, the Great Lakes are experiencing demographic and economic trends in their favor. Yet in everything from migration to industrial growth, the region can expect to face strong competition from other areas, most notably Texas, the Southeast, the Great Plains and the Intermountain West for new jobs and production.

    To meet this challenge, and truly take advantage of improved conditions, the region must develop a strategy that is suited to its particular advantages. There is no need to try to compete with Manhattan on urban chic, with Silicon Valley in high-tech startups or with Hollywood in entertainment – as some growth theorists would likely recommend.

    The Great Lakes needs to focus primarily on those very values of production and community that sparked its original ascendance. Once these are identified and strengthened, the region can once again not only rebound, but define its own space in the national and global economy.

    Perhaps the first priority has to do with education. The Great Lakes has an enormous edge in terms of first-class engineering schools, and needs to become more focused on these programs and those associated with them, including the information sciences. It needs to supplement this focus on the top echelon with a greater effort — as we can now see in Ohio — in training more of the skilled workforce desperately needed for the region’s resurgent manufacturers.

    By 2018, 63 percent of the nation’s jobs will require some type of post-high school training credential. Increasingly successful education programs have to focus on aligning with jobs available within a state or region. This can only occur with explicit cooperation between education, government, and the business community.

    Likewise, business collaboration with universities can boost the amount and the impact of industry R&D investments that fosters innovation. University-based research and technology development can yield fast-growing, high-technology firms that create higher-paying middle skill and professional, scientific and technical jobs.

    The second priority lies in developing critical infrastructure to keep the region’s economy humming. This includes a greater emphasis on developing energy resources, rebuilding and modernizing the freight rail, waterways and ports, as well as highways that connect the Great Lakes to the rest of the country and the world.

    In the modern economy, creating economic advantage also includes paying attention to specialized infrastructure such as university and lab facilities, technology and training centers, multi-modal shipping and logistics facilities, and research parks. These infrasystems – integrated fusions of facilities, technology and advanced socio-technical capabilities – can drive innovation, particularly for future higher-value industries and higher-paying jobs. The full range of today’s infrastructure assets is shown in the figure below.

    Third, and perhaps most important, the region needs to maintain the housing affordability and other quality of life attributes critical to attracting both immigrants and domestic migrants. As Millennials enter their 30s in large numbers over the next decade, the region needs to improve its public schools, parks and other amenities to attract them.

    Ultimately, this represents a distinctly common-sense means to overcome a legacy of failure and create a new paradigm of success for the region. The Great Lakes, rather than trying to arrest its decline by completely running away from its past, can now recover the great sense of potential so evident in its heroic history.

    Download the full pdf version of the report, including charts and maps about the Great Lakes Region. The report was authored for the Sagamore Institute with support from the Lynde and Harry Bradley Foundation.

    Joel Kotkin is executive editor of NewGeography.com and is a distinguished presidential fellow in urban futures at Chapman University, and contributing editor to the City Journal in New York. He is author of The City: A Global History. His newest book is The Next Hundred Million: America in 2050, released in February, 2010.

    Mark Schill is Vice President of Research at Praxis Strategy Group, an economic development and research firm working with communities and states to improve their economies.

    Ryan Streeter is Distinguished Fellow for Economic and Fiscal Policy at the Sagamore Institute. You can follow his work at RyanStreeter.com and Sagamoreinstitute.org.

    Photo courtesy of BigStockPhoto.com.

  • Time to Rethink This Experiment? Delusion Down Under

    The famous physicist, Albert Einstein, was noted for his powers of observation and rigorous observance of the scientific method. It was insanity, he once wrote, to repeat the same experiment over and over again, and to expect a different outcome. With that in mind, I wonder what Einstein would make of the last decade and a bit of experimentation in Queensland’s urban planning and development assessment? 

    Fortunately, we don’t need Einstein’s help on this one because even the most casual of observers would conclude that after more than a decade of ‘reform’ and ‘innovation’ in the fields of town planning and the regulatory assessment of development, it now costs a great deal more and takes a great deal longer to do the same thing for no measureable benefit. As experiments go, this is one we might think about abandoning or at the very least trying something different.

    First, let’s quickly review the last decade or so of change in urban planning and development assessment. Up until the late 1990s, development assessment was relatively more straightforward under the Local Government (Planning and Environment) Act of 1990. Land already zoned for industrial use required only building consent to develop an industrial building. Land zoned for housing likewise required compliance with building approvals for housing. These were usually granted within a matter of weeks or (at the outset) months. 

    There were small head works charges, which essentially related to connection costs of services to the particular development. Town planning departments in local and state governments were fairly small in size and focussed mainly on strategic planning and land use zoning. It was the building departments that did most of the approving. Land not zoned for its intended use was subject to a process of development application (for rezoning), but here again the approach was much less convoluted that today. NIMBY’s and hard left greenies were around back then, but they weren’t in charge. Things happened, and they happened far more quickly, at lower cost to the community, than now.

    In the intervening decade and a bit, we’ve seen the delivery and implementation of an avalanche of regulatory and legislative intervention. It started with the Integrated Planning Act (1997), which sought to integrate disparate approval agencies into one ‘fast track’ simplified system. It immediately slowed everything down.  It promised greater freedom under an alleged ‘performance based’ assessment system, but in reality provoked local councils to invoke the ‘precautionary principle’ by submitting virtually everything to detailed development assessment. The Integrated Planning Act was followed, with much fanfare, by the Sustainable Planning Act (2009). Cynics, including some in the government at the time, dryly noted that a key performance measure of the Sustainable Planning Act was that it used the word ‘sustainable’ on almost every page. 

    Overlaying these regulations have been a constant flow of land use regulations in the form of regional plans, environmental plans, acid sulphate soil plans, global warming, sky-is-falling, seas-are-rising plans – plans for just about everything which also affect what can and can’t be done with individual pieces of private property.
    But it wasn’t just the steady withdrawal of private property rights as state and local government agencies gradually assumed more control over permissible development on other people’s land. There was also a philosophical change on two essential fronts.

    First, there was the notion that we were rapidly running out of land and desperately needed to avoid becoming a 200 kilometre wide city. Fear mongers warned of ‘LA type sprawl’ and argued the need for densification, based largely on innocuous sounding planning notions like ‘Smart Growth’ imported from places like California (population 36 million, more than 1.5 times all of Australia, and Los Angeles, population 10 million, roughly three times the population of south east Queensland).  The first ‘South east Queensland Regional Plan 2005-2026’ was born with these philosophical changes in mind, setting an urban growth boundary around the region and mandating a change to higher density living (despite broad community disinterest in density). It was revisited by the South East Queensland Regional Plan 2009-2031 which formally announced that 50% of all new dwellings should be delivered via infill and density models (without much thought, clearly, for how this was to be achieved and whether anyone particularly wanted it). Then there was the South East Queensland Regional Infrastructure Plan 2010-2031 which promised $134 billion in infrastructure spending to make this all possible (without much thought to where the money might come from) and a host of state planning policies to fill in any gaps which particular interest groups or social engineers may have identified as needing to be filled.

    The significant philosophical change, enforced by the regional plan, was that land for growth instantly became scarcer because planning permission would be denied in areas outside the artificially imposed land boundary. Scarcity of any product, particularly during a time of rising demand (as it was back then, when south east Queensland had a strong economy to speak of) results in rising prices. That is just what happened to any land capable of gaining development permission within the land boundary: raw land rose in price, much faster than house construction costs or wages. 

    The other significant philosophical change that took root was the notion of ‘user pays’ – which became a byword for buck passing the infrastructure challenge from the community at large, to new entrants, via developer levies. Local governments state-wide took to the notion of ‘developer levies’ with unseemly greed and haste. ‘Greedy developers’ could afford to pay (they argued) plus the notion of ‘user pays’ gave them some (albeit shaky) grounds for ideological justification. Soon, developers weren’t just being levied for the immediate cost of infrastructure associated with their particular development, but were being charged with the costs of community-wide infrastructure upgrades well beyond the impact of their proposal or its occupants. 

    Levies rose faster than Poseidon shares in the ‘70s. Soon enough, upfront per lot levies went past the $50,000 per lot mark and although recent moves to cap these per lot levies to $28,000 per dwelling have been introduced, many observers seem to think that councils are now so addicted that they’ll find alternate ways to get around the caps.

    So the triple whammy of ‘reform’ in just over a decade was that regulations and complexity exploded, supply became artificially constrained to meet some deterministic view of how and where us mere citizens might be permitted to live, and costs and charges levied on new housing (and new development generally) exploded.

    At no point during this period, and this has to be emphasised, can anyone honestly claim that this has achieved anything positive. It has made housing prohibitively expensive, and less responsive to market signals. Simply put, it takes longer, costs more, and is vastly more complicated than it was before, for no measureable gain.

    An indication of this was given to me recently in the form of the Sunshine Coast Council’s budget for its development assessment ‘directorate.’ (How apropos is that term? It would be just as much at home in a Soviet planning bureau).  Their budget (the documents had to be FOI’d) for 2009-10 financial year included a total employee costs budget of $17.4 million.  For the sake of argument, let’s assume the average directorate comrade was paid $80,000 per annum. That would mean something like more than 200 staff in total. Now they might all be very busy, but it surely says something about how complexity and costs have poisoned our assessment system if the Sunshine Coast Council needs to spend over $17 million of its ratepayer’s money just to employ people to assess development applications in a down market.

    If there had been any meaningful measures attached to these changes in approach over the last decade, we’d be better placed to assess how they’ve performed. But there weren’t, so let’s instead retrospectively apply some:

    Is there now more certainty? No. Ask anyone. Developers are confused. The community is confused. Even regulators are confused and frequently resort to planning lawyers, which often leads to more confusion. The simple question of ‘what can be done on this piece of land’ is now much harder to answer.

    Is there more efficiency? No. Any process which now takes so much longer and costs so much more cannot be argued to be efficient.

    Is the system more market responsive? No. Indeed the opposite could be argued – that the system is less responsive to market signals or consumer preference. Urban planning and market preference have become gradually divorced to the point that some planners actively view the market preferences of homebuyers with contempt.

    Are we getting better quality product? Many developers will argue that even on this criteria, the system has dumbed down innovation such that aesthetic, environmental or design initiatives have to fight so much harder to get through that they’re simply not worth doing.

    Is infrastructure delivery more closely aligned with demand? One of the great promises of a decade of ‘reform’ was that infrastructure deficits would be addressed if urban expansion and infrastructure delivery were aligned. Well it’s been done in theory via countless reports and press releases but it’s hardly been delivered in execution. And when the volumes of infrastructure levies collected by various agencies has been examined, it’s often been found that the money’s been hoarded and not even being spent on the very things it was collected for.

    Is the community better served? Maybe elements of the green movement would say so, but for young families trying to enter the housing market, the answer is an emphatic (and expensive) no. How can prohibitively expensive new housing costs be good for the community? For communities in established urban areas, there is more confusion about the impact of density planning, which has made NIMBY’s even more hostile than before.

    Has it been good for the economy? South east Queensland’s economy was once driven by strong population growth – the very reason all this extra planning was considered necessary. But growth has stalled, arguably due to the very regulatory systems and pricing regimes that were designed around it. We now have some of the slowest rates of population growth in recent history and our interstate competitiveness – in terms of land prices and the costs of development – is at an all time low. That’s hardly what you’d call a positive outcome.

    Is the environment better served? If you believe that the only way the environment can be better served is by choking off growth under the weight of regulation and taxation, you might say yes. But then again, studies repeatedly show that the density models proposed under current planning philosophies promote less environmentally efficient forms of housing, and can cause more congestion, than the alternate. So even if the heroic assumptions for the scale of infill and high density development contained in regional plans was actually by some miracle achieved, the environment might be worse off, not better, for it. 

    All up, it’s a pretty damming assessment of what’s been achieved in just over a decade. Of course the proponents of the current approach might warn that – without all this complexity, cost and frustration – Queensland would be subject to ‘runaway growth’ and a ‘return to the policies of sprawl.’ The answer to that, surely, is that everything prior to the late 1990s was delivered – successfully – without all this baggage. Life was affordable, the economy strong, growth was a positive and things were getting done. Queensland, and south east Queensland in particular, was regarded as a place with a strong future and a magnet for talent and capital. Now, that’s been lost.

    Einstein would tell us to stop this experiment and try something else if we aren’t happy with the results. To persist with the current frameworks and philosophies can only mean the advocates of the status quo consider these outcomes to be acceptable.  Is anyone prepared to put up their hand and say that they are?

    Ross Elliott has more than 20 years experience in property and public policy. His past roles have included stints in urban economics, national and state roles with the Property Council, and in destination marketing. He has written extensively on a range of public policy issues centering around urban issues, and continues to maintain his recreational interest in public policy through ongoing contributions such as this or via his monthly blog The Pulse.

    Photo by Flickr user Mansionwb

  • New Urbanism vs. Dispersionism

    The Florida real estate developer, unburdened of state regulatory agencies, may now focus his efforts on pleasing the investment community and the local market.  I recently played the role of real estate developer interviewing two consultant teams vying to help me create a new fictional community.  Fortified with readings in both the New Urbanist camp and the Dispersionist camp, each team of students pitched their method of community building to me. 

    The actual debate was very lively, with many rebuttals and some serious emotional engagement.  The premise:  I have a multi-acre greenfield property.   I have shortlisted my planning candidates down to two:  a New Urbanist team, and a Dispersionist team.  Each team must pitch their philosophy, and I will select one team to design it.

    Question 1:  Since I am only able to afford Phase 1, future phases will be left to future developers.  In your approach, can future generations be trusted to keep focus on high-quality development?  How would you guarantee that the property rises in value?  I asked the New Urbanists to go first.

    The New Urbanist team was ready:  As Master Planners, they will create the entire form-based vision for the property and design it around a smart code so that the future developers will obey a plan to keep property values rising.  No future developer will get to ‘cheap out’.  For this team, the Master Plan will guarantee a quality of life for all residents.

    The Dispersionists will plan Phase 1, not as a rigid image of a town, but rather as a response to the natural landscape.  This team said the community would grow organically, from its functional needs, guaranteeing  the freedom of future generations to plan their own destiny. They  scoffed at a Master Plan that determined the urban form.  What good is a guarantee of a quality of life, they asked, if future generations want something different than the Master Planner intended?

    This round, in my mind, went to the Dispersionists.  Their argument that future generations should have the freedom to plan based on their functional needs outweighed the seductive beauty of a Master Plan.  Too many Master Plans are implemented poorly, or abandoned due to their disutility based on changing needs and markets.

    Question 2:  How does your viewpoint deal with the car?  How will residents and visitors get around your community?  I asked the Dispersionists to go first this time.

    “Well,” replied the Dispersionists, “Americans love their cars, and we love the car too.  We’ll plan for sidewalks and bikes, but we know that the car is a necessity.  We know that a 5-minute walk isn’t so realistic in Florida’s hot, humid climate.”  The Dispersionists have a hearty regard for cars, and they spoke of long, sweeping curves and scenic drives.  They pointed out that most residents will need to drive to other parts of the city as well.

    The New Urbanists shuddered.  “We will plan for car-free living,” they stated.  With very clever planning, they intended to keep driving to a minimum, and will design walking trails.  One New Urbanist ventured 4-story parking garages, crowing that their proposal would not be littered with gas stations.  The New Urbanists pointed out the ugly commercial strips dominating our current city, and how little they want that to intrude into the new development.

    I liked this, and challenged the Dispersionists.  Isn’t it better health, and less use of oil, to reduce vehicle dependency?  The Dispersionists asked me why, in this ten-acre community, I thought I could attract residents with 4-story parking garages?  Good point, I thought.

    Both sides had good answers, and the question did not fully go to one side or the other.  Cars do tend to  generate a lot of aesthetic horror.  On the other hand, they are not going away anytime soon, so learning how to deal with them seems like an important task for a developer looking to the future.

    Question 3:  How would you distribute density in your development?  One center, multiple centers, and centered around what?  This time the New Urbanists went first.

    The core, they stated, will be in the center of town, and could go to 8-10 stories, leaving the perimeter a green zone.  In the center will be the government and institutional buildings, carefully matched with proper style.  The point, they said, is predictability. They pledged to learn from the failures of the past, and their Master Plan will account for the full scope of development.

    The Dispersionists suggested multiple centers.  “Phase 1 will be our first density cluster,” they said, “and we’ll see how it goes.”  Unlike the New Urbanists, they didn’t want to introduce all their product at once, in case the market changes.  “We believe in New England-style green space,” they said, and wanted to evolve the community around these.  They saw the vitality of the community coming from diversity.

    I asked the New Urbanists what they would do if the market changes .  When pressed, they insisted their Master Plan had plenty of contingency plans in case the original plan wasn’t workable, but it sounded like they were winging it.

    This is what  the Dispersionists saw as their own strong suit.  “We don’t have all the answers,” they said.  Their first phase would gently nudge the community in a certain direction, but it would leave future developers the choice whether to reinforce the first phase, or strike out and build another phase better suited to a unique need.

    I felt that this round went to the Dispersionists. 

    Question 4:  Do you think your development scheme can promote or discourage social values?  Why or why not?  This time the Dispersionists went first.

    The Dispersionists believed that one cannot engineer social values through urban design.  However, they can be influenced.  Conservation, for example, is a value that they would promote in their plan to conserve open space and not overtake the land with development.  A sense of community, they said, was another, giving people a loyalty to their community out of good design.  These, they felt, led to a sustainable plan.

    The New Urbanists guaranteed that conservation land would always be there, and pointed out the Dispersionists’ flexibility as a negative . The New Urbanists insisted that their sense of place would be stronger, because it would be designed.  People want predictability.  New Urbanists would engage people by walking and having front porches.

    The Dispersionists speculated that neighbors will get to know one another in a cul-de-sac just as well as they would if they all had front porches.  They also felt that the shared experiences of a community would transcend the particular style or form that community took.

    Although I gave this one to the New Urbanists, I was skeptical about  the New Urbanists’ implication that well-behaved buildings produce well-behaved people.  The Dispersionists’ view that a cul-de-sac breeds any neighborly closeness also seemed a bit disingenuous.  It was near the end of class.

    Question 5:  Give me your arguments why your strategy is sustainable.  I let the New Urbanists go first.

    For one thing, they said, they will have more efficient transportation. Vertical buildings save land, they argued, and people who choose this community will value open space more highly and be willing to live densely.  They believed that they will have less gridlock by de-emphasizing the car and will be more stable and socially cohesive.  All this will come from a well-designed Master Plan.

    The Dispersionists said  their community would start small and then grow.  Failures won’t cause dead zones, they claimed, because they are not sentimental about form and want a community that works.  So if a building in their development begets a failed business, the building will need to be reinvented to make it successful.

    “Yes, but,” countered the New Urbanists, “for every successful community like yours, there are 10 that have failed and ultimately decline in value.  What guarantee do you give that you will be the one out of ten?”  They went on to cite their successes – Seaside, Celebration, and so on.

    The Dispersionists noted that Seaside was a resort town and Celebration was heavily subsidized by a local employer, so those weren’t exactly good models.  In any case, they said, their community will appeal to a much broader segment of the population than the New Urbanists, and therefore more likely to sustain growth in the future.

    With that, the debate was concluded.  What lingers, however, are some truths that show both sides need to do some more work.

    The New Urbanists, fresh on the scene, seem overly evangelical in their approach, and demand a great deal of faith in the Master (Planner).  The slow, organically grown towns of which they are so fond were largely planned before the car.  While many of these towns, like Charleston, South Carolina, are sentimental favorites, their practical replication in today’s transportation-intensive, constantly changing real estate market is questionable.

    The Dispersionists, on the other hand, have been around for quite a long time, and are the modus operandi for much of the earth’s population.  They seem uninvolved in the aesthetics of the built environment, preferring to leave this up to individual taste, and the result is a rather shabby, cluttered contemporary American scene.  Some cleaning up is certainly in order.

    While the New Urbanists have a hopeful approach in this regard, they are overreacting to the vast consumer-oriented real estate development world that operated up until 2007, and are missing the fundamentals of how a real community works.  None are built around employers or economic producers in any significant way. None admit the lowest socioeconomic groups.  Content, perhaps, to dabble with shopping districts and farmer’s markets, New Urbanists have yet to offer what contemporary employers need – space, flexibility, and room to grow.  They therefore seem doomed to create peripheral urban designs rather than communities integrated with 21st century employers.

    Dispersionists would do well to pay a bit more attention to the natural environment, for the general public is quite aware of the toll that this strategy has taken.  Developers, having overbuilt in so many markets recently, will face tough opposition to bulldozing another woodland, given the empty real estate that exists in our cities today.

    It seems inevitable that dispersionist strategies will continue; they largely dominate our real estate development world and will continue to do so.  They make the most economic sense, they leave the future choices to the future generations, and they respond to people’s natural density tendencies.  One hopes that the New Urbanists will nudge the market a bit more towards aesthetic continuity and environmental stewardship as the next wave of growth inevitably begins again, and that the debate remains healthy, productive, and positive as citizens get re-engaged about the future of their cities.

    Richard Reep is an Architect and artist living in Winter Park, Florida. His practice has centered around hospitality-driven mixed use, and has contributed in various capacities to urban mixed-use projects, both nationally and internationally, for the last 25 years.

    Photo courtesy of BigStockPhoto.com.

  • The Three Laws of Future Employment

    As a college educator I am tasked with preparing today’s students for their future careers.

    Implicit is that I should know more about the future than most people. I do not – at least not in the sense of specific predictions. But I can suggest some boundaries on the path forward.

    Let’s start with the three Laws of Future Employment. Law #1: People will get jobs doing things that computers can’t do. Law #2: A global market place will result in lower pay and fewer opportunities for many careers. (But also in cheaper and better products and a higher standard of living for American consumers.) Law #3: Professional people will more likely be freelancers and less likely to have a steady job.

    Usually taken for granted is that future jobs depend on STEM disciplines (science, technology, engineering, and math). This view is eloquently expounded by Thomas Friedman, who argues that the US is falling behind China and India in educating for STEM careers.

    Alex Tabarrok makes a case for STEM in his excellent little e-book, Launching the Innovation Renaissance. He points out that “the US graduated just 5,036 chemical engineers in 2009, no more than we did 25 years ago. In electrical engineering there were only 11,619 graduates in 2009, about half the number of 25 years ago.” Similarly, the numbers of US computer science grads is flat over the past quarter century. Thus Tabarrok believes the US is falling behind in innovation and related technologies.

    But Tabarrok and much of the conventional wisdom are  wrong. The job that electrical engineers did 25 years ago has almost nothing to do with the job they do today. Computers now do much of the work that people used to do – computers design circuits, do all the drafting, plan the manufacturing, etc. It used to be that an electrical engineer designed the electronics in your car. To some extent they still do, but today even the smallest components come with operating systems – in other words, your car is programmed rather than designed. Electrical engineering is a career that follows Law #1: much of it has been (and will continue to be) computerized out of existence.

    Computer science careers illustrate Law #2. Computer science services are among the most tradable in the world. It is literally a global job market. Thus the number of computer scientists graduating from American colleges is an irrelevant number. Further, computer science jobs are themselves being computerized. The job description for today’s computer scientist is only tenuously related to what they did 25 years ago.

    Laws #1 & 2 predict that there will likely be fewer STEM jobs in the future – they are both easily computerized and tradable. People will always be employed in STEM disciplines, many of them highly paid, but they’ll be paid for smarts rather than education. The disciplines will be much more competitive, with older and less talented workers left on the sidelines. Tom Friedman and Alex Tabarrok, reflecting conventional wisdom,  are mistaken in maintaining that increasing STEM education is a key to future economic competitiveness.

    So if computerized, tradable skills won’t create much new employment, if any, what will? Clearly, it will be non-tradable skills that can’t be computerized. At their most valuable these jobs depend on human-human interaction – empathy. Counseling (of any sort: psychiatric, financial, weight loss, etc.), sales, customer service, management, and personal services all rely on empathy, as does waitressing. While much teaching can be computerized, what remains will depend more on empathy than anything else. “They don’t care what you know, but they will know if you care,” is a maxim future teachers should take to heart.

    According to Ronald Coase it is generally cheaper to engage freelance labor than to hire employees, unless the market transaction costs are too high. The internet lowers transaction costs and makes smaller firms (fewer employees) more economical. Thus we arrive at the Third Law of Future Employment: professional people will more likely be freelancers and less likely to have jobs. This already happens in computer science: projects are put out to bid on websites for global competition. Much journalism today is freelance, as is graphic design, engineering, or any number of other skills. The third law predicts this trend will grow.

    The bottom line is that today’s young people need to develop an individually unique set of marketable skills for tomorrow’s job market. A marketable skill is more than an education (which is not a skill), and also more than just job training (a skill, but no larger expertise). The useful benchmark is it takes 10,000 hours to become expert in something.

    I recently had a student – an English major – in my chemistry class. He had no good reason for being there; he could have fulfilled requirements with much less effort. So I asked him why?

    “It fit into my schedule and I felt like doing it. I like it.”

    “What are you going to do with an English degree?” I asked.

    “I’m writing a novel. It’s about cowboys.”

    Now conventional wisdom says this guy is all wet. Alex Tabarrok would have him drop the English degree in favor of chemistry (or chemical engineering). His English professors will say that his chances of publishing a novel (much less earning a living off one) are next to zero. SUNY Chancellor Nancy Zimpher has Six Big Ideas for SUNY – and my student doesn’t fit into any of them.

    But think about the skill set needed to write a novel, of which writing may be the least of it. He has to have something to write about, which means nurturing a general curiosity about the world – not just cowboys, but apparently also chemistry. He learns to be a keen observer of people: their appearance, what they wear, their character, mannerisms, and language. He develops the self-discipline and self-confidence to finish a project because it is intrinsically important, not because people say “Wow, that’s wonderful. You’re writing a novel!” Because of his novel my student becomes expert in many skills that can translate into a wonderful career.

    How is that different from mere education? The typical English major writes papers comparing Proust with Balzac. Not that there’s anything wrong with that, but it isn’t building the 10,000 hours.  It simply amounts to following directions carefully, and eventually collecting a credential. True expertise, by contrast, is something self-generated, following your own passion and talents. This isn’t to say education is always a waste of time, but it will no longer be sufficient to build a career.

    So here is my career advice to today’s students:

    • If you passionately like something and are good at it, then do that. STEM, for example, will always have a place for smart, hardworking people. Likewise, good writing can’t be computerized, but you need both talent and passion to be successful.
    • Start work on the 10,000 hours. Your education may help, but very little you do in school contributes to the total. Be it car detailing, truck driving, computer programming, drawing, writing – acquire an expert skill in something. Write a novel.
    • Empathize if you can. Computers can’t do that. Jobs that involve empathy (along with other skills) will always be in demand.
    • If you got it, flaunt it. That’s something else computers can’t do. Beauty has value, especially for women but also for men. This is wonderfully described in Catherine Hakim’s book, Erotic Capital. Even if you don’t got it, take advantage of youth. Acquire a fashion sense, take care of yourself, look as good as you can.

    Work hard. Have fun. Get rich.

    Daniel Jelski is a professor of chemistry at New Paltz, and previously served as dean of New Paltz’s School of Science & Engineering.

  • Indianapolis: From Naptown to Super City

    I have long touted the sports strategy that Indianapolis used to revitalize its downtown as a model for cities to follow in terms of strategy led economic and community development. I really think it sets the benchmark in terms of how to do it, and it has been very successful.

    Indy is hosting the Super Bowl on Sunday, something that is locally seen as a sort of crowning achievement of the 40 year sports journey. As part of that, the Indianapolis Star and public TV station WFYI produced an hour long documentary on the journey called “Naptown to Super City.” I think it’s a must watch for anyone who is trying to figure out to revitalize their own downtown. An hour isn’t short, but given the billions of dollars cities pour into this, I think it’s worth doing some homework. It tells the story of how Indy went from a deserted downtown where local Jaycees were licensed to take their shotguns and kill pigeons to one where the Super Bowl is being hosted today.

    I’ll talk more about the Indy strategy in a bit, but first the show. If you are in Google Reader this won’t display for you, so click here to watch.



    One thing this brought home for me is the true magnitude of the change. Perhaps I’m being a bit uncharitable, but Indianapolis almost literally started with nothing. It was never a major, important American city. It had no brand in the market. And it had a downtown that was all but dead. Everything they have today was built almost from scratch.

    Why do I think the Indy sports strategy was such a good one? Two reason: it was a good strategic area to go after, and it was backed up with very intelligent execution.

    First, five reasons this was a good strategic goal to pursue:

    1. It just fits the character of the city. Hoosiers love sports. The Indianapolis 500 and high school basketball were long established. It’s something they could behind in a way that they would never have gotten behind being the “vegetarian capital of the world” or something like there. It was authentic to the city. If you watch the video, you’ll note how locals embraced the events that were held that. That goes a long way towards explaining the success of the strategy. You have to be authentic to a place in your development efforts.
    2. It was a whitespace opportunity where Indy could get first mover advantage. Today every city thinks they can make money off sports, but Indy really pioneered the notion that you could use sports as an economic development tool. There were a lot of firsts along the path, and that’s one reason Indy was able to take out a leadership position. Just as one example, Indy was first to do the “build it and they will come” model of building a stadium before having a team. As a result, they were able to grab the Colts, and do it in an era when you didn’t have to mortgage your whole city to make a team relocation happen.
    3. Being America’s top city for sports events was a realistically achievable goal. I know this because the city achieved it. This is in great contrast to the umpteen cities who all claim they’ll be the “best cycling city in America” or some such.
    4. There were huge collateral benefits to sports beyond the direct economic impact of the events and the jobs they support. They bring people to the city to show it off to people who might not otherwise come. They enliven downtown and create events that locals might actually want to attend. They also have been an amazing brand opportunity. Just think of the Colts. How many times a week during football season does the word “Indianapolis” get said on TV? Probably hundreds if not thousands. Imagine if the city had to pay advertising dollars for that exposure? Yes, sports is expensive, but I think it could be justified just as cost-efficient marketing alone. Think about how much companies pay just to put their name on the stadium. How much more is it worth to put your city’s name on the team or the event? Think about how much advertisers will be paying for a 30 second commercial in the Super Bowl? What’s it worth for all those mentions of your city during the Super Bowl again?
    5. It was an initiative that had the possibility of being truly transformative for the city. Again, I know this is true because it was.

    I’m not going to claim these were actually the thoughts going through people’s minds as the sports strategy developed or that it was this calculated. But all of these things were implicitly true all along, and I think clearly the people pushing sports must have gotten it on that at some level. So sports meets the first test of a great strategy in that it set out after a good strategic goal.

    It was also something where there was a level of execution detail that far exceeded what most cities do. In business, it’s one thing to have an idea. It’s another thing to execute on it and achieve market leadership. It’s still another to generate sustainable competitive advantage that keeps you there over the long haul. Indianapolis has managed to do all of these with sports. I’ll highlight eight examples of how it did this:

    1. It invested in world class facilities. A lot of these have remained top rated even long after they opened, like Conseco Fieldhouse, which is still ranked every year as the best arena in the United States.
    2. Two, it laid out an entire district downtown around events hosting, with everything you need in close proximity – venues, the convention center, hotels, shopping, and entertainment. This is something that’s already been widely commented on by Super Bowl visitors who are amazed you don’t have to get shuttled around all over the place and that you can actually walk directly from the media hotel to the hotels where the teams are staying.
    3. Three, because of this Indy is able to effectively “saturation rebrand” downtown for an event and otherwise cater to events in a way that few other cities can or will. In effect, the city has converted its downtown into a giant sound stage. Take a look at the pictures of the city. The whole downtown as been rebranded after the Super Bowl, including, for example, plastering a huge Lombardi Trophy images on the side of the city’s premier hotel. You can debate the value of this to the city, but there’s no denying its value to the NFL. How many cities are willing to do this to the extent Indianapolis is?
    4. Indy created the Indiana Sports Corp. as the first ever non-profit management company for events. Today, everybody has adopted that model.
    5. The city cultivated a large, experienced volunteer base for putting on events that is much more powerful than what others cities have.
    6. Indy has been willing to take calculated risks in support of the strategy. Building the Hoosier Dome with no team to play in it – big risk.
    7. It not only went after the events, it went after the sanctioning bodies that determined where the events would be held. The most important is of course the NCAA, but there are others too. This has resulted in Indy having a “cluster” of these organizations and direct access to the people making decisions that pays incalculable dividends. This is one area where the “face to face” discussions that occur in Indy gives the city a big leg up. It’s not just better for selling, it gives Indy critical advanced intelligence about how these organizations are conceiving of their future events needs.
    8. Last but certainly not least, this has been a sustained, 35 year commitment. It wasn’t a party politics thing. It was a single project thing. It wasn’t a flash in the pan idea. It was something that has been relentlessly pursued over the long haul.

    Add all this up and it is easy to see why still today, three or four decades after it first started and after pretty much every city decided to go after these types of events, Indianapolis is still the best place in America to host a sports event.

    I hope this gives you a flavor why the Indy sports strategy was so good and so successful. It’s certainly something that’s not without its failures and downsides. The fact that sports has consumed disproportionate civic resources is one of them, and one highlighted by the documentary. But on the whole, most people seem very happy with the results.

    Something the video highlights at the end is one essential attribute for success that you can’t plan for or make happen – luck. They ask questions like, what if the “Save the Pacers” telethon had failed back in the 70’s? What if the seats in the Hoosier Dome had been the originally planned variegated colors instead of the Colts blue and white colors when Bob Irsay walked in to check it out? There were many critical turning points where without a lucky break, who knows if the future of downtown Indy might have been radically different in some way. It should give us some humility about the limits of our ability to simply will things into being. On the other hand, it reminds us that if you aren’t in the game, if you aren’t swinging the bat, you don’t have any chance at all of hitting that home run. You have to play if you want to win.

    This piece originally appeared at The Urbanophile.

    Aaron M. Renn is an independent writer on urban affairs based in the Midwest. His writings appear at The Urbanophile, and operates Telestrian, an online tool for economic and demographic data.

    Photo of Lucas Oil Stadium courtesy of BigStockPhoto.com.

  • Who Stands The Most To Win – And Lose – From A Second Obama Term

    As the probability of President Barack Obama’s reelection grows, state and local officials across the country are tallying up the potential ramifications of a second term. For the most part, the biggest concerns lie with energy-producing states, which fear stricter environmental regulations, and those places most dependent on military or space spending, which are both likely to decrease under a second Obama administration.

    On the other hand, several states, and particularly the District of Columbia, have reasons to look forward to another four years. Under Obama the federal workforce has expanded — even as state and localities have cut their government jobs. The growing concentration of power has also swelled the ranks of Washington‘s parasitical enablers, from high-end lobbyists to expense-account restaurants. While much of urban America is struggling, currently Washington is experiencing something of a golden age.

    So what states have the most to lose from a second Obama term? The most obvious is Texas, the fastest-growing of the nation’s big states. Used to owning the inside track in Washington during the long years of Bush family rule, the Lone Star state now has less clout in Congress and the White House than in recent memory. Texans are particularly worried about restrictions on fossil fuel energy development, which is largely responsible for robust growth throughout the state.

    “Obama now wants to take credit for the increased production that has happened, but [increased production] has been opposed in every corner by the administration,” says John Hofmeister, founder of the Houston-based Citizens for Affordable Energy and former CEO of Shell USA. Hofmeister fears that in a second term, with no concern for reelection, Obama could exert even greater controls on fossil fuel development. This would have dramatic, negative implications not only for Texas but for the entire national energy grid, which includes North Dakota, Wyoming, Montana, West Virginia, Oklahoma, Alaska and Louisiana. These states fear that the nation’s recent energy boom, which has generated some of the nation’s strongest job and income growth, could implode in Obama’s second term.

    Take Louisiana, which is still recovering from Hurricane Katrina in 2005 and the BP oil spill in 2010. The administration’s moratorium on offshore drilling, sparked by the spill, has had a deleterious effect on the state’s energy economy, according to a recent study, with half offshore oil and service companies  shifting their operations to other regions and laying off employees.

    Once the moratorium was lifted in 2010, companies have faced long delays for new wells, growing from 60-day delays in 2008 to more than 109 last year  .  “The energy states feel they are being persecuted for their good deeds,” says Eric Smith, director of the Tulane Energy Institute in New Orleans. “There is a sense there are people in the administration who would like this whole industry to go away.”

    Many of these same states also worry about the administration’s proposed downsizing of the military. Obama’s move to cut roughly towards $500 billion in defense spending may make sense, but it  threatens places with large military presences such as Texas, Florida, Oklahoma, Virginia, Georgia, South Carolina and New Mexico.

    The D.C. metro area might also be hit by defense cuts, but overall the it has many reasons to genuflect toward the Obama Administration. Federal wages, salaries and procurement account for 40% of the district’s economic activity, roughly four times the percentage of any state. Expanding regulation on energy, health care and financial services has sparked a steady job boom in lobbying, think tanks and other facets of the persuasion industry — including among Republicans –at a time when employment growth has been sluggish elsewhere.

    D.C. partisans hail their city as the leader of a national urban boom. The district clearly benefits from diminished job opportunities in more market-based economies, particularly for educated 20-somethings.

    No place has flourished as much as the capital, but a second term would be favorable to states such as Maryland, which depend heavily on research spending directed from Washington and where federal spending accounts for fifteen percent of the local economy, over seven times the national average. Maryland agencies such as the National Institutes for Health will likely expand under an increasingly federalized health care system — particularly if Democrats gain more seats in Congress with an Obama win.

    Other big states that may benefit from a second term include New York, California and Illinois. New York benefits largely from the administration’s Wall Street leanings, despite the president’s recent attacks on financial elite. Even for the non-conspiracy theorists, the administration’s ties to Goldman Sachs appear unusually intimate. Powerful allies like Democratic Sen. Charles Schumer, D.C.’s greatest Wall Street booster, suggest big money has little to fear from a second term.

    Overall the administration’s basic policy approach has favored the financial giants. Support for bailouts, seemingly permanent low interest rates, few prosecutions for miscreant investment bankers, the institutionalization of “too big to fail” and easy loans for renewable fuel firms all have benefited the big Wall Street players.

    Of course, a Republican victory would not be a disaster for these worthies. Companies like Goldman Sachs are hedging their bets by sending loads of cash to the likely Republican choice, former Massachusetts Gov. Mitt Romney.

    But other New York interests, such as mass transit funding, would benefit from the current administration’s  generally pro-urban, green sensibilities. Tight regulations on carbon emissions — increasing the price of fossil fuels — may help the competitive position of New York City, which has little industry left and relatively low carbon emissions per capita, in part due to a greater reliance on hydroelectric and nuclear power.

    California also has reasons to root for an Obama victory. Although among the richest states in fossil fuels, particularly oil, the Golden State has become a bastion of both climate change alarmism and renewable energy subsidization. It adamantly won’t develop traditional its energy resources — which would help boost the state’s still weak economy — and Silicon Valley venture firms have eagerly grabbed subsidies and loans for start-ups from Energy Secretary Steven Chu’s seemingly bottomless cornucopia.

    Furthermore,  more powerful EPA would make California’s current “go it alone” energy and environmental problems less disadvantageous compared to more fossil-fuel-friendly states, leveling what is now a tortuous economic playing field.

    Similarly, attempts to push the state’s troubled high-speed rail line — recently described in Mother Jones as “jaw-droppingly shameless” –  will succeed only with strong backing by the federal government. Under a Republican administration and Congress, Brown’s beloved high-speed line would depend entirely on state and private funding, likely terminating the project.

    But no state needs an Obama victory more than his adopted home state of Illinois. To be sure, having a native son in the White House has not prevented the Land of Lincoln from suffering one of the weakest economies in the nation. The state has one of the highest rates of out-migration in the country, according to recent United Van Lines data and Census results.

    Even worse, the Land of Lincoln faces a fiscal crisis so great that it makes California look well-managed.  Without a good friend in the White House, and allies in Congress, Illinois could end up replacing long-struggling, now-improving Michigan as the Great Lakes’ new leading basket case. Count Illinois 20 electoral votes in the Obama column.

    This piece originally appeared in Forbes.com.

    Joel Kotkin is executive editor of NewGeography.com and is a distinguished presidential fellow in urban futures at Chapman University, and contributing editor to the City Journal in New York. He is author of The City: A Global History. His newest book is The Next Hundred Million: America in 2050, released in February, 2010.

    Photo from BigStockPhoto.com.