Category: Policy

  • An Economic Recovery Program for the Post-Bubble Economy

    By Bernard L. Schwartz, Sherle R. Schwenninger, New America Foundation

    The American economy is in trouble. Battered and bruised by the collapsing housing and credit bubbles, and by high oil and food prices, it is having trouble finding its footing. The stimulus medicine the Federal Reserve and Congress administered earlier this year is already wearing off, while home prices are still falling and unemployment continues to creep upward. By the time a new president is sworn in, there is a good chance the economy will have stalled again, and the hope for a relatively quick rebound will have given way to the fear of a protracted slowdown.

    The next administration must therefore have a second dose of medicine ready that is stronger, more enduring, and different in kind from the first stimulus program of tax rebates and tax cuts for business. Tax rebates may have been appropriate for an economy entering a standard cyclical downturn. But this is clearly not a normal business recession. It is a post-bubble slowdown involving a painful de-leveraging of America’s household and financial sectors. This means that consumers and housing will be struggling for some time, and that new sources of growth are needed.

    A longer-term economic recovery program must therefore steer the economy onto a new growth path that is less dependent on the debt-financed consumption that has driven economic growth over the past decade. The most promising new sources of growth are America’s enormous public infrastructure needs and the increased global demand for American technology created by the drive for greater efficiency in economies around the world. An economic recovery program built around public infrastructure investment and demand for American technology would be more effective in stimulating the economy in the short term, and far better for it in the long run, than would another round of tax rebates for American consumers.

    Getting the Diagnosis Right
    The experience of Japan and Sweden in the early 1990s should be a warning to those who believe that all the economy needs is a bit more of the standard countercyclical treatment-a few more tax cuts or rebates here, a little bit more unemployment insurance there, and perhaps some assistance to state and local governments. Japan and Sweden both experienced serious prolonged recessions after the bursting of their property and financial bubbles in the early 1990s, and it took extraordinary fiscal and monetary measures before either enjoyed a real recovery.

    The U.S. economy is more dynamic and more flexible than Japan’s or Sweden’s. Still, there are reasons to worry about the effectiveness of standard countercyclical measures in today’s post-bubble economy, notwithstanding our economy’s many strengths. To begin with, measures like temporary tax rebates are too transitory to generate a sustainable recovery. Businesses may act quickly to restore profitability by adjusting inventory levels and cutting costs, but households generally take much longer to put their balance sheets in order and increase spending again. This is especially the case when many Americans are already overleveraged and experiencing a decline in the value of their homes. With home prices falling, many households will not be able to maintain consumption levels by tapping home equity as they have in the past. Moreover, with unemployment rising, they cannot easily or quickly replace the credit they previously relied on with new sources of income. Thus they will have no choice but to cut consumption and increase savings gradually. In light of the fact that housing markets by their nature are slow to correct, this household de-leveraging process could take years to play out. Household consumption, which at its peak accounted for more than 70 percent of the economy, may thus be a drag for some time to come-at least until wages rise or home values begin to increase again.

    Second, standard stimulus programs generally are too modest to make a substantial difference to the parts of the economy affected by the bursting of the housing and credit bubbles. The Democratic leadership in Congress is considering a supplemental stimulus package of $50 billion. But $50 billion would count for little in a $13.8 trillion economy. David Rosenberg, chief economist at Merrill Lynch, estimates that the unwinding of the housing and credit bubbles, together with rising unemployment, will create a $475 billion reduction in consumer spending. Rising food and gas prices, he estimates, will drain another $300 billion from discretionary spending. Together, these sums dwarf the current $150 billion fiscal stimulus and suggest the need for a larger and more potent economic recovery program. Even the bursting of the tech bubble, which had relatively little impact on most Americans, required a fiscal stimulus the equivalent of more than 6 percent of GDP (measured by the increase in the budget deficit) over a three-year period, in addition to 16 cuts in the federal funds rates to 1 percent. In light of the much larger effect housing has on consumption, the unwinding of the housing and credit bubbles will require a stimulus of comparable size at the very least.

    Third, the standard stimulus measures are too focused on consumption and not enough on investment. Thus, to the extent such measures were successful, they would merely reinforce a suboptimal and ultimately unsustainable pattern of economic growth that over the past decade has been too dependent on debt-financed consumption and inflated asset prices. The root cause of this suboptimal pattern of growth has been the excess savings generated by the Asian export economies and the petrodollar states of the Persian Gulf, which were recycled into the U.S. financial system, fueling the credit and housing bubbles. The housing bubble in turn helped inflate consumption, as U.S. households took advantage of poorly regulated new financial instruments to purchase more expensive homes and tap rising home equity. U.S consumption in turn helped drive Asian export growth, resulting in even higher trade surpluses. The weakness in this pattern of economic growth lay in the fact that U.S. consumption was made possible not by real wage and income gains but by unsustainable increases in home prices and household debt.

    Seen from this perspective, the bursting of the housing and credit bubbles was a necessary, albeit painful, adjustment in the pattern of U.S. and world economic growth. The goal of a new recovery program therefore must not be to recreate this pattern with more short-term consumer-oriented stimulus but to steer the economy onto a more sustainable growth path. Future economic growth will need to be driven less by debt-financed consumption and more by investment that leads to the creation of good jobs and rising wages, and by exports to those economies that have underconsumed for much of the past decade.
    A new economic recovery program would not preclude measures such as the extension of unemployment insurance or assistance to state and local governments to ease the adjustment many households are now experiencing. But these worthwhile measures are not a substitute for what must be the overriding goal of a new economic recovery and growth program, namely finding a new big source of economic growth that can replace personal consumption as the main driver of economic growth in the short term and that over the medium term can lead to higher wages and incomes to support increased household consumption.

    There are two areas of enormous pent-up demand on which such a recovery program can be based. The first and most important is the pent-up demand in the United States for public infrastructure improvements in everything from roads and bridges to broadband and air traffic control systems to new energy infrastructure. We need not only to repair large parts of our existing basic infrastructure but also to put in place the 21st-century infrastructure for a more energy-efficient and technologically advanced society. This project, entailing several trillion dollars in new government spending over the next decade, would provide millions of new jobs for American workers.

    The other significant source of potential growth is the enormous pent-up demand in China and other emerging economies for both consumer goods and the productivity-enhancing and energy-efficient technology needed to sustain both corporate profitability and rising living standards. For years now, these economies have suppressed domestic demand at the expense of the living standards of their workers and have been able to use low wages to offset the rising cost of energy and other materials. But high energy prices, together with rising wages, are beginning to force a change toward more consumption-oriented economies that must do more to increase productivity and energy efficiency. This shift will increase demand for U.S. goods and services, allowing the United States to improve its trade balance and remove a drag on economic growth.

    These two areas of potential growth in turn will help fuel both domestic and international demand for American technology across a broad range of new growth clusters where U.S. companies enjoy a leadership position or, with new investment, could do so in the future. These areas include not just such traditional American strengths as aerospace, information technology, and networking, but emerging growth areas associated with what might be called the “triple green revolution” in agriculture, efficiency-enhancing clean technology, and renewable energy sources. Increased world and domestic demand for American technology will help spur new investment and, with it, a new generation of technological innovation.

    Public Infrastructure Investment
    The main pillar of an economic recovery and growth program must be a massive increase in public infrastructure investment, in part because it has the greatest multiplier effect of any stimulus and also because it provides the foundation for private investment in the productive economy. There is increasing public recognition that two decades of underinvestment in public infrastructure has created a backlog of public infrastructure needs that is undermining our economy’s efficiency and costing us billions in lost income and economic growth. The American Society of Civil Engineers estimates that we need to spend $1.6 trillion over the next five years to bring our basic infrastructure up to world standards. In addition, we need to spend sizeable sums in newer areas of infrastructure, like broadband access and new energy infrastructure for wind, solar, and clean coal.

    Public investment of this magnitude would give a significant boost to the economy, filling the gap left by the falloff in housing construction and consumer spending, while laying the foundation for a more productive economy. Indeed, public infrastructure investment is the most effective way to increase demand and investment at the same time, and thus the best way to counter an economic slowdown caused by the unwinding of the housing and credit bubbles. If, in spite of low interest rates, companies will not commit to more investment spending because of weak demand or uncertainty, the best way to jump-start more investment will be to do so directly by increasing public investment outlays. Public investment in turn will help stimulate new private investment by increasing the efficiency and potential returns of that investment, and by adding demand to the overall economy.

    Public infrastructure investment would have the advantage of creating more jobs, particularly more good jobs, and thus would help counter the negative employment effects of the collapsing housing bubble. For example, the U.S. Department of Transportation estimates that for every $1 billion in federal highway investment, 47,500 jobs would be created, directly and indirectly. Similarly, an analysis by the California Infrastructure Coalition concludes that each $1 billion in transit system improvements, including roadways, would produce 18,000 direct new jobs and nearly the same level of induced indirect investment. If all public infrastructure investment created jobs at the same rate as transit improvements in California, $150 billion in infrastructure investment would create more than 2.7 million jobs directly, more than offsetting the jobs lost since the bursting of the housing bubble.

    Public infrastructure investment not only creates jobs but generates a healthy multiplier effect throughout the economy by creating demand for materials and services. The U.S. Department of Transportation estimates that for every $1 billion invested in federal highways more than $6.2 billion in economic activity would be generated. Mark Zandi, chief economist at Moody’s Economy.com, offers a more conservative but still impressive estimate of the multiplier effect of infrastructure spending, calculating that every dollar of increased infrastructure spending would generate a $1.59 increase in GDP. By comparison, a combination of tax cuts and tax rebates is estimated to produce only 67 cents in demand for every dollar of lower taxes. Thus, by Zandi’s conservative estimates, $150 billion in infrastructure spending would generate a nearly $240 billion increase (or close to a 2 percent increase) in GDP in the first year.

    Public infrastructure investment would not only help stimulate the economy in the short term but help make it more productive over the long term. America’s current economic structure-relying heavily on financial services, entertainment, and certain tech industries-reflects our low investment in public infrastructure over the past two decades. However, many of the potential new growth sectors of the economy in agriculture, energy, and clean technology will require major infrastructure improvements or new public infrastructure: new transmission grids to tap the potential of wind and solar power in the Southwest and the Great Plains, better broadband access and new airports to support the growth of agribusiness and new tech companies in the lower-cost areas of the American heartland, and a new generation of information technology to reduce traffic congestion and speed up all sorts of transactions.

    In the first year, the increase in public infrastructure investment envisioned here could be funded as part of a second stimulus package. But to ensure adequate continued funding of public infrastructure over the next decade, the next administration will want to move quickly to establish a National Infrastructure Bank, along the lines proposed by Senators Christopher Dodd and Chuck Hagel, or a National Infrastructure Development Corporation, such as proposed by Congresswoman Rosa DeLauro. If properly structured, the proposed entities would enable the federal government to tap the private capital markets by issuing long-term special purpose bonds to help fund state and local infrastructure projects of national significance.

    Inevitably, a massive increase in public infrastructure investment will raise concerns about the deficit. But, as we have noted, the government deficit will need to widen for the next year or two in any case to fill the gap created by the falloff in consumer and business spending. It is better that it increases as a result of public infrastructure investment than as a result of tax cuts and other spending, because spending on infrastructure will create more new jobs and economic activity.

    Rising Exports from More Balanced World Deman
    Given the magnitude of the housing and credit bubbles, a massive public infrastructure program may not be enough to offset consumer weakness and jump-start new business investment. Therefore, rising exports must constitute the second pillar of an economic recovery and growth program. Thanks to a weaker dollar and strong growth in emerging economies, exports are in fact contributing positively to U.S. economic growth for the first time in more than 15 years. Over the past two quarters, the improvement in the net exports of goods and services has contributed the equivalent of 1 percent of GDP growth on an annual basis.

    However, there is a danger that this export boomlet will be cut short as other economies begin to feel the effects of weaker consumer demand in the United States. The next administration must therefore adopt an international strategy to encourage China and other large current account surplus economies-Japan, Germany, and the large oil-exporting countries-to expand domestic demand to offset weaker U.S. consumer growth.

    There are a number of factors that will give the next administration leverage to move China and other surplus economies in the direction of more balanced economic growth. As we have noted, one of the main factors is pent-up consumer demand and the accompanying political pressure for rising living standards within large emerging economies. Over the past decade, investment and savings have grown faster than consumption in Asian export-oriented countries as well as in oil-exporting economies. Thus, there are enormous pent-up consumption needs in these societies. China, for example, has one-half the televisions, one-quarter the computers, and one-third the cell phones per capita as Europe.

    At the same time, higher food and energy costs are creating pressure on China and other Asian exporting economies to let wages rise in order to avoid political tensions. Higher wages would increase the purchasing power of Asian workers and augment consumer demand, which would help create a healthier balance between demand and savings in these societies. China has an unusually high savings rate of more than 50 percent, while consumption constitutes only 35 percent of GDP. This combination of extraordinarily high savings and low consumption is unique among newly industrialized economies.

    Higher wages would also force companies in emerging economies to seek out new productivity gains to compensate for rising wage levels. The drive for more rapid productivity growth in emerging economies would in turn increase the demand for labor-saving and efficiency-enhancing technology. This would benefit many American technology companies that supply software and networking equipment, as well as American companies that are developing cutting-edge technology to improve energy and materials efficiency.

    In short, there are both political and economic reasons for large surplus economies to shift their economic policy toward more balanced economic growth in the near term. The next administration needs to do a better job of sending the message to large current-account-surplus economies, including the advanced economies of Japan and Germany, that they need to do more to generate their own demand. In the case of China, it can do so by pushing Beijing on international labor rights, by encouraging currency appreciation to stem inflation, and by using the OECD and the World Bank to help create a social safety net and develop a home mortgage market. Because China lacks a real safety net and does not have reliable systems of health care and education, Chinese workers engage in enormous precautionary saving, which is holding down consumption. The best way to reduce this high level of precautionary savings is to encourage China to put in place a modern social safety net and do a better job of providing education and health care for its citizens.

    The biggest threat to the favorable rebalancing of world trade now getting underway is higher inflation in emerging economies. If these economies tighten their monetary policy to stem inflation, the mini export boom that has kept the U.S. economy out of recession will be cut short and one of the new drivers of U.S. economic growth will come to a premature end. An early priority of the next administration, therefore, must be to reach an understanding with other economies about how to best handle the incipient global inflation threat. Inflation in many emerging economies is the result of their policy of pegging their currency to the dollar, whether formally or informally, in order to maintain export competitiveness. Hence, as the value of the dollar has fallen so have their currencies, raising the cost of imported food and energy. (The accumulation of large foreign currency reserves has also spurred monetary growth in these economies, in spite of efforts to “sterilize” capital inflows to reduce their effect on inflation.)

    The alternative to relying solely on monetary tightening would be for these economies to re-peg their currencies-by letting their currencies appreciate against the dollar but without abandoning the dollar peg entirely. This would create the best of both worlds for the U.S. economy: it would provide continued support for the dollar while also increasing domestic demand within the Asian and oil-exporting economies, thus expanding the market for U.S. goods and services. For this reason, the next administration should move quickly to a new set of understandings about world currencies that would facilitate these currency adjustments. The goal of these understandings should be to manage the dollar over the next few years to assure that it does not appreciate too much so as to cut short America’s export boom or fall too far so as to provoke a currency crisis.

    Capitalizing on the Next Tech Boom
    Expanded public infrastructure investment in the United States and the transition to intensive, energy-efficient growth in emerging economies will greatly increase the demand for American-made technology, setting the stage for new investment in a wide range of American technology companies. As we have noted, U.S. companies still enjoy a competitive advantage in a range of technology areas, from aerospace to business software to networking. What has been missing in recent years has been a new demand catalyst to drive new investment and innovation.

    Higher commodity and energy prices are also helping drive a new tech boom in other areas. In addition to benefiting many American producers, high commodity prices are setting the stage for new growth industries aimed at tapping scientific breakthroughs in agriculture, biotechnology, nanotechnology, the life sciences, energy extraction, and materials. The United States needs to position itself to take advantage of potential huge returns from new investments in the emerging growth industries of the triple green revolution: agriculture and biotechnology, clean technologies and energy and resource efficiency, and new energy sources.

    We have potential competitive advantages in each of these areas. We still lead the world in agricultural production and in related agricultural products and services, as well as in the life sciences. While parts of the world have resisted some American innovations in genetically modified seeds and materials, the need for new drought- and disease-resistant crops capable of greater yields is increasingly apparent. American agricultural companies turned biotech companies, like Monsanto, stand to benefit from the pressure to feed more people and improve the diets of millions of new members of the global middle class.

    In the area of energy and resource efficiency, rising commodity prices and concerns over global climate change are creating a huge demand for technology that can help make traditional industries more efficient and eco-friendly. Technology for squeezing more production out of existing oilfields, for example, is in great demand. So is technology for extracting minerals in a more environmentally friendly way. These same factors are also leading to a new cluster of clean technology companies, which specialize in technology to enhance energy efficiency and reduce carbon emissions. The demand for such engineering solutions has the potential to create a rebirth in America’s industrial heartland, especially in the old mining and commodity belt of the Upper Midwest.

    High oil prices have also spurred a mini investment boomlet in new renewable energy companies-wind and solar power, second-generation biofuels, and clean coal. Wind technology has advanced to the point that it is now cost competitive with traditional sources of electricity generation, and U.S. companies are becoming competitive with their European counterparts. Solar is not far behind. However, as we have noted, the lack of appropriate energy infrastructure is an obstacle to future growth. Wind and solar power is plentiful in what energy investor T. Boone Pickens calls the “Saudi Arabia of wind and solar”-namely the Southwest and the Great Plains-but this is the region that least needs more electricity generation. Future growth therefore will depend on new transmission lines to get the electricity to those parts of the country that need it most.

    In order to fully capitalize on these technological trends, the United States needs a more conscious technology and competitiveness strategy. One of the main short-term goals of this strategy should be to help start-up companies that are developing new energy technology grow by helping sustain demand for energy efficiency, not only domestically but globally. The government can do so by putting a floor under oil and gas prices and by mandating ever higher energy efficiency standards so that any temporary fall in prices does not deter further investment. Another goal should be to create incentives for new technology companies to invest and create more high-value-added jobs domestically. A technology competitiveness strategy would lower the cost of doing business in the United States by providing better infrastructure and more skilled workers, eliminating the tax incentives for companies to move their operations abroad, and adding tax incentives for companies to increase investment and job creation in the United States.

    With the right technology and competitiveness policies, we will be able to take advantage of the increased global demand for technology to spur investment in a cluster of new growth companies. In the process, we will be able to broaden the productive base of the American economy and create millions of new jobs that pay middle-class wages, helping to reverse the slow growth in wages that has held back living standards over the past several decades.

    A Strategy of Mutual Prosperity
    In the short term, the new economic recovery and growth program outlined here will help sustain U.S. and global economic growth during a period of painful adjustment following the bursting of the housing and credit bubbles. Over the longer term, it will put the U.S. and emerging economies on the path to mutually reinforcing productivity revolutions and mutually rising living standards. Increased public investment in the United States will lead to increased private investment and greater productive capacity, enabling American-based companies to take advantage of rising export demand for their goods and services. It will also lead to rising wages, enabling households to reduce their debt burdens without cutting back on consumption.

    Meanwhile in large emerging economies, higher wages and more consumer spending will increase domestic demand, allowing these export-oriented economies to weather a slowing of U.S. consumer demand. Rising living standards in turn will accelerate the transition in these economies to more sustainable growth based on rising productivity and resource efficiency. This new growth orientation in turn will open up even greater growth opportunities for American companies at the forefront of the triple green revolution.

    It will be up to the next administration to turn this opportunity into reality. To do so, it must have a bold and optimistic economic recovery plan that goes beyond conventional thinking and harnesses the American economy to the new growth drivers of public infrastructure investment and rising demand for efficiency-enhancing technology.

    Bernard L. Schwartz is Chairman and CEO of BLS Investments, llc. Sherle R. Schwenninger is Director of the Economic Growth Program at the New America Foundation.

  • America is More Small Town than We Think

    America has become an overwhelmingly metropolitan nation. According to the 2000 census, more than 80 percent of the nation’s population resided in one of the 350 combined metropolitan statistical areas. It is not surprising, therefore, that “small town” America may be considered as becoming a burdensome anachronism.

    Nothing could be further from the truth. America is more “small town” than we often think, particularly in how we govern ourselves. In 2000, slightly more than one-half of the nation’s population lived in jurisdictions — cities, towns, boroughs, villages and townships — with fewer than 25,000 people or in rural areas. Planners and geographers might see regions as mega-units, but in fact, they are usually composed of many small towns and a far smaller number of larger cities. Indeed, among the metropolitan areas with more than one million residents in 2000, the average sized city, town, borough, village or township had a population of little more than 20,000.

    Although local government consolidation and regional governance is all the rage in policy circles, most Americans seem content with a diverse, even fractured governmental structure. According to the 2002 U.S. Census of Governments, there were more than 34,000 local general-purpose governments with less than 25,000 residents and 31,000 local general-purpose governments with less than 10,000 residents (accounting, with rural areas, for 38 percent of the nation’s 2000 population). With so many “small towns,” the average local jurisdiction population in the United States is 6,200.

    Even in big metropolitan areas, citizens are often governed by small local institutions. People in Brecksville, Ohio (population 13,000), may tell their friends from far away that they live in Cleveland and residents of Woodway, Wash. (population 1,000), may claim to live in Seattle. But in reality their local governments are located not in the great City Hall downtown but in a usually quite modest nearby building.

    This large number of governments horrifies some organizations and people. Planners, the media and many often well-meaning local activists argue that local governments should be consolidated to eliminate waste and duplication. And so, in recent years there have been strong initiatives to force local government consolidations. Bigger, the argument goes, is usually better and more efficient — and certainly easier to cover if you are a journalist and influence if you are a big business interest.

    Yet the reality is that the claims of greater efficiency rarely confirm the theory. Both
    Pennsylvania and New York recently started initiatives to consolidate their governmental structure. They took to heart the usual mantra that there are thousands of governments in the state and that they must be consolidated to save money. In both states, the efforts were clothed in promises that local government consolidation would improve competitiveness relative to other states.

    However, the proponents never bothered to look at the data.

    We did and the results were stunning. In both states, an equivalent “market basket” of spending was compared. In Pennsylvania, the largest local jurisdictions spent (including a per capita allocation of county expenditures, so that Philadelphia could be included. Social service spending was excluded) 150 percent more per capita than jurisdictions with between 5,000 and 10,000 population. The largest jurisdictions — those over 250,000 people — spent 200 percent more than jurisdictions with under 2,500 residents.

    Moreover, it is not a matter of urban versus rural. In both the Philadelphia and Pittsburgh areas, there are literally hundreds of suburban jurisdictions that spent at less than one-half the per capita rate of the central cities.

    The story was little different in New York. The largest jurisdictions (those over 100,000) spent nearly double per capita as jurisdictions with between 5,000 and 10,000 population (this would have been even greater if it had been possible to include New York City). The big governments spent even more (more than 150 percent) compared to jurisdictions with between 1,000 and 2,500 population. The differences were even greater within metropolitan areas, where smaller jurisdictions were even more efficient relative to the largest jurisdictions.

    Why should this be? Perhaps it’s the old, all too often neglected Jeffersonian principle of downscaling government closer to the people. Elected officials who know more of their constituents are likely to be more responsive to their needs. Too often the principal economies of scale that occur from municipal consolidations are economies of scale for lobbyists and special interests.

    Further, this small town governance structure is not limited to the United States. Metropolitan Paris has approximately 1,300 general-purpose local jurisdictions, more than any U.S. metropolitan area. Milan has more than 600. By comparison, Tokyo-Yokohama, the world’s largest metropolitan area, is a model of government consolidation, with more than 200 general-purpose governments.

    America’s small town government structure engenders a sense of community, even as a part of larger metropolitan areas. They also save a lot of money, principally because democracy tends to work better when government is closer to home. It is not surprising that so many consolidation proposals fail and that when given the chance, voters usually reject consolidation proposals.

    America needs both its small towns and its bigger cities. But make no mistake about it, even much of what we call a “metropolis” functions more effectively as a network of small towns.

    The view of Main Street, Bramwell, West Virginia was photographed by Sandy Sorlien as part of her twenty-year project, The Heart of Town: Main Streets in America.

    New Geography apologizes for having initially published the image without permission or attribution.

    Resources:

    Report for the Pennsylvania State Association of Township Supervisors

    Report for the Association of Towns of the State of New York

    General-purpose governments by metropolitan area (2002)

    Wendell Cox is a Visiting Professor, Conservatoire National des Arts et Metiers, Paris and the author of “War on the Dream: How Anti-Sprawl Policy Threatens the Quality of Life.”

  • Heartland Infrastructure Investment Key to the Nation’s Growth

    By Delore Zimmerman and Matthew Leiphon

    Infrastructure investment in America is poised to jump to the front of the policy agenda over the next few years. With the election of the next President, new priorities and objectives are sure to be set on several key issues, including national infrastructure investment. Some of this will be addressed in a major new Congressional transportation funding that will include a major push for all kinds of infrastructure.

    Infrastructure is one of the fundamental building blocks of economic opportunity, something increasingly recognized by pundits as well as political leaders in both parties. At NewGeography.com we hope to expand the discussion about infrastructure policy by examining its role in our communities, and exploring innovative new funding options for its provision.

    We have already looked at the history of infrastructure investment by focusing on the accomplishments of the New Deal. In the next few weeks we will examine current and future trends in infrastructure investment, both here and around the globe, and the fundamental role that infrastructure plays in promoting economic growth and driving innovation.

    Unlike earlier periods of infrastructure expansion, which were often uniformly national or regional in scope, today’s infrastructure needs related to economic development are often closely tied to the specific circumstances, resources, capabilities, and aspirations of the local economy. And, because federal resources alone will most certainly be unable to meet skyrocketing needs, local and private resources be mobilized to the greatest extent possible.

    One major initiative we are developing deals with the role of infrastructure in America’s Heartland, an often-overlooked, perhaps insufficiently understood part of our country’s economic landscape. Today’s Heartland is made up of thousands of rural small towns and hundreds of second and third tier cities scattered across America. They have deep roots in agriculture, forestry, mining or fishing but many have made a steady and successful diversification transition to an economy that now includes strong, globally competitive manufacturing, energy or service industries.

    Heartland communities outside the major metropolitan areas possess many underutilized assets. These include relatively low housing costs and a good business climate, quality schools, a reasonably educated and productive workforce, and available land and other resources for expansion.

    More recently the resurgence of the Heartland can be traced to strong performance in traditional pillars of small town and rural economies ‐‐ food and energy. But as history shows, resource-based markets are often subject to the whims of global cycles that can rise and fall with little warning. The Financial Times recently noted the biggest drop in commodity prices in over 25 years, although from record highs. But the drop does point to the volatility of these markets and the risk of over-reliance on high prices in crops and livestock to keep the Heartland economies robust and growing.

    To avoid a return to what may be seen as the “commodity trap”, there needs to be a commitment to infrastructure that could help grow other sectors of the economy as well as best leverage the commodity-based economy. This includes standard infrastructure such as highways, airports, harbors, utility distribution systems, railways, water and sewer systems, and communications networks. New facilities to distribute energy resources to the rest of the country—including pipelines to supply the water necessary to propel both energy production and manufacturing—will also be needed.

    But we also see the need to pay attention to specialized infrastructure such as university and lab facilities, technology and training centers, multi-modal shipping facilities, and research parks. These infrasystems – integrated fusions of facilities, technology and advanced socio-technical capabilities – have emerged as key drivers of innovation and the locus of future higher-value industries and higher-paying jobs.

    Federal resources will probably not be available meet these needs, as a 2006 GAO concluded For that reason, here and elsewhere around the world, cash-strapped governments are viewing private investment as an increasingly important piece of the infrastructure investment puzzle. Concurrently, banks, pension funds and other private investors are considering infrastructure as a new, long-term asset class that offers a combination of hard assets and visible long-term earning streams.

    This confluence of circumstances has given rise to a new set of private infrastructure funds that have attracted billions of dollars and Euros from individual and institutional investors alike, beyond traditional equity investment, public utility bond issues and into outright privatization of assets.

    The key question is will the new private infrastructure investment vehicles will find their way to the Heartland or remain concentrated in the large metro areas like their venture capital counterparts. Communities and second and third tier cities are, after all, often financially stressed because of a limited tax base, the high costs associated with size and scale, and difficulties adjusting expediently to population growth or decline.

    A possible solution lies in creating a Heartland Infrastructure Investment Bank. This institution would focus solely on infrasystem investments that would create higher-value opportunities in science and technology, manufacturing, energy and advanced services in smaller commuters. The Bank is to serve as a lead or secondary lender on projects of economic significance in the American Heartland and is intended to leverage considerable co-investment from the private and public sectors.

    We first developed the concept of a development bank while working on a project for the Washington, DC-based New America Foundation. Now we are looking for practical advice from potential investors, communities and policy makers. Please help us build a better future for the American heartland.

    Delore Zimmerman is President of Praxis Strategy Group and publisher of NewGeography.com Matthew Leiphon is a Research Analysis at Praxis Strategy Group

  • The Social Function of NIMBYism

    Opposition to new development is fraught with so many acronyms that you need a lexicon to decode them. The catch-all term is NIMBYism, sufficiently well known to merit an entry in the Oxford English Dictionary, which identifies its first use in a 1980 Christian Science Monitor story. The term arose to describe opposition to large infrastructure projects undertaken by public agencies or utility companies, such as highways, nuclear power plants, waste disposal facilities, and prisons. (These are known as LULUs, Locally Undesirable Land Uses) It has now extended outward in concentric circles of opposition, each with its own acronym: NOTEs (Not Over There Either), NIABYs (Not In Anyone’s Backyard), BANANAs (Build Absolutely Nothing Anywhere Near Anyone), and even NOPEs (Not On Planet Earth!). It is also possible to find references to CAVE people (Citizens Against Virtually Everything) and NIMTOOs (Not In My Term Of Office).

    In any event, opposition to development has long since entered its second phase, targeting not just LULUs, but also ordinary development projects. It is now a standard feature of the development landscape, a form of ritual performance art. As a citizen activist and author of a NIMBY handbook unapologetically observes, “Everyone is a NIMBY, and no one wants a LULU.”

    The neighborhood meeting
    There are good reasons why NIMBYism is so pervasive (more about that later), but it is hard to witness firsthand, say at a neighborhood meeting about a proposed condominium project. First, people complain that they did not get notice of the meeting – yet they are in attendance, so what are we to make of that? Others voice complaints that seem embarrassingly trivial to air in public in a voice quivering with outrage: the developer’s trucks are muddy or the project description misspells the name of their street. General complaints emerge about neighborhood-wide conditions that are somehow now the developer’s responsibility to address. These throat-clearing denunciations are a way to limber up for the main event, which is to dismantle the actual proposal and its proponent in any way possible.

    The project-specific complaints follow familiar patterns too. The traffic in every neighborhood is, apparently, already intolerable, no matter what the transportation consultants say about “level of service.” The project will only worsen it, infringing upon residents’ inalienable right to uncongested streets.

    For large-scale urban projects, the second most prevalent objection is against building height, which often becomes the currency in which trades are made. For the neighbors, height is a signifier of all other impacts. For the developer, height is directly proportional to financial feasibility. So it rapidly becomes a zero sum game, which in turn leads to gamesmanship. The developer leads with a proposal which is taller than needed, to have something to trade with; the neighbors come to understand and even expect this and accuse the developer of duplicity. Sometimes the developer overplays the opening hand by asking for a height which is deemed scandalous, thereby lighting a fire that can never be extinguished.

    A third leitmotif is view. Virtually all residents believe that the Constitution protects the view from every window of their homes. Sometimes the developer (or a public official in attendance) will note that views generally are not protected as a matter of property law or by zoning ordinance, but this only further inflames the aggrieved party. The neighbors often elevate their personal views and lifestyle preferences to universal policy imperatives and are incensed if public agencies do not vindicate them. They view public officials as complicit if they express support for the developer’s position, so the officials retreat to the sidelines until the combat subsides.

    Length of tenure in the neighborhood often shapes the neighbors’ advocacy. Longer-term residents will recite their credentials: “I was born and raised on _______Street” or “I’ve lived here since____.” to give their views more weight. Their opposition is often poignant: they seem to want to preserve their immediate surroundings in the condition in which they first encountered them, maybe in childhood. Newcomers, with the zeal of recent converts, are often the most vocal in resisting change to the neighborhood they have just discovered.

    Some projects attract attention from advocacy groups concerned about affordable housing, historic preservation, open space, waterfront access, or sustainable design, but most opposition comes from those with a close geographic interest. While issue-oriented advocates tend to be progressive in their politics, NIMBYs come in every political stripe. Some are progressives who see their advocacy as a form of environmental protection they are bestowing on their unempowered neighbors. Some are middle-class burghers protecting the safety and stability of the neighborhood. Even libertarians justify opposing development as an infringement on their right to be left alone. It is rare to encounter vocal neighbors whose political views or personal values counteract the visceral sense that their very way of life is being threatened. Nobody, it seems, is precluded from principled opposition, no matter what their principles are.

    The overwhelming majority are homeowners. If the project includes large-scale retail uses, neighborhood business owners will join the chorus. Renters rarely feel they have enough at stake, and those outside of the project’s zone of immediate impact may show up to the first meeting or two out of curiosity and then stay home, letting their more vigilant neighbors continue the fight. So eventually, the field is left to the opponents, and the most strident voices prevail. Public hearings become forums for the chronically aggrieved; in an increasingly fragmented culture, they are what pass for community.

    Ironically, while the neighbors often feel helpless against sinister forces, developers lament how influential the neighbors are, even when their complaints are not factually accurate or are not encompassed within the zone of interests protected by the land use regulation at issue. The result is a shrill, dysfunctional, and seemingly interminable public conversation. Nobody seems to learn anything from the last experience, so it gets repeated with each new project. As planning shrinks in importance as a means to establish advance consensus about growth, the public approval process has become the crucible in which cities are actually shaped, one project at a time, in the most laborious way imaginable. How did it get this way?

    Historical roots of NIMBYism
    NIMBYism is, in a sense, just a modern manifestation of the larger phenomenon of civic engagement, a part of our national foundation narrative remarked on as long ago as de Tocqueville. Its contemporary expression arose in the second half of the 20th century, and each decade has made its own contribution to its rise. After the complacent prosperity of the 1950s, the 1960s saw the rise of citizen activism, exemplified by crusaders such as Ralph Nader, Saul Alinsky, and Jane Jacobs. The epic battle between Jacobs and Robert Moses over urban renewal in New York City (whose battle lines are still being redrawn) was based not only on their divergent views of how cities work and how to accommodate their changing physical needs, but perhaps more fundamentally on how such decisions should be made, with Jacobs pioneering citizen activism as an antidote to “top-down” planning and development decisions.

    In the 1970s, environmental activism re-framed growth not just as an engine of progress but as a competition for resources and, most importantly, as a potential threat to human health and natural systems. Educated opinion leaders began to distrust technology and fear the future. This spawned landmark federal legislation, including particularly the National Environmental Policy Act, which in turn spawned many state and, eventually, local impact review programs. These institutionalized two important ideas: (1) that development projects are generators of impacts that must be assessed before development decisions are made, and (2) that citizen participation is an essential component of such reviews. Along with other protections for wetlands, endangered species habitat, air and water discharges, and, in particular, comprehensive zoning administered by local officials who are obliged to be responsive to citizen participation, these regulatory processes have created many more opportunities for public involvement. If, as its detractors say, NIMBYism is the last lawful blood sport, it is one which is not only permissible but explicitly encouraged by our legal system.

    The 1980s was the decade which gave birth to NIMBYism as an art form – the break-out decade. By glorifying private initiative and sowing distrust of government, the era saw the rise of grass-roots anti-development activism as a necessary counterbalance not only to government-initiated projects like prisons and highways, but particularly to private initiatives unrestrained by government. In the 1990s, the environmental justice movement, aiming to ensure that low-income communities are not disproportionately burdened by high-impact uses, added a progressive arrow to the NIMBY quiver. Also, revenue-strapped local government began to pursue privatization strategies in earnest, creating additional impetus for citizen activism as a counterbalance.

    Finally, the first decade of the 21st century has established that the shift from manufacturing to “knowledge” as the driving economic force in American cities, and the reurbanization and gentrification this has spawned, have actually increased land-use friction in many ways. It is inherently contentious to graft new uses onto existing urban tissue, where there are so many incumbents whose rights are affected. Those incumbents, especially if they are knowledge workers, tend to have wider awareness of the power of the built environment to affect their own quality of life, have a reduced tolerance for development impacts, and are sophisticated about using public processes to vindicate their concerns.

    The scale of new development projects and our ability to measure their impacts have also increased over time. As the burgeoning land use regulatory regime has gradually supplanted planning, the effectiveness of public agencies in establishing publicly accepted templates for growth has also diminished. Perhaps more importantly, we have come increasingly to rely on private actors to build public infrastructure as a component of their large-scale development projects.

    These factors combine to almost mandate wider citizen participation in development decisions. While civic engagement may be dwindling generally, it has undoubtedly risen in the development arena. Filling the vacuum left by minimalist government, atrophied land-use planning, and an eroding social contract, NIMBYism is the bitter fruit of a pluralistic democracy in which all views carry equal weight.

    Emotional roots of NIMBYism
    NIMBYism starts with identification with one’s personal surroundings. “A sense of place” is of course not a bad thing, but it spawns a deep-seated resistance to changes to those physical surroundings. This at first seems based on a boundless sense of personal entitlement: People not only place their own needs above the public interest but come close to reframing the public interest as a social organization that vindicates their personal needs. No individual wants to accept the incremental burden of meeting a broader societal need. General reciprocity—the notion that I will accommodate your wishes because I know that someday I will want someone somewhere to accommodate mine, and that’s the only way society can move forward—has been replaced by specific reciprocity: I will accede to your wishes only in exchange for your agreement to accede to my wishes (for compensation or mitigation). So a general lubricating agent in society devolves into a series of negotiated (usually contentious) transactions.

    But, on closer inspection, this is not so hard to accept. Most people experience the world as an increasingly complex and bewildering place where most issues are well beyond their ability to influence and the pace of change seems to accelerate continually. This powerlessness leads them to yearn to control those things that they can, their immediate home and neighborhood being foremost, and to be tetchy if their efforts seem fruitless. But it is important to recognize that NIMBYs are not just reflexively change-averse; average Americans will move twelve times in their lifetimes, and, according to the National Association of Homebuilders, Americans have been spending record amounts in home renovation projects in the recent housing boom. The environmental change that NIMBYists resist is change imposed by others. This crucial distinction underlies virtually all opposition to development.

    It is also important to recognize the role of increases in homeownership – clearly a good thing insofar as it engenders economic security, personal autonomy, and community investment. As I noted above, the overwhelming majority of project opponents are homeowners. Their home is often their largest financial asset, unprotected by diversification of risk and subject to changes in value due to neighborhood circumstances over which the homeowner has little control. While many homeowners frame their objections to development with references to larger issues such as traffic impacts on the neighborhood or broader environmental consequences, if you listen carefully, there is an implicit calculus at work: will this project tend to increase or diminish the value of my house? Since the answer to the question is often unknowable or at least not commonly understood, homeowners’ rational impulse to protect their investment shades into an irrational fear of the unknown. As behavioral economists have demonstrated, most people fear losses more than they value gains, even reasonably certain ones. So it is only natural to fear the risk of the unknown more than you value the uncertain benefits of unasked-for change.

    Even when any reasonable calculus would suggest that property values would rise, say from a neighboring luxury residential development, neighbors want to capture that rise in value without suffering any impacts. After having been encouraged time and again by advertisers and public office seekers to expect greater benefits than the incumbent brand or officeholder offers without making any personal sacrifice and having been told by personal-injury lawyers that every wrong has a remedy, they have come to expect a kind of immaculate conception: zero-impact development. This is especially true when the neighbors are suburban migrants who have returned to urban neighborhoods in search of vitality and edginess, but, being used to their large-lot, single-family house, they find the density of urban life jarring. They want to lose the isolation of suburbia but not the insulation it provides from their neighbors.

    The fact is that the benefits of development in the form of jobs, real estate tax revenue, or housing production are diffuse and general, but the impacts are specific and local. Satisfying housing demand by building a new apartment or condominium building is inarguably a public good, but its tiny incremental benefit to the abutting homeowner is just as inarguably outweighed by its increased traffic, noise, and other impacts. In a world in which personhood is paramount, it does not warrant support from abutters.

    Constructive engagement
    So where does this leave us? Though shrill anti-development sentiment is beginning to seem quaint at a time when planners, public officials, and even project proponents are more likely to embrace the ideas of Jane Jacobs than those of Robert Moses, expecting public officials to “slay the NIMBY dragon” by standing up to their constituents is naive. Stealth, misinformation, bullying, and a host of other stratagems employed by frustrated project proponents generally backfire, deepening the sense of barely submerged outrage that fuels NIMBYism in the first place. It is possibly worthwhile and certainly laudable to work to rebuild public consensus about the broader societal benefits of development, but this is at best a long-term effort.

    In the meanwhile, any effective solution to NIMBYism must address its root causes: perceived powerlessness and actual impact risk. The time it takes to resolve development controversies is, in some measure, the time it takes abutting homeowners to evaluate the risk to their largest investment and adjust to impending changes to their personal domain imposed by others. The laboriousness of this process can be reduced by measures which address these root causes: control and compensation. First, although it goes against the grain of every project proponent’s deepest instincts, to overcome their sense of oppression, the neighbors must be invited to actually influence development outcomes within the bounds of feasibility. Ceding some measure of control over the design of the project eliminates the “zero sum game” negotiation that characterizes most approval processes. It often leads to creative solutions and empowers the problem-solvers and constructive participants more than the extremists.

    A second element is compensation. Every project has impacts, and most fall disproportionately on an identifiable subset of people within a narrow geographic radius, who generally believe, whether they state it publicly or not, that they are entitled to some special consideration for allowing some broader social need to be met at their expense. Since government often cannot or will not play this role, it falls to the project proponent to do so. Such compensation is generally indirect: the improvement of a neighborhood park, a contribution to a local charity, support for neighborhood crime watch efforts, and the like. It is better if there is some connection between the compensation and the area of impact (e.g., owners of the tall building that will shadow a park will contribute to park maintenance, owners of the fast food restaurant will augment the neighborhood litter patrol). Reasonable people could differ about whether it’s fair, but some specific benefit is generally necessary.

    NIMBYism serves many social functions. In an improvised and very democratic way, it forces mitigation measures to be considered, distributes project impacts, protects property values, and helps people adjust to change in their surroundings. It is a corrective mechanism that, if allowed to function properly, can even help to preserve a constituency for development. We owe the continued existence of many memorable places, from Washington’s Mt. Vernon to the Cape Cod National Seashore, to the efforts of past NIMBYs. In fact, if the forces that animate NIMBYism – attachment to place, increases in homeownership, and public participation in government decision-making – were waning, we would lament this more than we now bemoan NIMBYism. Though it’s not so easy to do, the only constructive approach is to accord development opponents the presumption of good faith and to engage with them. If it helps, think of it as Jane Jacobs’ revenge.

    This article originally appeared in Harvard Design Magazine, Spring/Summer 2008.

    Matthew J. Kiefer is a partner in the Boston law firm of Goulston & Storrs, P.C., practicing in the area of real estate development and land use law. A 1995-1996 Loeb Fellow at Harvard University, he teaches a course on the development approval process at the Harvard Graduate School of Design and is an active board member of private non-profit open-space, preservation, and design organizations.

    Photo by Leah Franchetti.

  • Excavating The Buried Civilization of Roosevelt’s New Deal

    A bridge crashes into the Mississippi at rush hour. Cheesy levees go down in New Orleans and few come to help or rebuild. States must rely on gambling for revenue to run essential public services yet fall farther into the pit of structural deficits. Clearly we have gone a long way from the legacy of the New Deal.

    The forces responsible for this dismantling are what Thomas Frank calls “The Wrecking Crew,” the ideological (and sometimes genealogical) descendants of those who have waged war against Franklin Roosevelt’s New Deal since its birth 75 years ago. Few today articulate any vision of what Americans can achieve together because “the public” is the chief and intended casualty in that long war.

    Those whom the mass media routinely refer to as conservatives better know themselves as counterrevolutionaries against what FDR wrought. Ronald Reagan proclaimed that government is the problem as he made it so. Almost two generations after President Roosevelt’s death, President Reagan and his conservative surrogates depended upon the amnesia of those who know little about what the New Deal did and what it still does for them to undo parts of its legacy.

    I was not much more enlightened when I began what became the California Living New Deal Project four years ago. I thought that — with a generous seed grant from the Columbia Foundation — photographer Robert Dawson and I could document the physical legacy of the New Deal in California. Since the New Deal agencies were all about centralization, I thought, I would find their records neatly filed back in Washington at the National Archives and Library of Congress. I was wrong on all counts.

    I discovered, instead, a strange civilization buried beneath strata of forgetfulness, neglect, and even malice seventy-five years deep. Aborted by the Second World War FDR’s sudden death, then covered with the congealed lava of the McCarthy reaction, the half dozen or so agencies that had created the physical and cultural infrastructure from which grew America’s post-war prosperity left few accessible records of their collective accomplishments. So many-pronged and multitudinous was the Roosevelt administration’s onslaught upon the Depression that even FDR’s Secretary of the Interior and head of the Public Works Agency (PWA), “Honest Harry” Ickes, admitted that he could not keep track of it all.

    With the help of hundreds of photographs scanned at the National Archives and other collections, journal articles of the period, historical surveys, mimeographed WPA reports, as well as local historians and other informants, an indispensable matrix of public works was revealed to me. Most of our urban airports and rural airstrips, it now appears, began as projects of the WPA and CCC (Civilian Conservation Corps), while California’s many community colleges are similarly New Deal creations. (Between two illustrated talks I recently gave to large audiences at Santa Rosa Community College, Professor Marty Bennett led the first New Deal tour of a campus almost entirely built by Ickes’ PWA.) Committed to public education in all of its manifestations, the WPA and PWA built and expanded literally hundreds of schools throughout the state to replace older buildings that were seismically unsafe, inadequate, or nonexistent. Most are still in use.

    The availability of plentiful and cheap labor as well as PWA grants and loans made the Bay Area one of the most desirable regions in the country by giving it a vast network of public parks and recreational areas. A WPA report on that agency’s accomplishments in San Francisco noted that WPA workers had improved virtually every park in the city: that now appears to be true of most older towns in California where federally employed workers left a legacy of handsome stonework, public stadia, tennis courts, golf courses, swimming pools, baseball diamonds, and restrooms but few markers. Other federal employees built a network of all-weather farm-to-market roads enabling growers to get fresh produce to towns and tourists to visit every corner of the state. Still others completed and expanded public water supplies and electrical distribution systems as well as sewage treatment plants that, for the first time, insured the majority of Americans safe and plentiful drinking water.

    As the scale and extent of that often forgotten civilization grew ever larger, cataloging and mapping it fast outpaced my organizational and technical skills. With the joint sponsorship of the California Historical Society, the California Studies Center, and the Institute for Research on Labor and Employment (IRLE) at UC Berkeley, the California Living New Deal Project morphed into an unprecedented collaborative effort to use informants throughout the State to inventory and map what New Deal agencies achieved and to suggest what might have been. In particular, I am grateful to the IRLE Library whose staff maintains and continually expands the CLNDP website with input from research assistants and informants.

    The Roosevelt Administration and those it brought to Washington envisioned a collectively built America whose immense productive capacities could benefit all. A profusion of splendid public spaces such as Mount Tamalpais State Park’s Mountain Theater and the Santa Barbara Bowl would, they believed, make citizens and community of a polyglot populace. Together with a plethora of well-built public schools, libraries, post offices, parks, water systems, bridges, airports, hospitals, harbors, city halls, county courthouses, zoos, art works and more, New Deal initiatives spread the wealth and enriched the lives of uncounted Americans.

    In his last State of the Union address, FDR’s firm and confident voice enunciated the need for a second bill of economic rights that would ensure everyone a modicum of freedom, a freedom that his country promised but so often failed to deliver. If extended worldwide, Roosevelt suggested, that Bill of Rights could short-circuit future wars such as the one still raging as he spoke. “Necessitous men are not free men,” he told the nation, a condition afflicting the vast majority of people today.

    Gray Brechin is a Visiting Scholar at the U.C. Berkeley Department of Geography and the Project Scholar of the California Living New Deal Project. He is the author of “Imperial San Francisco: Urban Power, Earthly Ruin” and, with photographer Robert Dawson, “Farewell Promised Land: Waking from the California Dream.”

  • Public Investment, Decentralization and Other Economic Lessons from the New Deal

    The first lesson to be learned from this earlier era is that a large middle class requires an economy that generates a broad base of jobs paying middle-class wages. The New Dealers were not opposed to “rigging” the labor and financial markets to achieve this result. New Deal progressives believed the economy should exist to serve society, not the other way around, and that the government has a duty to shape the economy to meet middle-class aspirations. A high-wage, middle-class society would, in turn, be good for the economy: living wages would not only ensure adequate demand for the economy but in so doing would spur new investment and productivity growth, creating a virtuous circle of rising living standards.

    The belief of New Deal progressives in an economy that could create good middle-class jobs stemmed in part from their resistance to large social welfare subsidies to individuals, on the grounds that this would encourage an unhealthy dependence on the state. Moreover, even though they favored progressive taxation, New Dealers were skeptical of a society dependent upon the permanent redistribution of income. The principal goal of many New Deal programs was not to relieve the conditions of poverty -although they often did so – but to build physical and human capital that would allow people to escape permanently from poverty.

    Thus New Dealers emphasized government programs that expanded education, spread property ownership, invested in America’s common physical and knowledge capital, and seeded the industries of the future. It was not perfect, in large part because it preceded the civil rights revolution and thus left out millions of African-Americans, but it did build the largest and most secure middle class America has ever known.

    Today we see the consequences of a much different way of thinking about the economy and society. Over the past two decades we have been told that globalization is an immutable force and that we must bend to its demands, embracing the agenda of free trade, financial deregulation and less progressive taxation. The best we can do, we’re told, is to let globalization run its course and compensate the losers, even though no amount of new social welfare measures could compensate for the loss of millions of good-paying manufacturing jobs. Thus, without any real debate, America’s political elites have chosen for us a highly stratified, low-wage society with great costs to our middle-class way of life and to our productive economy.

    The second New Deal principle is about achieving a high-wage economy and at the same time more widely distributing the capital and skills for wealth creation. The principal policy tool the earlier generation used was massive public investment and public building. The public investment programs they pursued not only created many new middle-class jobs but also laid the foundation for a more productive economy, which led to even more middle-class jobs.

    Agencies like the Tennessee Valley Authority in the 1930s and ’40s were followed by even more extensive public investment initiatives in the postwar years. From 1950 to 1970, the government spent more than three percent of GDP on public infrastructure alone. It built everything from highways to schools, power systems to parks.

    Throughout the New Deal era, public investment was America’s way of enacting industrial policy. It was understood that public investment paid for itself many times over. The GI Bill alone generated returns of up to $7 for every dollar invested. And because it generated returns to the economy and society, New Dealers in the postwar period were not afraid to raise taxes or to borrow in order to ensure adequate levels of public investment. And borrow they did, even though the national debt was a much larger percentage of GDP than it is now.

    For the past few decades, however, we have made a very different choice. As concerns over the budget deficit have grown, and as tax-cutting mania has taken hold, we have cut back on public investment. Since 1980 we have devoted less than 2 percent of GDP to public infrastructure and have allowed federal spending on basic research and development to decline as a percentage of GDP as well. As a result, a backlog of public investment needs – clogged roads and ports, collapsing bridges and levees, uneven broadband access, an antiquated air traffic system, an undersized energy infrastructure – has begun to cut into our economic growth and undermine our efficiency.

    A third principle of middle-class America that the New Deal offers us relates to the concentration of power and capital. Earlier progressive reformers distrusted such concentrations. Not only did they threaten democracy, they also warped the economy and distorted consumption and investment. Government therefore must be a strong countervailing force to big business and oligarchic power, and must be organized so that it cannot be captured by one economic group at the expense of another or the general public.

    The New Dealers were particularly concerned about the power of Wall Street and the financial community. They feared a national credit system that was dependent on Wall Street bankers, whose interests were not always aligned with the needs of homeowners, farmers and small and medium-sized producers. They therefore sought to democratize capital by creating myriad credit institutions that would ensure that all regions and sectors of the economy had access to capital. They created a variety of federally subsidized credit programs to enable people to construct homes and start businesses and to allow states and municipalities to build schools and modernize infrastructure. It was here that the New Deal was most creative – combining a strong federal state with the local and regional decentralization of capital and the local and regional control of these programs and institutions.

    As with other first principles of a middle-class America, we have seen a reversal of priorities over the past few decades, as big financial institutions have again asserted their influence over the economy and economic policy. The new power of Wall Street has been evident in its successful push for financial liberalization and deregulation, in the emphasis accorded the deficit and concerns about inflation as opposed to full employment, and until recently in Washington’s preference for a strong dollar, which favors financiers over real producers. This triumph of Wall Street over Main Street has been responsible in part for the hollowing out of the tradable-goods sector and for the asset bubbles and predatory lending that have wreaked havoc on the economy. Indeed, one of the first things the New Deal would have us do is re-regulate the financial system and put the interests of the productive economy over those of Wall Street.

    In all these respects, whether it be high wages, public investment or the decentralization of financial power, the New Deal succeeded because it changed the way the economy worked. And it did so by marrying progressive reforms with Americans’ preference for independence, whether from government subsidy or big-business paternalism. This is the enduring lesson of the New Deal.

    Sherle Schwenninger directs the New America Foundation’s Economic Growth Program and the Global Middle Class Initiative. He is also the former director of the Bernard L. Schwartz Fellows Program.

  • The New Deal at 75: An Inspiration, Not a Blueprint

    Whatever your political perspective, Americans need to admire the New Deal for, if nothing else, its ambitious agenda. In a way unparalleled in the 20th Century, the New Deal left us a legacy of achievement – one that we can still see in big cities like San Francisco and small towns like Wishek, North Dakota.

    The great genius of the New Deal lay not in ideology but in its pragmatism and practicality. People were out of work so it created jobs. The country’s infrastructure, particularly in the rural areas, was primitive, so it took on the task of modernization.

    In some ways, this paralleled what was also being done under the Communists in the Soviet Union as well as under Fascists in Italy and under the National Socialists in Germany. This has led some conservatives, such as “Liberal Fascism” author Jonah Goldberg, to conflate the New Deal legacy with fascism. But this assertion is belied by the fact that we still live under a democratic and liberal political structure, one that by the 1980s had turned to oppose much of that legacy.

    Yet I believe that even Ronald Reagan – himself once an avid New Dealer – would admit that the New Deal did much to expand America’s middle class. It did so not by promoting redistribution and welfarism or by moral cajoling – characteristics Mike Lind identifies with the more elite Progressives – but by practical actions that gave people the tools with which to build their own individual prosperity.

    Economically speaking, it is also true that the New Deal failed to recreate prosperity (at least until the onset of the Second World War). But it cannot be denied that it literally brought light to large parts of the country – particularly the Southeast and the rural Great Plains – into the 20th Century. Among the New Deal’s great accomplishments, as Andy Sywak discusses, are its public works.A partial list of these accomplishments include:

    • 22,428 road projects
    • 7488 educational buildings
    • Over 7000 sewer, water and other public buildings
    • Employed over 3,000,000 workers earning who helped support 10,000,000 dependents
    • Employed 125,000 engineers, social workers, accountants, superintendents, foremen and timekeepers scattered in every state and community

    Ultimately, notes scholar Jason Scott Smith, the New Deal touched intimately the lives of more than fifty million out of a total U.S. population in 1933 of 125 million. Yet its legacy went well beyond the Roosevelt years, extending from Roosevelt and Truman all the way to Eisenhower, Kennedy, Johnson and, even to some extent, Richard Nixon.

    As Sherle Schwenninger points out, The New Deal created the basis for the great, and widely shared, national prosperity of the post-war period. Through infrastructure spending, housing programs, the GI Bill and government-funded scientific research, the New Deal directly and indirectly helped make the United States the premier power on the world scene and by far its strongest economy.

    America remains the preeminent country in the world, but there is a great, widely held belief that this status is slipping as other countries – China, Russia, Brazil, India – enact what amounts to their own New Deals. Our once vibrant middle class is under siege, our infrastructure is aging and even “progressives” seem more interested in promoting avant garde cultural values than in economic growth, upward mobility or maintaining technological excellence. Even in the field of conservation, a core value of the New Deal and progressive traditions, the focus is increasing less about preserving resources and open space for people, and more about how to preserve and insulate nature from the ill-effects of human carbon-based life forms.

    Yet if we can be inspired by the New Deal, we can not simply repeat it. For one thing, our crisis today is less palpable and immediate, making it all but impossible to mobilize resources in the same way. At the same time, the public sector, small at the onset of New Deal, has already swollen to gargantuan size. The power of organized public employees, largely a non-factor in the 1930s and 1940s, threatens any government initiative by siphoning off too many local and federal resources due to their often extravagant demands in everything from salaries and work rules to pensions.

    This can be seen in the morphing of the New Deal legacy in large cities including the greatest of all, New York. Under Mayor Fiorella La Guardia, a maverick Republican of the Theodore Roosevelt stripe, the city built new parks, playgrounds, swimming pools, roads, and sanitation systems with an almost messianic fervor. At one time, New York City was receiving one-seventh of all funds dispersed by the Works Progress Administration (WPA).

    Yet La Guardia’s expanded city government, notes Cooper Union historian Fred Siegel, still operated under an efficiency-oriented progressive administration. La Guardia and his parks commissioner, Robert Moses fired political appointees and dismissed incumbents, leading some public employees to identify him with the Italian dictator Mussolini. Rejecting narrow ideology, La Guardia famously claimed: “There is no Republican or Democratic way to clean streets.”

    La Guardia’s successors, in New York and elsewhere, did not stick to this moral and administrative rigor. The share government workers in New York’s workforce expanded from 10 percent in 1950 to over 17 percent in 1970s but with increasingly little accountability. If a new New Deal means a large expansion of the unionized public workforce, in New York or elsewhere, it will be largely doomed.

    So as we admire the achievements of the New Deal, we also need to keep in mind the shortcomings that grew out of its success. That we need a new powerful commitment to infrastructure and economic growth is undoubted, but in pursuing this we need to make sure it does not serve primarily the public employee lobbies and the well-organized rent-seeking private interests.

    New solutions, such as tapping abundant capital resources from both here and abroad, need to be tried out. And given the overconcentration of power already in Washington, and the spread of technical expertise to states and regions, a greater emphasis on locally based initiatives may work better this time around.

    Yet in the end, American still requires some form of broad initiative to overcome its current doldrums. This requires the same kind of bold, innovative and pragmatic spirit characteristic of the New Deal that three quarters of a century later remains its most useful legacy.

    Joel Kotkin is the Executive Editor of www.newgeography.com.

    Other New Geography New Deal articles:

    The New Deal & the Legacy of Public Works
    New Deal Investments Created Enduring, Livable Communities
    Progressives, New Dealers, and the Politics of Landscape
    Public Investment, Decentralization and Other Economic Lessons from the New Deal
    Emerald City Emergence: Seattle and the New Deal
    Excavating The Buried Civilization of Roosevelt’s New Deall

    Other New Deal sites:

    New Deal Network (sponsored by the Franklin and Eleanor Roosevelt Institute)
    New Deal Cultural Programs
    California’s Living New Deal Project

  • The New Deal & the Legacy of Public Works

    Almost completely ignored in the press this year has been the 75th anniversary of the New Deal. Social Security, public housing, school lunches, deposit insurance, labor relations standards and banking regulations are among its many enduring legacies. On this anniversary, it is worth looking at the public works programs that constructed roads and buildings that still exist in every county in America.

    In a nation where a quarter of the adult population was unemployed, the immediate goal of the New Deal was to provide temporary relief for Americans who were destitute and put them back to work. The failure of the Hoover Administration to either curtail the Depression or inspire people created a political climate for dramatic action.

    During FDR’s first 100 days – called the “First New Deal” by historians – a truly impressive list of legislation was passed. Prohibition ended, the Tennessee Valley Authority was created eventually bringing electricity and development to an impoverished area of the South, and controls were placed upon industrial practices, Wall Street, labor relations and farm output. The Civilian Conservation Corps, which ended up planting two billion trees across the country, was founded. A historian would be hard pressed to find a more energetic first 100 days of any administration.

    Yet one of its most far-reaching accomplishments was the Federal Emergency Relief and National Industrial Recovery acts which created the bureaucracy to institute public relief by funding large-scale public works. Under the system, states applied for grants from the federal government. Over the next ten years, the government would spend nearly $9 billion dollars though the Civil Works Administration (CWA), Public Works Administration (PWA) and the Works Progress Administration (WPA).

    The depth and social unrest created by the Depression provided motivation for New Deal officials to act quickly and decisively. The official who was the center of the action was Harry Hopkins. A hyper-competent social worker who had created a program to deliver services to mothers with dependent children in New York City and founded the American Association of Social Workers, Hopkins jumped into his role as head of federal relief with tremendous vigor. After a five-minute meeting with Roosevelt on his first day of work in May of 1933, he was dispatched to a cockroach-infested building on New York Avenue where, by the end of the day, he had dispensed with $5.3 million in aid to eight states. In a year’s time, Hopkins had created a jobs program that spent a billion dollars and provided badly needed jobs to over three million people during the cold winter of 1933 (the average wage was $13 a week). He spent money quickly – perhaps too quickly, some maintained – but his focus was to respond to FDR’s demand to quickly create jobs and alleviate misery in the country.

    But Hopkins was not a welfare statist. His career as a social worker had taught him that individuals did not want to be “on the dole,” living off the largesse of the state. By finding work for unemployed breadwinners, Hopkins believed he could keep families strong and enable them to retain their pride despite the hard times.

    This psychological aspect should not be underestimated. The Depression was more than a huge decline in GDP, vast unemployment and lost industrial output – it was a great identity crisis for a nation that placed great value on self-sufficiency and self-reliance. Look at New Deal art (another achievement of the New Deal are all the beautiful murals still in existence created by government funded artists) and you will see a glorification of labor. Frescos from San Francisco to New York depict colorful scenes of men hard at work.

    Today bureaucrats stress cost-effectiveness ratios, but New Deal reports were most concerned with how many jobs a project provided. Conservative critiques of the New Deal for a mixed record of achieving economic growth often miss this critical point. The official report of WPA projects in San Francisco, for example, lists as its main achievement how “the program contributed to the continuance of the normal standards of living of the working man’s family in San Francisco and maintenance of the courage and morale of the ordinary citizens through a most distressing period.” Expenses for projects are listed not just in dollar amounts spent but also in the number of “man hours” provided to workers.

    When Roosevelt ran for re-election the first time in 1936 (“Four Years Ago and Now” was his campaign slogan), he could claim six million jobs had been created in the last three years. He could point to a doubling of industrial output and the creation of a Farm Credit Administration that on an average day saved 300 farms from foreclosure. Still, eight million people were still out of work in 1936 and the public works programs, historically audacious they were, did not solve many of the nations entrenched economic and social problems. Roosevelt himself did not want his public works programs to compete with private industry or to create dependency on the state.

    Yet, looking back at the WPA and its companion public works agencies, the list of lasting contributions to the nation’s infrastructure are indeed impressive: 78,000 bridges, 650,000 miles of roads, 700 miles of airport runways, 13,000 playgrounds, hundreds of airports built and 125,000 military and civilian buildings were constructed. The roads and public works constructed by the WPA and PWA ended up being lasting infrastructure investments.

    However, perhaps the New Deal’s most enduring achievement was creating a sense of unity at a time of unparalleled economic crisis. Whereas the nation had previously elevated Horatio Alger -style self-reliance, the New Deal tapped into the creative industrial potential both of common unskilled laborers and thousands of skilled and creative workers. It created a sense of pride among millions who for the rest of their lives could point to public buildings they helped design and build, as well as the roads they laid out and paved.

    The 1930s produced the Hoover and Grand Coulee dams, the Golden Gate and Bay bridges, La Guardia Airport and the San Antonio River Walk. Besides some luxury high-rises, high-tech sports stadiums with retractable roofs and edgy art museums, what great things have we achieved lately?

    Andy Sywak is the articles editor for Newgeography.com.

  • Progressives, New Dealers, and the Politics of Landscape

    One of the greatest ironies of our time is the fact that today’s leading progressives tend to despise the very decentralized landscape that an earlier generation of New Deal liberals created.

    Franklin Roosevelt and his successors from Harry Truman to John F. Kennedy and Lyndon Johnson sought to shift industry and population from the crowded industrial centers of the Northeast and Midwest. They did this through rural electrification based on hydropower projects, factories supplying the military and federal aid to citizens seeking to buy single-family homes in low-density suburbs.

    This is precisely the environment – which brought so much opportunity and improved living conditions to so many – that today’s progressives so often despise. Since the 1960s, environmentalists, for example, have waged a campaign against the great dams that symbolized New Deal economic development policies. Artificial lakes that generate electricity for millions of suburban homeowners and businesses, and have brought an end to devastating, cyclical floods, are condemned by progressives for having wiped out local fauna and flora. And it goes without saying that the middle-class swimmers, picnickers and motor-boaters that enjoy government-created lakes on weekends are… well, vulgar.

    Similarly, the defense plants that the Roosevelt, Truman and Kennedy-Johnson administrations scattered throughout the country are often lambasted as emblems of the fascistic “military-industrial complex,” part of a wicked “Gun Belt.” In fact, industry is increasingly seen as undesirable by today’s Arcadian progressives, who appear to believe that it would have been better to leave the farmers of rural America as quaint specimens of authentic folk life.

    But nothing riles the progressives of today than the low-density, single-family home suburbs made possible by New Deal liberal homeownership policies. Since the 1950s, intellectuals on the left have been bemoaning the alleged cultural sterility and conformity of the suburbs. Now anti-sprawl campaigners allege that the suburbs are also destroying the planet.

    So the question is: How did the American left, in a short period of time, come to repudiate the New Deal and the American landscape it created? The answer is simple: today’s center-left, which calls itself progressive rather than liberal, is not the heir of New Deal liberalism. It is the heir instead of early twentieth century elite Progressives, who were shoved aside and marginalized during the heyday of New Deal liberalism.

    The original Progressives were overwhelmingly professionals and patricians of old Anglo-American stock in the Northeast and Midwest, many of them the children of Protestant clergymen, teachers or professors. They despised the nouveau riche of the Gilded Age, but also tended to view European immigrants and white and black Southerners as benighted primitives.

    Their vision of the ideal society, influenced by the Hegelian Idealist culture of Bismarckian Germany, was one in which a university-trained elite ran everything with minimal interference by ignorant voters and crass politicians. As heirs of the moralistic Northern Protestant Whig and Republican traditions, these Progressives also had a strong interest in the social engineering of private behavior, from prohibition to eugenic sterilization.

    From Reconstruction until the Depression, Progressive moralism and elitism alienated European immigrants and rural Southerners and Westerners alike. This benefited the industrial capitalists of the dominant Republican party. Franklin Roosevelt created a powerful, but fundamentally unstable, Democratic majority by adding many former Republican Progressives to the old Democratic coalition of Northern white “ethnics” and white Southerners.

    Yet in the process Roosevelt helped undermine many of the signature initiatives of the progressives, starting with the repeal of Prohibition, a policy loathed by German and Irish Catholic voters. It signaled a repudiation of the Whig-Republican-Progressive ambition to use the federal government for moral reform and social engineering. (FDR’s tactical appeasement of Southern segregation had a similar tactical logic).

    Another goal of Progressives, economic planning, died with the collapse of the National Recovery Administration (NRA) in the first Roosevelt term. Jettisoning the Progressive dream of a planned economy run by technocrats, the Roosevelt administration instead focused pragmatically on state-capitalist public infrastructure projects like the Tennessee Valley Association (TVA) and the Lower Colorado River Association (LCRA).

    Plans for an all-powerful executive civil service subordinate to the White House – a progressive reform that FDR unwisely favored – were rejected by a Congress jealous of its prerogatives and suspicious of executive power. Finally, nanny-state supervision of the poor, another Progressive theme, found little sympathy among New Deal Democrats, who preferred universal social insurance to means-tested public assistance, and preferred employing the able-bodied poor in public works to what FDR called “the narcotic” of the “dole.”

    The New Deal ultimately left little of the old Progressive project but created what could be considered a Golden Age that lasted until the 1970s for the white lower middle class majority. Progressive intellectuals and activists, however, sensed that they had been marginalized. Over-represented in the prestige press and the universities, they increasingly denounced what they saw as the vulgarity of the New Deal’s constituency.

    The assault on the suburbs was one of the most powerful expressions of this discontent. It was led by two figures. One was Jane Jacobs, the romantic chronicler of dense urban life, and its villain in New York’s highway-building Robert Moses. A rival school, headed by Jacobs’ enemy Lewis Mumford, sang the praises of planned “organic” villages – “highwayless towns” connected by “townless highways.” The Mumfordian strain of Progressive planning is represented today by the New Urbanism, with its hyper-regulated low-rise pedestrian communities.

    The resurgent progressives also clung to their vision of a society in which an enlightened, nonpartisan elite governs the ignorant masses from above. The Civil Rights Revolution, and the era of judicial activism that followed, permitted progressives to transfer power from the elected political class to the federal judiciary. By the 1970s and 1980s, federal judges were regulating practically all aspects of American life. Social engineering schemes like busing for racial balance and race-based affirmative action, which “color-blind” New Deal liberal opponents of segregation like Hubert Humphrey and Lyndon Johnson opposed, now became critical pillars of progressive ideology.

    The New Dealers had been ardent conservationists, but their conservationism focused not only on nature but also the well-being of people. New Deal soil conservation and agricultural productivity policies allowed the amount of land in cultivation to decline, freeing up vast tracts of land for wilderness or habitation. Farmers, middle class suburbanites and nature all gained.

    This approach is repudiated by most contemporary progressives, who know nothing about farms except that they are cruel to livestock. By the 1970s many progressives abandoned liberal conservationism for radical environmentalism, which seeks to protect nature by separating it from humanity and industry. Radical environmentalism tends to shade into misanthropy, as in the proposal by two New Jersey environmentalists to turn much of the Great Plains into a human-free “Buffalo Commons.” (Curiously, nobody seems to have proposed evacuating New Jersey in order to create a “Migratory Bird Park.”) The radical Green goal of “rewilding” North America by creating “wildlife corridors” from which humans are banned repudiates the New Deal liberal vision of allowing working-class Americans to enjoy the scenery of national parks.

    So in every respect except racism and opposition to immigration, today’s progressives are genuine heirs not of the New Deal liberals but of the capital-P Progressive economic planners and social engineers of the early twentieth century. Even their social base is the same as in 1908 – college-educated professionals, particularly those in the nonprofit sector and education, like public school teachers and academics.

    This class – enlarged ironically by New Deal liberal programs like the G.I. Bill and student loans – has been increased in number by upwardly-mobile Americans to whom mass university education imparts a blend of the worldviews of old-fashioned Northeastern progressives and the old Bohemian left-intelligentsia. This enlarged college-educated professional class has allied itself with African-Americans and Latinos in the identity centered post-McGovern Democratic party.

    With perfect symbolism, the two bases of the alliance of white progressives and nonwhite Democrats – college campuses and inner cities, allied against the middle-class and working-class suburbs – correspond to the alternate urban utopias of Lewis Mumford and Jane Jacobs respectively, if we consider the college campus to be a Mumfordian paradise.

    With good reason, then, today’s progressives despise the suburban, middle-class America created by yesterday’s New Deal liberals. Today’s progressives may invoke the New Deal, but they are the heirs not of mid-century liberals like Franklin Roosevelt and Lyndon Johnson but rather of the Progressive social engineers who believed that enlightened elites should alter both the built environment and human behavior to meet their social goals. Some things never change.

    Michael Lind is the Whitehead Senior Fellow at the New America Foundation. He is the author, with Ted Halstead, of “The Radical Center: The Future of American Politics” (Doubleday, 2001). He is also the author of “Made in Texas: George W. Bush and the Southern Takeover of American Politics” (New America Books/Basic, 2003) and “What Lincoln Believed” (Doubleday, 2005). Mr. Lind has been an editor or staff writer for The New Yorker, Harper’s Magazine, and The New Republic. From 1991 to 1994, he was executive editor of The National Interest.

  • New York’s Next Fiscal Crisis

    Mayor Bloomberg needs to prepare the city for the crash of the Wall Street gravy train.

    New York City, dependent on Wall Street for a quarter-century, has gotten used to harsh cyclical economic downturns, including the lending contraction in the early nineties and the bursting technology bubble in 2000. But today’s turmoil may be not a cyclical downturn for Wall Street but instead the beginning of an era of sharply lower profits as it rethinks its entire business model. If so, it will produce the biggest economic adjustment and fiscal challenge that New York has confronted in more than three decades. If the city’s leaders don’t recognize this challenge and move quickly to meet it, New York could soon face an acute fiscal crisis rivaling its near-bankruptcy in the mid-seventies.

    Such a fate—almost unthinkable to a city that has grown complacent about its world-class standing—could set Gotham back in the colossal strides that it has made over the past two decades in restoring its citizens’ quality of life. As Mayor Michael Bloomberg said in May, we must “pray that Wall Street does well.” But we’d better have a plan if it doesn’t.

    Wall Street bankrolled New York’s long recovery from the seventies because New York, through its long economic, fiscal, and social deterioration, managed to keep its position as the nation’s financial capital just as finance was about to take off. In the early eighties, the nation’s financial industry—particularly Wall Street—was feeling its way toward a sweet spot where it would stay for two decades. As Federal Reserve chief Paul Volcker brought inflation under control, creating a stable environment for financial innovation and a stable currency for the world’s savings, baby boomers and international investors flocked to U.S. markets. The Dow Jones Industrial Average tripled between 1982 and 1990, despite the ’87 crash, while the assets of securities brokers and dealers more than doubled as a share of America’s financial assets. The financial industry also saw a huge opportunity in Americans’ increasing love of debt, creatively packaging it into everything from mortgage-backed securities to junk bonds and then selling it to investors. Between the early 1980s and the early 1990s, the financial sector’s profits as a percentage of the nation’s income more than doubled. The sector’s pretax income as a percentage of all national income started a similar march upward. Profits at securities firms, while choppy, easily doubled between the early eighties and the end of the decade (all numbers are inflation-adjusted unless indicated otherwise).
    Graph by Alberto Mena.

    New York reaped massive rewards from Wall Street’s good fortune. The city’s financial-industry employment grew by 14 percent in the eighties—more than triple the job growth in its other private-sector industries. Jobs in the securities industry in particular, which had decreased in the seventies, grew by more than a third. Since these positions were high-paying, they had an outsize impact: by the late eighties, according to the

    Fed, financial services contributed nearly 23 percent of New Yorkers’ wages and salaries, up more than 60 percent from the previous decade. And financiers’ heavy spending supported other jobs, from restaurant workers and interior decorators to teachers and nurses.

    For evidence of how Wall Street started to lure newcomers to New York, look to Hollywood. Movies chronicling Gotham’s grim decline, like Taxi Driver (1976) and Escape from New York (1981), gave way to films portraying the heady excitement of making millions in the city, like Wall Street (1987) and Working Girl (1988). While much of the city remained grimy and dangerous, the excitement outweighed those factors for young, child-free baby boomers who paid high taxes without requiring many city services. The result: after hemorrhaging nearly 10 percent of its population between 1970 and 1980, New York gained nearly 4 percent back between 1980 and 1990. The city’s tax take in 1981 had been slightly lower than its take a decade before; but by 1991, it was raking in a third more than in 1981.

    This money allowed New York to reverse some of its bone-scraping seventies-era budget cuts and to invest in infrastructure without making the politically difficult choice of cutting deeply into social services. In the seventies, the city had laid off nearly 3,000 police officers and 1,500 sanitation workers; in 1985, Mayor Ed Koch hired 5,300 cops and almost 1,000 sanitation workers. In the 1990s, it was largely Wall Street’s breakaway success that gave Mayor Rudy Giuliani the financial resources to focus on making New York City safe again.

    If high finance found its sweet spot in the eighties, it reached dizzying sugar highs starting in the late nineties and continuing, after recovering from the tech bust and 9/11, until last year. The nation was awash in the world’s money, encouraging record lending and speculation as well as the creation of more financial products, which yielded banks massive profits. By 2006, the financial industry’s corporate profits as a percentage of the nation’s income had doubled once again.

    It seemed that nothing could go wrong for Wall Street once it had bounced back from the tech bubble’s burst. With the dollar serving as the expanding global economy’s reserve currency, banks had oodles of money to lend. Cheap Asian imports were keeping prices and inflation expectations low, allowing central bankers to justify low interest rates. Beginning in the nineties, traditional consumer banks—previously tightly regulated to protect government guarantees for their depositors—began taking investment risks that once had been confined to Wall Street. As time went on, investment banks became more dependent on fees from debt backed by home mortgages and other consumer products, further blurring traditional lines between investment and consumer banking.

    The financial world took advantage of the easy money and better technology. It booked high fees by designing ever more complicated “structured finance” products, backed by riskier home mortgages as well as corporate loans. Wall Street sold these products to international investors, who couldn’t get enough of American debt, by making a seductive pitch: the products were structured so intricately that even risky mortgages were as safe as government bonds, and they paid better interest rates. Further, if an investor ever had to sell a mortgage-backed security after he had purchased it from a bank, it was a cinch, since Wall Street had “securitized” individual loans—that is, taken thousands of them at a time, sliced them up, and turned them into easily tradable bonds of different risk levels.

    In addition to lending, Wall Street was borrowing at record levels so that it could take bigger and bigger risks with its shareholders’ money, making up for lower profit margins on businesses like equity underwriting and merger advisories. Wall Street’s borrowing as a multiple of its shareholders’ equity was 60 percent above its long-term average by the end of last year (with sharp increases over the past few years). Firms were taking even more risks than that figure indicates, setting up arcane, off-the-books “investment vehicles” with shareholders still vulnerable if something went wrong.

    As banks and financiers got unimaginably rich, so did the city. The finance industry’s contribution to New Yorkers’ wages and salaries topped out at over 35 percent two years ago. Last year, the city took in 41 percent more in taxes than it did in 2000, capping off an era of unprecedented revenue growth. While the city’s stratospheric property market—itself a function of Wall Street bonuses and easy money—drove much of that increase through property-related taxes, corporate tax revenues rose by 52 percent, personal income tax revenues by nearly 20 percent, and banking tax revenues by nearly 200 percent.
    Graph by Alberto Mena.

    But today, the financial industry may be entering a wilderness period of lower profits, employment, and bonuses. “Whether it’s financials as a share of the stock market or financials as a share of GDP, we’ve peaked,” ISI Group analyst Tom Gallagher told the Wall Street Journal in April. One measure of how this downturn differs from those in the recent past: some Wall Street firms, after their disastrous miscalculations, are operating today only because the Fed, as Bear Stearns melted down in March, decided to start lending to investment banks, which it doesn’t normally regulate or protect.

    A new alignment of global demographics, inflation expectations, and interest rates may spell long-term trouble for the city’s premier industry. A decade ago, cheap Asian goods kept prices and inflation expectations down; today, Asia’s growth is pushing them up. Ballooning energy prices and too-low interest rates threaten to yield sustained inflation. America now faces intense competition—particularly from the euro—for the world’s savings and investment, meaning that it can’t depend on attracting as large a portion of the world’s nest egg to keep interest rates down. “It is not credible that the world will revert to the same level of capital flow to the U.S. after the credit crunch is over,” Jerome Booth, research head of U.K.-based Ashmore Investment Management, noted recently. The Fed can keep official rates low only at the risk of inflation and more capital flight. The end of cheap money means that the market for future debt may shrink, squeezed by tougher borrowing terms, cutting off a crucial profit line for banks.

    Regulators, too, will be harder on the banks. Because investment banks now benefit from taxpayer-guaranteed debt, taxpayers must be protected. The feds probably won’t let firms borrow from private lenders at the levels that they have over the past decade, and it’s unlikely that they’ll let banks rely so intensely on short-term debt—which is cheaper, but riskier, than long-term debt. (Short-term lenders can flee quickly, as the Bear Stearns crash showed, because they have the option of yanking their money out of investments, often overnight, while long-term lenders are stuck with the bets that they’ve made.) Less borrowing means lower profits, and not just temporarily. Regulation might also curtail Wall Street’s lucrative business of complex derivatives, another huge area of risk. Plus, international stock listings continue to bypass New York for Asia and Europe because of the six-year-old Sarbanes-Oxley law, which imposes an unnecessary regulatory burden on companies publicly traded in the U.S., and also because the world’s growth has moved east. Such losses could be ignored only when debt and derivatives were making up for it.

    The skepticism of Wall Street’s own investors and clients, though, is the real deal-breaker. The most startling news out of the current crisis is that Merrill Lynch, UBS, and others didn’t know that they had taken certain risks for shareholders, lenders, and clients until they were already reporting tens of billions in losses. Clients and investors shouldn’t mind losses when they understand the risks that they’re taking. They do mind if, after the firm that they’re investing in or doing business with has insisted that its careful models and safeguards protect them, it turns out that its only protection from bankruptcy is Uncle Sam.

    International investors will not again blindly trust Wall Street’s ability to assess and allocate risk. “Market participants now seem to be questioning the financial architecture itself,” Fed governor Kevin Warsh said recently. Don’t forget the stock market’s performance, either: it hasn’t been impressive over the past eight years.

    New York City, so dependent on the financial industry’s continued growth, should shudder.

    If Mayor Bloomberg and his successor view the current downturn as another short blip, rather than a long readjustment of the financial industry’s share of the economy, and they turn out to be wrong, the decisions that they make could prove ruinous. Over the past two and a half decades, whenever the financial industry underwent one of its periodic downturns, New York stuck to the same playbook: jack up taxes to make up for lower tax revenues, cut spending a bit, and wait for the financial industry to come roaring back. During the early nineties’ credit crunch, Mayor David Dinkins slapped two temporary surcharges on the income tax; one still persists. In 2002 and 2003, after the tech bust and 9/11, Bloomberg temporarily hiked income and sales taxes and permanently hiked the property tax.

    Those tax increases were never wise because they kept less profitable industries and their lower-paid employees out, making New York ever more dependent on finance. Even the financial industry didn’t ignore the tax hikes; partly in response, it sent back-office, five-figure-a-year jobs to cheaper cities, and as a result, New York today has less than one-fourth of the nation’s securities-industry jobs, down from one-third two decades ago. Still, the industry was growing so fast that it and its workers could withstand the higher costs posed by the tax increases.

    But what was once merely unwise could be calamitous today. Consider the last time that New York tried raising taxes when its premier industry was about to shrink—the mid-sixties, when the city’s leaders arrogantly believed that its record population of 7.9 million people, in the middle of a record economic boom, wouldn’t mind paying for a breathtaking array of Great Society social programs, as well as fattened public-employee benefits. In 1965, the New York Times had reminded city leaders that “New York City’s economy is prospering,” and its editorialists decreed a year later that “strong medicine, specifically higher taxes, is the remedy for restoring New York’s financial health.”

    Mayor John Lindsay, with state support, enacted the city’s first personal income taxes, as well as new business taxes, in 1966. New York went on to lose half of its 1 million manufacturing jobs between 1965 and 1975—a trauma as great as Wall Street’s troubles today, because in 1960, manufacturing had accounted for more than a quarter of New York’s jobs. At the same time, the city was also losing its collection of corporate headquarters and their legions of well-paid employees. By the end of the seventies, half of its 140 Fortune 500 companies had fled the city.

    New York didn’t anticipate this change or understand its significance as it was happening. Well into the early seventies, the city thought that it could keep taxing and spending because the future was bound to mirror the “Soaring Sixties.” City officials argued that fleeing companies were evidence of New York’s success because some companies just couldn’t afford to be here any longer. Worse, the city’s leaders didn’t understand how quickly urban quality of life could deteriorate: as they focused on social spending rather than vital public services like policing, murders shot up from 645 in 1965 to 1,146 just five years later. Nor did they realize how quickly middle-class residents would flee, taking their tax dollars with them.

    For a while, the city and its lenders found a way around these miscalculations. New York stepped up its borrowing against future tax revenue in the late sixties and early seventies, paying the banks back when the following year’s tax receipts rolled in. The foolishness of such a plan was always obvious: three years before the city skirted bankruptcy, the Times reported, Albany skeptics warned that large-scale temporary borrowing was folly. But even as economic and fiscal conditions worsened, the city kept spending and spending. In 1970, city leaders were heartened by the judgment of bond-rating agency Dun & Bradstreet, which noted New York’s “extraordinary economic strength . . . and long-range credit stability.” (Then, as now, ratings agencies weren’t good at predicting acute crises.) In 1972, as what had once seemed like a short downturn stretched on, Times editorialists encouraged complacency, noting that “after all the years of . . . warnings of imminent municipal bankruptcy, it is reassuring to find investors . . . bullish about the outlook for New York City’s long-term financial soundness.”

    By late 1974, however, as rising spending outpaced tax receipts, a crisis was inevitable. It came the following spring, when New York wrestled with a budget deficit that equaled 14 percent of its expected spending and creditors cut the city off. Forced to throw itself at the mercy of the state and federal governments for emergency funding, Gotham gutted trash pickup and policing, murders climbed to 1,500 annually, and more residents left.

    Millennial New York likes to think of itself as vastly superior to the troubled city of the 1970s. But once again, on the brink of what may be a major economic upheaval—this time, involving the financial sector rather than manufacturing—it is reacting with disturbing complacency. And yet again, the mayor has allowed the budget to swell dangerously during the good times, which could push leaders to make the same mistakes as were made in the sixties and seventies: raising taxes at precisely the wrong time and slashing vital services under pressure to keep up social and public-employee spending.

    During the past decade, New York used the cash that Wall Street was showering on the city not to ease its long-term problems but to make them worse. In 1974, under Lindsay, the city devoted one-quarter of its budget to social spending: welfare, health services, and charities. Today, the city continues to spend one-quarter of its budget on social services (not including the public schools’ vast social-services component). Nor has New York reformed the pensions and size of its still-huge public workforce, reduced debt costs, or cut Medicaid costs fueled by Albany’s powerful medical lobby, which helps ensure that New York’s per-capita Medicaid spending—rife with waste and fraud—is the highest in the nation. Even after adjusting for inflation and considerable population recovery, the city’s tax-funded budget for 2008 is 22 percent higher than it was at its Lindsay-era peak. While spending rose just 9 percent or so during the Giuliani era, it has risen three times as fast since—the highest rate since Lindsay left office.

    Echoing a time when people said that New York was ungovernable, Mayor Bloomberg often calls these costs “uncontrollable.” But there was no better time to start controlling them than during the past half-decade, an era of unparalleled prosperity and public safety when Bloomberg had an opportunity available to no other modern mayor. If he had successfully bargained with Albany and union employees to require new workers to contribute more to their pensions and health benefits, we would have seen the results by now. Likewise, if he had worked with Albany to rein in Medicaid spending—now nearly $6 billion a year—the city could have spent some of that money to build schools and fix roads, reducing debt costs. Instead, we’ve got a politically powerful public workforce that commands benefits belonging to another era and that remains vulnerable to corruption despite this generosity, as recent construction investigations show.

    The mayor has also sharply increased spending in one area that was easily controllable: the city’s public schools budget, up by more than one-third since 2001 even though enrollment is down 4 percent. Much of that spending funds plusher teachers’ salaries and the higher pensions that follow, plus borrowing costs for school construction and rehab, making it harder to cut than it was to increase. Today, the education budget is nearly $21 billion: one-third of the entire budget, and more than police, fire, and sanitation combined.
    Graph by Alberto Mena.

    Bloomberg’s failure to control costs during the boom means that big trouble looms. The city projects that spending over the next three years will increase by more than 20 percent, while revenues will increase by just 13 percent (neither figure is adjusted for inflation). If that happens, a $5 billion–plus deficit—more than 11 percent of tax-funded spending—will result in two years’ time. Moreover, that’s the best-case scenario, based on the city comptroller’s prediction of low growth this year and next and a quick, though weak, recovery after that. But the mayor expects a 7.5 percent economic contraction this year, followed by a smaller contraction. If that happens, revenues might not rise as much as 13 percent; in fact, they might shrink, as they often did in the seventies (and again in 1990 and 2002).

    This risk is especially acute because our progressive tax structure and the growth in wealth of our richest citizens over the past two decades make New York highly dependent on the rich, whose income is volatile. Two years ago, the top 1 percent of taxpayers paid nearly 48 percent of the city’s personal income taxes even after adjusting for the temporarily higher tax rate, up from 46 percent in 2000, 41 percent a decade ago, and 34 percent two decades ago, according to economist Michael Jacobs at the city’s independent budget office. A few bad years for the city’s wealthiest translate into a few terrible years for their home base.

    Cutting a $5 billion deficit—let alone an even larger one—is a formidable task even when done slowly. Cutting such a deficit in a hurry two years from now, under an inexperienced mayor, will endanger the city’s vitality. It’s not too late for Bloomberg to prepare the budget for a painful economic adjustment, and not just by cutting around the edges of the “controllable” budget, as he’s prudently done this year and last.

    The first principle is to do no harm on the tax side. Bloomberg will allow a temporary property-tax cut to expire, and he has told the Times: “If all else fails, we’re not going to walk away from providing services, and only then would I think about a tax increase, and my hope is that we’ll avoid it.” He’ll have to: while the city has proved that it can squeeze higher taxes out of a phenomenal growth industry, that trick won’t work on an industry that’s stagnant or in decline. New York’s sky-high income taxes for businesses and residents already put the city at a huge disadvantage, since they keep away lower-paying jobs from media, technology, and other industries that otherwise might be attracted by lower housing costs and commercial rents in the coming years. The city can’t afford to make this disadvantage any worse.

    Second, the mayor must carefully manage his budget cuts. This year, he proposed largely across-the-board cuts of about 6 percent in projected spending, covering everything from police and sanitation to homeless services and education. He also enacted a 20 percent slash to the long-term capital budget, which funds physical infrastructure. But this strategy won’t work for long. Vital services can’t withstand deep cuts. The mayor must not alienate the middle class, whose tax revenues he needs, and that means protecting the police department, cleaning streets, and keeping libraries open. (His May delay in hiring 1,000 new police officers for more than a year, even as New Yorkers are becoming wary of crime again, is worrisome.) Further, failing to fix decaying infrastructure isn’t a way to save money. It’s no different from borrowing to pay for other expenses, since waiting will worsen deterioration and mean more expenses later.

    So as Bloomberg readies his final budget over the next year, he’ll have to choose the deepest cuts to projected spending carefully, even though it requires fighting the city council, which nixed half his proposed cuts this year and especially protected education. Rising education spending under both Bloomberg and Giuliani hasn’t improved scores on national tests, after all. And within the capital budget, the city should reduce its spending on economic-development and affordable-housing subsidies in order to fund things like roads and transit adequately. Furthermore, New York pols should stop regarding the operating and capital budgets as unrelated. Ten percent of Medicaid’s $6 billion annual take would go a long way toward upgrading the city’s roads and subways. Last, tens of millions of dollars in politically connected earmarks by both the mayor and the council are unsavory in good times and unconscionable in bad.

    But ultimately, the mayor can’t fix the city’s budget without addressing its “uncontrollable” half, whose growth will be responsible for three-fourths of the deficit in three years’ time. Bloomberg—and his successor—can use fiscal stress to advantage in bargaining for changes in city contracts. In the past, in fact, the city’s biggest bargaining gains have come during fiscal turmoil. As Charles Brecher and Raymond D. Horton noted in their 1993 book, Power Failure: New York City Politics and Policy Since 1960, the city won sanitation productivity gains in 1981, while it was suffering the fallout from the fiscal crisis of the 1970s, and a less costly pension tier two years later. While police officers won a raise this year that was necessary to attract recruits, the mayor must not let the city’s other unions bring home similar gains through contract renegotiation.

    The city’s contract with more than 100,000 non-uniformed workers expired this spring, presenting an opportunity. New York should negotiate to get this union, DC-37, to allow new employees to accept a pension plan in which the city contributes to workers’ private accounts, rather than guarantees a pension for life. The independent budget office estimates sizable budget savings here—nearly $100 million annually—within half a decade. Requiring health-insurance-premium payments of 10 percent from these workers and retirees would save half a billion dollars more; extending the workweek from 35 and 37 hours to 40 (imagine!) would net another half-billion, savings that the next administration will dearly need if Wall Street doesn’t roar back. The mayor (and his potential successors) must impress upon unions that their members won’t get a better deal if they wait.

    But why the urgency? After all, New York has huge advantages today. Half a century ago, suburban growth was driven by cheap fuel, fast commutes, and low crime. Today, suburbs are choked off by congestion, $5-a-gallon gas, and bad public schools. The city’s governance approach is also different. If crime starts to rise, we know what to do: aggressively police neighborhoods and prosecute and sentence defendants appropriately. And the city’s new citizens—many of whom have invested their lives’ savings in their homes—should help politicians keep some focus, counterbalancing to some extent the organized pressure to sacrifice all else for education spending. The city’s budget has safety latches, too. New York’s fiscal near-death in the seventies spurred the state to impose extraordinary oversight and brought about local changes. The city can’t borrow much today for operating spending. It must balance its budget annually and project four years’ worth of expected spending and revenues, submitting the results to a state board.

    Yet these advantages aren’t limitless, as recent high-profile shootings in Harlem and Far Rockaway indicate. If a mayor lets crime spiral out of control over a crucial one- or two-year period, it will be harder to control later. The middle class won’t be patient for long if its voice isn’t heard, and the city’s “global” upper class is much more transient than it was 40 years ago. Plus, with one-third of the population leaving every decade, New York must continually attract new residents. As for city finances: no amount of regulation can guard against complacency. The city couldn’t have balanced its budget this year and reduced next year’s deficit if not for the huge surplus that Wall Street provided last year, before it ran out of steam. The city doesn’t have to default on its bonds to get into trouble, as it nearly did three decades ago, moreover. Sacrificing quality of life so that it can pay those bonds would do as much damage. Finally, if the city does need help, it can’t look to New York State to bail it out, as it did 33 years ago: this time around, Albany might be in equally dire straits.

    Even if we do all the hard work of fixing the budget and in two years’ time, Wall Street is defiantly humming along, once more channeling record tax revenues into the city’s coffers, the steps that we take today won’t have been wasted. By acting now, Bloomberg will enable his successor to consider income tax cuts and infrastructure investment. Just as we prepare for a terrorist attack that we hope will never come, we have to prepare for a fiscal and economic crisis that we hope will never come. The risk is real.

    Nicole Gelinas, a City Journal contributing editor and the Searle Freedom Trust Fellow at the Manhattan Institute, is a Chartered Financial Analyst. This article appeared in the Summer 2008 City Journal.