Category: Economics

  • Public Investment, Decentralization and Other Economic Lessons from the New Deal

    The first lesson to be learned from this earlier era is that a large middle class requires an economy that generates a broad base of jobs paying middle-class wages. The New Dealers were not opposed to “rigging” the labor and financial markets to achieve this result. New Deal progressives believed the economy should exist to serve society, not the other way around, and that the government has a duty to shape the economy to meet middle-class aspirations. A high-wage, middle-class society would, in turn, be good for the economy: living wages would not only ensure adequate demand for the economy but in so doing would spur new investment and productivity growth, creating a virtuous circle of rising living standards.

    The belief of New Deal progressives in an economy that could create good middle-class jobs stemmed in part from their resistance to large social welfare subsidies to individuals, on the grounds that this would encourage an unhealthy dependence on the state. Moreover, even though they favored progressive taxation, New Dealers were skeptical of a society dependent upon the permanent redistribution of income. The principal goal of many New Deal programs was not to relieve the conditions of poverty -although they often did so – but to build physical and human capital that would allow people to escape permanently from poverty.

    Thus New Dealers emphasized government programs that expanded education, spread property ownership, invested in America’s common physical and knowledge capital, and seeded the industries of the future. It was not perfect, in large part because it preceded the civil rights revolution and thus left out millions of African-Americans, but it did build the largest and most secure middle class America has ever known.

    Today we see the consequences of a much different way of thinking about the economy and society. Over the past two decades we have been told that globalization is an immutable force and that we must bend to its demands, embracing the agenda of free trade, financial deregulation and less progressive taxation. The best we can do, we’re told, is to let globalization run its course and compensate the losers, even though no amount of new social welfare measures could compensate for the loss of millions of good-paying manufacturing jobs. Thus, without any real debate, America’s political elites have chosen for us a highly stratified, low-wage society with great costs to our middle-class way of life and to our productive economy.

    The second New Deal principle is about achieving a high-wage economy and at the same time more widely distributing the capital and skills for wealth creation. The principal policy tool the earlier generation used was massive public investment and public building. The public investment programs they pursued not only created many new middle-class jobs but also laid the foundation for a more productive economy, which led to even more middle-class jobs.

    Agencies like the Tennessee Valley Authority in the 1930s and ’40s were followed by even more extensive public investment initiatives in the postwar years. From 1950 to 1970, the government spent more than three percent of GDP on public infrastructure alone. It built everything from highways to schools, power systems to parks.

    Throughout the New Deal era, public investment was America’s way of enacting industrial policy. It was understood that public investment paid for itself many times over. The GI Bill alone generated returns of up to $7 for every dollar invested. And because it generated returns to the economy and society, New Dealers in the postwar period were not afraid to raise taxes or to borrow in order to ensure adequate levels of public investment. And borrow they did, even though the national debt was a much larger percentage of GDP than it is now.

    For the past few decades, however, we have made a very different choice. As concerns over the budget deficit have grown, and as tax-cutting mania has taken hold, we have cut back on public investment. Since 1980 we have devoted less than 2 percent of GDP to public infrastructure and have allowed federal spending on basic research and development to decline as a percentage of GDP as well. As a result, a backlog of public investment needs – clogged roads and ports, collapsing bridges and levees, uneven broadband access, an antiquated air traffic system, an undersized energy infrastructure – has begun to cut into our economic growth and undermine our efficiency.

    A third principle of middle-class America that the New Deal offers us relates to the concentration of power and capital. Earlier progressive reformers distrusted such concentrations. Not only did they threaten democracy, they also warped the economy and distorted consumption and investment. Government therefore must be a strong countervailing force to big business and oligarchic power, and must be organized so that it cannot be captured by one economic group at the expense of another or the general public.

    The New Dealers were particularly concerned about the power of Wall Street and the financial community. They feared a national credit system that was dependent on Wall Street bankers, whose interests were not always aligned with the needs of homeowners, farmers and small and medium-sized producers. They therefore sought to democratize capital by creating myriad credit institutions that would ensure that all regions and sectors of the economy had access to capital. They created a variety of federally subsidized credit programs to enable people to construct homes and start businesses and to allow states and municipalities to build schools and modernize infrastructure. It was here that the New Deal was most creative – combining a strong federal state with the local and regional decentralization of capital and the local and regional control of these programs and institutions.

    As with other first principles of a middle-class America, we have seen a reversal of priorities over the past few decades, as big financial institutions have again asserted their influence over the economy and economic policy. The new power of Wall Street has been evident in its successful push for financial liberalization and deregulation, in the emphasis accorded the deficit and concerns about inflation as opposed to full employment, and until recently in Washington’s preference for a strong dollar, which favors financiers over real producers. This triumph of Wall Street over Main Street has been responsible in part for the hollowing out of the tradable-goods sector and for the asset bubbles and predatory lending that have wreaked havoc on the economy. Indeed, one of the first things the New Deal would have us do is re-regulate the financial system and put the interests of the productive economy over those of Wall Street.

    In all these respects, whether it be high wages, public investment or the decentralization of financial power, the New Deal succeeded because it changed the way the economy worked. And it did so by marrying progressive reforms with Americans’ preference for independence, whether from government subsidy or big-business paternalism. This is the enduring lesson of the New Deal.

    Sherle Schwenninger directs the New America Foundation’s Economic Growth Program and the Global Middle Class Initiative. He is also the former director of the Bernard L. Schwartz Fellows Program.

  • New Deal Investments Created Enduring, Livable Communities

    Growing appeals for more public infrastructure investment make two critical claims: that this would help stimulate the economy in the short run while making our country more productive over the long run. Unlike tax rebates and other short-term stimulus, a major infrastructure investment program can have powerful effects on community life beyond boosting spending at the local Wal-Mart.

    I thought about this recently when I visited my boyhood hometown of Wishek, North Dakota. Wishek is a small, farming town of 1,200 people nestled in the gently rolling hills of the central Dakotas, about 17 miles from the South Dakota border. Its population is made up largely of people who trace their origins to German immigrants from Russia. These people previously were recruited by Catherine the Great to farm the steppes near the Black Sea.

    Seeing a greater opportunity in North America, these Germans started to arrive in 1885 to homestead the Dakotas’ deep sod prairie – a glacial moraine of earth and rock. They were lured by the romantic thrill of developing a “Territorial Empire” that later became the states of North and South Dakota.

    This dream was widely realized by the 1920s but all but dried up and almost blew away during the drought-ridden thirties. That dream would have extinguished if not for the enlightened programs of the New Deal — from soil conservation to loans for farmers to the Works Progress Administration (WPA).

    Growing up in Wishek during the 50s and 60s, you rarely heard about the New Deal. Life was good, pretty much everything you might imagine small town childhood to be in Middle America. The pace of life was easy; everyone knew everyone and almost everything about anyone. The fortunes of the community rose and fell with farm prices, sometimes fluctuating wildly from year to year. Kids roamed freely on foot and in cheaply fueled cars and there were ample opportunities to participate in almost every facet of community life. With a k-12 school population of about 500 to 600, any child or young person who wanted to could play some kind of role in sports, arts and music, or church related activities.

    Unknown to me — and not widely discussed by the 1960s — was how many of the community’s best and most used facilities were constructed by the WPA. During the drought years of the mid-‘30s, the city park was enlarged and developed with a children’s playground, clay-surfaced tennis courts and a light skating rink paid for by WPA. Later a $6,000 bond issue was floated to build a pool that was designed by WPA engineers and is still in use today. Then in 1942 a new auditorium — a truly landmark building for the community — was completed for use by the school district. The auditorium continues to be used today as a civic center for community and family events including Wishek’s premier regional event the annual Sauerkraut Days.

    This investment strategy in community infrastructure was played out across North Dakota. Elwyn B. Robinson, in his classic “History of North Dakota,” recounts the massive investment in North Dakota:

    “In North Dakota the W.P.A. alone, between July 1, 1935 and June 30, 1942, built 20,373 miles of highways and streets, 721 new bridges and railroads, 166 miles of sidewalks, 15,012 culverts, 503 new public buildings, 61 additions to public buildings, 680 outdoor recreation facilities, 809 water wells, 2 irrigation projects, 39 sewage treatment plants and 9 water treatment plants. It reconstructed 1,002 bridges and viaducts, 2,180 public buildings and 1,721 culverts.”

    To be sure, today is not the “dirty” thirties of the Dust Bowl. It is also far different from the serene place of my boyhood in the 50s and 60s. Some of the old infrastructure needs maintenance while other infrastructure needs have changed significantly. A proposed wind farm just south of town, for example, has been delayed because of the lack of electric transmission capacity throughout the region. In addition, like many rural communities the major employment base is now in manufacturing and health services, pointing to the increasing and essential importance of broadband telecommunications, roads and air service that permit link places like Wishek with the national and international economy.

    Yet if we look about us, the legacy of New Deal endures to this day. It provides clear evidence of the impact that infrastructure investment can make on even the smallest of communities. Much of the current discussions about infrastructure investment too often focus on the giant projects and national implications. However, the case for a renewed investment agenda can be made most persuasively by pointing out what such investments have done for local communities — city or small town — in the past. And what they might have failed to become if there had never been a New Deal.

    Delore Zimmerman is CEO of Praxis Strategy Group and Publisher of www.newgeography.com.

  • The New Deal & the Legacy of Public Works

    Almost completely ignored in the press this year has been the 75th anniversary of the New Deal. Social Security, public housing, school lunches, deposit insurance, labor relations standards and banking regulations are among its many enduring legacies. On this anniversary, it is worth looking at the public works programs that constructed roads and buildings that still exist in every county in America.

    In a nation where a quarter of the adult population was unemployed, the immediate goal of the New Deal was to provide temporary relief for Americans who were destitute and put them back to work. The failure of the Hoover Administration to either curtail the Depression or inspire people created a political climate for dramatic action.

    During FDR’s first 100 days – called the “First New Deal” by historians – a truly impressive list of legislation was passed. Prohibition ended, the Tennessee Valley Authority was created eventually bringing electricity and development to an impoverished area of the South, and controls were placed upon industrial practices, Wall Street, labor relations and farm output. The Civilian Conservation Corps, which ended up planting two billion trees across the country, was founded. A historian would be hard pressed to find a more energetic first 100 days of any administration.

    Yet one of its most far-reaching accomplishments was the Federal Emergency Relief and National Industrial Recovery acts which created the bureaucracy to institute public relief by funding large-scale public works. Under the system, states applied for grants from the federal government. Over the next ten years, the government would spend nearly $9 billion dollars though the Civil Works Administration (CWA), Public Works Administration (PWA) and the Works Progress Administration (WPA).

    The depth and social unrest created by the Depression provided motivation for New Deal officials to act quickly and decisively. The official who was the center of the action was Harry Hopkins. A hyper-competent social worker who had created a program to deliver services to mothers with dependent children in New York City and founded the American Association of Social Workers, Hopkins jumped into his role as head of federal relief with tremendous vigor. After a five-minute meeting with Roosevelt on his first day of work in May of 1933, he was dispatched to a cockroach-infested building on New York Avenue where, by the end of the day, he had dispensed with $5.3 million in aid to eight states. In a year’s time, Hopkins had created a jobs program that spent a billion dollars and provided badly needed jobs to over three million people during the cold winter of 1933 (the average wage was $13 a week). He spent money quickly – perhaps too quickly, some maintained – but his focus was to respond to FDR’s demand to quickly create jobs and alleviate misery in the country.

    But Hopkins was not a welfare statist. His career as a social worker had taught him that individuals did not want to be “on the dole,” living off the largesse of the state. By finding work for unemployed breadwinners, Hopkins believed he could keep families strong and enable them to retain their pride despite the hard times.

    This psychological aspect should not be underestimated. The Depression was more than a huge decline in GDP, vast unemployment and lost industrial output – it was a great identity crisis for a nation that placed great value on self-sufficiency and self-reliance. Look at New Deal art (another achievement of the New Deal are all the beautiful murals still in existence created by government funded artists) and you will see a glorification of labor. Frescos from San Francisco to New York depict colorful scenes of men hard at work.

    Today bureaucrats stress cost-effectiveness ratios, but New Deal reports were most concerned with how many jobs a project provided. Conservative critiques of the New Deal for a mixed record of achieving economic growth often miss this critical point. The official report of WPA projects in San Francisco, for example, lists as its main achievement how “the program contributed to the continuance of the normal standards of living of the working man’s family in San Francisco and maintenance of the courage and morale of the ordinary citizens through a most distressing period.” Expenses for projects are listed not just in dollar amounts spent but also in the number of “man hours” provided to workers.

    When Roosevelt ran for re-election the first time in 1936 (“Four Years Ago and Now” was his campaign slogan), he could claim six million jobs had been created in the last three years. He could point to a doubling of industrial output and the creation of a Farm Credit Administration that on an average day saved 300 farms from foreclosure. Still, eight million people were still out of work in 1936 and the public works programs, historically audacious they were, did not solve many of the nations entrenched economic and social problems. Roosevelt himself did not want his public works programs to compete with private industry or to create dependency on the state.

    Yet, looking back at the WPA and its companion public works agencies, the list of lasting contributions to the nation’s infrastructure are indeed impressive: 78,000 bridges, 650,000 miles of roads, 700 miles of airport runways, 13,000 playgrounds, hundreds of airports built and 125,000 military and civilian buildings were constructed. The roads and public works constructed by the WPA and PWA ended up being lasting infrastructure investments.

    However, perhaps the New Deal’s most enduring achievement was creating a sense of unity at a time of unparalleled economic crisis. Whereas the nation had previously elevated Horatio Alger -style self-reliance, the New Deal tapped into the creative industrial potential both of common unskilled laborers and thousands of skilled and creative workers. It created a sense of pride among millions who for the rest of their lives could point to public buildings they helped design and build, as well as the roads they laid out and paved.

    The 1930s produced the Hoover and Grand Coulee dams, the Golden Gate and Bay bridges, La Guardia Airport and the San Antonio River Walk. Besides some luxury high-rises, high-tech sports stadiums with retractable roofs and edgy art museums, what great things have we achieved lately?

    Andy Sywak is the articles editor for Newgeography.com.

  • Progressives, New Dealers, and the Politics of Landscape

    One of the greatest ironies of our time is the fact that today’s leading progressives tend to despise the very decentralized landscape that an earlier generation of New Deal liberals created.

    Franklin Roosevelt and his successors from Harry Truman to John F. Kennedy and Lyndon Johnson sought to shift industry and population from the crowded industrial centers of the Northeast and Midwest. They did this through rural electrification based on hydropower projects, factories supplying the military and federal aid to citizens seeking to buy single-family homes in low-density suburbs.

    This is precisely the environment – which brought so much opportunity and improved living conditions to so many – that today’s progressives so often despise. Since the 1960s, environmentalists, for example, have waged a campaign against the great dams that symbolized New Deal economic development policies. Artificial lakes that generate electricity for millions of suburban homeowners and businesses, and have brought an end to devastating, cyclical floods, are condemned by progressives for having wiped out local fauna and flora. And it goes without saying that the middle-class swimmers, picnickers and motor-boaters that enjoy government-created lakes on weekends are… well, vulgar.

    Similarly, the defense plants that the Roosevelt, Truman and Kennedy-Johnson administrations scattered throughout the country are often lambasted as emblems of the fascistic “military-industrial complex,” part of a wicked “Gun Belt.” In fact, industry is increasingly seen as undesirable by today’s Arcadian progressives, who appear to believe that it would have been better to leave the farmers of rural America as quaint specimens of authentic folk life.

    But nothing riles the progressives of today than the low-density, single-family home suburbs made possible by New Deal liberal homeownership policies. Since the 1950s, intellectuals on the left have been bemoaning the alleged cultural sterility and conformity of the suburbs. Now anti-sprawl campaigners allege that the suburbs are also destroying the planet.

    So the question is: How did the American left, in a short period of time, come to repudiate the New Deal and the American landscape it created? The answer is simple: today’s center-left, which calls itself progressive rather than liberal, is not the heir of New Deal liberalism. It is the heir instead of early twentieth century elite Progressives, who were shoved aside and marginalized during the heyday of New Deal liberalism.

    The original Progressives were overwhelmingly professionals and patricians of old Anglo-American stock in the Northeast and Midwest, many of them the children of Protestant clergymen, teachers or professors. They despised the nouveau riche of the Gilded Age, but also tended to view European immigrants and white and black Southerners as benighted primitives.

    Their vision of the ideal society, influenced by the Hegelian Idealist culture of Bismarckian Germany, was one in which a university-trained elite ran everything with minimal interference by ignorant voters and crass politicians. As heirs of the moralistic Northern Protestant Whig and Republican traditions, these Progressives also had a strong interest in the social engineering of private behavior, from prohibition to eugenic sterilization.

    From Reconstruction until the Depression, Progressive moralism and elitism alienated European immigrants and rural Southerners and Westerners alike. This benefited the industrial capitalists of the dominant Republican party. Franklin Roosevelt created a powerful, but fundamentally unstable, Democratic majority by adding many former Republican Progressives to the old Democratic coalition of Northern white “ethnics” and white Southerners.

    Yet in the process Roosevelt helped undermine many of the signature initiatives of the progressives, starting with the repeal of Prohibition, a policy loathed by German and Irish Catholic voters. It signaled a repudiation of the Whig-Republican-Progressive ambition to use the federal government for moral reform and social engineering. (FDR’s tactical appeasement of Southern segregation had a similar tactical logic).

    Another goal of Progressives, economic planning, died with the collapse of the National Recovery Administration (NRA) in the first Roosevelt term. Jettisoning the Progressive dream of a planned economy run by technocrats, the Roosevelt administration instead focused pragmatically on state-capitalist public infrastructure projects like the Tennessee Valley Association (TVA) and the Lower Colorado River Association (LCRA).

    Plans for an all-powerful executive civil service subordinate to the White House – a progressive reform that FDR unwisely favored – were rejected by a Congress jealous of its prerogatives and suspicious of executive power. Finally, nanny-state supervision of the poor, another Progressive theme, found little sympathy among New Deal Democrats, who preferred universal social insurance to means-tested public assistance, and preferred employing the able-bodied poor in public works to what FDR called “the narcotic” of the “dole.”

    The New Deal ultimately left little of the old Progressive project but created what could be considered a Golden Age that lasted until the 1970s for the white lower middle class majority. Progressive intellectuals and activists, however, sensed that they had been marginalized. Over-represented in the prestige press and the universities, they increasingly denounced what they saw as the vulgarity of the New Deal’s constituency.

    The assault on the suburbs was one of the most powerful expressions of this discontent. It was led by two figures. One was Jane Jacobs, the romantic chronicler of dense urban life, and its villain in New York’s highway-building Robert Moses. A rival school, headed by Jacobs’ enemy Lewis Mumford, sang the praises of planned “organic” villages – “highwayless towns” connected by “townless highways.” The Mumfordian strain of Progressive planning is represented today by the New Urbanism, with its hyper-regulated low-rise pedestrian communities.

    The resurgent progressives also clung to their vision of a society in which an enlightened, nonpartisan elite governs the ignorant masses from above. The Civil Rights Revolution, and the era of judicial activism that followed, permitted progressives to transfer power from the elected political class to the federal judiciary. By the 1970s and 1980s, federal judges were regulating practically all aspects of American life. Social engineering schemes like busing for racial balance and race-based affirmative action, which “color-blind” New Deal liberal opponents of segregation like Hubert Humphrey and Lyndon Johnson opposed, now became critical pillars of progressive ideology.

    The New Dealers had been ardent conservationists, but their conservationism focused not only on nature but also the well-being of people. New Deal soil conservation and agricultural productivity policies allowed the amount of land in cultivation to decline, freeing up vast tracts of land for wilderness or habitation. Farmers, middle class suburbanites and nature all gained.

    This approach is repudiated by most contemporary progressives, who know nothing about farms except that they are cruel to livestock. By the 1970s many progressives abandoned liberal conservationism for radical environmentalism, which seeks to protect nature by separating it from humanity and industry. Radical environmentalism tends to shade into misanthropy, as in the proposal by two New Jersey environmentalists to turn much of the Great Plains into a human-free “Buffalo Commons.” (Curiously, nobody seems to have proposed evacuating New Jersey in order to create a “Migratory Bird Park.”) The radical Green goal of “rewilding” North America by creating “wildlife corridors” from which humans are banned repudiates the New Deal liberal vision of allowing working-class Americans to enjoy the scenery of national parks.

    So in every respect except racism and opposition to immigration, today’s progressives are genuine heirs not of the New Deal liberals but of the capital-P Progressive economic planners and social engineers of the early twentieth century. Even their social base is the same as in 1908 – college-educated professionals, particularly those in the nonprofit sector and education, like public school teachers and academics.

    This class – enlarged ironically by New Deal liberal programs like the G.I. Bill and student loans – has been increased in number by upwardly-mobile Americans to whom mass university education imparts a blend of the worldviews of old-fashioned Northeastern progressives and the old Bohemian left-intelligentsia. This enlarged college-educated professional class has allied itself with African-Americans and Latinos in the identity centered post-McGovern Democratic party.

    With perfect symbolism, the two bases of the alliance of white progressives and nonwhite Democrats – college campuses and inner cities, allied against the middle-class and working-class suburbs – correspond to the alternate urban utopias of Lewis Mumford and Jane Jacobs respectively, if we consider the college campus to be a Mumfordian paradise.

    With good reason, then, today’s progressives despise the suburban, middle-class America created by yesterday’s New Deal liberals. Today’s progressives may invoke the New Deal, but they are the heirs not of mid-century liberals like Franklin Roosevelt and Lyndon Johnson but rather of the Progressive social engineers who believed that enlightened elites should alter both the built environment and human behavior to meet their social goals. Some things never change.

    Michael Lind is the Whitehead Senior Fellow at the New America Foundation. He is the author, with Ted Halstead, of “The Radical Center: The Future of American Politics” (Doubleday, 2001). He is also the author of “Made in Texas: George W. Bush and the Southern Takeover of American Politics” (New America Books/Basic, 2003) and “What Lincoln Believed” (Doubleday, 2005). Mr. Lind has been an editor or staff writer for The New Yorker, Harper’s Magazine, and The New Republic. From 1991 to 1994, he was executive editor of The National Interest.

  • New York’s Next Fiscal Crisis

    Mayor Bloomberg needs to prepare the city for the crash of the Wall Street gravy train.

    New York City, dependent on Wall Street for a quarter-century, has gotten used to harsh cyclical economic downturns, including the lending contraction in the early nineties and the bursting technology bubble in 2000. But today’s turmoil may be not a cyclical downturn for Wall Street but instead the beginning of an era of sharply lower profits as it rethinks its entire business model. If so, it will produce the biggest economic adjustment and fiscal challenge that New York has confronted in more than three decades. If the city’s leaders don’t recognize this challenge and move quickly to meet it, New York could soon face an acute fiscal crisis rivaling its near-bankruptcy in the mid-seventies.

    Such a fate—almost unthinkable to a city that has grown complacent about its world-class standing—could set Gotham back in the colossal strides that it has made over the past two decades in restoring its citizens’ quality of life. As Mayor Michael Bloomberg said in May, we must “pray that Wall Street does well.” But we’d better have a plan if it doesn’t.

    Wall Street bankrolled New York’s long recovery from the seventies because New York, through its long economic, fiscal, and social deterioration, managed to keep its position as the nation’s financial capital just as finance was about to take off. In the early eighties, the nation’s financial industry—particularly Wall Street—was feeling its way toward a sweet spot where it would stay for two decades. As Federal Reserve chief Paul Volcker brought inflation under control, creating a stable environment for financial innovation and a stable currency for the world’s savings, baby boomers and international investors flocked to U.S. markets. The Dow Jones Industrial Average tripled between 1982 and 1990, despite the ’87 crash, while the assets of securities brokers and dealers more than doubled as a share of America’s financial assets. The financial industry also saw a huge opportunity in Americans’ increasing love of debt, creatively packaging it into everything from mortgage-backed securities to junk bonds and then selling it to investors. Between the early 1980s and the early 1990s, the financial sector’s profits as a percentage of the nation’s income more than doubled. The sector’s pretax income as a percentage of all national income started a similar march upward. Profits at securities firms, while choppy, easily doubled between the early eighties and the end of the decade (all numbers are inflation-adjusted unless indicated otherwise).
    Graph by Alberto Mena.

    New York reaped massive rewards from Wall Street’s good fortune. The city’s financial-industry employment grew by 14 percent in the eighties—more than triple the job growth in its other private-sector industries. Jobs in the securities industry in particular, which had decreased in the seventies, grew by more than a third. Since these positions were high-paying, they had an outsize impact: by the late eighties, according to the

    Fed, financial services contributed nearly 23 percent of New Yorkers’ wages and salaries, up more than 60 percent from the previous decade. And financiers’ heavy spending supported other jobs, from restaurant workers and interior decorators to teachers and nurses.

    For evidence of how Wall Street started to lure newcomers to New York, look to Hollywood. Movies chronicling Gotham’s grim decline, like Taxi Driver (1976) and Escape from New York (1981), gave way to films portraying the heady excitement of making millions in the city, like Wall Street (1987) and Working Girl (1988). While much of the city remained grimy and dangerous, the excitement outweighed those factors for young, child-free baby boomers who paid high taxes without requiring many city services. The result: after hemorrhaging nearly 10 percent of its population between 1970 and 1980, New York gained nearly 4 percent back between 1980 and 1990. The city’s tax take in 1981 had been slightly lower than its take a decade before; but by 1991, it was raking in a third more than in 1981.

    This money allowed New York to reverse some of its bone-scraping seventies-era budget cuts and to invest in infrastructure without making the politically difficult choice of cutting deeply into social services. In the seventies, the city had laid off nearly 3,000 police officers and 1,500 sanitation workers; in 1985, Mayor Ed Koch hired 5,300 cops and almost 1,000 sanitation workers. In the 1990s, it was largely Wall Street’s breakaway success that gave Mayor Rudy Giuliani the financial resources to focus on making New York City safe again.

    If high finance found its sweet spot in the eighties, it reached dizzying sugar highs starting in the late nineties and continuing, after recovering from the tech bust and 9/11, until last year. The nation was awash in the world’s money, encouraging record lending and speculation as well as the creation of more financial products, which yielded banks massive profits. By 2006, the financial industry’s corporate profits as a percentage of the nation’s income had doubled once again.

    It seemed that nothing could go wrong for Wall Street once it had bounced back from the tech bubble’s burst. With the dollar serving as the expanding global economy’s reserve currency, banks had oodles of money to lend. Cheap Asian imports were keeping prices and inflation expectations low, allowing central bankers to justify low interest rates. Beginning in the nineties, traditional consumer banks—previously tightly regulated to protect government guarantees for their depositors—began taking investment risks that once had been confined to Wall Street. As time went on, investment banks became more dependent on fees from debt backed by home mortgages and other consumer products, further blurring traditional lines between investment and consumer banking.

    The financial world took advantage of the easy money and better technology. It booked high fees by designing ever more complicated “structured finance” products, backed by riskier home mortgages as well as corporate loans. Wall Street sold these products to international investors, who couldn’t get enough of American debt, by making a seductive pitch: the products were structured so intricately that even risky mortgages were as safe as government bonds, and they paid better interest rates. Further, if an investor ever had to sell a mortgage-backed security after he had purchased it from a bank, it was a cinch, since Wall Street had “securitized” individual loans—that is, taken thousands of them at a time, sliced them up, and turned them into easily tradable bonds of different risk levels.

    In addition to lending, Wall Street was borrowing at record levels so that it could take bigger and bigger risks with its shareholders’ money, making up for lower profit margins on businesses like equity underwriting and merger advisories. Wall Street’s borrowing as a multiple of its shareholders’ equity was 60 percent above its long-term average by the end of last year (with sharp increases over the past few years). Firms were taking even more risks than that figure indicates, setting up arcane, off-the-books “investment vehicles” with shareholders still vulnerable if something went wrong.

    As banks and financiers got unimaginably rich, so did the city. The finance industry’s contribution to New Yorkers’ wages and salaries topped out at over 35 percent two years ago. Last year, the city took in 41 percent more in taxes than it did in 2000, capping off an era of unprecedented revenue growth. While the city’s stratospheric property market—itself a function of Wall Street bonuses and easy money—drove much of that increase through property-related taxes, corporate tax revenues rose by 52 percent, personal income tax revenues by nearly 20 percent, and banking tax revenues by nearly 200 percent.
    Graph by Alberto Mena.

    But today, the financial industry may be entering a wilderness period of lower profits, employment, and bonuses. “Whether it’s financials as a share of the stock market or financials as a share of GDP, we’ve peaked,” ISI Group analyst Tom Gallagher told the Wall Street Journal in April. One measure of how this downturn differs from those in the recent past: some Wall Street firms, after their disastrous miscalculations, are operating today only because the Fed, as Bear Stearns melted down in March, decided to start lending to investment banks, which it doesn’t normally regulate or protect.

    A new alignment of global demographics, inflation expectations, and interest rates may spell long-term trouble for the city’s premier industry. A decade ago, cheap Asian goods kept prices and inflation expectations down; today, Asia’s growth is pushing them up. Ballooning energy prices and too-low interest rates threaten to yield sustained inflation. America now faces intense competition—particularly from the euro—for the world’s savings and investment, meaning that it can’t depend on attracting as large a portion of the world’s nest egg to keep interest rates down. “It is not credible that the world will revert to the same level of capital flow to the U.S. after the credit crunch is over,” Jerome Booth, research head of U.K.-based Ashmore Investment Management, noted recently. The Fed can keep official rates low only at the risk of inflation and more capital flight. The end of cheap money means that the market for future debt may shrink, squeezed by tougher borrowing terms, cutting off a crucial profit line for banks.

    Regulators, too, will be harder on the banks. Because investment banks now benefit from taxpayer-guaranteed debt, taxpayers must be protected. The feds probably won’t let firms borrow from private lenders at the levels that they have over the past decade, and it’s unlikely that they’ll let banks rely so intensely on short-term debt—which is cheaper, but riskier, than long-term debt. (Short-term lenders can flee quickly, as the Bear Stearns crash showed, because they have the option of yanking their money out of investments, often overnight, while long-term lenders are stuck with the bets that they’ve made.) Less borrowing means lower profits, and not just temporarily. Regulation might also curtail Wall Street’s lucrative business of complex derivatives, another huge area of risk. Plus, international stock listings continue to bypass New York for Asia and Europe because of the six-year-old Sarbanes-Oxley law, which imposes an unnecessary regulatory burden on companies publicly traded in the U.S., and also because the world’s growth has moved east. Such losses could be ignored only when debt and derivatives were making up for it.

    The skepticism of Wall Street’s own investors and clients, though, is the real deal-breaker. The most startling news out of the current crisis is that Merrill Lynch, UBS, and others didn’t know that they had taken certain risks for shareholders, lenders, and clients until they were already reporting tens of billions in losses. Clients and investors shouldn’t mind losses when they understand the risks that they’re taking. They do mind if, after the firm that they’re investing in or doing business with has insisted that its careful models and safeguards protect them, it turns out that its only protection from bankruptcy is Uncle Sam.

    International investors will not again blindly trust Wall Street’s ability to assess and allocate risk. “Market participants now seem to be questioning the financial architecture itself,” Fed governor Kevin Warsh said recently. Don’t forget the stock market’s performance, either: it hasn’t been impressive over the past eight years.

    New York City, so dependent on the financial industry’s continued growth, should shudder.

    If Mayor Bloomberg and his successor view the current downturn as another short blip, rather than a long readjustment of the financial industry’s share of the economy, and they turn out to be wrong, the decisions that they make could prove ruinous. Over the past two and a half decades, whenever the financial industry underwent one of its periodic downturns, New York stuck to the same playbook: jack up taxes to make up for lower tax revenues, cut spending a bit, and wait for the financial industry to come roaring back. During the early nineties’ credit crunch, Mayor David Dinkins slapped two temporary surcharges on the income tax; one still persists. In 2002 and 2003, after the tech bust and 9/11, Bloomberg temporarily hiked income and sales taxes and permanently hiked the property tax.

    Those tax increases were never wise because they kept less profitable industries and their lower-paid employees out, making New York ever more dependent on finance. Even the financial industry didn’t ignore the tax hikes; partly in response, it sent back-office, five-figure-a-year jobs to cheaper cities, and as a result, New York today has less than one-fourth of the nation’s securities-industry jobs, down from one-third two decades ago. Still, the industry was growing so fast that it and its workers could withstand the higher costs posed by the tax increases.

    But what was once merely unwise could be calamitous today. Consider the last time that New York tried raising taxes when its premier industry was about to shrink—the mid-sixties, when the city’s leaders arrogantly believed that its record population of 7.9 million people, in the middle of a record economic boom, wouldn’t mind paying for a breathtaking array of Great Society social programs, as well as fattened public-employee benefits. In 1965, the New York Times had reminded city leaders that “New York City’s economy is prospering,” and its editorialists decreed a year later that “strong medicine, specifically higher taxes, is the remedy for restoring New York’s financial health.”

    Mayor John Lindsay, with state support, enacted the city’s first personal income taxes, as well as new business taxes, in 1966. New York went on to lose half of its 1 million manufacturing jobs between 1965 and 1975—a trauma as great as Wall Street’s troubles today, because in 1960, manufacturing had accounted for more than a quarter of New York’s jobs. At the same time, the city was also losing its collection of corporate headquarters and their legions of well-paid employees. By the end of the seventies, half of its 140 Fortune 500 companies had fled the city.

    New York didn’t anticipate this change or understand its significance as it was happening. Well into the early seventies, the city thought that it could keep taxing and spending because the future was bound to mirror the “Soaring Sixties.” City officials argued that fleeing companies were evidence of New York’s success because some companies just couldn’t afford to be here any longer. Worse, the city’s leaders didn’t understand how quickly urban quality of life could deteriorate: as they focused on social spending rather than vital public services like policing, murders shot up from 645 in 1965 to 1,146 just five years later. Nor did they realize how quickly middle-class residents would flee, taking their tax dollars with them.

    For a while, the city and its lenders found a way around these miscalculations. New York stepped up its borrowing against future tax revenue in the late sixties and early seventies, paying the banks back when the following year’s tax receipts rolled in. The foolishness of such a plan was always obvious: three years before the city skirted bankruptcy, the Times reported, Albany skeptics warned that large-scale temporary borrowing was folly. But even as economic and fiscal conditions worsened, the city kept spending and spending. In 1970, city leaders were heartened by the judgment of bond-rating agency Dun & Bradstreet, which noted New York’s “extraordinary economic strength . . . and long-range credit stability.” (Then, as now, ratings agencies weren’t good at predicting acute crises.) In 1972, as what had once seemed like a short downturn stretched on, Times editorialists encouraged complacency, noting that “after all the years of . . . warnings of imminent municipal bankruptcy, it is reassuring to find investors . . . bullish about the outlook for New York City’s long-term financial soundness.”

    By late 1974, however, as rising spending outpaced tax receipts, a crisis was inevitable. It came the following spring, when New York wrestled with a budget deficit that equaled 14 percent of its expected spending and creditors cut the city off. Forced to throw itself at the mercy of the state and federal governments for emergency funding, Gotham gutted trash pickup and policing, murders climbed to 1,500 annually, and more residents left.

    Millennial New York likes to think of itself as vastly superior to the troubled city of the 1970s. But once again, on the brink of what may be a major economic upheaval—this time, involving the financial sector rather than manufacturing—it is reacting with disturbing complacency. And yet again, the mayor has allowed the budget to swell dangerously during the good times, which could push leaders to make the same mistakes as were made in the sixties and seventies: raising taxes at precisely the wrong time and slashing vital services under pressure to keep up social and public-employee spending.

    During the past decade, New York used the cash that Wall Street was showering on the city not to ease its long-term problems but to make them worse. In 1974, under Lindsay, the city devoted one-quarter of its budget to social spending: welfare, health services, and charities. Today, the city continues to spend one-quarter of its budget on social services (not including the public schools’ vast social-services component). Nor has New York reformed the pensions and size of its still-huge public workforce, reduced debt costs, or cut Medicaid costs fueled by Albany’s powerful medical lobby, which helps ensure that New York’s per-capita Medicaid spending—rife with waste and fraud—is the highest in the nation. Even after adjusting for inflation and considerable population recovery, the city’s tax-funded budget for 2008 is 22 percent higher than it was at its Lindsay-era peak. While spending rose just 9 percent or so during the Giuliani era, it has risen three times as fast since—the highest rate since Lindsay left office.

    Echoing a time when people said that New York was ungovernable, Mayor Bloomberg often calls these costs “uncontrollable.” But there was no better time to start controlling them than during the past half-decade, an era of unparalleled prosperity and public safety when Bloomberg had an opportunity available to no other modern mayor. If he had successfully bargained with Albany and union employees to require new workers to contribute more to their pensions and health benefits, we would have seen the results by now. Likewise, if he had worked with Albany to rein in Medicaid spending—now nearly $6 billion a year—the city could have spent some of that money to build schools and fix roads, reducing debt costs. Instead, we’ve got a politically powerful public workforce that commands benefits belonging to another era and that remains vulnerable to corruption despite this generosity, as recent construction investigations show.

    The mayor has also sharply increased spending in one area that was easily controllable: the city’s public schools budget, up by more than one-third since 2001 even though enrollment is down 4 percent. Much of that spending funds plusher teachers’ salaries and the higher pensions that follow, plus borrowing costs for school construction and rehab, making it harder to cut than it was to increase. Today, the education budget is nearly $21 billion: one-third of the entire budget, and more than police, fire, and sanitation combined.
    Graph by Alberto Mena.

    Bloomberg’s failure to control costs during the boom means that big trouble looms. The city projects that spending over the next three years will increase by more than 20 percent, while revenues will increase by just 13 percent (neither figure is adjusted for inflation). If that happens, a $5 billion–plus deficit—more than 11 percent of tax-funded spending—will result in two years’ time. Moreover, that’s the best-case scenario, based on the city comptroller’s prediction of low growth this year and next and a quick, though weak, recovery after that. But the mayor expects a 7.5 percent economic contraction this year, followed by a smaller contraction. If that happens, revenues might not rise as much as 13 percent; in fact, they might shrink, as they often did in the seventies (and again in 1990 and 2002).

    This risk is especially acute because our progressive tax structure and the growth in wealth of our richest citizens over the past two decades make New York highly dependent on the rich, whose income is volatile. Two years ago, the top 1 percent of taxpayers paid nearly 48 percent of the city’s personal income taxes even after adjusting for the temporarily higher tax rate, up from 46 percent in 2000, 41 percent a decade ago, and 34 percent two decades ago, according to economist Michael Jacobs at the city’s independent budget office. A few bad years for the city’s wealthiest translate into a few terrible years for their home base.

    Cutting a $5 billion deficit—let alone an even larger one—is a formidable task even when done slowly. Cutting such a deficit in a hurry two years from now, under an inexperienced mayor, will endanger the city’s vitality. It’s not too late for Bloomberg to prepare the budget for a painful economic adjustment, and not just by cutting around the edges of the “controllable” budget, as he’s prudently done this year and last.

    The first principle is to do no harm on the tax side. Bloomberg will allow a temporary property-tax cut to expire, and he has told the Times: “If all else fails, we’re not going to walk away from providing services, and only then would I think about a tax increase, and my hope is that we’ll avoid it.” He’ll have to: while the city has proved that it can squeeze higher taxes out of a phenomenal growth industry, that trick won’t work on an industry that’s stagnant or in decline. New York’s sky-high income taxes for businesses and residents already put the city at a huge disadvantage, since they keep away lower-paying jobs from media, technology, and other industries that otherwise might be attracted by lower housing costs and commercial rents in the coming years. The city can’t afford to make this disadvantage any worse.

    Second, the mayor must carefully manage his budget cuts. This year, he proposed largely across-the-board cuts of about 6 percent in projected spending, covering everything from police and sanitation to homeless services and education. He also enacted a 20 percent slash to the long-term capital budget, which funds physical infrastructure. But this strategy won’t work for long. Vital services can’t withstand deep cuts. The mayor must not alienate the middle class, whose tax revenues he needs, and that means protecting the police department, cleaning streets, and keeping libraries open. (His May delay in hiring 1,000 new police officers for more than a year, even as New Yorkers are becoming wary of crime again, is worrisome.) Further, failing to fix decaying infrastructure isn’t a way to save money. It’s no different from borrowing to pay for other expenses, since waiting will worsen deterioration and mean more expenses later.

    So as Bloomberg readies his final budget over the next year, he’ll have to choose the deepest cuts to projected spending carefully, even though it requires fighting the city council, which nixed half his proposed cuts this year and especially protected education. Rising education spending under both Bloomberg and Giuliani hasn’t improved scores on national tests, after all. And within the capital budget, the city should reduce its spending on economic-development and affordable-housing subsidies in order to fund things like roads and transit adequately. Furthermore, New York pols should stop regarding the operating and capital budgets as unrelated. Ten percent of Medicaid’s $6 billion annual take would go a long way toward upgrading the city’s roads and subways. Last, tens of millions of dollars in politically connected earmarks by both the mayor and the council are unsavory in good times and unconscionable in bad.

    But ultimately, the mayor can’t fix the city’s budget without addressing its “uncontrollable” half, whose growth will be responsible for three-fourths of the deficit in three years’ time. Bloomberg—and his successor—can use fiscal stress to advantage in bargaining for changes in city contracts. In the past, in fact, the city’s biggest bargaining gains have come during fiscal turmoil. As Charles Brecher and Raymond D. Horton noted in their 1993 book, Power Failure: New York City Politics and Policy Since 1960, the city won sanitation productivity gains in 1981, while it was suffering the fallout from the fiscal crisis of the 1970s, and a less costly pension tier two years later. While police officers won a raise this year that was necessary to attract recruits, the mayor must not let the city’s other unions bring home similar gains through contract renegotiation.

    The city’s contract with more than 100,000 non-uniformed workers expired this spring, presenting an opportunity. New York should negotiate to get this union, DC-37, to allow new employees to accept a pension plan in which the city contributes to workers’ private accounts, rather than guarantees a pension for life. The independent budget office estimates sizable budget savings here—nearly $100 million annually—within half a decade. Requiring health-insurance-premium payments of 10 percent from these workers and retirees would save half a billion dollars more; extending the workweek from 35 and 37 hours to 40 (imagine!) would net another half-billion, savings that the next administration will dearly need if Wall Street doesn’t roar back. The mayor (and his potential successors) must impress upon unions that their members won’t get a better deal if they wait.

    But why the urgency? After all, New York has huge advantages today. Half a century ago, suburban growth was driven by cheap fuel, fast commutes, and low crime. Today, suburbs are choked off by congestion, $5-a-gallon gas, and bad public schools. The city’s governance approach is also different. If crime starts to rise, we know what to do: aggressively police neighborhoods and prosecute and sentence defendants appropriately. And the city’s new citizens—many of whom have invested their lives’ savings in their homes—should help politicians keep some focus, counterbalancing to some extent the organized pressure to sacrifice all else for education spending. The city’s budget has safety latches, too. New York’s fiscal near-death in the seventies spurred the state to impose extraordinary oversight and brought about local changes. The city can’t borrow much today for operating spending. It must balance its budget annually and project four years’ worth of expected spending and revenues, submitting the results to a state board.

    Yet these advantages aren’t limitless, as recent high-profile shootings in Harlem and Far Rockaway indicate. If a mayor lets crime spiral out of control over a crucial one- or two-year period, it will be harder to control later. The middle class won’t be patient for long if its voice isn’t heard, and the city’s “global” upper class is much more transient than it was 40 years ago. Plus, with one-third of the population leaving every decade, New York must continually attract new residents. As for city finances: no amount of regulation can guard against complacency. The city couldn’t have balanced its budget this year and reduced next year’s deficit if not for the huge surplus that Wall Street provided last year, before it ran out of steam. The city doesn’t have to default on its bonds to get into trouble, as it nearly did three decades ago, moreover. Sacrificing quality of life so that it can pay those bonds would do as much damage. Finally, if the city does need help, it can’t look to New York State to bail it out, as it did 33 years ago: this time around, Albany might be in equally dire straits.

    Even if we do all the hard work of fixing the budget and in two years’ time, Wall Street is defiantly humming along, once more channeling record tax revenues into the city’s coffers, the steps that we take today won’t have been wasted. By acting now, Bloomberg will enable his successor to consider income tax cuts and infrastructure investment. Just as we prepare for a terrorist attack that we hope will never come, we have to prepare for a fiscal and economic crisis that we hope will never come. The risk is real.

    Nicole Gelinas, a City Journal contributing editor and the Searle Freedom Trust Fellow at the Manhattan Institute, is a Chartered Financial Analyst. This article appeared in the Summer 2008 City Journal.

  • Impending Doom for the Heartland?

    The Financial Times recently made note of the biggest drop in commodity prices in 28 years. This, of course, is a fall from record highs and some analysts are continuing bullish forecasts. The Reuters/Jeffries CRB index has continued its decline the past few days:

    It’s a trend to keep an eye on.

  • Questioning Conventional Wisdom: Should Poor Folks Stay Put?

    There is reason to think again about the now-current idea of dispersing the population of poor folks in the Skid Row district of downtown Los Angeles and similar precincts in other cities across the U.S.

    There’s cause to pause over notions such as mixing “affordable housing” that’s priced in the range of working-class or poor folks alongside spiffy market-rate units.

    There’s some research going on that combines data analysis in the law-enforcement profession with efforts in the social sciences, and it’s far enough along to raise questions about some commonplace assumptions among policy makers.

    One questionable assumption is the notion that it’s best to do away with old-fashioned, densely developed centers of subsidized housing – places such as Skid Row, or the many areas of cities across the U.S. known as “the projects.” Conventional wisdom currently holds that such clusters on the low end of the socio-economic scale are best relegated to history and replaced with scattered sites.

    Here’s a simpler way of putting it: Recent years have seen government authorities ditch the old “projects” model – literally blowing them up, in some cases – in favor of programs that shift poor residents from the inner city to residences in outlying areas. They don’t bunch the poor folks together, at least not in the cheek-by-jowl way of the old neighborhood. The idea is to mix things up and put a relatively small number of poor folks into any given middle-class neighborhood that is safer and has better schools. The presumption is that spreading poverty out will give the poor a greater chance to work their way up the socio-economic scale.

    Such thinking bears a similarity to efforts by some public officials in Los Angeles who aim to make similar shifts possible based on regulations requiring builders to subsidize lower rents for certain numbers of units in their developments.

    It’s not exactly the same, and you can argue the finer points. But the truth is that the efforts to change the residential patterns of poor folks – and the talk of dispersing the social service agencies that serve low-income residents of neighborhoods such as Skid Row – aim for a goal that’s similar to the top-down approach of blowing up the projects and moving folks to places beyond the city’s center.

    Also similar is the reason behind some of the efforts to move poor residents out of the downtown areas of many cities: gentrification. Cities want to spruce up their historic cores. They want new retail and residential developments that will generate more tax revenue than any densely populated housing project or collection of low-rent residence hotels will ever provide. Public officials have often presented such efforts with a two-birds-with-one-stone argument – poor folks get to go off to nicer, safer neighborhoods and the city gets a shiny new trophy in a redeveloped downtown.

    There’s an article in the current issue of the Atlantic that looks at recent developments in Memphis, Tennessee, where sociological researchers have been comparing law-enforcement data on crime trends to recent programs to relocate poor folks from the inner city to outlying areas. Some of the findings have the researchers leaning toward a different two-birds-with-one-stone argument on subsidized housing. They think it might just be that both the folks who were shifted from those hard-pressed areas and their new neighbors far away from the inner city are worse off for all the manipulations.

    The research has not reached any definitive conclusions, and there are plenty of variables that must be considered with care. Still, there seems to be enough to raise serious questions about a trend in urban planning and public policy that has gone nearly unexamined for some time.

    The Garment & Citizen yields to the Atlantic on this matter, urging anyone who is interested to give careful consideration to the piece, “American Murder Mystery.”

    We also urge all involved in the debate to ask themselves a few questions:

    What is a neighborhood? Do common economic circumstances bring a sense of community that is necessary to any neighborhood? Is a poor neighborhood necessarily a bad neighborhood? If so, why?

    Jerry Sullivan is the Editor & Publisher of the Los Angeles Garment & Citizen.

  • The Entrenchment of Urban Poverty

    How high urban housing costs and income inequality have exacerbated urban poverty

    A few years ago, on a drive from New York to Washington, I turned off I-95 in Baltimore to see H.L. Mencken’s home. Abandoned row houses lined the street, some boarded up with plywood, others simply gutted. Signs offering fast cash for houses and a number to call for unwanted cars outnumbered pedestrians. It was a landscape of rot and neglect with few signs of renewal and investment.

    Writers have expended vast amounts of ink about the recent resurgence of cities, yet pockets of great poverty like West Baltimore have proven disturbingly resilient. Maryland has one of the nation’s lowest poverty rates, but is one of eight states where 70 percent of the poor are concentrated in one city. In most of the city’s schools, close to 50 percent of students qualify for federally assisted meals.

    Looking at data from the 2006 US Census American Community Survey, many urban cities have poverty rates that far exceed the national level of 13.3 percent. Bronx County tops the list at 29.1 percent. The city of St. Louis and Baltimore as well as Philadelphia, Wayne (Detroit), Kings County (Brooklyn) and Denver counties all have poverty rates hovering between 19 and 27 percent.

    The poverty in these communities testifies to a widening schism of income inequality distressingly common across America but most pronounced in the nation’s cities. Cost of living in cities is one key factor. The federal poverty threshold for a family of four in 2004 was only $19,157, but this number does not make an adjustment for the high rents that low-wage workers must pay to live in an urban environment.

    Deborah Reed of the Public Policy Institute of California found that the poverty rates in wealthy cities like San Francisco and Los Angeles were actually significantly higher than the official rate. In San Francisco, the poverty rate was 19 percent adjusted for housing costs compared to the official ten percent; Los Angeles had a 20 percent poverty rate with the factored adjustment compared to the 16 percent official number.

    Furthermore, numerous studies have documented the “high cost of being poor” in many urban areas. Low-income neighborhoods like Compton in Los Angeles (where one third of the residents are in poverty) or the Tenderloin in San Francisco suffer from a paucity of services that are plentiful in surrounding communities. Manhattan Beach has one bank for every 4,000 residents. Residents of Compton, on the other hand, can access barely one for every 25,000. Residents must make do with corner stores that sell inferior food goods at higher prices and check cashing outlets that often deduct three percent of the customer’s paycheck.

    What is all this leading to? The unsettling contrasts between rich and poor of John Edwards’ “Two Americas” narrative is all too real in many American cities. Walking down Minna Street in San Francisco this week, I saw a homeless man drying his socks in the sun, just twenty yards from restaurants with $30 entrees and nightclubs so discrete in their hipness they need only signify their sign with a small letter.

    And although often more startling in affluent, white-collar havens like San Francisco, this contrast exists in almost every city. In Baltimore the gap between high-earning skilled professionals living in gentrified neighborhoods with waterfront view and a procession of hard-pressed, violence-plagued communities nearby is equally striking.

    The celebratory accounts of gentrification of small parts of cities like Baltimore – or large parts of sections of San Francisco or Chicago – needs to be balanced with a far greater concern with creating upward mobility for those large populations left behind. These lower income populations need to be treated as potential assets that will require investments in skills training and childcare subsidies, all the while nurturing high wage blue collar industries and improving basic public infrastructure.

    In the past, poverty reduction never stuck around long enough to become a major issue in the presidential campaign, partly because voter turnout in these communities is low and, as we suggested earlier this week, there is little doubt which party will win urban voters.

    But there is some reason, perhaps, to feel more optimistic this year. Senator Obama’s community organizing background in Chicago’s South Side has led him to adopt a broad anti-poverty platform targeting greater federal resources for working parents and low-income children. The presumptive Democratic nominee also proposes tripling the popular Earned Income Tax Credit that supplements low-income workers and supports pegging the minimum wage to the cost of living. Interestingly, Obama has also voiced support for creating a White House Office of Urban Policy.

    Coming from a party skeptical about increasing poverty spending, McCain has supported tax credits being used to attract businesses to low-income neighborhoods and also favors increasing childcare subsidies for low-income families.

    Mencken once wrote that his house in Baltimore “is as much a part of me as my two hands. If I had to leave it I’d be as certainly crippled as if I lost a leg.” However, given its current condition, it is highly unlikely today he would linger in his old neighborhood for long. Hopefully, after November, there may be reason to reassess that assumption.

    Andy Sywak is the articles editor for Newgeography.com.

  • Dayton, Ohio: The Rise, Fall and Stagnation of a Former Industrial Juggernaut

    What Dayton can tell cities about staying competitive in the global economy

    Few people would recognize Dayton, Ohio of 2008 as the industrial powerhouse it was less than one hundred years ago. Once a beacon of manufacturing success, Dayton claimed more patents per capita than any other U.S. city in 1900. Its entrepreneurial climate nurtured innovators such Charles Kettering, inventor of the automobile self-starter and air travel pioneers Wilbur and Orville Wright. As the U.S. economy took off after World War II, Dayton was home to the largest concentration of General Motors employees outside of Michigan.

    The city also nurtured companies that would became stalwarts on the Fortune 500, including National Cash Register (NCR), Mead Paper Company, business forms companies Standard Register and Reynolds and Reynolds, Dayco and Phillips Industries. To put this in context, just 14 U.S. cities could claim six or more Fortune 500 headquarters in 2007. Not a bad performance for an urban area that peaked as the 40th largest city in the U.S. in 1940.

    These early industrialists were more than just business men. They were also visionaries. The founder of NCR, John H. Patterson, is widely credited with laying the foundation for the first modern factory system, pioneering the basic principles that still drive much of modern advertising, and redefining the relationship between labor and management.

    NCR may also have been America’s first truly global business. “The cash register,” writes Patterson biographer Samuel Crowther, “is the first American machine which can claim that on it the sun has never set.” Even as Patterson was toiling away in a little shop in Dayton, cash “registers were being sold in England and Australia.” The company’s first non-US sales office was established in England in 1885 and its first European factory was established in Germany in 1903.

    It’s difficult to underestimate Patterson’s influence on American industry. By 1930, an estimated one-sixth of all U.S. corporate executives had either been an executive at NCR or been part of Patterson’s management training programs. Among NCR’s alumni were IBM’s visionary CEO Thomas Watson as well as the presidents of Packard Motor Car Company, Toledo Scale, Delco (now Delphi) and dozens of others.

    What may separate men like Patterson to their equivalents today in places like Silicon Valley was their intense civic involvement. Patterson was one of the first business leaders to try to apply scientific management to local government, testing out his ideas in rebuilding the city after a disastrous flood ruined downtown Dayton in 1913. He also helped create the Miami Conservancy District, one of the nation’s first flood control districts that still manages a system of low-level dams and levies that keep downtown flood-free to this day. Perhaps one of Patterson’s most prescient civic innovations was bringing the city manager form of local government to the first large city in the U.S.

    As significant as Patterson was as an individual, he was not alone. The Dayton area benefited from the entrepreneurial drive and civic commitment of hundreds of businessmen that built large companies, many publicly traded. Patterson was the most iconic of the icons.

    Dayton’s Economic Descent
    Today one would not expect such vision in Dayton, and you would be unlikely to find it. Since the early 1970s, nearly 15,000 manufacturing jobs disappeared at NCR. Automobile plants cut payrolls as the economy restructured toward services, and foreign competition outsold domestic manufacturers. As late as 1990, five General Motors plants employed more than 20,000 people regionally. Now, fewer than 12,000 work in these factories and Delphi is on the cusp of closing two more plants. NCR’s world headquarters employs fewer than 3,000 people. Mead Paper Company has merged with a competitor, becoming MeadWestvaco and its corporate headquarters has moved to Richmond, Virginia.

    As the economy has tanked, the city has shrunk. After peaking at more than 260,000 people in 1960, the city is barely clinging to a core city population of less than 160,000. In the 2000 census, Dayton ranked 147th in size nationwide. Its metropolitan area is now ranked 59th.

    Meanwhile, the suburbs have grown. Nearly 74 percent of Montgomery County’s population lived in Dayton in 1930. The growth of suburban cities shrunk that proportion to less than a third by the mid 1980s. Now, less than 20 percent of the metropolitan area’s population lives in the city of Dayton.

    Lessons for Other Cities
    Dayton’s early dependence on traditional manufacturing, with a particular emphasis on assembly line work, put the region at a competitive disadvantage as growing international trade and dramatically reduced transportation costs allowed for the global dispersion of factory work.

    Yet perhaps most remarkable is not the region’s decline, but its resilience. Despite the ongoing decline of manufacturing sector, the metropolitan area still knits together a population of over one million people. What accounts for this?

    First, the regional economy has diversified. Now, as in other metropolitan areas, the growth in employment is in services. Two local major health care networks – Premier Health Partners and Kettering Medical Network – employ 15,300 in facilities that are nationally recognized for their quality of care. Wright Patterson Air Force Base is a center for scientific research and development and employs another largely civilian workforce of 21,000.

    Second, some of the large industrial companies of the past have evolved to meet the needs of an information economy. NCR, while its presence has diminished, is now a high tech company. Reynolds & Reynolds, a former business forms manufacturer, now provides software in niche markets such as auto sales. The region is also home to the legal information services provider Lexus/Nexus, now a division of Reed Elsevier but originally a division of the Mead Paper Company’s investment in data management services.

    Third, core parts of the traditional manufacturing base literally retooled to become globally competitive. In the early 1980s, more than 600 machine shops employed nearly 20,000 people. As the 1990s unfolded, this number had fallen by half. As the 21st century got its start, the number of tool and die shops had revived and employment was rebounding close to 15,000. The shops remain small, but they are deeply invested in global trade. Productivity is up along with incomes.

    Fourth, the region remains at a strategic logistical and demographic location in the Midwest. The city of Dayton is at the cross roads of two major interstate highways – the major east-west link I-70 and the north-south connector of I-75. Combined with access to three major airports, the Dayton region can easily benefit from and tap into economic growth in nearby metropolitan areas such as Columbus, Cincinnati, and Indianapolis. Ironically, many of the highway improvements some believed would “empty” the downtown – the interstates plus a partial beltway, I-675 – ended up tying the city and suburbs to other larger urban areas and enhanced the region’s geographic importance.

    Dayton’s economy may no longer provide the flash and glitter of 20th century economic leadership, but the region has demonstrated a remarkable robustness that holds lessons for other cities striving to remain competitive in a global economy. All cities or economic regions pass through periods of growth and decline. The real question is whether they can adapt to changing economic circumstances.

    Dayton survived by building on the secrets of its past success. Its innovative manufacturing base has become more tech-centric and service-oriented. New areas of vitality such as health services have been enhanced. The city may no longer be what it was at its peak a century ago, but its future is far from grim.

    Sam Staley, Ph.d., is director of urban and land use policy at the Reason Foundation and teaches urban economics at the University of Dayton. He is a fourth generation native and current resident of the Dayton area.

  • Sacramento 2020

    Even in the best of times, Sacramento tends to be a prisoner to low self-esteem. The region’s population and economic growth have been humming along nicely for the past decade, drawing ever more educated workers from overpriced coastal counties, but the region’s leaders have often seemed defensive about their flourishing town.

    So perhaps it’s not surprising that the mortgage meltdown, which has hit the area hard, has sparked something of an identity crisis. Yet in trying to cope with hard times, it’s important that the region not lose its focus on what paced Sacramento’s past success: its ability to offer affordable, high-quality, largely single-family neighborhoods for middle class families.

    Sadly, the dominant narrative among many planners, politicians and developers in Sacramento today is to try to shed the family-friendly image. There’s a growing consensus that low-density neighborhoods are passé and that the region’s future success lies in retrofitting the region along a high-density, centralized model. Suburban areas like Rancho Cordova or Elk Grove, some believe, are destined to become the “the next slums” as middle-income homeowners, fleeing high gas prices, flock to the urban core.

    Although a healthier downtown with reasonable density is good for the entire region, the high-density focus does not make a good fit for a predominately middle class, family-oriented region such as Sacramento. Unlike an elite city like San Francisco, Sacramento’s growth has been fueled by an influx of educated, family-oriented residents – the populations that have been fleeing such high-priced places where the housing supply is constrained.

    Long-term demographic trends, and perhaps common sense, suggest that most people do not move to Sacramento to indulge in a “hip and cool” urban lifestyle. If someone craves the excitement, bright lights and glamorous industries of a dense city, River City pales compared with places like San Francisco, New York or Los Angeles.

    The fact Sacramento has fared far better than these cities over the past 15 years suggests the region’s recent problems lie not in a lack of downtown condos and nightlife, but with a housing market that, as in much of California, has been totally out of whack. Once a consistently affordable locale, by the mid-1990s Sacramento’s housing prices jumped almost nine times income growth, an unsustainable pace seen in a few areas such as Riverside, Miami and Los Angeles.

    As a result, the refugees from the coastal counties who had been coming to Sacramento for affordable housing stopped arriving. Net migration to the region, more than 36,000 in 2001, fell to less than 1,000 in 2006.

    Ultimately only a housing market correction will again lure the people who have come to Sacramento seeking single-family houses – the type of home favored by about 80 percent of Californians – back to the region. Evidence that these people, or current suburbanites, might flock back to the core city is thin at best. The failures of such high-profile projects as The Towers and the region’s stagnant rental market do not suggest a seismic shift toward denser living.

    One key reason has to do with patterns of job growth. Since 2000, suburban communities in the largest metropolitan areas have added jobs at roughly six times the rate of the urban cores.

    This pattern has had profound and often counterintuitive effects on commuting distances. Planners and journalists tend to think of cities in traditional concentric rings, with distance from the core as the key measurement of distance from jobs. But in most regions, the vast majority of employment is outside the core. Even in Sacramento, a state capital, only about 1 in 10 jobs are in the city center. Exurban employment growth since 2000 has been the fastest regionally, expanding at nearly twice the rate for Sacramento County.

    This means commuting distance – and thus exposure to higher gas prices – reflects more than proximity to the central core. In such diverse regions as Los Angeles and Chicago, the shortest average commutes exist both in the affluent inner-city neighborhoods and those suburbs and exurbs, where much of the employment growth has clustered. People who live in Irvine or Ontario in Southern California, or in the western suburbs of Chicago, for example, actually have shorter commutes than those residing in the barrios around downtown Los Angeles or in the Windy City’s fabled South Side.

    These trends suggest a radically different response to high gas prices than the knee jerk downtown-centric approach now widely supported. Instead of cajoling people downtown, perhaps it would make more sense to accelerate employment growth in those suburban and exurban areas where the region’s skilled work force is increasingly concentrated.

    These suburban nodes, both in and outside of Sacramento County, may very well become more important in the near future. With the state facing a perpetual budget deficit, state government – the dominant employer in the central city – may not expand and even could contract in years ahead. Perhaps a wiser approach would be to focus on the biotech, electronics and other firms, many concentrated in suburban areas, as the region’s best hope for the creation of new high-wage jobs.

    Does this mean the region should invite unbridled, uncontrolled growth to the periphery? Not in the least. Successful suburban communities – think of Clovis outside Fresno or Irvine or Valencia in Southern California – provide a high quality of life to their residents. This suggests the need for greater investments in such things as developing lively town centers, expansive parks, wildlife and rural preserves, as well as maintaining good schools, which are often the key factor for families deciding where to live.

    This vision focuses not on one selected geographic area but on a broad spectrum of places across the region. It concentrates not exclusively on dense urban neighborhoods but on fostering a series of thriving villages from close-in city neighborhoods to places like Folsom, Roseville and even Elk Grove. Ultimately the suburb needs not to be demonized, but transformed into something more than bedrooms for a central core.

    In terms of reducing vehicle miles driven, a greater emphasis on telecommuting, including by state employees, would likely also do more than an expanded, very expensive light-rail system. Although more than 12 percent of commuters to and from downtown take transit daily, less than 2 percent of those commuting elsewhere do so. Given the structure of the suburban regions, with multiple nodes of work and a weak bus-feeder system, notions of turning Sacramento into a transit mecca like New York or even San Francisco are far-fetched at best.

    The central city will continue to maintain important functions, not only as a state capital but as a physical and cultural hub. But there needs to be recognition that “hip and cool” dense urbanity does not constitute the core competence of this region. For the foreseeable future, Sacramento’s advantage against its coastal competitors will lie in providing affordable and highly livable modest-density neighborhoods for California’s increasingly diverse middle class.