Category: Policy

  • Special Report: Infill in US Urban Areas

    One of the favored strategies of current urban planning is “infill” development. This is development that occurs within the existing urban footprint, as opposed that taking place on the fringe of the urban footprint (suburbanization). For the first time, the United States Bureau of the Census is producing data that readily reveals infill, as measured by population growth, in the nation’s urban areas.

    2000 Urban Footprint Populations

    The new 2007 estimates relate to urban areas or urban footprints as defined in 2000 and are produced by the American Community Survey program of the Bureau of the Census. Urban areas are the continuous urbanization that one would observe as the lights of a “city” on a clear night from an airplane. It is the extent of development from one side of the urban form to the other. Further, urban areas are not metropolitan areas, which are always larger and are defined by work trip travel patterns. Metropolitan areas always include adjacent rural areas, while urban areas never do.

    The Process of Infill

    Although embraced with often religious passion within the urban planning community, infill is neither good nor bad in terms of social or environmental impact. Infill always increases population densities and that means more traffic. If road capacity is increased sufficiently, traffic congestion can be kept at previous levels. If on the other hand, nothing is done, traffic congestion is likely to increase along with population. This means slower traffic and more stop and go operations, which inevitably increases the intensity of air pollution with the potential to cancel out any reductions in greenhouse gas emissions (GHG) that might occur if average car trip lengths decline. Similar difficulties can occur with respect to other infrastructure systems, such as sewer and water. Expanding roads, sewer and water systems in already developed areas can be far more expensive than new systems on greenfield sites. Regrettably, boosters of infill routinely ignore these issues.

    But infill has been going on for years, along with suburbanization, both in the United States and in other first world nations. This is indicated by the general densification trend that occurred in US urban areas between 1990 and 2000 and the longer term densification trends that occurred in a number of southwestern urban areas, such as Los Angeles, San Jose, Riverside-San Bernardino, Phoenix, Dallas-Fort Worth and Las Vegas. All these traditionally “sprawling” areas have, in fact, been densifying since 1960 or before. Since 2000, 33 of the nation’s 37 urban areas with a population exceeding 1,000,000 population experienced population infill to their 2000 urban footprints.

    Infill in Traditionally Regulated Markets (More Responsive Markets)

    Infill is a natural consequence of the traditional post-World War II land use regulation, which tends towards accommodating both demographic growth and market forces. This has been replaced by more prescriptive (often called “smart growth”) land use regulation in some urban areas. Under traditional regulation, suburban development followed a “leap frog” process, moving ever further out. This is roundly condemned in today’s planning literature and among leading academics and policy makers.

    Leap frog development occurs where urban development skips over empty land and creates a less continuous urban fabric. Land is developed based upon the interplay between sellers and buyers. Due to fewer planning restrictions, no seller can be sure that their land will be purchased since there is always plenty of land that buyers can otherwise purchase. This keeps land prices down. In the more responsive markets, it is typical for land and site infrastructure costs to be 20 percent of the total price land and house price.

    Infill occurs as land that has been “leaped” over is subsequently purchased for development. Again, because buyers have plenty of choices, prices of the infill land remains low, so that land and infrastructure costs remain relatively affordable in relationship to the overall new house purchase price.

    The result is an urban area that is generally continuous, though with a transitional “ragged edge.” The ragged edge enabled the broad expansion of home ownership that occurred in the decades following World War II by keeping house prices low.

    Infill in More Prescriptive Markets (Smart Growth)

    The infill process is quite dramatically different in more prescriptive markets. Infill might be mandated as a percentage of total development or by severely limiting the development allowed to occur closer to the urban fringe. Sellers of land on which development is permitted have disproportionate power to charge higher prices because the planning regime seriously limits the availability of alternative sites for buyers. This, of course, flows through to house prices. The share of land and site infrastructure can rise to two-thirds of the house and land cost. The urban area may have a “clearer” edge, but at a significant loss in housing affordability.

    Infill Trends in the 2000s

    The new infill estimates indicate that American urban areas continue to densify. Between 2000 and 2007, the 33 of the 37 urban areas of more than 1,000,000 population experienced densification in their 2000 urban footprints. The average population infill increase was 5.6 percent (See Table the following table).

    Population Infill in 2000 Urban Footprints
    2000-2007
      Population Change: 2000 Urban Footprint Population Density of 2000 Urban Footprint in 2007  
    Urban Area 2000 Census 2007 Estimate Change % Rank Rank
    Riverside–San Bernardino, CA       1,506,816      1,800,117     293,301 19.5% 1         4,110 8
    Atlanta, GA       3,499,840      4,118,485     618,645 17.7% 2         2,100 36
    Austin, TX         901,920      1,051,962     150,042 16.6% 3         3,308 17
    Las Vegas, NV       1,314,357      1,518,835     204,478 15.6% 4         5,311 5
    Houston, TX       3,822,509      4,370,475     547,966 14.3% 5         3,377 16
    Portland, OR–WA       1,583,138      1,779,705     196,567 12.4% 6         3,755 12
    Phoenix, AZ       2,907,049      3,254,634     347,585 12.0% 7         4,078 9
    Dallas–Fort Worth, TX       4,145,659      4,549,281     403,622 9.7% 8         3,236 18
    Orlando, FL       1,157,431      1,267,976     110,545 9.6% 9         2,799 24
    San Antonio, TX       1,327,554      1,440,794     113,240 8.5% 10         3,540 14
    Tampa–St. Petersburg, FL       2,062,339      2,209,067     146,728 7.1% 11         2,754 25
    Sacramento, CA       1,393,498      1,488,647       95,149 6.8% 12         4,034 10
    Seattle, WA       2,712,205      2,896,844     184,639 6.8% 13         3,040 21
    Miami, FL       4,919,036      5,243,679     324,643 6.6% 14         4,703 6
    Washington, DC–VA–MD       3,933,920      4,174,187     240,267 6.1% 15         3,611 13
    Denver, CO       1,984,887      2,087,803     102,916 5.2% 16         4,192 7
    Indianapolis, IN       1,218,919      1,278,687       59,768 4.9% 17         2,316 34
    Columbus, OH       1,133,193      1,175,132       41,939 3.7% 18         2,960 22
    Kansas City, MO–KS       1,361,744      1,408,900       47,156 3.5% 19         2,413 31
    Virginia Beach, VA       1,394,439      1,442,494       48,055 3.4% 20         2,742 26
    San Jose, CA       1,538,312      1,588,544       50,232 3.3% 21         6,110 2
    Los Angeles, CA     11,789,487    12,171,625     382,138 3.2% 22         7,302 1
    Cincinnati, OH–KY–IN       1,503,262      1,546,730       43,468 2.9% 23         2,305 35
    Baltimore, MD       2,076,354      2,133,371       57,017 2.7% 24         3,128 19
    San Diego, CA       2,674,436      2,747,620       73,184 2.7% 25         3,514 15
    New York, NY–NJ–CT     17,799,861    18,223,567     423,706 2.4% 26         5,440 4
    Minneapolis–St. Paul, MN       2,388,593      2,438,359       49,766 2.1% 27         2,727 27
    Chicago, IL–IN       8,307,904      8,467,804     159,900 1.9% 28         3,992 11
    St. Louis, MO–IL       2,077,662      2,103,040       25,378 1.2% 29         2,540 30
    Milwaukee, WI       1,308,913      1,324,365       15,452 1.2% 30         2,719 28
    Boston, MA–NH–RI       4,032,484      4,077,659       45,175 1.1% 31         2,350 33
    Providence, RI–MA       1,174,548      1,183,622        9,074 0.8% 32         2,353 32
    Philadelphia, PA–NJ–DE–MD       5,149,079      5,178,918       29,839 0.6% 33         2,880 23
    San Francisco, CA       3,228,605      3,214,137      (14,468) -0.4% 34         6,099 3
    Detroit, MI       3,903,377      3,831,575      (71,802) -1.8% 35         3,041 20
    Pittsburgh, PA       1,753,136      1,687,509      (65,627) -3.7% 36         1,981 37
    Cleveland, OH       1,786,647      1,705,917      (80,730) -4.5% 37         2,641 29
    Total  116,773,113  122,182,066  5,408,953 5.6%
    Data from US Bureau of the Census

    Riverside-San Bernardino, long castigated as a “sprawl” market, had the largest population infill, at 19.5 percent. Atlanta ranked number two, at 17.7 percent. This is a real surprise, since Atlanta was the least dense major urban area in the world in 2000, ranked second in 2000s infill. As a result, it is likely that Pittsburgh- often held up as a model of urban regeneration – is now the world’s least dense major urban area. On the other hand, if Atlanta’s infill rate continues, its 2000 urban footprint will be more dense than that of Boston by 2015.

    Austin ranked third, adding 16.6 percent population to its 2000 urban footprint. Las Vegas ranked fourth, with a 15.6 percent increase in its 2000 urban footprint. The density of Las Vegas is increasing so rapidly that by the 2010 census its 2000 urban footprint will be more dense than the 2000 New York urban footprint, should the current rates continue.

    Perhaps most surprising of all is that Houston ranked fifth, added 14.3 percent to its 2000 urban footprint. This may surprise those who have denounced Houston’s largely deregulated regulatory environment, both in the city and in unincorporated county areas in the suburbs. Yet overall Houston’s infill exceeded that of smart growth model Portland. The Rose City stood at sixth, adding 12.4 percent to its 2000 urban footprint.

    Perhaps equally surprising, Portland remains less dense than average for a western urban area. Its 2000 urban footprint density trailing Los Angeles, San Jose, San Francisco, Las Vegas, Denver, Riverside-San Bernardino, Phoenix and Sacramento, while leading only San Diego and Seattle.

    The top ten were rounded out by Phoenix (7th), Dallas-Fort Worth (8th), Orlando (9th) and San Antonio (10th). It is worth noting that like Houston, the unincorporated suburbs of Austin, Dallas-Fort Worth and San Antonio have largely deregulated land use regulation, yet these urban areas ranked high in infill.

    Interestingly some of the greatest infill growth also took place in the fastest growing, traditionally “sprawling” cities. Atlanta also had the largest numeric increase in the population of its 2000 urban footprint, at more than 600,000. Houston was a close second, at nearly 550,000.

    In contrast, population losses since 2000 in the urban footprints of Cleveland, Pittsburgh, Detroit and San Francisco, means these urban areas experienced no population infill. San Francisco’s loss enabled San Jose to move into second position nationally after Los Angeles in the population density of its 2000 urban footprint.

    How the Core Cities Fared

    The core cities (municipalities) attracted, on average, their population share. Approximately 30 percent of the infill growth occurred inside the core cities. Even this figure may be a bit high, due to the impacts of annexation

    All of the infill in Philadelphia, Baltimore, Chicago, Providence and Minneapolis-St. Paul occurred outside the core cities. The city of Portland attracted barely 10 percent of its urban area infill, despite highly publicized (and subsidized) infill projects such as the Pearl District. Core cities attracted the largest share of infill growth in such diverse cities as San Antonio, San Jose, Columbus, Phoenix and New York.

    Note: Additional information available at http://www.demographia.com/db-uzafoot2007.pdf

    Wendell Cox is a Visiting Professor, Conservatoire National des Arts et Metiers, Paris. He was born in Los Angeles and was appointed to three terms on the Los Angeles County Transportation Commission by Mayor Tom Bradley. He is the author of “War on the Dream: How Anti-Sprawl Policy Threatens the Quality of Life.

  • Kauai, Hawaii: Local Merchants Make Waves

    Many have by now heard or read the story of the plucky group of Hawaiians on the island of Kauai who, when faced with the loss of their businesses due to the state government’s inability to open park roads to a popular beach and camping area, took care of it themselves for a fraction of the cost and in a fraction of the time. How very Tocquevellian. Or, better, how very American. The story brings a reflexive smile to everyone who hears it, but the events cast a spotlight on the way governments at all levels interact with their communities, and how, in light of significant budget cutbacks, roles are changing.

    In his magisterial commentary on 19th century democratic culture, Democracy in America, Alexis de Tocqueville compared the initial sources of public action in European countries with the United States: “Everywhere that, at the head of a new undertaking, you see the government in France and a great lord in England, count on it that you will perceive an association in the United States.”

    De Tocqueville was overwhelmed at this penchant of Americans to collaborate in common effort. The Frenchman attributed this unique, awe-inspiring American quality to the absence of a large government or aristocratic structure. “They can do almost nothing by themselves,” he wrote, “and none of them can oblige those like themselves to lend them their cooperation. They therefore all fall into impotence if they do not learn to aid each other freely.”

    After December floods washed out the park roads, bridges, and facilities at the Polihale State Park, Hawaii’s Department of Land and Natural Resources (DLNR) studied the damage and released a statement two months later, declaring, “We know that people are anxious to get to the beach. However, the preliminary cost estimate of repairs is $4 million.” An original timeline for the work was set for late summer, but, commented local resident and surfer, Bruce Pleas, “It would not have been open this summer, and it probably wouldn’t be open next summer.”

    The DLNR’s natural response to this natural disaster was to go inward (look to its own capabilities) and upward (look for more State or Federal funds). The public’s role – if there was to be any – was to leave them alone to do the first task, and help them achieve the second; specifically, the main objective was to grab a fee-generated windfall for the department, ironically entitled the “Recreational Renaissance” fund. In February, DLNR’s Chair, Laura Thielen, pleaded, “We are asking for the public’s patience and cooperation to help protect the park’s resources during this closure, and for their support of the ‘Recreational Renaissance’ so we can better serve them and better care for these important places.” The department convened an “information meeting” in March to discuss… how residents could work with the department to open the roads? No, only to provide information on how to lobby the state for more funding.

    This approach did not sit well with area residents who depend on the park for their livelihood. It was reported that Ivan Slack, owner of Na Pali Kayaks, which operates from the beach in Polihale, summed up the community’s frustration: “We can wait around for the state or federal government to make this move, or we can go out and do our part.” Beginning in late March, business leaders and local residents organized — “associated” — to take the situation into their own hands. From food donated by local restaurants to heavy machinery offered by local construction companies, a project that was originally forecast to cost millions and take months (if not years) was nearly completed in a matter of weeks, all with donated funds, manpower, and equipment. As Troy Martin from Martin Steel, which provided machinery and five tons of steel at no charge, put it, “We shouldn’t have to do this, but when it gets to a state level, it just gets so bureaucratic; something that took us eight days would have taken them years. So we got together — the community — and we got it done.”

    This was not just a park clean-up, but a significant undertaking involving bridge-building, reconstructing rest rooms, and use of heavy equipment to clear miles of flood-damaged roadways.

    While unique in its scope, what is happening on the southwestern coast of Kauai is not completely anomalous. Due to the national budget crisis, states and cities around the country are having to take a hard look at the services they offer and find new ways to involve civil society. The organization I head up, California Common Sense, is working with cities and school districts that have to chart this new course. The failure of several revenue-raising ballot initiatives here in the Golden State has provided even more impetus to practice this outward-focused governance.

    In some respects, governments themselves are to blame for setting the service expectations of the past decades. Beginning in the mid-1980s, the “TQM” (Total Quality Management) craze in private industry found its way into the public sector, and a new language of “service provider” (government) and “customer” (citizen) followed. Government no longer was something to participate in, but something to pay for. Later in this transition, scholars like Northwestern University’s John McKnight could see that the results of this new relationship were heading towards a precipice. In an essay for The Essential Civil Society Reader, McKnight commented on this situation in terms reminiscent of de Tocqueville’s fears almost two centuries earlier: “The service ideology [in governments] will be consummated when citizens believe that they cannot know whether they have a need, cannot know what that remedy is, [and] cannot understand the process that purports to meet the need.” This, thankfully, is not the situation in Kauai.

    But we, as citizens, don’t get off the hook that easily. Certainly, we have too often taken on this role as “customer,” believing our taxes are just the prices we pay for the services we desire, from filling potholes to teaching our children. When government does not perform up to our expectations the usual response is either to decry its wastefulness or to acquiesce to higher taxes. These often unproductive reactions come from both the left and right on the ideological spectrum.

    The story in Kauai, and others bubbling up around the country, demonstrate that there is a “third way”: get some friends and pick up a shovel when the government can’t or won’t. Governments on the other side of this equation need to be open to this kind of direct participation; in fact, they should encourage it. What is happening in Polihale is not a syrupy, Rockwellian portrait. It is doubtful that this dramatic participation would have occurred without the dire financial consequences that loomed for many of the residents and businesses involved. It is a manifestation of de Tocqueville’s “self-interest rightly understood”.

    “All feel themselves to be subject to the same weakness and the same dangers,” De Tocqueville wrote, “and their interest as well as their sympathy makes it a law for them to lend each other mutual assistance when in need.” Ray Ishihara, manager of the local Ishihara Market, which has donated food for the volunteers, puts this in more concrete terms: “I think it’s great. Everybody needs help these days in this economy.”

    It is ironic that this should all be taking place in President Obama’s home state. The usually articulate Obama has sounded uncomfortable when attempting to define how he expects Americans to “sacrifice” during this financial crisis. From a policy perspective, the Administration’s only responses appear to be raising taxes on our wealthiest 5%, and, interestingly, increasing Federal funding for volunteer programs.

    One thing the President could do is travel out Kauai’s Route 50 to Polihale State Park during his next trip to Hawaii. There, he could see and celebrate what everyday Americans do when they gather in common purpose. Thanks to their hard work and sacrifice, surf’s up.

    Pete Peterson is executive director of Common Sense California, a multi-partisan organization that supports citizen participation in policymaking (his views do not necessarily represent those of CSC). He also lectures on State & Local Governance at Pepperdine’s School of Public Policy. An earlier version of this article appeared in City Journal.

  • The Fate of America’s Homebuilders: The Changing Landscape of America

    During the first ten days of October 2008, the Dow Jones dropped 2,399.47 points, losing 22.11% of its value and trillions of investor equity. The Federal Government pushed a $700 billion bail-out through Congress to rescue the beleaguered financial institutions. The collapse of the financial system in the fall of 2008 was likened to an earthquake. In reality, what happened was more like a shift of tectonic plates.

    History will record that the tectonic plates of our financial world began to drift apart in the fall of 2008. The scale of this change may be most evident in housing.

    PART TWO – THE HOME BUILDERS

    For decades, home ownership epitomized the American dream. For years, Americans saved their money for the required 20% down payment to purchase their dream home and become part of the great American Middle Class. They saved their money in a special account at the local savings & loan that paid a little more interest than the banks. Interest rates were fixed by law. A typical mortgage was written at a fixed rate for 30 years. Most American home owners stayed in their homes and celebrated the pay-off with a mortgage burning party.

    In this arrangement, it was understood that the savings & loans were allowed to pay more interest because they provided long term home mortgages. They paid depositors 4 – 5% and lent money at 6% making a little profit on the arbitrage for their risk. With a 20% down payment, there was little risk. Mortgage bankers knew the homes they lent money on and more importantly, they knew their clients. The mortgage stayed on the books at the local savings & loan until paid.

    In this time, home builders were mostly small local shops known by their customers and the lenders. For decades the industry was quite stable. Homes averaged 1,400 square feet in 1970 according to the National Association of Homebuilders. A quality home could be purchased for under $20,000. Not everyone could afford to buy a home but almost everyone aspired to this. Savings & loans provided 60% of all home mortgages.

    The first crack in the dam appeared in the late 1970s. Under President Jimmy Carter, America suffered double-digit inflation. As the value of the dollar eroded, Americans sought investments that could protect their dollars from the ravages of inflation. Regulation D prohibited banks from paying interest on checking accounts. A tiny bank in Massachusetts, the Consumers Savings Bank of Worcester, Massachusetts introduced the NOW Account (Negotiable Order of Withdrawal) and began paying a higher rate of interest than the savings & loans. Money flooded into the bank.

    The Depository Institutions Deregulation and Monetary Control Act of 1980 began the six-year process of phasing out limits on interest rate. Money flowed out of savings & loans and into NOW accounts and MMDA accounts (Money Market Depository Accounts). The S&Ls, with long term fixed loans on their books and short term money leaving for higher rates at the banks, never fully recovered. The primary source of funding for America’s home building industry was changed forever.

    In the late 1980s the S&L industry attempt to recapture market share by entering the equity side of real estate development with disastrous consequences. The government was forced to seize most of the S&Ls and sell off their assets through the Resolution Trust Company (RTC). In 1989, Congress passed TEFRA, the Tax Equity and Fiscal Responsibility Act that effectively outlawed direct ownership of property by S&Ls. It was a death blow to the industry and the end of the 30-year home mortgage as we knew it.

    This is where the seeds of the current housing disaster and financial meltdown were sown. Wall Street and politics entered the financial vacuum left by the demise of the savings & loan industry. The Garn-St Germain Depository Institutions Act of 1982 introduced the ARM (adjustable rate mortgage) which allowed rates paid to depositors to balance rates charged to borrowers. Our politicians, filled with good intentions, began down an irreversible path of using the home mortgage for social engineering.

    Seeking to increase homeownership, Congress began to unwind the financial safety net that protected the American dream for nearly 100 years. An ugly brew was concocted with the marriage of too much money and too much power. Congress began to consider housing as a right instead of a privilege.

    Over the ensuing quarter century, Wall Street and Congress conspired to turn the traditional 20% down, fixed 30 year mortgage on its ear. In 1977, they passed the Community Reinvestment Act that outlawed red-lining and forced lenders to make loans to poor neighborhoods. In 1982, they passed the Alternative Mortgage Transactions Parity Act (AMTPA) that expanded the funding and powers of Fannie Mae and Freddie Mac by lifting the restrictions on adjustable rate mortgages (ARM), balloon payment mortgages and the Option ARM (negative amortization loan). When a savings & loan made a mortgage in the past, they held it for 30 years or until paid. Freddie and Fannie became the new absentee owner of the majority of mortgages by purchasing them from the originators in the secondary market.

    Thus the die was cast. Mortgage bankers and brokers became salesman and paper pushers packaging applications for the secondary market and financial investors who never saw the asset they lent money against or met the borrowers for whom they made the loan. But this was not enough to satisfy the greed of Wall Street which invented the CMBS (commercial mortgage backed security) in 1991. This was nothing more than a private label pool of mortgages that they sold off to equally unconnected financial investors in their own secondary market. Home mortgage lending by commercial banks went from nothing to 40% of the market in a matter of years.

    The market could have possibly tolerated this bastardization of the conventional mortgage but neither Congress nor Wall Street could control themselves. There was simply too much money to be made. Congress determined that the credit score was discriminatory and violated the rights of the poor and minorities. In 1994, Congress approved the formation of the Home Loan Secondary Market Program by a group called the Self-Help Credit Union. They asked for and received the right to offer loans to first time homebuyers who did not have credit or assets to qualify for conventional loans. Conventional 80% financing was replaced with 90% loans and then 95% and finally 100% financing that allowed a home buyer to purchase a home with no down payment. The frenzy climaxed with negative amortization loans that actually allowed homes to be purchased with 105% financing.

    In June of 1995, President Clinton, Vice President Gore, and Secretary Cisneros announced a new strategy to raise home-ownership to an all-time high. Clinton stated: “Our homeownership strategy will not cost the taxpayers one extra cent. It will not require legislation.” Clinton intended to use an informal partnership between Fannie and Freddie and community activist groups like ACORN to make mortgages available to those “who have historically been excluded from homeownership.”

    Historically, a good credit score was essential to receive a conventional mortgage. Under pressure from the politicians, lenders created a new class of lending called “sub-prime” and as these new borrowers flooded the market, housing prices rose. Lenders used “teaser rates”, a form of loss leader, to help the least credit worthy to qualify for loans.

    Congress instructed Fannie and Freddie to purchase mortgages even though there was no down payment and no proof of earnings by the applicant. An applicant could “state” his or her income and provide no proof of employment. Stated income loans eventually became known as “liar loans”. Sub-prime loans grew from 41% to 76% of the market between 2003 and 2005.

    This devilish brew caused a record 7,000,000 home sales in 2005, including more than 2,000,000 new homes and condominiums. Mortgage lending jumped from $150 billion in 2000 to $650 billion in 2005. Prices rose relentlessly, pushed by more and more buyers entering the market. The top 10 builders in the United States in 2005 were:

    1. D.R. Horton – 51,383 Homes Built
    2. Pulte Homes – 45,630 Homes Built
    3. Lennar Corp. – 42,359 Homes Built
    4. Centex Corp. – 37,022 Homes Built
    5. KB Homes – 31,009 Homes Built
    6. Beazer Homes – 18,401 Homes Built
    7. Hovnanian Enterprises –17,783 Homes Built
    8. Ryland Group – 16,673 Homes Built
    9. M.D.C. Holdings – 15,307 Homes Built
    10. NVR – 13,787 Homes Built

    Economists and pundits eventually began to identify the phenomenon as the housing bubble. And, bubbles burst. But Congress was not ready to confront reality. Rep. Barney Frank testified he “saw nothing that questioned the safety and soundness of Fannie and Freddie”. Fannie Mae Chairman Franklin Raines was paid $91.1 million in salary and bonuses between 1998 and 2004. In 1998 Fannie’s stock was $75/share. Today it is 67 cents.

    In 2007 as prices stopped rising, the flood of buyers entering the market ceased putting market values into a free-fall. Home building is not a nimble industry. It takes years of planning and development to bring a project to market. America’s homebuilders had hundreds of thousands of homes and condos under construction when the housing market came to a crashing halt in the fall of 2008. New home sales, which topped 2,000,000 units per year in 2005, fell to an annual level of under 400,000 units in early 2009. Prices have retreated to 2003 levels and in some markets even lower.


    What happens to America’s home builders? Do they follow General Motors and Chrysler into bankruptcy? Can they survive? New home sales are down 80% since 2005 – doing worse even than automobile sales. The tectonic plates of the housing industry are shifting rapidly and have not settled into any discernible pattern.

    Residential land has dropped precipitously in value but a case can be made that raw residential land now has a “negative residual value”. There are hundreds of thousands of completed but unsold, foreclosed, and vacant, homes littering the countryside. The chart above demonstrates how dramatically sales have fallen since their peak in 2005. This “overhand” inventory must be cleared out before any recovery can ensue. The prices of these units must be cut by draconian margins to attract the bottom fishers and speculators who will take the risk from the home builders and purchase the outstanding inventory. This will not happen quickly. This is not a market that can generate an early rebound.

    Has Congress learned from its mistakes? Apparently not. In March 2009, Democratic Representatives Green, Wexler and Waters introduced HR600 entitled “Seller Assisted Down Payments” that instructs FHA to accept 100% financing from those who cannot fund the required 3.5% down payment.

    A year from now the landscape of America will be forever changed. Five years from now, will American ingenuity have revolutionized the home building industry? The imperative is to find homebuilders who can speed production and lower costs. And government needs to learn from its own mistakes and realize that a successful housing sector depends on solid market fundamentals as opposed to pursuing an agenda of social engineering.

    ***********************************

    This is the second in a series on The Changing Landscape of America. Future articles will discuss real estate, politics, healthcare and other aspects of our economy and our society. Robert J. Cristiano PhD is a successful real estate developer and the Real Estate Professional in Residence at Chapman University in Orange, CA.
    PART ONE – THE AUTOMOBILE INDUSTRY (May 2009)

  • State of the Economy June 2009

    Nobel Prize-winning economist Paul Krugman was quoted widely for saying that the official recession will end this summer. Before you get overly excited, keep in mind that the recession he’s calling the end of started officially in December 2007. Now ask yourself this: when did you notice that the economy was in recession? Six months after it started? One year? Most people didn’t even realize the financial markets were in crisis until the value of their 401k crashed in September 2008. Count the number of months from December 2007 until you realized the economy was in recession, add that to September 2009 and you’ll have an idea of when you should expect to actually see improvements in the economy.

    Douglas Elmendorf, Director of the Congressional Budget Office (CBO), testified on “The State of the Economy” before the House Committee on the Budget U.S. House of Representatives at the end of May. CBO sees several years before unemployment falls back to around 5 percent, after climbing to about 10 percent later this year. Remember this phrase: Jobless Recovery; it happens every time we have a recession. Employment historically does not increase until 6 to 12 months AFTER GDP starts to improve. Even Krugman admits that unemployment will keep going up for “a long time” after the recession officially ends.

    While some of us are worrying about stagflation – a stagnant economy with rising prices – the CBO report does a good job of describing why deflation is worse than inflation. Deflation would slow the recovery by causing consumers to put off spending in expectation of lower prices in the future. The risk associated with high inflation is primarily that the Federal Reserve would raise interest rates too fast, stalling the economy – similar to what Greenspan did to prolong the recession in the early 1990s. We think the real conundrum is this: how do you deal with an asset bubble without deflating prices? Preventing deflation now simply passes the bubble on to some other asset class at some future time.

    CBO calculates that output in the U.S. is $1 trillion below potential, a shortfall that won’t be corrected until at least 2013. New GDP forecasts are coming in August from CBO. They say the August forecast will likely paint an even gloomier picture than this already gloomy report. Hard to imagine!

    There are plenty of reasons that Krugman and others are seeing encouraging signs in the economy. Social Security recipients received a large cost-of-living adjustment, payroll taxes were lowered so that employees are taking home bigger paychecks, larger tax refunds, lower energy prices – all of these lead to an uptick in consumer spending in the first quarter of 2009. I checked in with Omaha-area Realtor Rod Sadofsky last week. He has seen an improvement in sales in the range of median-priced homes which he attributes to the $8,000 tax credit available to first-time homebuyers (or those who have not owned for at least three years). Along with an up-tick in that segment of the market, those sellers are able to move up to higher priced homes a little further up the range, further improving home sales. However, the tax incentive is scheduled to expire at the end of 2009. When the stimulus winds down…well, there will be no more up-ticks. CBO agrees with Rod and warns of a possible re-slump in 2010 when the effects of the stimulus money begin to wane.

    CBO’s Dr. Elmendorf has a way to solve this problem: to keep up consumer spending, he suggests that people should work more hours and make more money. Duh! We think we hear Harvard calling – they want their PhD back! CBO seems undecided about which came first in the credit markets: problems in supply or problems in demand?

    “Growth in lending has certainly been weak, but a large part of the contraction probably is due to the effect of the recession on the demand for credit, not to the problems experienced by financial institutions.”

    “Indeed, economic recovery may be necessary for the full recovery of the financial system, rather than the other way around.”

    We shouldn’t be so hard on Elmendorf. The report makes it clear just how difficult it has been to figure out 1) what happened 2) why it happened 3) what do we do about it and 4) what happens next. CBO seems to be reaching for answers while to us it is obvious they are missing the point by not even considering that manipulation has wrecked havoc on the markets. Whenever things don’t make sense to someone like the Director of the CBO, experience tells us there’s a rat somewhere.

    Regardless of how overly-complicated financial products may become, the economy really shouldn’t be that hard to figure out. Still, no one seems to know how far down the banks can go – if banks don’t lend to businesses, businesses close, people lose their jobs, unemployed people default on loans, banks have less to lend, and banks can’t lend to businesses…Seems we are damned if we do and damned if we don’t: too much borrowing caused the crisis; too little spending worsens it. Do they want us to keep spending money we don’t have?

    While Krugman is admitting that the world economy will “stay depressed for an extended period” CBO is reporting that “in China, South Korea, and India, manufacturing activity has expanded in recent months.” The other members of the G8, however, aren’t faring any better than we are: GDP is down 10.4 percent in the European Union, 7.4 percent in the UK and 15.2 percent in Japan. Canada – whose banks are doing just fine without a bailout, thank you very much – saw GDP decline by just 3.4 percent in the last quarter of 2008.

    Undaunted by nearly 10 percent unemployment – after predicting it would rise no higher than 8 percent – President Obama announced today that the White House opened a website for Americans to submit their photos and stories about how the stimulus spending is helping them. If they can’t manage the economy, they can still try to manage our expectations about the economy.

    Susanne Trimbath, Ph.D. is CEO and Chief Economist of STP Advisory Services. Her training in finance and economics began with editing briefing documents for the Economic Research Department of the Federal Reserve Bank of San Francisco. She worked in operations at depository trust and clearing corporations in San Francisco and New York, including Depository Trust Company, a subsidiary of DTCC; formerly, she was a Senior Research Economist studying capital markets at the Milken Institute. Her PhD in economics is from New York University. In addition to teaching economics and finance at New York University and University of Southern California (Marshall School of Business), Trimbath is co-author of Beyond Junk Bonds: Expanding High Yield Markets.

  • Britain’s Labour Lessons For Obama

    LONDON – The thrashing of Britain’s New Labour Party – which came in a weak third in local and European Parliament elections this week – may seem a minor event compared to Barack Obama’s triumphal overseas tour. Yet in many ways the humiliation of New Labour should send some potential warning shots across the bow of the good ship Obama.

    Labour’s defeat, of course, stemmed in part from local conditions, notably a cascading Parliamentary expense scandal that appears most damaging to the party in power. Yet beyond those sordid details lies a more grave tale – of the possible decline of the phenomenon I describe as gentry liberalism.

    Gentry liberalism – which reached its height in Britain earlier this decade and is currently peaking in the U.S. – melded traditional left-of-center constituencies, such as organized labor and ethnic minorities, with an expanding class of upper-class professionals from field like media, finance and technology.

    Under the telegenic Tony Blair, an Obama before his time, this coalition extended well into the middle-class suburbs. It made for an unbeatable electoral juggernaut.

    But today, this broad coalition lies in ruins. An urban expert at the London School of Economics, Tony Travers, suggests that New Labour’s biggest loss is due to the erosion of middle-class suburban support. The party also appears to be shedding significant parts of its historic working-class base, particularly those constituents who aren’t members of the public employee unions.

    Even some longstanding ethnic minorities, most notably the highly entrepreneurial South Asians, also show signs of drifting away from Labour. The only Labour supporters left, then, are the liberal gentry, the government apparatus and the most aggrieved minorities.

    This process started before the Parliamentary scandals, Travers adds. Last year a Conservative, Boris Johnson, was able to unseat the sitting Labour-ite mayor of London, Ken Livingstone, largely due to votes from the outer boroughs of the city.

    The shift reveals the weakening hold of gentry liberalism. At its core, gentry liberalism depends on massive profits in key sectors – largely finance and real estate – to maintain its affluence while servicing both its environmentally friendly priorities and redistributing wealth to the long-term poor.

    This has also allowed for a massive expansion of both the scope and size of government. Today government-funded projects account for close to half of Britain’s gross domestic product (GDP), and this share is heading toward its highest level since the late 1940s. In some depressed parts of country, like in the north of England, it stands at over 60%.

    As long as the City of London was minting money – much of it recycled from abroad – the government could afford to pay its bills. But with the economy in a deep recession, Labour can no longer count on the same sources to finance expanding government.

    Although the liberal gentry are not much affected by diminished job opportunities, higher taxes or reduced services, those problems do afflict the tax-paying working and lower middle classes who dominate suburban areas. “We are not [just] dealing with upward mobility,” notes Shamit Saggar, a University of Sussex social scientist with close ties to the Labour Party, “but also the prospect of downward mobility.”

    Both in Britain and America, these middle-income suburban voters remain by far the largest electoral bloc. Last year they divided their votes about evenly between Obama and John McCain, which helped the Democrats, along with the huge supermajorities Obama racked up in the urban core, forge an easy victory.

    In Britain, however, now these suburban as well as small-town voters are tilting to the right, notes Sarah Castells of the Ipsos-Mori survey organization. This is in large part because they no longer believe the Labour Party supports their aspirations. “This is where we see a shift to the Tories,” Castells explains.

    The now-diminished Labour base of public employees, minorities and these gentry liberals is not a sustainable electoral coalition. In total, Labour can’t count for more than one-quarter of the electorate.

    Although vastly different in their class status, these groups share a common interest in an ever-more-expansive state. For public sector workers and the welfare-dependent poor, there is the reasonable motive of self-interest. In contrast, the liberal gentry’s enthusiasm for expanded government stems increasingly from their embrace of environmental regulation, which has become something of a religion among this set.

    You have to wonder what average Brits must make of the likes of Jonathon Porritt, the head of the government’s Sustainable Development Commission – a member of the gentry in both attitude and lineage. The Eton-educated Porritt’s recent pronouncements include such gems as a call to restrict the number of children per family to two to reduce Britain’s population from 60 to 30 million. He also has scolded overweight people for causing climate change.

    These do not seem like sure electoral winners. Today extreme green policies that were once merely odd or eccentric are becoming increasingly oppressive, leading to even more actions that disadvantage suburban lifestyles. Environmental activists’ solution for the country’s severe housing shortage – particularly in the London region – is to cram the working and middle classes into dense urban units resembling sardine cans and force even more suburbanites off the road.

    Even so, large-scale house production over the past decade has lagged behind demand and, as a result, the tidy single-family home with a nice back garden so beloved by the British public may soon be attainable only by the highly affluent – and, ironically, that includes much of the gentry. What an odd posture for a party supposedly built around working-class aspirations.

    “New Labour has brought in ‘New Urbanism,’ and the results are not pretty,” suggests University of Westminster social historian Mark Clapson, as he showed me some particularly tiny, surprisingly expensive new houses outside of London.

    This kind of approach has gained some proponents among the Obama crowd. Recent administration pronouncements endorse such things as “coercing” Americans from their cars, fighting suburban “sprawl” and even imposing restrictions on how much they can drive. It makes you wonder what future they have in mind for our recently bailed-out auto companies.

    It’s possible that America’s middle-income voters will eventually be turned off by such policies, as is the case in Britain. President Obama’s remarkable genius for political theater may insulate him now, but it won’t for eternity. Over time, some of the Democrats’ hard-won, suburban middle-class support could erode.

    The key here may be the quality of the opposition. In Britain, the Conservatives may have found at least an adequate leader in David Cameron. People see him as a viable prime minister. Right now, the Republicans have no such figure, allowing themselves to be led by gargoyles like Rush Limbaugh and Newt Gingrich.

    Yet the president cannot count on Republicans’ continued ineptitude. There’s only so much tolerance in the U.S. – both for cascading public debt and ever-expanding government regulation.

    Of course, Obama still has time to get it right. But if he remains the prisoner of the gentry, he and his party could experience some of the pain now being inflicted upon their ideological counterparts across the pond.

    This article originally appeared at Forbes.

    Joel Kotkin is executive editor of NewGeography.com and is a presidential fellow in urban futures at Chapman University. He is author of The City: A Global History. His next book, The Next Hundred Million: America in 2050, will be published by Penguin early next year.

  • Painting the Town White: Technology and Greenhouse Gas Emissions

    “Paint the world white to fight global warming” was the astonishing headline from The Times of London. The paper was referring to a presentation made by United States Secretary of Energy, Dr. Stephen Chu at the St. James Palace Nobel Laureate Symposium last week. Chu was reported as saying that that this approach could have a vast impact. By lightening paved surfaces and roofs to the color of cement, it would be possible to cut carbon emissions by as much as taking all the world’s cars off the roads for 11 years. That would be no small accomplishment.

    Chu makes considerable sense and his underlying approach is wise: emphasizing inexpensive, simple and unobtrusive ways to reduce greenhouse gas (GHG) emissions. This is at the same time that Secretary of Transportation Ray LaHood has suggested “coercing” people out of cars and a bill by Senators Jay Rockefeller and Frank Lautenberg would require annual reductions in per capita driving. Strategies such as these are not inexpensive, they are not simple and they are not unobtrusive. Indeed, given the close association between personal mobility, employment and economic growth, such policies could have serious negative effects.

    The biggest problem with coercive strategies is that they are simply unnecessary. As Secretary Chu has indicated, huge reductions can be achieved in GHG emissions, without interfering in people’s lives or threatening the economy. There’s more to this story than paint.

    The Cascade of Technology

    There is a virtual cascade of technological advances that have been spurred by the widely accepted public policy imperative to reduce GHG emissions. Here are just a few.

    Vehicle Technology

    Some of the most impressive advances are in vehicle technology. GHG emissions from cars are directly related to fuel consumption. Thus, as cars require less fuel, GHG emissions go down at the same rate.

    By now, everyone is aware that the Administration has advanced the 2020 vehicle fuel efficiency (CAFE) standards to 2016, matching the California requirements. These requirements apply to the overall fleet, both cars and light trucks (which are predominantly sport-utility vehicles). Recently published research by Robert Puentes of the Brookings Institution finds that per capita automobile use had fallen off even before gasoline prices exploded, so it seems reasonable to suggest that future vehicle travel will rise at approximately the population growth rate, rather than the robust growth rates previously forecast. At the new 35.5 miles per gallon, the nation could be on a course to reduce GHG emissions from cars and light trucks by more than 20 percent by 2030, despite the increase in driving as population increases.

    This is just the beginning. There are advances well beyond the 35.5 mile per gallon standard. The most efficient hybrid cars now achieve 50 miles per gallon. The European parliament has adopted a nearly 70 mile per gallon standard for 2020. The President has often spoke of his commitment for the nation to develop 150 mile per gallon cars, while Volkswagen has already developed a 235 mile per gallon car.

    A French company plans to market a car powered by compressed air at city traffic speeds, producing almost no GHG emissions, while at higher speeds it uses gasoline to get more than 100 miles per gallon.

    Fuel Technology

    Progress is also being made on alternative fuels and on making present fuels cleaner.

    Technologies are being developed to produce gasoline from carbon dioxide.

    There are even substantial advances in air travel emissions. Air New Zealand has announced tests that show the feasibility of using biofuels based upon the jatropha plant. The airline reports that, gallon for gallon, the biofuel reduced GHG emissions 60 to 65 percent relative to jet fuel. Jatropha is a non-food crop, and therefore its use would have little or no impact on food prices.

    Carbon Neutral Housing

    We have previously reported on the development of a carbon neutral, single story 2,150 square foot suburban house in Japan. The resulting 100 percent reduction in GHG emissions means that there is no reason that such housing cannot continue to be available to those who prefer it.

    Electricity Generation

    One of the most intractable challenges will be producing sufficient supplies of electricity while considerably reducing GHG emissions. Obviously, one approach with great potential is nuclear power, which the environmentally conscious French have successfully used to produce approximately three-quarters of their demand.

    Further, substantial advances are coming in solar power. For example a Massachusetts Institute of Technology team has developed a solar concentrator system that increases power production “by a factor of 40.” The process is now under commercial development.

    Even Buck Rogers seems to be getting into the game. California’s Pacific Gas and Electric Company is partnering with a startup firm to produce solar energy in space and to beam it to earth by microwaves. This process could produce as much as 10 times the energy as ground based solar connectors.

    Further, international efforts continue toward developing nuclear fusion power generation. This non-polluting technology, still largely theoretical, could revolutionize power production in decades to come.

    The Color of Paint

    Some of the technological advances above may not in fact make a substantial contribution to reducing GHG emissions in the longer run. However, these developments and others likely to come underscore the fact that technology, that is human ingenuity, can materially reduce GHG emissions, while permitting people and the economy to go about their business. Serious attempts to force behavior modification backwards to the past seem likely to fail.

    So, there is no reason to retreat to an idealized yesterday to meet the thinly disguised social engineering goals of the few while leaving the many worse off. Secretary Chu has caught the spirit of the right approach. We should be painting the town white with innovation and should reject the coercion that has been embraced by those who naively (or perhaps even purposefully) would paint the future a more somber color. As in the past, human ingenuity appears up to the challenge, if we give it the chance.

    Wendell Cox is a Visiting Professor, Conservatoire National des Arts et Metiers, Paris. He was born in Los Angeles and was appointed to three terms on the Los Angeles County Transportation Commission by Mayor Tom Bradley. He is the author of “War on the Dream: How Anti-Sprawl Policy Threatens the Quality of Life.

  • Salinas and Self-Governance

    “Man is the only kind of varmint who sets his own trap, baits it, then steps in it.” — John Steinbeck

    Though probably not intended as a political commentary, Steinbeck’s utterance perfectly describes the current California budget crisis. And, given the revenue and service delivery relationship between cities and the state, traps can be set and baited in Sacramento, leaving mayors, city councils and city managers to step in them.

    This is what is happening today in Steinbeck’s hometown of Salinas (his childhood home is pictured), where the city faces a structural deficit of nearly $20 million, out of a $97 million general budget. Given the dramatic scope of the decisions it faces, the city government is taking a unique approach to finding solutions: gathering residents together in a series of facilitated discussions about the budget crisis. I attended one of these workshops in early April, where I watched around a hundred Salinas residents participate in a three-hour dialogue, and learned anew the challenges to self-governance, and its power.

    The first hurdle attendees encountered was informational. From the size of the deficit, to utility users’ tax revenues, to what portion of the budget is spent on cops versus parks, it was evident that most attendees had little understanding about how their city government actually functions. This is not to cast aspersions on Salinas: lack of basic civic knowledge, especially of local government, is a national tragedy, contributing to uninformed discussions that easily turn partisan. Several participants came to the workshop with single-issue views about the police chief’s salary, or the amount spent on maintenance, but when faced with the full budget picture, and other residents with contrary opinions, they soon moderated their judgments.

    Participants were forced to wrestle with the same difficult trade-offs as their elected representatives, and in so doing, learned that governing – even at the local level – is a complex process of moving interlocking levers. Using a program template developed by San Diego’s Viewpoint Learning, participants were presented with a set of three “visions” of Salinas, each with related service and revenue frameworks. A budget cut in a certain area has specific ramifications, as do tax and fee increases, but rarely do any of us participate in conversations where we have to confront such decisions. As Mayor of Salinas Dennis Donohue told me, “The gap between service expectations by the public and the public sector’s inability to deliver those services needs to be bridged.” This can only happen effectively when the public both understands and legitimately weighs its options.

    Finally, as the dialogues reached the final hour, I began to sense a change in the attitude of those hundred or so Salinans gathered in a community college cafeteria. What began as a crash course in local government civics, and moved to the plate-balancing act that is a budget process, concluded with participants taking ownership of their city. A debate at one table about a sales tax increase moved into a discussion of, “What can we do to keep our young people from moving out of Salinas after High School?” When presented to the full group, this thought was echoed, with others extolling “What it is that’s great about Salinas,” wondering how this could be communicated, and what role they might play in improving their community.

    Salinas is one of several cities around California, and around the country, employing this “participatory budgeting” process in response to painful fiscal decisions. Even cities as large as Philadelphia, with its “Tight Times, Tough Choices” project, involved over 4,000 residents in budget deliberations. Each has different elements depending on the size of the city and scope of the budget challenge, but those with the greatest impact do the following: accurately inform the public, engage them in a conversation that involves having to make legitimate trade-offs, and create a space in which residents can not only offer informed opinions, but actually participate in the building of their city.

    It seems that budget deficits are yielding surpluses in local involvement.

    Pete Peterson is Executive Director of Common Sense California, a multi-partisan non-profit organization that supports civic participation around California. He also lectures on civic engagement at Pepperdine’s School of Public Policy.

  • San Jose, California: Bustling Metropolis or Bedroom Community?

    Dionne Warwick posed the question more than 40 years ago, yet most Americans still don’t know ‘The way to San Jose’. Possessing neither the international cachet of San Francisco nor the notoriety of Oakland, San Jose continues to fly under the national radar in comparison to its Bay Area compatriots. Even with its self-proclaimed status as the ‘Heart of Silicon Valley’, many would be hard pressed to locate San Jose on a map of California.

    More well-known American cities may try to gain population by branding themselves as interesting places, but San Jose does not struggle to attract newcomers. Sprawling over 178 square miles, San Jose sits at the southern end of the San Francisco Bay. This year the city exceeded the 1 million population mark for the first time.

    So what makes this city, the 10th-largest in the United States, appealing? Unlike its precious neighbor 50 miles to the north, San Francisco, people move to San Jose primarily for jobs – especially those related to the coveted technology sector. Whereas San Francisco balances its role as playground for the independently wealthy and welfare state for the lumpenproletariat, San Jose remains favored among families and those looking for a safe environment in which to raise children – not to mention, the weather is better.

    San Jose does not stimulate a sense of urban exaltation. Aside from a commercial downtown core with a collection of mediocre high-rises (limited in height due to do downtown’s adjacency to the San Jose Airport), the city is unapologetically suburban in a character.

    San Jose’s pattern of development can be traced back to its origins as an agricultural community supporting early Spanish settlers who chose to settle in the fertile Santa Clara Valley. It remained a modest-size agrarian community until the end of World War II when it underwent a period of rapid expansion-not unlike that of Los Angeles to the south. During the 1950s, with the emergence of semiconductor technology derived from silicon, San Jose and the greater Santa Clara Valley exploded into a center for the evolution of computer technology.

    Today, San Jose can best be understood by its ambivalent relationship with neighboring Silicon Valley cities. Mid-size suburbs such as Cupertino, Sunnyvale, Mountain View and Palo Alto, all located west/northwest of San Jose as one travels up the peninsula towards San Francisco, are very distinct and separate entities. Home to some of Silicon Valley’s heaviest hitters (Cupertino has Apple, Sunnyvale has Yahoo!, Mountain View has Google, Palo Alto has Hewlett-Packard, Facebook and Stanford University), these cities largely define the technology-focused region. To be sure, San Jose’s has its share of big players, including eBay and Adobe as well as the ‘Innovation Triangle’, an industrial area of north San Jose, home to the headquarters of large companies like Cisco Systems and Cypress Semiconductor.

    Yet, despite the presence of these firms, San Jose has become ever more a residential community, with among the worst jobs to housing balances in the region. Furthermore, a whopping 59% of the city’s developed land constitutes residential use – 78% of that being single-family detached housing. In this sense, despite being the largest city, San Jose essentially serves as a ‘bedroom community’ for the rest of Silicon Valley.

    This has been a burden for the city, which, unlike its neighbors, lacks enough large information technology companies to help fill their tax coffers. In contrast job rich ‘green’ cities like Palo Alto have remained staunchly ‘anti-growth’ regarding residential development and consequently have very high housing prices.

    This pattern poses fiscal problems for San Jose. City officials have long been aware of the need to stimulate economic development instead of continuing to lose out to its neighbors but the city seems determined to increase further its role as dormitory for its neighbors. Indeed, amazingly the city’s development agenda has in recent years shifted to a relentless focus on high-density, multi-family residential in the downtown core and along transit corridors. In 2007, 79% of all new housing built in San Jose was multi-family – a staggering deviation from its history of low density development.

    Though well-intentioned, the slant towards densification has yielded a glut of empty condo units throughout the city. Those that have purchased units in new developments often find themselves with underwater mortgages. During a recent visit to one the flashy new downtown condo buildings, The 88, I entered a desolate sales office and was greeted by a skittish sales agent. When asked how sales were, my question was deferred without a direct answer in an act of not-so-quiet desperation.

    Although it’s clear most people in San Jose prefer lower density living, the city government continues hedging tax dollars against a future in which newcomers will want to live in a high-density setting. Outside of downtown, low to mid-rise multi-family housing has been built along the city’s light-rail lines in what are conceived to be ‘transit villages’. The popularity for such a lifestyle is questionable given the high price point and unreasonable HOA dues of these condo units, particularly when single-family detached houses can be purchased at comparable prices.

    Despite these issues, San Jose seems hell-bent on its path towards densification. The city has major plans to develop the area around its Diridon Train Station, just west of downtown, as California High-Speed Rail and BART are projected to make their way to San Jose. Furthermore, the city government is counting on the Oakland A’s baseball team making a move to San Jose.

    From the Champs-Élysées to Tiananmen Square, grand urban visions are what have defined cities historically. As a product of the Silicon Valley ethos as well as an observer of planning trends, I would argue that this is no longer valid – especially for any city with the hopes of a prosperous future. Rather, in democratic societies, it will be the idiosyncrasies of individual actors and the prospect of upward mobility that will define a sense of place.

    Obsessed with density and urban form, planners don’t seem to grasp the chicken and egg conundrum – the notion that lifestyle amenities follow on the heels of economic opportunity. San Jose needs to cast its future on nurturing its entrepreneurs instead of trying to become something it is not yet ready to become.

    Adam Nathaniel Mayer is a native of the San Francisco Bay Area. Raised in the town of Los Gatos, on the edge of Silicon Valley, Adam developed a keen interest in the importance of place within the framework of a highly globalized economy. He currently lives in San Francisco where he works in the architecture profession.

  • Project Development: Regulation and Roulette

    The site plan logically should be the key to approval of a development project. Yet in reality, the plan is secondary to the presentation. My conclusions are based upon experience with well over a thousand developments over four decades, most in the mainland USA. And what I’ve observed is that the best site plan is only as good as the presentation that will convince the council or planning commission to vote “Yes” on it. No “yes” vote, no deal, no development.

    Each presenter deals with the dog-and-pony show in his own way. There’s an endless variety of styles (or lack of styles). All of these public meetings have one thing in common: The neighbors (if there are any) will be there to oppose the new development.

    Not Too Long Ago…
    In the old days there were three factions: The developer presenting the plan, the neighbors opposing the plan, and the council listening to both sides. If the development was high profile, someone from the local press might also show up. The planning commission and council are fully aware that all plans will be met with neighborhood opposition, and they will have to listen to lengthy complaints along the route to approving (possibly) the plan.

    In the past, the citizens sitting on these boards would most likely dismiss Elwood and Betsy Smith’s complaint about how a development in their back yard would invade their privacy, and would vote in favor of the new master planned community instead.

    How It’s Different Today
    Today there is often an additional audience. Televised meetings provide an entire region of neighbors. The on-screen council listens to the neighbor’s objections, no matter how absurd they may be, then answers directly to the camera, showing the general community watching at home that they really care about every citizen’s opinion. The council member must never appear too much in favor of the developer, as that could be misconstrued as not caring about the citizens he or she represents. A televised Council member hears the Smith’s complaint with a very concerned on-camera look, explains how maybe we have too many new homes in this town, and proceeds to tell viewers that the developer might want to consider a buffer and a drop in density. Concerns have changed from developing economically sensible neighborhoods to “please elect me Mayor when I’m on the ballot”.

    Planning Outside The USA
    Our first large site plan done outside the States was in Freeport, Bahamas. In 2000, when we were first contacted to design Heritage Village, we asked about doing presentations to the city council and planning commission to help move the approval process along. We were told that the development company and the regulating entity were the same, and if they liked the plan it would be built! That is exactly what had happened.

    Our next attempt outside the USA was not so easy. In Mexico City when we asked to sit down with government officials to change policy to create better neighborhoods, the developer said… No. At the time, we did not understand why it was so critical that we were not to suggest changes.

    We Discover A Superior Foreign System
    We wrongly assumed that all planning outside the USA could have similar problems, with restrictions that were absurdly prohibitive for designing great neighborhoods. It was only when we worked in Bogota, Columbia last year that we had the opportunity to work within a system that may not be so backwards after all. Our request to meet with the authorities to show them new ways to design neighborhoods was met, as it had been in Mexico City, with an absolute… No.

    We then asked for an opportunity to present the plan, and were told that was not necessary. Being that it was Columbia you can imagine our first thoughts. Cartels? Maybe corruption? The reality was much simpler. Since our plans met the minimums (they actually exceeded them), they were automatically considered approved. Imagine that – no neighbors to complain! If everything conforms, it should be approved … right? Just plain common sense.

    Zoning-Compliant Projects Should Be Exempt From Public Meetings
    When you think about it, why wouldn’t this work in the USA? if the development plan being submitted meets or exceeds the zoning and the subdivision regulation minimums, why does it need to go through any public approvals at all? The American developer often faces months or years of delays, enormous interest payments, and tens or perhaps hundreds of thousands of dollars spent on consultants and legal help to re-create plans that conform. Those massive sums could go towards making better neighborhoods, better architecture, better landscaping, less environmental impacts, and more affordable housing.

    We’d Still Need Public Meetings
    The public would still have plenty of input on regulation and zoning exemptions, where public citizen input is valuable. If a developer is proposing something that goes below minimums or does not conform to zoning regulations, then it is reasonable to go through the more time consuming process that we currently have. This brings up the question of how the developer would introduce something different to the written law. This could be a problem under typical PUD (Planned Unit Development) regulations, which typically allow blanket changes to the minimums when alternative designs are not covered by typical zoning.

    This PUD Pandora’s box, once opened, can have devastating results if the regulators and the neighbors both agree that the plan is simply not good enough. The developer thinks the plan is just dandy as is, but in reality most PUD proposals are simply too vague to be functional. A battle of wills that can last years often ensues.
    In the end , these expensive delays increase lot costs, and the home buyer ultimately pays. If a special ordinance such as PUD, Cluster Conservation, or Coving was specifically spelled out in a rewards-based — instead of a minimums-based — system, developers could get benefits for great plans complete with open space and connectivity, typically density and setback relaxations.

    While writing Prefurbia, we began to ask ourselves, how did we take something so simple and let it get so out of control? The third world countries are progressive enough to actually allow developers who comply with the rules to quickly build their neighborhood. Maybe they are not so far behind us after all.

    Perhaps our regulations and planning approach is intended to keep the system “busy” with billable hours. Imagine if we could get a conforming plan stamped, and the next day construction could begin. How many billable hours would be eliminated, how much construction cost and land holding interest saved? That would be very hard to calculate, but it’s likely significant.

    “It is difficult to get a man to understand something when his salary depends upon his not understanding it…” Al Gore, An Inconvenient Truth

    The inconvenient truth won’t win us many friends in the consulting industry whose incomes depend upon generating billing time in meetings. But can we afford to continue down the path we are presently on? We need to take a hard look at the regulations. Are they written solely to provide the highest living standards? Or do they generate the highest billable hours for the consultants who propose them?

    Rick Harrison is President of Rick Harrison Site Design Studio and author of Prefurbia: Reinventing The Suburbs From Disdainable To Sustainable. His websites are rhsdplanning and prefurbia.com.

  • Portland: A Model for National Policy?

    United States Secretary of Transportation Ray LaHood and Washington Post columnist George Will have been locked in debate over transit. Will called LaHood the “Secretary of Behavior Modification” for his policies intended to reduce car use, citing Portland’s strong transit and land use planning measures as a model for the nation. In turn, the Secretary defended the policies in a National Press Club speech and “upped the ante” by suggesting the policies are “a way to coerce people out of their cars.”

    These are just the latest in a series of media accounts about Portland, usually claiming success for its policies that have favored transit over highway projects as well as its “progressive” land use policies. Portland has also become the poster child for those who advocate planning restrictions and subsidies favoring higher density development in parts of the urban core.

    Indeed if Secretary LaHood has his way, Portland could become The Model for federal transportation policy. So perhaps it is appropriate to review what it has accomplished.

    Portland’s Mediocre Results

    Portland’s record of transit emphasis began more than 30 years ago, when the area “traded in” federal money that was available to build an east side freeway to build its first light rail line. The east side light rail opened in 1986. Since that time, Portland has significantly increased its transit service, especially opening three more light rail lines (West Side, North Side and Airport) as well as a downtown “streetcar.”

    Portland’s Static Transit Market Share: With these new lines and expanded service, Portland has experienced a substantial increase in transit ridership. Passenger miles have increased more than 130 percent since 1985, the last year before the first light rail line was opened. This is an impressive figure.

    However, over the same period, automobile use increased just as impressively. In 1985, approximately 2.1 percent of motorized travel in the Portland urban area was on transit and it remained 2.1 percent in 2007, the latest year for which data is available.

    Portland’s Declining Transit Work Trip Market Share: One of transit’s two most important contributions to a community is providing an alternative to the automobile for the work trip (the other important contribution is mobility for low income citizens). Work trip rider attraction is important because much of this travel is during peak periods, when roadways are operating at or above full capacity. In 1980, the last year for which data is available before the first light rail line opened, United States Bureau of the Census data indicates that transit’s work trip market share was 9.5 percent in the Portland area counties of Clackamas, Multnomah and Washington covered by Portland’s strong land use policies. Yet despite this, and the transit improvements, the work trip market share has not grown. By 1990, transit’s market share had dropped a third, to 6.3 percent. It rose to 7.6 percent in 2000 and by 2007 had fallen back to 6.8, despite opening two new light rail lines since 2000 (Figure 1). Remarkably, transit’s 2007 work market share was 28 percent behind its 1980 share and had fallen 10 percent since 2000.

    Figure 1:

    Yes, Portland did increase its transit use, but failed to increase the share of travel on transit and the proportion of people riding transit to work declined.

    Driving the Portland Evangelism: GHG Emissions

    Secretary LaHood’s affection for Portland appears to principally be that its policies can materially assist in the objective of reducing greenhouse gas (GHG) emissions. The data is available to test that claim.

    We examined GHG emissions per capita by transit in Portland and the urban personal vehicle fleet, including cars and personal trucks (principally sport utility vehicles). Overall, including upstream emissions (such as refining and power production), transit in Portland is about 50 percent more GHG friendly per passenger mile than the 2007 vehicle fleet. If all of the increase in transit passenger miles from 1985 to 2007 replaced automobile passenger miles, then reduction of approximately 50,000 GHG tons can be said to have occurred as a result in 2007 (though as is indicated below, things are not that simple).

    That sounds like a large number, until you consider that Portland traffic produces more than 8,000,000 GHG tons per year. Transit’s expansion has reduced GHG emissions by approximately 0.6 percent annually over 22 years. This pales in comparison to the 83 percent national reduction over a 45 year period that would be required by the Waxman-Markey bill being considered by Congress.

    The Cost of GHG Emission Reduction

    Moreover, GHG emission reduction requires a context. Not all GHG emission reduction strategies make sense. Given the widely held principle that GHG emission removal must not hobble the economy, it is crucial that costs (per ton of GHG removed) be a principal criteria. If excessively costly strategies are employed, the result will be wasted financial resources, which will translate into diminished economic growth and higher levels of poverty. According to the United Nations Intergovernmental Panel on Climate Change (IPCC), between $20 and $50 per ton is the maximum amount necessary to accomplish deep reversal of CO2 concentrations between 2030 and 2050. It is fair to characterize any amount above $50 per ton as wasteful and likely to impose unnecessary economic disruption.

    Even that cost may be high. The current “market rate” is about $14 per ton, which appears to approximate the amount that figures such as former vice-president Al Gore, Speaker of the House Nancy Pelosi and California Governor Arnold Schwarzenegger pay to offset their GHG emissions from flying.

    Portland Costs of GHG Emission Reduction

    This $14 to $50 range provides the context for comparing the cost of GHG emission reduction through transit expansion in Portland. Annual transit costs in Portland more than tripled from 1985 to 2007 (including inflation adjusted operating costs and the annual capital costs of the light rail lines), an annual increase of more than $325 million. This figure is reduced to capture the consumer cost savings from reduced automobile gasoline and maintenance costs. The final result is a cost of approximately $5,500 per ton of GHG removed.

    This is 110 times the IPCC $50 maximum and nearly 400 times the Gore-Pelosi-Schwarzenegger standard. If the United States were to spend as much to remove each ton of the likely 83 percent national reduction target, the cost would be $30 trillion annually, more than double the gross domestic product. To call the Portland GHG cost reduction figure extravagant would be an understatement.

    Traffic Congestion Increases GHG Emissions

    There is not a one-to-one relationship between reduced driving levels and reduced GHG emissions. As traffic congestion increases, urban travel speeds decline and “stop-and-start” traffic increases, fuel consumption is reduced (miles per gallon declines). Some or even all of the supposed gain from reduced driving can be negated by the higher GHGs from traveling in greater traffic congestion.

    Portland’s traffic congestion has increased substantially since before light rail. Further, by 2007 Portland’s traffic congestion had become worse than average for a middle-sized urban area and worse than in much larger Dallas-Fort Worth, Atlanta, Philadelphia and Phoenix.

    Further, according to information in the Texas Transportation Institute’s Annual Mobility Report, the amount of gasoline wasted due to peak period traffic congestion in Portland rose 18,000,000 gallons from 1985 to 2005 (latest data available, adjusted for the population increase), simply due to greater traffic congestion. The increase in GHG emissions from this excess fuel consumption is estimated to be approximately 200,000 tons annually. This is four times the estimated reduction in GHG emissions that was assumed to have occurred from the increase in transit ridership.

    The bottom line: The Portland model inherently produces more congestion and increases GHG emissions. Failure to expand roadways to meet demand and forced densification increase traffic congestion.

    Better Models

    The ineffectiveness of Portland’s model strategies in GHG emission is in contrast to other strategies. Between 2000 and 2007, the share of people working at home in Portland rose more than one quarter. If transit and working at home should continue their 2000s rates, transit’s work trip share will be less than that of working at home by 2015. Working at home eliminates the work trip, resulting in substantial GHG emission reductions and does it at a cost of $0.00 per ton.

    Another approach is the Obama Administration’s automobile fuel efficiency strategy. About the same time as the LaHood-Will debate was heating up, the President announced that automobile manufacturers would be required to increase their corporate average fuel efficiency for cars and light trucks to 35.5 miles per gallon by 2016, a 75 percent performance improvement from that of the present fleet. If this fuel efficiency could be achieved in Portland today, the reduction in GHG emissions would be more than 40 percent. This new policy would eventually close 90 percent of the gap between personal vehicles and transit in Portland.

    President Obama indicated that this strategy is costless. The higher costs that consumers will pay for cars will be more than made up by the fuel cost savings. Thus, according to the President, this policy costs $0.00 per ton of GHG emissions removed, less than the IPCC’s $50 and less than Portland’s $5,500. Of course, it is not possible to achieve 35.5 miles per gallon now, but it will be (Figure 2).

    Figure 2:

    The best hybrid cars now achieve 50 miles per gallon, which makes them less GHG intensive than transit in Portland. President Obama has gone further, indicating the potential for developing 150 mile per gallon cars. The curtain could be rising on a future of cars that emit less GHG emissions per passenger mile than transit. People and officials genuinely concerned about GHG emissions should applaud these advances. On the other hand, people and officials who value coercive behavior modification more than GHG emission reduction are likely to resist.

    The Consequences of Coercing People Out of Cars

    Moreover, Portland policies ignore a crucial factor: how automobiles facilitate economic growth and employment. Generally, the research indicates that the economic performance of metropolitan areas is enhanced by greater mobility. Moreover, no transit system provides the extensive mobility made possible by the automobile, not in America and not even in Europe. Coercing people out of cars coerces some out of employment and into poverty.

    Even where transit service is available, it generally takes longer than traveling by car. In 2007, travel to work by transit took 3:50 (three hours and 50 minutes) per week longer than driving in the nation’s largest metropolitan areas. With all of Portland’s transit improvements, it still takes approximately 3:15 longer per week to commute by transit than by driving. It appears that Secretary LaHood would add more than three hours (time many don’t have) to our work trip each week.

    The Land Use Cost

    The second plank of The Model is strong land use regulation (smart growth), which economic research shows to materially increase house costs, which would lead to a lower standard of living.

    Time to Turn Off the Ideological Autopilot

    The policies of The Model Portland have no serious potential for reducing GHG emissions and could even make it worse. On the other hand, the rapidly developing advances possible from improved vehicle technology, something the Administration espouses, show great promise. Behavior modification a la The Model turns out not only to be undesirable, but also unnecessary.

    Wendell Cox is a Visiting Professor, Conservatoire National des Arts et Metiers, Paris. He was born in Los Angeles and was appointed to three terms on the Los Angeles County Transportation Commission by Mayor Tom Bradley. He is the author of “War on the Dream: How Anti-Sprawl Policy Threatens the Quality of Life.