Category: Economics

  • State of the Economy June 2009

    Nobel Prize-winning economist Paul Krugman was quoted widely for saying that the official recession will end this summer. Before you get overly excited, keep in mind that the recession he’s calling the end of started officially in December 2007. Now ask yourself this: when did you notice that the economy was in recession? Six months after it started? One year? Most people didn’t even realize the financial markets were in crisis until the value of their 401k crashed in September 2008. Count the number of months from December 2007 until you realized the economy was in recession, add that to September 2009 and you’ll have an idea of when you should expect to actually see improvements in the economy.

    Douglas Elmendorf, Director of the Congressional Budget Office (CBO), testified on “The State of the Economy” before the House Committee on the Budget U.S. House of Representatives at the end of May. CBO sees several years before unemployment falls back to around 5 percent, after climbing to about 10 percent later this year. Remember this phrase: Jobless Recovery; it happens every time we have a recession. Employment historically does not increase until 6 to 12 months AFTER GDP starts to improve. Even Krugman admits that unemployment will keep going up for “a long time” after the recession officially ends.

    While some of us are worrying about stagflation – a stagnant economy with rising prices – the CBO report does a good job of describing why deflation is worse than inflation. Deflation would slow the recovery by causing consumers to put off spending in expectation of lower prices in the future. The risk associated with high inflation is primarily that the Federal Reserve would raise interest rates too fast, stalling the economy – similar to what Greenspan did to prolong the recession in the early 1990s. We think the real conundrum is this: how do you deal with an asset bubble without deflating prices? Preventing deflation now simply passes the bubble on to some other asset class at some future time.

    CBO calculates that output in the U.S. is $1 trillion below potential, a shortfall that won’t be corrected until at least 2013. New GDP forecasts are coming in August from CBO. They say the August forecast will likely paint an even gloomier picture than this already gloomy report. Hard to imagine!

    There are plenty of reasons that Krugman and others are seeing encouraging signs in the economy. Social Security recipients received a large cost-of-living adjustment, payroll taxes were lowered so that employees are taking home bigger paychecks, larger tax refunds, lower energy prices – all of these lead to an uptick in consumer spending in the first quarter of 2009. I checked in with Omaha-area Realtor Rod Sadofsky last week. He has seen an improvement in sales in the range of median-priced homes which he attributes to the $8,000 tax credit available to first-time homebuyers (or those who have not owned for at least three years). Along with an up-tick in that segment of the market, those sellers are able to move up to higher priced homes a little further up the range, further improving home sales. However, the tax incentive is scheduled to expire at the end of 2009. When the stimulus winds down…well, there will be no more up-ticks. CBO agrees with Rod and warns of a possible re-slump in 2010 when the effects of the stimulus money begin to wane.

    CBO’s Dr. Elmendorf has a way to solve this problem: to keep up consumer spending, he suggests that people should work more hours and make more money. Duh! We think we hear Harvard calling – they want their PhD back! CBO seems undecided about which came first in the credit markets: problems in supply or problems in demand?

    “Growth in lending has certainly been weak, but a large part of the contraction probably is due to the effect of the recession on the demand for credit, not to the problems experienced by financial institutions.”

    “Indeed, economic recovery may be necessary for the full recovery of the financial system, rather than the other way around.”

    We shouldn’t be so hard on Elmendorf. The report makes it clear just how difficult it has been to figure out 1) what happened 2) why it happened 3) what do we do about it and 4) what happens next. CBO seems to be reaching for answers while to us it is obvious they are missing the point by not even considering that manipulation has wrecked havoc on the markets. Whenever things don’t make sense to someone like the Director of the CBO, experience tells us there’s a rat somewhere.

    Regardless of how overly-complicated financial products may become, the economy really shouldn’t be that hard to figure out. Still, no one seems to know how far down the banks can go – if banks don’t lend to businesses, businesses close, people lose their jobs, unemployed people default on loans, banks have less to lend, and banks can’t lend to businesses…Seems we are damned if we do and damned if we don’t: too much borrowing caused the crisis; too little spending worsens it. Do they want us to keep spending money we don’t have?

    While Krugman is admitting that the world economy will “stay depressed for an extended period” CBO is reporting that “in China, South Korea, and India, manufacturing activity has expanded in recent months.” The other members of the G8, however, aren’t faring any better than we are: GDP is down 10.4 percent in the European Union, 7.4 percent in the UK and 15.2 percent in Japan. Canada – whose banks are doing just fine without a bailout, thank you very much – saw GDP decline by just 3.4 percent in the last quarter of 2008.

    Undaunted by nearly 10 percent unemployment – after predicting it would rise no higher than 8 percent – President Obama announced today that the White House opened a website for Americans to submit their photos and stories about how the stimulus spending is helping them. If they can’t manage the economy, they can still try to manage our expectations about the economy.

    Susanne Trimbath, Ph.D. is CEO and Chief Economist of STP Advisory Services. Her training in finance and economics began with editing briefing documents for the Economic Research Department of the Federal Reserve Bank of San Francisco. She worked in operations at depository trust and clearing corporations in San Francisco and New York, including Depository Trust Company, a subsidiary of DTCC; formerly, she was a Senior Research Economist studying capital markets at the Milken Institute. Her PhD in economics is from New York University. In addition to teaching economics and finance at New York University and University of Southern California (Marshall School of Business), Trimbath is co-author of Beyond Junk Bonds: Expanding High Yield Markets.

  • Rewriting The Oil Stock Story

    Could oil price manipulation have created the rerun of the Great Depression that we are currently enduring?

    Think about it. The doubling of gas prices had a profound effect on disposable income and the affordability of housing, whose subsequent downturn set the stage for economic collapse.

    We now know that Wall Street speculation drove oil from $69 a barrel to nearly $150. But this article purports to explain why.

    Back in early 2004, the nation’s investment banks began making large investments in oil stocks, which became the so-called “story stocks” of the era. The story was obvious. Emerging nations like China and India were driving up demand for oil, and supplies weren’t keeping pace.

    The investment banks had their analysts write papers espousing the profits to be made from oil, and they promoted the commodity itself as an asset class like real estate, stocks, and bonds, suggesting that it was suitable for long-term investment.

    To prove their point, the investment banks began investing in oil in the futures market. But their reason had nothing to do with what they were telling investors. It had to do with the long positions they held in oil stocks, which were certain to appreciate with the rise in the value of oil as a commodity. Exxon-Mobil stock, for example, went from around 40 in the spring of 2004 to a high of 95 on December 24, 2007. Merry Christmas and a Happy New Year!

    It was around this time that the Petroleum Marketers Association, which represents more than 8,000 retail and wholesale home heating oil companies and gas station owners, began getting hate mail. They were being blamed for gouging the public, even though their costs had more than doubled.

    Early in 2008, I received a call from a former stock brokerage client of mine, who is the CEO of a concern with factories and production facilities in China. “Tim, I keep getting these investment letters from the banks telling me how China is slurping up all this oil. But it simply isn’t true. Sure, the country is growing quickly, but no faster than last year, and certainly not enough to double the price of oil in less than a year.”

    Around the same time, Art Rosen, the former president of the National-Committee on U.S.- China Relations, also told me that China could not account for all the price spikes in oil. From what he could tell, there was plenty of product readily available at supply terminals throughout the Middle Kingdom.

    Now we know how this happened. The investment banks went to regulators to obtain permission to increase their leverage from a factor of 12 to a factor of 40 times capital. Much of that leverage was being applied to the already heavily leveraged oil market, where $10,000 controls over $100,000 of product. In the new scenario, $10,000 in the hands of an investment bank controlled $4 million in product.

    The levered effect on the price of oil was such that it began drawing huge amounts of money from stock and bond funds into the commodities markets, and specifically the market for oil. Institutional investors ranging from the Harvard Endowment to sovereign wealth funds got in on the oil action, which rose from $13 billion to over $300 billion in commodities transactions in just three years. At one point the markets were trading 27 barrels of crude oil for each barrel of oil that was actually being consumed in the United States, positions so large that they move the market in the cash commodity. In a single day in the price of oil jumped by more than $25, and yet there were no hurricanes or other supply disruptions that might have accounted for it.

    A report out of the MIT Center for Energy and Environmental Policy Research clearly showed that the dynamics of supply and demand for the cash commodity could not have been responsible for such a run-up in oil prices, which reached its steepest levels during an interval when supply was going up and demand was falling.

    By this time the price of gas was rising to five dollars a gallon. The owner of a $400,000 house who commuted by car suddenly discovered that that the price of gasoline had doubled and his commute was costing more than his mortgage payment. Something had to give and it was his mortgage. Suddenly, the $400,000 houses were worth $200,000, the mortgages were underwater, and the banks were drowning in red ink. The cascade in housing prices was soon mirrored in the price of oil. The money on Wall Street was now pulling out of the oil patch to drive down the bank shares and their mortgage-backed assets, setting the stage for the deepest economic contraction since the Great Depression.

    There’s plenty of blame to go around. But once again (as in Frontrunning and Finance; New Geography.Com), most of it should be borne by the people on Wall Street, best described as a bunch of crumbs held together by dough.

    Tim Koranda is a former stockbroker who now works as a professional speechwriter. He can be reached at koranda@alum.mit.edu.

  • The Best Places to Avoid a Recession

    Would you like to avoid recessions altogether?

    You can come close if you live in the right place.

    This report looks at the period January 1991 through April 2009 – a period of 220 months that includes three recessions. Since employment rises and falls monthly because of seasonal trends (school year, holiday retail and more), this report uses 12-month employment growth rates as the measurement criteria – the employment in a given month compared to the employment 12 months earlier. This eliminates seasonality and allows us to compare, if you will, apples with apples.

    The metric in this analysis is the percent of months where the 12-month employment growth rate is positive.

    Using employment growth rates as the measurement criteria:

    Alaska is 99.1% recession-proof since employment was growing for 218 months out of 220.

    Michigan is 51.8% recession-proof since employment was growing for 114 months out of 220.

      All the states are shown in the graphic, color-coded as follows:

    • Green is 90% or more
    • Grey is 80% to 90%
    • Red is 70% to 80%
    • Black is less than 70%

    Some metropolitan areas are also relatively recession-proof:

    Area
    Share of months where 12-month job growth rate is positive
    Grand Junction, CO
    100.00%
    McAllen-Edinburg-Mission, TX
    99.50%
    Olympia, WA
    99.10%
    Bismarck, ND
    98.60%
    Anchorage, AK
    97.70%
    Fargo, ND-MN
    97.70%
    Tyler, TX
    97.30%
    Greeley, CO
    96.80%
    Iowa City, IA
    96.40%
    Sioux Falls, SD
    96.40%
    Cheyenne, WY
    95.90%
    Columbia, MO
    95.90%
    Coeur d’Alene, ID
    95.50%
    College Station-Bryan, TX
    95.50%
    Billings, MT
    95.00%
    Fayetteville-Springdale-Rogers, AR-MO
    94.50%
    Laredo, TX
    94.50%
    Las Cruces, NM
    94.50%
    Valdosta, GA
    94.50%
    Killeen-Temple-Fort Hood, TX
    94.10%
    Rapid City, SD
    94.10%
    Bellingham, WA
    93.60%
    Ogden-Clearfield, UT
    93.60%
    Knoxville, TN
    93.20%
    St. George, UT
    93.20%

    And, unfortunately, some metropolitan areas are not very recession proof:

    Area
    Share of months where 12-month job growth rate is positive
    Baltimore City, MD
    17.70%
    Flint, MI
    28.60%
    Detroit-Livonia-Dearborn, MI Metro
    34.10%
    Philadelphia City, PA
    35.50%
    Dayton, OH
    37.30%
    Mansfield, OH
    38.20%
    Youngstown-Warren-Boardman, OH-PA
    41.80%
    Muncie, IN
    42.70%
    Kingston, NY
    43.60%
    Waterbury, CT NECTA
    45.50%
    Binghamton, NY
    47.30%
    Lima, OH
    47.30%
    Springfield, OH
    48.20%
    Detroit-Warren-Livonia, MI
    49.10%
    Lansing-East Lansing, MI
    50.00%
    Saginaw-Saginaw Township North, MI
    50.50%
    Ann Arbor, MI
    51.40%
    Cleveland-Elyria-Mentor, OH
    52.70%
    Decatur, IL
    52.70%
    Terre Haute, IN
    53.60%
    Canton-Massillon, OH
    54.10%
    Battle Creek, MI
    54.50%
    Jackson, MI
    55.00%
    Niles-Benton Harbor, MI
    55.00%

    You can’t necessarily judge a metropolitan area by its State’s employment growth rates. For example, Georgia is only 73.6% recession-proof yet Valdosta is 94.5%. Indiana is 74.5% yet Indianapolis is 90.0%. Missouri is 72.3% yet Columbia is 95.9%.

    A complete list of states and metropolitan areas is available at http://jobbait.com/a/rpa.htm.

    The data in this report present only part of a recession-proof picture of states and metropolitan areas. Think of them as a long-term picture from 1990 through April 2009. They do not necessarily represent what’s happening today. For example, Olympia WA which is the second-most recession-proof metropolitan area long term has declines in the last two months, March and April 2009. And, this will change next month and the month after.

    This report was written by Mark Hovind, President of JobBait. Mark helps six and seven figure executives find jobs by going directly to the decision-makers most likely to hire them. Mark can be reached through www.JobBait.com or by email at Mark@JobBait.com.

  • A Look at the Information Sector

    Between economic development strategies targeting software firms, the deflation of the tech bubble, talk of “broadband,” and recent consternation about failing publishing business models, we seem to hear a lot about the information sector. Recognizing that, it’s interesting that the information sector only comprises about 2.2% of total employment in the US.

    On top of that, after a big decline since the tech bubble peak in 2001, in February the sector has receded to just more than 2.9 million jobs, a level not seen since April 1996.

    The telecommunications subsector accounts for just more than 1/3 of information employment, and saw the biggest boom and bust. Publishing has declined since 2000, and motion picture and sound recording industries are larger than either software publishing or data processing.

    Looking at percent change, software has recovered from the tech bust, while the movie business has remained steady since topping off in 2000. Worse off are telecom and data processing, which continue the post bust slide.

    One fifth of the jobs in the publishing industry have vanished since 2001.

    This is not to say technology occupations are not a key part of the nation’s economy and productivity gains over the past decade, but the importance of the information sector itself is overstated. High-tech industries that produce products generally fall into manufacturing sectors while things like systems design, web design, or even custom programming are business services.

    The next post will look at regional shifts in information employment, but until then check out Ross Devol’s more comprehensive study on regional tech poles.

    Other Information services includes: news syndicates, libraries, archives, exclusive Internet publishing and/or broadcasting, and Web Search Portals.

  • Is Your City Safe From The Tech Bust?

    A decade ago, the path to a successful future seemed sure. Secure a foothold in the emerging information economy, and your city or region was destined to boom.

    That belief, as it turned out, was misguided.

    In the decade between 1997 and 2007, the information sector–which includes jobs in fields from media, publishing and broadcasting to computer programming, data processing, telecommunications and Internet publishing–has barely created a single new net job, while some 16,000,000 were created in other fields.

    The biggest losses have been in the telecommunications sub-field, which has shed 400,000 jobs nationwide since its peak in 2000. Not surprisingly the media and publishing industries have also lost ground, while employment in other arenas such as motion pictures, software and data-processing have remained stagnant for much of the decade.

    Equally critical, it seems clear that simply being a high-tech magnet does not make a region a prodigious job creator. The San Jose metropolitan area, better known as the heart of Silicon Valley, boasted over 960,000 jobs in 1997. Last year, even after the ballyhooed Version 2.0 of the dot-com boom, that number had actually declined–to barely 900,000. According to figures from economic-strategy firm Praxis Strategy Group, other traditionally tech-heavy areas, including San Francisco and Boston, also did poorly in terms of growth through the balance of this decade.

    Perhaps most disturbing, many areas are also losing their share of the information industry. For example, the information-sector job count, notes the Public Policy Institute of New York, has actually been stagnant or in decline in places like New Jersey, Connecticut, Illinois, Massachusetts, Minnesota and New York.

    The same pattern also affects so-called “cool” cities that were supposed to be ideal for high-tech jobs, according to a recent study by my colleagues at Praxis. The biggest declines in information jobs since 2000 have occurred in San Francisco (which lost 31,800 jobs), Northern Virginia (35,200) and Washington, D.C. (40,700).

    Silicon Valley dropped 5,400 positions since 2000, which amounts to 11.6% of all its information-sector jobs. The only bright spot for blue states is in Washington, where growth is driven by big employers Microsoft and Boeing. Los Angeles, buoyed by the relatively stable entertainment sector, has also managed to hold its own.

    Faced with all these cities that are merely struggling not to lose any jobs, just where is the tech-sector growth? It’s in less-celebrated areas of the country, like Idaho, New Mexico, North Carolina, Nevada–and in parts of Florida, South Dakota and South Carolina. By region, the fastest gainers turned out to be places like Orlando, Fla. (with 2,176 new information jobs since 2000), Madison, Wis. (2,400), Boise, Idaho (1,500), Wilmington, N.C. (1,267) and Charleston, S.C. (1,033).

    What distinguish most of these places are factors beyond prominent employers. These could include such prosaic things as tax rates (particularly on incomes), the cost of housing and the overall climate toward business. Information-sector jobs, it turns out, follow the basic rules of economic development seen in other industries.

    Of course, this is not to say tech jobs don’t matter. As the Milken Institute’s Ross DeVol argues in his new study of high-tech centers, technology jobs pay better than most, and their presence can boost other parts of local economies. And although they may not be multiplying fast, in some centers, like Silicon Valley, Boston and Southern California, whatever employment already exists has enough inertia to allow them to remain the largest tech centers in the country.

    Yet the problem is that the information economy, by itself, simply doesn’t reliably spur broader economic growth. That may be due to changes within the sector itself. From the 1980s to the mid-1990s, tech firms largely focused on creating productivity-enhancing products. Many of them also used on-shore manufacturing. Aerospace was a smaller industry, but it was still vital.

    These catalysts helped create dynamic companies that both employed large numbers of people directly and used contractors (whose numbers increased). The Silicon Valley I reported on in the mid-1980s housed an essentially industrial economy with many good jobs for middle- and working-class people. It was both a hotbed for pioneering entrepreneurs and a society that offered and encouraged opportunity.

    Today, however, tech has become increasingly software- and media-oriented. New companies tend to emerge from a small pool, and they are financed by a relative handful of local venture capitalists. Once launched, they may conduct some research and development at home, but marketing and customer service are either off-shored or moved to remote locations like the Great Plains or the “Intermountain West,” between the Cascades and the Rockies.

    As a result, even star companies like Google create a far smaller number of jobs than predecessor firms like Hewlett Packard, Intel or IBM. And even newer companies like venture darling San Francisco-based Twitter may go public, valued at $250 million or more, with only 45 employees.

    This, of course, represents very good news for a select few: investors and a handful of highly educated software engineers. But the Bay region’s broader economy and society isn’t as lucky.

    That’s because most segments of the information sector that do create lots of jobs tend to take place elsewhere. For example, when Intel considers opening a new chip plant, which could open up 7,000 new positions, it won’t build it in the Valley of its birth but rather in farther-flung locales like Oregon, Arizona and New Mexico. California has become too expensive; businesses there are heavily regulated and taxed for most industrial activity.

    So maybe it’s time to unlearn some of the assumptions we developed during the first tech boom. In the 1990s and early 2000s, many held that the information revolution would tame the business cycle, guarantee constant high returns and create widespread prosperity. Now we know better.

    The model of Silicon Valley, as DeVol suggests, cannot be easily duplicated. Another well-promoted formula, linking great universities to up-and-coming hip cities for the so-called “creative class,” has proved very limited when it comes to creating new jobs. And, anyway, trends in tech growth suggest that basic economic conditions like general affordability, taxes and the regulatory environment play an important role.

    Just as troubling may be the class divisions on display in places like Silicon Valley. As manufacturing and middle management jobs have fled, its capital, San Jose, has become more of a backwater. As local blogger Adam Mayer has pointed out, San Jose increasingly serves as a dormitory for the bottom-feeders of the Silicon Valley food chain.

    In contrast, tech power and influence is shifting to those areas that have always been well-to-do and are likely to stay that way–academically-oriented places like Cambridge, Palo Alto and San Francisco. They are becoming ever-more-exclusive reserves for the restless young and those with the greatest talent within the media and software industries. Meanwhile, the service class commutes in from the surrounding periphery to tidy up and run restaurants, while high housing costs and an overall lack of opportunities for other kinds of workers drive away much of the middle class, particularly families.

    In geographic terms, the real losers in this brave new tech world may be the communities on the fringes of those high-end tech areas. Take Lowell, Mass. Lowell, a former mill town widely celebrated for its tech-led revival in the 1980s, has seen little job growth since the late 1990s. But why pick Lowell, when it’s far cheaper and easier to expand in Boise or, even better, Bangalore, India?

    The time has come to let go of vintage fantasies about tech that date from the 1990s. Key regions–and the country as a whole–need to understand that the information sector is best seen not as an end in itself but as an industry that derives its value from how it works with other parts of the economy, such as finance and business services, agriculture, energy, manufacturing, warehousing and engineering. (Manufacturing alone employs 25% of the U.S.’s scientists and 40% of its engineers–and their related technicians.) We have to nurture a broad industrial base so that innovations in this sector do not simply end up boosting off-shore industry.

    Techies won’t save us from the folly of deindustrialization; in essence, we can no longer believe that it’s possible to Google our way to prosperity.

    This article originally appeared at Forbes.

    Joel Kotkin is executive editor of NewGeography.com and is a presidential fellow in urban futures at Chapman University. He is author of The City: A Global History and is finishing a book on the American future.

  • San Jose, California: Bustling Metropolis or Bedroom Community?

    Dionne Warwick posed the question more than 40 years ago, yet most Americans still don’t know ‘The way to San Jose’. Possessing neither the international cachet of San Francisco nor the notoriety of Oakland, San Jose continues to fly under the national radar in comparison to its Bay Area compatriots. Even with its self-proclaimed status as the ‘Heart of Silicon Valley’, many would be hard pressed to locate San Jose on a map of California.

    More well-known American cities may try to gain population by branding themselves as interesting places, but San Jose does not struggle to attract newcomers. Sprawling over 178 square miles, San Jose sits at the southern end of the San Francisco Bay. This year the city exceeded the 1 million population mark for the first time.

    So what makes this city, the 10th-largest in the United States, appealing? Unlike its precious neighbor 50 miles to the north, San Francisco, people move to San Jose primarily for jobs – especially those related to the coveted technology sector. Whereas San Francisco balances its role as playground for the independently wealthy and welfare state for the lumpenproletariat, San Jose remains favored among families and those looking for a safe environment in which to raise children – not to mention, the weather is better.

    San Jose does not stimulate a sense of urban exaltation. Aside from a commercial downtown core with a collection of mediocre high-rises (limited in height due to do downtown’s adjacency to the San Jose Airport), the city is unapologetically suburban in a character.

    San Jose’s pattern of development can be traced back to its origins as an agricultural community supporting early Spanish settlers who chose to settle in the fertile Santa Clara Valley. It remained a modest-size agrarian community until the end of World War II when it underwent a period of rapid expansion-not unlike that of Los Angeles to the south. During the 1950s, with the emergence of semiconductor technology derived from silicon, San Jose and the greater Santa Clara Valley exploded into a center for the evolution of computer technology.

    Today, San Jose can best be understood by its ambivalent relationship with neighboring Silicon Valley cities. Mid-size suburbs such as Cupertino, Sunnyvale, Mountain View and Palo Alto, all located west/northwest of San Jose as one travels up the peninsula towards San Francisco, are very distinct and separate entities. Home to some of Silicon Valley’s heaviest hitters (Cupertino has Apple, Sunnyvale has Yahoo!, Mountain View has Google, Palo Alto has Hewlett-Packard, Facebook and Stanford University), these cities largely define the technology-focused region. To be sure, San Jose’s has its share of big players, including eBay and Adobe as well as the ‘Innovation Triangle’, an industrial area of north San Jose, home to the headquarters of large companies like Cisco Systems and Cypress Semiconductor.

    Yet, despite the presence of these firms, San Jose has become ever more a residential community, with among the worst jobs to housing balances in the region. Furthermore, a whopping 59% of the city’s developed land constitutes residential use – 78% of that being single-family detached housing. In this sense, despite being the largest city, San Jose essentially serves as a ‘bedroom community’ for the rest of Silicon Valley.

    This has been a burden for the city, which, unlike its neighbors, lacks enough large information technology companies to help fill their tax coffers. In contrast job rich ‘green’ cities like Palo Alto have remained staunchly ‘anti-growth’ regarding residential development and consequently have very high housing prices.

    This pattern poses fiscal problems for San Jose. City officials have long been aware of the need to stimulate economic development instead of continuing to lose out to its neighbors but the city seems determined to increase further its role as dormitory for its neighbors. Indeed, amazingly the city’s development agenda has in recent years shifted to a relentless focus on high-density, multi-family residential in the downtown core and along transit corridors. In 2007, 79% of all new housing built in San Jose was multi-family – a staggering deviation from its history of low density development.

    Though well-intentioned, the slant towards densification has yielded a glut of empty condo units throughout the city. Those that have purchased units in new developments often find themselves with underwater mortgages. During a recent visit to one the flashy new downtown condo buildings, The 88, I entered a desolate sales office and was greeted by a skittish sales agent. When asked how sales were, my question was deferred without a direct answer in an act of not-so-quiet desperation.

    Although it’s clear most people in San Jose prefer lower density living, the city government continues hedging tax dollars against a future in which newcomers will want to live in a high-density setting. Outside of downtown, low to mid-rise multi-family housing has been built along the city’s light-rail lines in what are conceived to be ‘transit villages’. The popularity for such a lifestyle is questionable given the high price point and unreasonable HOA dues of these condo units, particularly when single-family detached houses can be purchased at comparable prices.

    Despite these issues, San Jose seems hell-bent on its path towards densification. The city has major plans to develop the area around its Diridon Train Station, just west of downtown, as California High-Speed Rail and BART are projected to make their way to San Jose. Furthermore, the city government is counting on the Oakland A’s baseball team making a move to San Jose.

    From the Champs-Élysées to Tiananmen Square, grand urban visions are what have defined cities historically. As a product of the Silicon Valley ethos as well as an observer of planning trends, I would argue that this is no longer valid – especially for any city with the hopes of a prosperous future. Rather, in democratic societies, it will be the idiosyncrasies of individual actors and the prospect of upward mobility that will define a sense of place.

    Obsessed with density and urban form, planners don’t seem to grasp the chicken and egg conundrum – the notion that lifestyle amenities follow on the heels of economic opportunity. San Jose needs to cast its future on nurturing its entrepreneurs instead of trying to become something it is not yet ready to become.

    Adam Nathaniel Mayer is a native of the San Francisco Bay Area. Raised in the town of Los Gatos, on the edge of Silicon Valley, Adam developed a keen interest in the importance of place within the framework of a highly globalized economy. He currently lives in San Francisco where he works in the architecture profession.

  • Project Development: Regulation and Roulette

    The site plan logically should be the key to approval of a development project. Yet in reality, the plan is secondary to the presentation. My conclusions are based upon experience with well over a thousand developments over four decades, most in the mainland USA. And what I’ve observed is that the best site plan is only as good as the presentation that will convince the council or planning commission to vote “Yes” on it. No “yes” vote, no deal, no development.

    Each presenter deals with the dog-and-pony show in his own way. There’s an endless variety of styles (or lack of styles). All of these public meetings have one thing in common: The neighbors (if there are any) will be there to oppose the new development.

    Not Too Long Ago…
    In the old days there were three factions: The developer presenting the plan, the neighbors opposing the plan, and the council listening to both sides. If the development was high profile, someone from the local press might also show up. The planning commission and council are fully aware that all plans will be met with neighborhood opposition, and they will have to listen to lengthy complaints along the route to approving (possibly) the plan.

    In the past, the citizens sitting on these boards would most likely dismiss Elwood and Betsy Smith’s complaint about how a development in their back yard would invade their privacy, and would vote in favor of the new master planned community instead.

    How It’s Different Today
    Today there is often an additional audience. Televised meetings provide an entire region of neighbors. The on-screen council listens to the neighbor’s objections, no matter how absurd they may be, then answers directly to the camera, showing the general community watching at home that they really care about every citizen’s opinion. The council member must never appear too much in favor of the developer, as that could be misconstrued as not caring about the citizens he or she represents. A televised Council member hears the Smith’s complaint with a very concerned on-camera look, explains how maybe we have too many new homes in this town, and proceeds to tell viewers that the developer might want to consider a buffer and a drop in density. Concerns have changed from developing economically sensible neighborhoods to “please elect me Mayor when I’m on the ballot”.

    Planning Outside The USA
    Our first large site plan done outside the States was in Freeport, Bahamas. In 2000, when we were first contacted to design Heritage Village, we asked about doing presentations to the city council and planning commission to help move the approval process along. We were told that the development company and the regulating entity were the same, and if they liked the plan it would be built! That is exactly what had happened.

    Our next attempt outside the USA was not so easy. In Mexico City when we asked to sit down with government officials to change policy to create better neighborhoods, the developer said… No. At the time, we did not understand why it was so critical that we were not to suggest changes.

    We Discover A Superior Foreign System
    We wrongly assumed that all planning outside the USA could have similar problems, with restrictions that were absurdly prohibitive for designing great neighborhoods. It was only when we worked in Bogota, Columbia last year that we had the opportunity to work within a system that may not be so backwards after all. Our request to meet with the authorities to show them new ways to design neighborhoods was met, as it had been in Mexico City, with an absolute… No.

    We then asked for an opportunity to present the plan, and were told that was not necessary. Being that it was Columbia you can imagine our first thoughts. Cartels? Maybe corruption? The reality was much simpler. Since our plans met the minimums (they actually exceeded them), they were automatically considered approved. Imagine that – no neighbors to complain! If everything conforms, it should be approved … right? Just plain common sense.

    Zoning-Compliant Projects Should Be Exempt From Public Meetings
    When you think about it, why wouldn’t this work in the USA? if the development plan being submitted meets or exceeds the zoning and the subdivision regulation minimums, why does it need to go through any public approvals at all? The American developer often faces months or years of delays, enormous interest payments, and tens or perhaps hundreds of thousands of dollars spent on consultants and legal help to re-create plans that conform. Those massive sums could go towards making better neighborhoods, better architecture, better landscaping, less environmental impacts, and more affordable housing.

    We’d Still Need Public Meetings
    The public would still have plenty of input on regulation and zoning exemptions, where public citizen input is valuable. If a developer is proposing something that goes below minimums or does not conform to zoning regulations, then it is reasonable to go through the more time consuming process that we currently have. This brings up the question of how the developer would introduce something different to the written law. This could be a problem under typical PUD (Planned Unit Development) regulations, which typically allow blanket changes to the minimums when alternative designs are not covered by typical zoning.

    This PUD Pandora’s box, once opened, can have devastating results if the regulators and the neighbors both agree that the plan is simply not good enough. The developer thinks the plan is just dandy as is, but in reality most PUD proposals are simply too vague to be functional. A battle of wills that can last years often ensues.
    In the end , these expensive delays increase lot costs, and the home buyer ultimately pays. If a special ordinance such as PUD, Cluster Conservation, or Coving was specifically spelled out in a rewards-based — instead of a minimums-based — system, developers could get benefits for great plans complete with open space and connectivity, typically density and setback relaxations.

    While writing Prefurbia, we began to ask ourselves, how did we take something so simple and let it get so out of control? The third world countries are progressive enough to actually allow developers who comply with the rules to quickly build their neighborhood. Maybe they are not so far behind us after all.

    Perhaps our regulations and planning approach is intended to keep the system “busy” with billable hours. Imagine if we could get a conforming plan stamped, and the next day construction could begin. How many billable hours would be eliminated, how much construction cost and land holding interest saved? That would be very hard to calculate, but it’s likely significant.

    “It is difficult to get a man to understand something when his salary depends upon his not understanding it…” Al Gore, An Inconvenient Truth

    The inconvenient truth won’t win us many friends in the consulting industry whose incomes depend upon generating billing time in meetings. But can we afford to continue down the path we are presently on? We need to take a hard look at the regulations. Are they written solely to provide the highest living standards? Or do they generate the highest billable hours for the consultants who propose them?

    Rick Harrison is President of Rick Harrison Site Design Studio and author of Prefurbia: Reinventing The Suburbs From Disdainable To Sustainable. His websites are rhsdplanning and prefurbia.com.

  • Frontrunning and Finance: Left Foot Forward

    This month, the Obama administration moved to regulate the so-called ‘invisible’ financial instruments that have come to rule the world of finance. Variations of the ‘shadow’ banking system — or, in the preferred language of financiers, market ‘risk management tools’ — have increasingly taken the spotlight during the current crises.

    Jim Cramer, on one of those CNBC webcasts which he must have thought would never be seen by anyone who counts, appeared to admit in December to something illegal when he said, “A lot of times when I was short (stocks) at my hedge fund, I would create a level of activity beforehand that would drive the futures.”

    Might he have been referring to self-frontrunning, an egregious flim-flam that takes place on two separate exchanges almost simultaneously so that one regulatory eye can’t see what the other one sees? On one exchange, the hedge fund manager sells the index future, and on another, he executes a series of short sales in the stocks of which the index is composed. The net effect is to drive the future down to profitable levels. Or, in the case of Mr. Cramer, who goosed the futures after having shorted the stocks, to draw investors in to an arbitrage that he himself created.

    It is strange and striking that a practice responsible for the lion’s share of the trading profits of the nation’s hedge funds and investment banks should remain a secret… even an open secret. But every morning on CNBC’s Squawk Box, commentators comfortably predict that the market will open up or down based on the movement of the futures. And nine times out of ten they are right.

    This type of thing can go on ad infinitum: after having closed out the short position, one might readily go long the index future and likewise the composite stocks and make money on the upside as well. While not foolproof – a critical mass of fools could upend such plans in a jittery trading environment – one can achieve a comfortable margin of safety by working with other hedge funds to go long or short the identical stocks and futures in concert. The effect is momentum investing in the truest sense of the term. And lofty expectations are sure to be met because the law of one price will force the futures in line with the cash every time. Add computers and a little leverage, and your hedge fund will not only spectacularly outperform the market averages, but take on far less risk in the bargain.

    Of course, Wall Street firms which execute trades for hedge funds often have an advantage over the funds because they have inside knowledge of the trading plans. And they can and often do trade in advance of these moves to the detriment of the hedge fund customers. Recently, a jury convicted three former stockbrokers at Bank of America, Merrill Lynch and Smith Barney for placing open telephone lines next to the internal speaker systems to eavesdrop on block orders by hedge funds and other institutional clients.

    The hedge funds are run by bright people. They caught onto this scam quickly. And rather than miss out, they joined forces with the Wall Street firms themselves to combine their financial power in concerted transactions, which makes the markets even more volatile. Mighty orchestrations of computer-driven buy and sell orders then exploit the minute-to-minute differentials of the stocks and their derivatives. Those differentials add up to trillions of dollars.

    Such bold moves trigger wild price swings and send skittish investors to the exits. But the solipsistic trading strategy is so wonderfully profitable to the insiders that any thought of calming the waters prompts snickers. Regulators don’t seem to care; they think these moves improve efficiency, seemingly without realizing that the traders create the conditions under which index arbitrage makes sense.

    A variant of this practice played a major role in sinking the banks during the credit crisis that began last year. Hedge funds began by shorting the banks, and then forced them into the toilet by shorting the same mortgage pools that banks carried on their balance sheets.
    Mark-to-market accounting created the impression that the banks were insolvent. This not only ensured that the short positions were profitable, but forced the Financial Accounting Standards Board to rush rule changes.

    Years ago, when commodity firms first adopted it, marking to the market seemed like a good idea, as investors need to know not only the cost basis of an asset, but also what it would fetch in the marketplace. Today it’s clear that the market transactions may have less to do with an asset’s actual valuation in a normal trading environment than with its desired valuation in a manipulated one.

    Having ruined the banks, these same swindlers turned on the insurance companies whose short interest skyrocketed in tandem with the crashing of their shares because their annuity products were backed by the same triple-A rated mortgage bonds that reposed on the banks’ balance sheets. Ironically, some firms, such as Lincoln National, ended up buying banks to qualify for bailout money so that they could continue in business.

    The stakes to the economy may seem smaller when insurers — as opposed to banks — appear insolvent, but many alarmed customers were quick to move their business elsewhere at merely the whiff of insolvency. The consequences to both industries were such that for the first time in 16 years the finance and insurance sectors of the economy actually shrank by 16 percent. They now have to raise more equity just to keep their customers.

    The transactions at the source of these woes were the result of what one financial writer termed “regulatory somnambulism,” in that it allowed for the elimination of the up-tick rule — which stipulated that short sales be entered at a price that is higher than the price of the previous trade — and of naked short selling, which can sink a flagship faster than a broadside beneath the water line.

    Naked short selling is a vicious twist on the usual. Normal short selling occurs when investors borrow shares and sell them, hoping the stock will fall and they can buy back the shares at a lower price. Naked short selling artificially increases the supply of a security as one can sell them without first borrowing them and thereby might technically sell more shares than actually exist. This utterly speculative practice has no bearing on the efficiency of the markets, contrary to what its practitioners claim. Its only purpose is to flood the market with sell orders and drive down share prices.

    In doing so, it contributes to an inaccurate picture of financial stringency that plays a major role in the price and allocation of credit and capital, which is central to the proper running of the world economy. It’s a true tail wagging the dog phenomenon that enriches well-placed gamblers at the expense of everyone else.

    Tim Koranda is a former stockbroker who now works as a professional speechwriter. He can be reached at koranda@alum.mit.edu.

  • Can California Make A Comeback?

    These are times that thrill some easterners’ souls. However bad things might be on Wall Street or Beacon Hill, there’s nothing more pleasing to Atlantic America than the whiff of devastation on the other coast.

    And to be sure, you can make a strong case that the California dream is all but dead. The state is effectively bankrupt, its political leadership discredited and the economy, with some exceptions, doing considerably worse than most anyplace outside Michigan. By next year, suggests forecaster Bill Watkins, unemployment could nudge up towards an almost Depression-like 15%.

    Despite all this, I am not ready to write off the Golden State. For one thing, I’ve seen this movie before. The first time was in the mid 1970s. The end of the Vietnam War devastated the state’s then powerful defense industry, leaving large swaths of unemployment and generating the first talk about the state’s long-term decline.

    An even scarier remake came out in the 1990s. Everything was going wrong, from the collapse of the Soviet Union and the unexpected deflating of Japan to a nearly Pharaonic set of plagues, ranging from earthquakes and fires to the awful Los Angeles riots of 1992.

    Yet each time California came roaring back, having reformed itself and discovered new ways to create wealth. In the wake of the early ’70s decline came the first full flowering of Silicon Valley as well as other tech regions, from the west San Fernando Valley to Orange and San Diego counties. Much of the spark for this explosion of growth came from those formerly employed in the defense and space sectors.

    The ’90s recovery was even more remarkable. Amazingly, the politicians actually were part of the solution. Aware the state’s economy was crashing, the state’s top pols–Assembly Speaker Willie Brown, Sen. John Vasconcellos, Gov. Pete Wilson–made a concerted effort to reform the state’s regulatory regime and otherwise welcomed businesses.

    The private sector responded. High-tech, Hollywood, international trade, fashion, agriculture and a growing immigrant entrepreneurial culture all generated jobs and restored the state’s faded luster.

    These sectors still exist and still excel even under difficult conditions. The problem this time is that the political class seems clueless how to meet the challenge.

    Politics have not always been a curse to California. In the 1950s and 1960s, the Golden State’s growth stemmed in large part from what historian Kevin Starr describes as “a sense of mission” on the part of leaders in both parties. Starr chronicles this period in his forthcoming book, Golden Dreams: California in an Age of Abundance, 1950-1963.

    Under figures like Earl Warren, Goodwin Knight and Pat Brown, Starr notes, California “assembled the infrastructure for a great commonwealth.” Their legacy–the great University system, the California Water Project, the freeways and state park system–still undergirds what’s left of the state economy.

    Perhaps the best thing about these investments was that they helped the middle class. Sure, nasty growers, missile makers and rapacious developers all made out like bandits–which is why many of them also backed Pat Brown. But the ’50s and ’60s also ushered in a remarkable period of widespread prosperity.

    Millions of working- and middle-class people gained good-paying jobs, and could send their children to what was widely seen as the world’s best public university system. People who grew up in New York tenements or dusty Midwest farm towns now could enjoy a suburban lifestyle complete with single-family homes, cars, swimming pools and drive-through hamburger stands.

    “This was an epic success story for the middle class,” historian Starr notes. It’s one reason why, when people ask me about my politics, I proudly identify myself as a Pat Brown Democrat.

    That’s why California’s current decline is so bothersome. A state that once was home to a huge aspirational middle class has become increasingly bifurcated between a sizable overclass, clustered largely near the coast, and a growing poverty population.

    Over the past 40 years California’s official poverty rate grew from 9% to nearly 13% in 2007, before the recession. Three of its counties–Monterey, San Francisco and Los Angeles–boast large populations of the über rich but, adjusted for cost of living, also suffer some of the highest percentages of impoverished households in the nation.

    Most worrisome has been the decline of the middle–the increasingly diverse ranks of homeowners, small business people and professionals. The middle has been heading out of state for much of the past decade. Politically, they have proven no match for the power of the wealthy trustfunders of the left, the powerful public employee union as well as a small, but determined right wing.

    The good news is that the middle class shows signs of stirring. The nearly two-to-one rejection of the governor’s budget compromise reflected a groundswell of anger toward both the Terminator and his allies in the legislature.

    Simply put, California voters sense we need something more than an artful quick fix built to please the various Sacramento interest groups. Required now is a more sweeping revolutionary change that takes power away from the state’s most powerful lobby, the public employees, whose one desired reform would be ending the two-thirds rule for approval of new taxes and budgets.

    Middle-class Californians are asking, with justification, why we should be increasing taxes–we’re ranked sixth-highest in the nationto pay for gold-plated state employee pensions as well as an ever-expanding social welfare program. Although state spending has grown at an adjusted 26% per capita over the past 10 years, it is hard to discern any improvement in roads, schools or much of anything else.

    As an opening gambit, the right’s solution–strict limits on state spending–makes perfect sense. However, long-lasting reform needs to be about more than preserving property and low taxes. To appeal to the state’s increasingly minority population, as well as the younger generation, a reform movement also has to be about economic growth and jobs.

    Not surprisingly, local leaders of the “tea party” movement gained some profile from last week’s vote. Yet the right, which has exhibited strong nativist tendencies, is not likely to win over an increasingly diverse state.

    In my mind, California’s revival depends on three key things. First, the lobbyist-dominated Sacramento cabal needs to be shattered, perhaps turning the legislature into a part-time body, as proposed by one group. Perhaps the cleverest plan has come from Robert Hertzberg, a former Speaker of the Assembly who heads up the reformist California Forward group.

    Hertzberg proposes a radical decentralization of power to the state’s various regions, as well as cities and even boroughs in urban areas like Los Angeles. This would break the power of the Sacramento system by devolving tax and spending authority to local governments.

    Secondly, California needs to develop a long-term economic growth strategy. Over the past decade, California’s growth has become ever more bubblicious, dependent first on the dot-com bubble and then one in housing. The basic economy–manufacturing, business services, agriculture, energy–has been either ignored or overly regulated. Not surprisingly, we could see 20% unemployment, or worse, in places like Salinas and Fresno by next year.

    Third, both political reform and an economic strategy aimed at restoring upward mobility depends on a revival of middle-class politics in this state. It would include building an alliance between the more reasoned tea partiers and saner elements of the progressive community.

    The new alliance would not be red or blue, liberal or conservative, but would represent what historian Starr calls “the party of California.” At last there could be a political home for Californians who are angry as hell but still not yet ready to give up on the most intriguing, attractive and potentially productive of all the states.

    This article originally appeared at Forbes.

    Joel Kotkin is executive editor of NewGeography.com and is a presidential fellow in urban futures at Chapman University. He is author of The City: A Global History and is finishing a book on the American future.

  • Sweden’s Taxes – The Hidden Costs of The Welfare State

    By Nima Sanandaji and Robert Gidehag

    Sweden is a nation with extraordinary high tax rates. The average worker not only pays 30 percent of her or his income in visible taxes, but, additionally, close to 30 percent in hidden taxes. The defenders of the punishing tax burden argue that it is needed to maintain Sweden’s generous welfare system. While this claim may seem reasonable on its surface, a deeper look suggests that it is based on flawed analysis.

    Some level of taxation is, of course, required to fund the public sector. At the same time, a high level of taxation does not necessarily translate into an equally high level of welfare:

    Taxes discourage work and encourage tax avoidance. There is strong evidence that Sweden’s highest rate of individual and capital taxation actually reduces public revenue. For this reason, some taxes, such as the wealth tax, have recently been reduced. The result is estimated to be a net increase in tax revenues.

    When Swedish municipalities receive increased funding from the state, the money is used to expand the local bureaucracy, a government survey has shown, instead of going to educators and health care workers.

    Municipalities provide much of the welfare in Sweden. The Swedish Association of Local Authorities and Regions have shown in a study that funding for Swedish municipalities grew dramatically between 1980 and 2005. Despite this, the general public consensus is that the quality of welfare has declined during the same period.

    Welfare provisions don’t necessarily correspond with taxation levels. A 2005 research paper examines the efficiency of the public sector in 23 industrialized countries. The researchers found that Sweden only reaches a mediocre 12th place when it comes to how much the public sector provides in terms of welfare services. When the level of welfare is related to the level of taxation, Sweden falls to the last position in the index.

    There is a high variation in how effectively public money is spent within Sweden. The Swedish Taxpayers Association has, in a number of surveys, shown that identical welfare services such as care of the elderly, can vary in cost quite dramatically across Sweden.

    There are two important reasons why the average Swedish worker pays a large portion of her or his income in taxes, without necessarily receiving an equally high level of welfare.

    First, much of the money is spent on administrative costs at various levels of government. Although a small nation, Sweden has over a hundred public authorities. Vast sums are spent on political projects which fall outside the frames of general welfare. It is, for instance, not unusual for Swedish municipalities to fund bowling alleys, swimming pools, or camping places.

    Second, a large fraction of the population is living on benefits rather than working, due to the combination of high taxes, a rigid labour market and generous welfare benefits. Even before the economic crisis hit, for example, almost one out of five children in Sweden’s third largest city, Malmö, were living in a family supported by social security. Sweden has 105 local districts where the majority of the population lives off of various public benefits, and does not work. This unintended consequence of the welfare state has taken a heavy toll on public services, since an increasing share of tax revenue must be diverted to fund welfare payments, rather than social services.

    Many are immigrant dense neighborhoods; others are situated in the northern part of Sweden, where many cities with stagnating economies have suddenly experienced a boom in the fraction of the population who cannot work due to disability.

    The famous Swedish welfare state is to a large degree a notion of the past. Many feel that its glory days occurred during the late 1950s and early 1960s, when Sweden successfully combined welfare policies with an expanding economy. At that time, however, Swedish taxes were 27 percent of the GDP, compared to 47 percent today. The golden days of Swedish welfare did not coincide with the high tax regime we know today.

    How could Sweden fund a prospering welfare system with relatively low taxes in the past? As the researcher Erik Moberg documents in a book for the Ratio Institute, public money was spent much differently back then. The share of public revenues spent on health care and education at the end of the 1950s was greater than it is today.

    And, compared to the 1950s, close to three times as much of public revenues are now spent on public bureaucracy. Four times as much is spent on welfare payments and social insurance. As the level of taxation has increased, so has the share of taxes going to public bureaucracy and various government handouts.

    The historical comparison with the 1950s and 1960s is worth thinking about. It shows that a high quality of welfare can be achieved with a much lower tax level than we have today. If politicians slim down public bureaucracy and cut wasteful spending, resources can be opened up for increasing welfare and reducing taxes at the same time. If the system rewards work to a greater degree than it does living off the state, fewer will be dependent on the public for their daily living, again opening up tax revenues for better use.

    Sweden has long been a small homogeneous country with a high degree of economic equality. Strong norms related to work and responsibility made it possible to enact an effective welfare system early on. With time, however, welfare dependence has reduced the very norms that formed the foundation of Swedish welfare, and wasteful spending has increased.

    Many important social outcomes that the welfare state aims to address, and that Sweden is famous for, such as a low crime rate, have increased in recent decades, concurrent with the expansion of the welfare state. Even income inequality has increased in Sweden compared to, for example, the 1980s, despite similar or higher public expenditure.

    Swedish decision makers are doing their best to reduce public spending and lower taxes. The reforms have been highly successful so far. As taxes have decreased from 57 percent of GDP in 1989 to 47 percent of GDP in 2009, the incentives to work have improved, with Swedish growth rates benefiting. The convergence of lower taxes and lower public spending is likely to continue. After all, experience has made it quite apparent for many Swedes that extraordinary high taxes are not the key to qualitative welfare services and a well functioning society.

    Nima Sanandaji is president of think tank Captus and a fellow at the Swedish Taxpayers Association. Robert Gidehag is president of the Swedish Taxpayers Association.