Tag: Environment

  • Homebuilding Recovery: How CAD Stifles Solutions

    The Recovery Blueprint is a multipart series on homebuilding. Part II addresses how a reliance on CAD software and a lack of collaboration stifle sustainable land development solutions.

    The front cover of Engineering News-Record on March 12th, 2012 was about a technology survey conducted a few weeks earlier. Of 18 issues surveyed, the need for better software was mentioned most frequently. Under the heading “Software Shortfall – Better, Simpler, Cheaper”, the editors noted that ‘dissatisfaction with current products cuts across all responses,’ and labeled the area, ‘Needs Improvement’.

    Better Software: Until a few decades ago the development of the world was represented by a hand drawn plan. Computer Aided Drafting (CAD) did not exist. There was an intimacy between the design of buildings and the land development task at hand. Since the introduction of CAD, the typical American city has seen few technology changes in the ways that housing is designed. There is virtually no advancement in the design of land development that can be associated with this new era of software-enabled design. If anything, it could be argued that CAD technology resulted in worse design of the cities in which we dwell.

    During a recent lunch with a prominent architect, he explained to me how easy it is to do multifamily design. Simply create one interior unit and one end unit, and then repeat with minor modifications for the first floor units. There was no mention on how to increase the views, or of perceived space (versus actual space), or of efficiencies that could help make everyday living better for the residents. Only that CAD made things so much faster and ‘easier’ for the architect.

    Several software solutions companies boast in their literature about how the development of hundreds of lots can be generated in a minute. The attitude that technology is a tool for speed, instead of for quality, feeds complacency and dumbs down design to series of ‘typicals’ or ‘blocks’ that can be instantly duplicated.

    CAD was intended as a drafting tool to serve hundreds of purposes within a multi-billion dollar software industry. To serve all industrial usages, CAD has become a ‘jack of all trades but master of none’. This is most apparent in land-based design, which requires calculations based upon coordinate geometry. CAD requires a separate data structure to perform these calculations. As an industry core technology, CAD compromises and limits land development design. To do land based calculations for environmental and economic reporting requires precision spatial analysis, and CAD technology fails to deliver. If CAD were a spatial platform there would be no need for a separate GIS technology (another industry problem) for analytical data.

    CAD Saturation: The hand drafting tools used just a few decades ago simply do not exist today. In a saturated market, CAD companies must generate fees through updates, support and training. If these systems were easy (see above complaints) and quick to learn the support and training income would plummet. Thus, intentional complexity assures CAD an income stream for companies at the expense of limiting progress and stifling design advancements.

    Pre-packaged software results in pre-packaged solutions. For example, imagine that an engineer schooled in the use of a particular software is given the task of designing a storm sewer on a 100-acre subdivision. To design and create the required drawings and reports for the multi-million dollar storm sewer system using add-on software to CAD, it might take only a day or so. A more natural alternative using surface flow is likely a viable option, potentially reducing infrastructure expense by tens of thousands, and in some cases millions, of dollars. However, there is no ‘button press’ for surface flow. If consulting fees are based upon a percentage of construction costs the situation becomes worse.

    Many Architects intelligently use technology that is not possible through CAD. Some of these more intelligent software solutions have even been acquired by leading CAD companies. GIS (Geographic Information System) technology is generally based upon polygons, that is, a series of straight lines forming a shape. Typically, it’s useless for precision engineering and surveying irregular, real-world sites.

    Technology Inhibited Collaboration: Architects, engineers, surveyors and planners — the group of consultants that are given responsibility to design and produce plans for our world’s growth — have been, historically, un-collaborative. Technology has done little to change this and foster collaboration.

    Only a few decades ago, it was a given that hand drawn sketches would need to be calculated for construction. Today, a planner using CAD could ‘sketch’ thousands of inaccurate lines and arcs that look like a finished plan, but would be useless for engineering and surveying. Data transferred to the CAD system of an engineer or surveyor does not magically become accurate, and therefore usable. The way CAD has been utilized destroys collaboration instead of building it.

    This isn’t the fault of CAD technology, which actually can create precise drawings. The blame falls on those that teach its use. One way to build collaboration would be for schools in engineering, architecture, planning, and surveying to work on common projects, teaching the needs of each other in a way that reduces time and workload, allowing more time for better decision making.
    Unsustainable Sustainability: It’s human nature to find comfort at a certain stage of equilibrium. What does this mean? We relent to the flow of everyday life. In the case of land development issues, methods and technology that go with the flow lead to an unsustainable path.

    Those involved in the development industry, whether working for private or for public entities, know our growth is not sustainable. Instead of seeking better methods, we have reduced planning to either mindlessly automating design, or to creating stricter design models that promise progress by providing a better architectural façade.

    Instead of being more efficient and reducing the physical elements required for development, we have added solutions that often increase installation and maintenance costs. An example is permeable paving, which is a wonderful idea: pavement that allows rainwater to pass into the ground, instead of running off the pavement’s end and flooding the surrounding area. The problem is not the pavement, but the fact that the under layer supporting the paving must also be permeable. To do this is often prohibitively expensive. If it’s not done properly, it traps water that can freeze (in colder climates) and then expand, and may not hold up to the weight of heavy loads.

    Despite the promise of permeable pavement, design innovations that can reduce the volume of street surface by 30% or more without reducing functionality make more sense. Eliminating an excessive amount of street surface is an efficient solution that costs less to install and maintain than permeable pavement.

    Funding Sources For Innovation: Would it be possible for someone to discover a way to create an affordable base for permeable pavement? Probably. There are hundreds of millions of dollars available from private foundations and government grants for solutions leading to sustainable growth. However, foundation grants fund only 501c non-profits. Should future solutions to development be tied only to non-profit or politically connected entities, or to private firms which may be more capable of innovation?

    There is no technology that can create a better design; we can only create better designers. Instead of educating CAD users on how to automate design, we need to create a generation of designers who use technology to create wonderful neighborhoods instead of quick subdivision plans.

    The consultant needs to concentrate on the best solution, not just the solution that is a mere button press away. Today, there is no excuse for creating designs that are not precise. Architects, engineers, planners, and surveyors need to learn to fulfill each other’s basic needs. This would go a long way towards creating a new era of collaborative design.

    Rick Harrison is President of Rick Harrison Site Design Studio and Neighborhood Innovations, LLC. He is author of Prefurbia: Reinventing The Suburbs From Disdainable To Sustainable and creator of Performance Planning System. His websites are rhsdplanning.com and pps-vr.com.

    Flickr Photo: Designing tools by evrenozbilen.

  • The New Class Warfare

    Few states have offered the class warriors of Occupy Wall Street more enthusiastic support than California has. Before they overstayed their welcome and police began dispersing their camps, the Occupiers won official endorsements from city councils and mayors in Los Angeles, San Francisco, Oakland, Richmond, Irvine, Santa Rosa, and Santa Ana. Such is the extent to which modern-day “progressives” control the state’s politics.

    But if those progressives really wanted to find the culprits responsible for the state’s widening class divide, they should have looked in a mirror. Over the past decade, as California consolidated itself as a bastion of modern progressivism, the state’s class chasm has widened considerably. To close the gap, California needs to embrace pro-growth policies, especially in the critical energy and industrial sectors—but it’s exactly those policies that the progressives most strongly oppose.

    Even before the economic downturn, California was moving toward greater class inequality, but the Great Recession exacerbated the trend. From 2007 to 2010, according to a recent study by the liberal-leaning Public Policy Institute of California, income among families in the 10th percentile of earners plunged 21 percent. Nationwide, the figure was 14 percent. In the much wealthier 90th percentile of California earners, income fell far less sharply: 5 percent, only slightly more than the national 4 percent drop. Further, by 2010, the families in the 90th percentile had incomes 12 times higher than the incomes of families in the 10th—the highest ratio ever recorded in the state, and significantly higher than the national ratio.

    It’s also worth noting that in 2010, the California 10th-percentile families were earning less than their counterparts in the rest of the United States—$15,000 versus $16,300—even though California’s cost of living was substantially higher. A more familiar statistic signaling California’s problems is its unemployment rate, which is now the nation’s second-highest, right after Nevada’s. Of the eight American metropolitan areas where the joblessness rate exceeds 15 percent, seven are in California, and most of them have substantial minority and working-class populations.

    When California’s housing bubble popped, real-estate prices fell far more steeply than in less regulated markets, such as Texas. The drop hurt the working class in two ways: it took away a major part of their assets; and it destroyed the construction jobs important to many working-class, particularly Latino, families. The reliably left-leaning Center for the Continuing Study of the California Economy found that between 2005 and 2009, the state lost fully one-third of its construction jobs, compared with a 24 percent drop nationwide. California has also suffered disproportionate losses in its most productive blue-collar industries. Over the past ten years, more than 125,000 industrial jobs have evaporated, even as industrial growth has helped spark a recovery in many other states. The San Francisco metropolitan area lost 40 percent of its industrial positions during this period, the worst record of any large metro area in the country. In 2011, while the country was gaining 227,000 industrial jobs, California’s manufacturers were still stuck in reverse, losing 4,000.

    Yet while the working and middle classes struggle, California’s most elite entrepreneurs and venture capitalists are thriving as never before. “We live in a bubble, and I don’t mean a tech bubble or a valuation bubble. I mean a bubble as in our own little world,” Google CEO Eric Schmidt recently told the San Francisco Chronicle. “And what a world it is. Companies can’t hire people fast enough. Young people can work hard and make a fortune. Homes hold their value.” Meanwhile, in nearby Oakland, the metropolitan region ranks dead last in job growth among the nation’s largest metro areas, according to a recent Forbes survey, and one in three children lives in poverty.

    One reason for California’s widening class divide is that, for a decade or longer, the state’s progressives have fostered a tax environment that slows job creation, particularly for the middle and working classes. In 1994, California placed 35th in the Tax Foundation’s ranking of states with the lightest tax burdens on business; today, it has plummeted to 48th. Only New York and New Jersey have more onerous business-tax burdens. Local taxes and fees have made five California cities—San Francisco, Los Angeles, Beverly Hills, Santa Monica, and Culver City—among the nation’s 20 most expensive business environments, according to the Kosmont–Rose Institute Cost of Doing Business Survey.

    Still more troubling to California employers is the state’s regulatory environment. California labor laws, a recent U.S. Chamber of Commerce study revealed, are among the most complex in the nation. The state has strict rules against noncompetition agreements, as well as an overtime regime that reduces flexibility: unlike other states, where overtime kicks in after 40 hours in a given week, California requires businesses to pay overtime to employees who have clocked more than eight hours a day. Rules for record-keeping and rest breaks are likewise more stringent than in other states. The labor code contains tough provisions on everything from discrimination to employee screening, the Chamber of Commerce study notes, and has created “a cottage industry of class actions” in the state. California’s legal climate is the fifth-worst in the nation, according to the Institute for Legal Reform; firms face far higher risks of nuisance and other lawsuits from employees than in most other places. In addition to these measures, California has imposed some of the most draconian environmental laws in the country, as we will see in a moment.

    The impact of these regulations is not lost on business executives, including those considering new investments or expansions in California. A survey of 500 top CEOs by Chief Executive found that California had the worst business climate in the country, and the U.S. Chamber of Commerce calls California “a difficult environment for job creation.” Small wonder, then, that since 2001, California has accounted for just 1.9 percent of the country’s new investment in industrial facilities; in better times, between 1977 and 2000, it had grabbed 5.6 percent.

    Officials, including Governor Jerry Brown, argue that California’s economy is so huge that it can afford to lose companies to other states. But for the local economy to be hurt, firms don’t have to leave entirely. Business consultant Joe Vranich, who maintains a website that tracks businesses that leave the state, points out that when California companies decide to expand, often they do so in other parts of the U.S. and abroad, not in their home environment. Further, Brown is too cavalier about the effects of businesses’ departure. As Vranich notes, many businesses leave California “quietly in the night,” generating few headlines but real job losses. He cites the low-key departure in 2010 of Thomas Brothers Maps, a century-old California firm, which transferred dozens of employees from its Irvine headquarters to Skokie, Illinois, and outsourced the rest of its jobs to Bangalore.

    The list of companies leaving the state or shifting jobs elsewhere is extensive. It includes low-tech companies, such as Dunn Edwards Paints and fast-food operator CKE Restaurants, and high-tech ones, such as Acacia Research, Biocentric Energy Holdings, and eBay, which plans to create 1,000 new positions in Austin, Texas. Computer-security giant McAfee estimates that it saves 30 to 40 percent every time it hires outside California. Only 14 percent of the firm’s 6,500 employees remain in Silicon Valley, says CEO David DeWalt. The state’s small businesses, which account for the majority of employment, are harder to track, but a recent survey found that one in five didn’t expect to remain in business in California within the next three years.

    Apologists for the current regime also claim that the state’s venture capitalists will fund and create new companies that will boost employment. It’s certainly true that in the past, California firms funded by venture capital tended to expand largely in California. But as Jack Stewart, president of the California Manufacturing and Technology Association, points out, a different dynamic is at work today: once a company’s start-up phase is over, it tends to move its middle-class jobs elsewhere, as the state’s shrinking fraction of the nation’s industrial investment indicates. “Sure, we are getting half of all the venture capital investment, but in the end, we have relatively small research and development firms only,” Stewart argues. “Once they have a product or go to scale, the firms move [employment] elsewhere. The other states end up getting most of the middle-class jobs.”

    Radical environmentalism has been particularly responsible for driving wedges between California’s classes. Until fairly recently, as historian Kevin Starr says, California’s brand of progressivism involved spurring economic growth—particularly by building infrastructure—and encouraging broad social advancement. “What the progressives created,” Starr says, “was California as a middle-class utopia. The idea was if you wanted to be a nuclear physicist, a carpenter, or a cosmetologist, we would create the conditions to get you there.” By contrast, he says, today’s progressives regard with suspicion any growth that requires the use of land and natural resources. Where old-fashioned progressives embraced both conservation and the expansion of public parks, the new green movement advocates a reduced human “footprint” and opposes cars, “sprawl,” and even human reproduction.

    The Bay Area has served as the incubator for the new green progressivism. The militant Friends of the Earth was founded in 1969 in San Francisco. Malthusian Paul Ehrlich, author of the sensationalist 1968 jeremiad The Population Bomb and mentor of President Obama’s current science advisor, John Holdren, built his career at Stanford. Today, more than 130 environmental activist groups make their headquarters in San Francisco, Berkeley, Oakland, and surrounding cities.

    The environmentalist agenda emerged in full flower under nominally Republican governor Arnold Schwarzenegger, who initially cast himself as a Milton Friedman–loving neo-Reaganite. On his watch, California’s legislature in 2006 passed Assembly Bill 32, which, in order to cut greenhouse-gas emissions, imposes heavy fees on using carbon-based energy and severely restricts planning and development. One analysis of small-business impacts prepared by Sacramento State University economists indicates that AB 32 could strip about $181 billion per year, or nearly 10 percent, from the state’s economy. At the same time, land-use regulations connected to the climate-change legislation hinder expansion for firms.

    Another business-hobbling mandate is the law requiring that 30 percent of California’s electricity be generated by “renewable” sources by 2020. The state’s electricity costs are already 50 percent above the national average and the fifth-highest in the nation—yet state policies make the construction of new oil- or gas-fired power plants all but impossible and offer massive subsidies for expensive, often unreliable, “renewable” energy. The renewable-fuel laws will simply boost electricity costs further. The cost of electricity from the new NRG solar-energy facility in central California, for instance, will be 50 percent higher than the cost of power from a newly built gas-powered facility, according to state officials. For providing this expensive service, NRG will pay no property taxes on its facilities. By some estimates, green mandates could force electricity prices to rise 5 to 7 percent annually through 2020.

    The renewable-fuel regulations are driving even green jobs out of the state. Cereplast, a thriving El Segundo–based manufacturer of compostable plastic, last year moved its manufacturing operations to Indiana, where electricity costs are 70 percent lower. Fuel-cell firm Bing Energy cited cost and regulatory factors when announcing its move from California to Florida. “I just can’t imagine any corporation in their right mind would decide to set up in California right now,” the firm’s CFO, Dean Minardi, told the Inland Valley Daily Bulletin. Still more rules, aimed at improving water quality and protecting endangered species, could have a devastating effect on the construction and expansion of port facilities, which tend to sustain high-wage blue- and white-collar jobs.

    The political class largely ignores the economic consequences of these policies. Indeed, Governor Brown and others insist that they will create jobs—upward of 500,000 of them—while establishing California as a green-energy leader. To turn Brown’s green dreams into reality, the state has approved enormous subsidies and tax breaks for solar and other renewable-energy producers to supplement those dispensed by the Obama administration. Yet for all this, California has barely 300,000 “green jobs,” many of which are low-wage positions, such as weather-stripping installers. And the solar industry, in California and abroad, is imploding.

    Bill Watkins, head of the economic forecasting unit at California Lutheran University, notes that California’s green policies affect the very industries—manufacturing, home construction, warehousing, and agribusiness—that have traditionally employed middle- and working-class residents. “The middle-class economy is suffering since there is no real opposition to the environmental community,” says Watkins. “You see the Democrats, who should worry about blue-collar and middle-income jobs, give in every time.”

    Progressives and many Occupy protesters mourned the death of high-tech innovator and multibillionaire Steve Jobs. They also tend to view social-networking firms like Facebook more as allies than as class enemies. This embrace of Silicon Valley is nearly as strange as the Occupy movement’s decision to target the ports of Los Angeles and Oakland—large employers of well-paid blue-collar workers. Activists portrayed the attempted port shutdowns as attempts to “disrupt the profits of the 1 percent,” but union workers largely saw them as impositions on their livelihood. As former San Francisco mayor and state assembly speaker Willie Brown wrote in the San Francisco Chronicle: “If the Occupy people really want to make a point about the 1 percent, then lay off Oakland and go for the real money down in Silicon Valley. The folks who work on the docks in Oakland or drive the trucks in and out of the port are all part of the 99 percent.”

    The explanation for the progressives’ hypocritical friendliness to Silicon Valley is simple: money and politics. Venture capitalists and highly profitable, oligopolistic firms like Google (with its fleet of eight private jets) invest heavily in green companies; they were also among the primary bankrollers of the successful opposition to a 2010 ballot initiative aimed at reversing AB 32. The digital elite has become more and more involved in local politics, with executives from Facebook, Twitter, and gaming website Zynga contributing heavily to the recent campaign of San Francisco mayor Ed Lee, for example. Lee has, in turn, been extremely kind to the digerati, extending a payroll-tax break to Twitter and a stock-option break to Zynga and other firms that may soon go public.

    Hollywood manages to outdo even Silicon Valley in its class hypocrisy. Former actor Schwarzenegger doesn’t let his green zealotry stop him from owning oversize houses and driving fuel-gorging cars. Canadian-born director James Cameron, who contents himself with a six-bedroom, $3.5 million, 8,300-square-foot Malibu mansion, talks about the need to “stop industrial growth” and applauds the idea of a permanent recession. “It’s so heretical to everybody trying to recover from a recession economy—‘we have to stimulate growth!’ ” says Cameron. “Well, yeah. Except that’s what’s gonna kill this planet.”

    According to the Tax Foundation, California residents already pay the nation’s sixth-highest state tax rates, and they are likely to keep rising. Three tax-raising measures have already been proposed for the November 2012 ballot. Governor Brown’s proposal, which would boost both income and sales taxes, stands a good chance of passage. Hedge-fund manager Tom Steyer, an investor in environmental firms, has floated a measure that would raise taxes on out-of-state companies that conduct any operations in California and use some of the revenue to subsidize green-friendly building projects. And Molly Munger, a civil rights attorney and daughter of Warren Buffett’s longtime business partner, is pressing a measure to raise income taxes to fund schools. The so-called Think Long proposal, financed by nomadic French billionaire Nicolas Berggruen and overseen by a committee including Google’s Schmidt and billionaire philanthropist Eli Broad, proposes a mild cut in income-tax rates for the highest earners (like themselves) but new taxes on services provided by architects, accountants, business consultants, plumbers, gardeners, and others—the sole proprietors and microbusinesses that represent the one growing element in the state’s beleaguered private-sector middle class.

    More money for social services or education might help alleviate some of the recession’s impact, but it cannot break the vicious cycle from which California currently suffers: weak growth leading to low tax revenues, government boosting taxes to make up the shortfall, and those higher taxes driving businesses and jobs away, resulting in continued weak growth. What California’s middle and working classes need above all is broad, private-sector job growth—and that, fortunately, is a goal still well within reach. The Golden State may be run stupidly, but it retains enormous assets: its position on the Pacific Rim, large numbers of aspiring immigrants, unparalleled creative industries, fertile land, and a treasure trove of natural resources.

    The most promising opportunity is in the contentious area of fossil-fuel energy, a mainstay of the state’s economy since the turn of the twentieth century. California still ranks as the nation’s fourth-largest oil-producing state. Traditional energy has long provided good jobs; nationally, the industry pays an average annual salary of $100,000. And elsewhere, from the Great Plains to eastern Ohio, an oil and gas boom is driving growth.

    But California has thus far excluded itself from the party. Even as production surges in other parts of the country, California companies like Occidental Petroleum report diminishing oil production. The drop-off proves, some environmentalists say, that “peak oil” has been reached, but the evidence shows otherwise: the last few years have seen a fourfold increase in applications for drilling permits in California, largely because of the discovery of the massive Monterey shale deposits—containing a potential 15 billion barrels of oil—and of an estimated 10 billion barrels near Bakersfield. The real reason for the reduced production is that California has rejected most of the drilling applications since 2008. “I asked Jerry Brown about why California cannot come to grips with its huge hydrocarbon reserves,” recalls John Hofmeister, former president of Shell Oil’s U.S. operations. “After all, this could turn around the state. He answered that this is not logic, it’s California. This is simply not going to happen here.”

    The anti-fossil-fuel stance, according to the Los Angeles County Economic Development Corporation, has placed some $1 billion in investment and 6,000 jobs on hold. The sense of wasted opportunity can be palpable. If you travel to Santa Maria, a hardscrabble town near the Monterey formation, you pass empty industrial parks and small, decaying shopping centers. As economist Watkins put it at a recent conference there: “If you guys were in Texas, you’d all be rich.”

    California doesn’t even need to abandon its progressive tradition to narrow the class divide. Homebuilding, manufacturing, and warehousing could expand if regulatory burdens other than those associated with fighting climate change were merely modified—not repealed, but relaxed sufficiently to make it possible to do business, put people to work, and make a profit. New energy production could take place under strict regulatory oversight. Future industrial and middle-class suburban development could be tied to practical energy-conservation measures, such as promoting home-based businesses and better building standards. California’s agriculture industry—currently thriving, thanks to exports—could be less burdened by the constant threat of water cutbacks and new groundwater regulations.

    Even from an environmental perspective, increased industrial growth in California might be a good thing. The state’s benign climate allows it to consume fossil-fuel energy far more efficiently than most states do, to say nothing of developing countries such as China. Keeping industry and middle-class jobs here may constitute a more intelligent ecological position than the prevailing green absolutism.

    More important still is that a pro-growth strategy could help reverse California’s current feudalization. The same Public Policy Institute of California study shows that during the last broad-based economic boom, between 1993 and 2001, the 10th percentile of earners enjoyed stronger income growth than earners in the higher percentiles did. The lesson, which progressives once understood, is that upward mobility is best served by a growing economy. If they fail to remember that all-important fact, the greens and their progressive allies may soon have to place the California dream on their list of endangered species.

    This piece originally appeared in The City Journal.

    Joel Kotkin is executive editor of NewGeography.com and is a distinguished presidential fellow in urban futures at Chapman University, and contributing editor to the City Journal in New York. He is author of The City: A Global History. His newest book is The Next Hundred Million: America in 2050, released in February, 2010.

    Los Angeles aqueduct photo by BigStockPhoto.com.

  • Why Emissions Are Declining in the U.S. But Not in Europe

    It wasn’t that long ago that the U.S. was cast as the global climate villain, refusing to sign the Kyoto accord while Europe implemented cap and trade. 

    But, as we note below in a new article for Yale360, a funny thing happened: U.S. emissions started going down in 2005 and are expected to decline further over the next decade, while Europe’s cap and trade system has had no measurable impact on emissions. Even the supposedly green Germany is moving back to coal.

    Why? The reason is obvious: the U.S. is benefitting from the 30-year, government-funded technological revolution that massively increased the supply of unconventional natural gas, making it cheap even when compared to coal.   

    The contrast between what is happening in Europe and what is happening in the U.S. challenges anyone who still thinks pricing carbon and emissions trading are more important to emissions reductions than direct and sustained public investment in technology innovation. 

    — Ted and Michael

    Yale 360

    Beyond Cap and Trade: A New Path to Clean Energy

    Putting a price and a binding cap on carbon is not the panacea that many thought it to be. The real road to cutting U.S. emissions, two iconoclastic environmentalists argue, is for the government to help fund the development of cleaner alternatives that are better and cheaper than natural gas.

    by Ted Nordhaus and Michael Shellenberger

    A funny thing happened while environmentalists were trying and failing to cap carbon emissions in the U.S. Congress. U.S. carbon emissions started going down. The decline began in 2005 and accelerated after the financial crisis. The latest estimates from the U.S. Energy Information Administration now suggest that U.S. emissions will continue to decline for the next few years and remain flat for a decade or more after that.

    The proximate cause of the decline in recent years has been the recession and slow economic recovery. But the reason that EIA is projecting a long-term decline over the next decade or more is the glut of cheap natural gas, mostly from unconventional sources like shale, that has profoundly changed America’s energy outlook over the next several decades.

    Gas is no panacea. It still puts a lot of carbon into the atmosphere and has created a range of new pollution problems at the local level. Methane leakage resulting from the extraction and burning of natural gas threatens to undo much of the carbon benefit that gas holds over coal. And even were we to make a full transition from coal to gas, we would then need to transition from gas to renewables and nuclear in order to reduce U.S. emissions deeply enough to achieve the reductions that climate scientists believe will be necessary to avoid dangerous global warming.

    But the shale gas revolution, and its rather significant impact on the U.S. carbon emissions outlook, offers a stark rebuke to what has been the dominant view among policy analysts and environmental advocates as to what it would take in order to begin to bend down the trajectory of U.S. emissions, namely a price on carbon and a binding cap on emissions. The existence of a better and cheaper substitute is today succeeding in reducing U.S. emissions where efforts to raise the cost of fossil fuels through carbon caps or pricing — and thereby drive the transition to renewable energy technologies — have failed.

    In fact, the rapid displacement of coal with gas has required little in the way of regulations at all. Conventional air pollution regulations do represent a very low, implicit price on carbon. And a lot of good grassroots activism at the local and regional level has raised the political costs of keeping old coal plants in service and bringing new ones online.

    But those efforts have become increasingly effective as gas has gotten cheaper. The existence of a better and cheaper substitute has made the transition away from coal much more viable economically, and it has put the wind at the back of political efforts to oppose new coal plants, close existing ones, and put in place stronger EPA air pollution regulations.

    Yet if cheap gas is harnessing market forces to shutter old coal plants, the existence of cheap gas from unconventional places is by no means the product of those same forces, nor of laissez faire energy policies. Our current glut of gas and declining emissions are in no small part the result of 30 years of federal support for research, demonstration, and commercialization of non-conventional gas technologies without which there would be no shale gas revolution today.

    Starting in the mid-seventies, the Ford and Carter administrations funded large-scale demonstration projects that proved that shale was a potentially massive source of gas. In the years that followed, the U.S. Department of Energy continued to fund research and demonstration of new fracking technologies and developed new three-dimensional mapping and horizontal drilling technologies that ultimately allowed firms to recover gas from shale at commercially viable cost and scale. And the federal non-conventional gas tax credit subsidized private firms to continue to experiment with new gas technologies at a time when few people even within the natural gas industry thought that firms would ever succeed in economically recovering gas from shale.

    The gas revolution now unfolding — and its potential impact on the future trajectory of U.S. emissions — suggests that the long-standing emphasis on emissions reduction targets and timetables and on pricing have been misplaced. Even now, carbon pricing remains the sine qua non of climate policy among the academic and think-tank crowds, while much of the national environmental movement seems to view the current period as an interregnum between the failed effort to cap carbon emissions in the last Congressand the next opportunity to take up the cap-and-trade effort in some future Congress.

    And yet, the European Emissions Trading Scheme (ETS), which has been in place for almost a decade now and has established carbon prices well above those that would have been established by the proposed U.S. system, has had no discernible impact on European emissions. The carbon intensity of the European economy has not declined at all since the imposition of the ETS. Meanwhile green paragon Germany has embarked upon a coal-building binge under the auspices of the ETS, one that has accelerated since the Germans shut down their nuclear power plants.

    Even so, proponents of U.S. emissions limits maintain that legally binding carbon caps will provide certainty that emissions will go down in the future, whereas technology development and deployment — along with efforts to regulate conventional air pollutants — do not. Certainly, energy and emissions projections have proven notoriously unreliable in the past — it is entirely possible that future emissions could be well above, or well below, the EIA’s current projections. But the cap-and-trade proposal that failed in the last Congress, like the one that has been in place in Europe, would have provided no such certainty. It was so riddled with loopholes, offset provisions, and various other cost-containment mechanisms that emissions would have been able to rise at business-as-usual levels for decades.

    Arguably, the actual outcome might have been much worse. The price of the environmental movement’s demand for its “legally binding” pound of flesh was a massive handout of free emissions allocations to the coal industry, which might have slowed the transition to gas that is currently underway.

    Continuing to drive down U.S. emissions will ultimately require that we develop low- or no-carbon alternatives that are better and cheaper than gas. That won’t happen overnight. The development of cost-effective technologies to recover gas from shale took more than 30 years. But we’ve already made a huge down payment on the technologies we will need.

    Over the last decade, we have spent upwards of $200 billion to develop and commercialize new renewable energy technologies. China has spent even more. And those investments are beginning to pay off. Wind is now almost as cheap as gas in some areas — in prime locations with good proximity to existing transmission. Solar is also close to achieving grid parity in prime locations as well. And a new generation of nuclear designs that promises to be safer, cheaper, and easier to scale may ultimately provide zero-carbon baseload power.

    All of these technologies have a long way to go before they are able to displace coal or gas at significant scale. But the key to getting there won’t be more talk of caps and carbon prices. It will be to continue along the same path that brought us cheap unconventional gas — developing and deploying the technologies and infrastructure we need from the bottom up.

    When all is said and done, a cap, or a carbon price, may get us the last few yards across the finish line. But a more oblique path, focused on developing better technologies and strengthening conventional air pollution regulations, may work just as well, or even better.

    For one thing should now be clear: The key to decarbonizing our economy will be developing cheap alternatives that can cost-effectively replace fossil fuels. There simply is no substitute for making clean energy cheap.

    © 2010 Yale Environment 360

  • California Declares War on Suburbia II: The Cost of Radical Densification

    My April 9 Cross Country column commentary in The Wall Street Journal (California Declares War on Suburbia) outlined California’s determination to virtually outlaw new detached housing. The goal is clear:    force most new residents into multi-family buildings at 20 and 30 or more to the acre. California’s overly harsh land use regulations had already driven housing affordability from fairly typical levels to twice and even three times higher than that of much of the nation. California’s more recent tightening of the land use restrictions (under Assembly Bill 32 and Senate Bill 375) has been justified as necessary for reducing greenhouse gas (GHG) emissions.

    It is All Unnecessary: The reality, however, is that all of this is unnecessary and that sufficient GHG emission reductions can be achieved without interfering with how people live their lives. As a report by the McKinsey Company and The Conference Board put it, there would need to be "no downsizing of vehicles, homes or commercial space," while "traveling the same mileage." Nor, as McKinsey and the Conference Board found, would there be a need for a "shift to denser urban housing." All of this has been lost on California’s crusade against the lifestyle most Californians households prefer.

    Pro and Con: As is to be expected, there are opinions on both sides of the issue. PJTV used California Declares War on Suburbia as the basis for a satirical video, Another Pleasant Valley Sunday, Without Cars or Houses? Is California Banning Suburbia?

    California’s Increasing Demand for Detached Housing? A letter to the editor in The Wall Street Journal suggested that there are more than enough single-family homes to accommodate future detached housing demand in California for the next 25 years. That’s irrelevant, because California has no intention of allowing any such demand to be met.

    The data indicates continuing robust demand. In California’s major metropolitan areas, detached houses accounted for 80 percent of the additions to the occupied housing stock between 2000 and 2010, which slightly exceeds the national trend favoring detached housing (Figure 1). If anything, the shift in demand was the opposite predicted by planners, since only 54 percent of growth in occupied housing in the same metropolitan areas was detached in 2000 (Figure 2).


    Watch What they Do, Not What they Say: It does no good to point to stated preference surveys indicating people preferring higher density living. Recently, Ed Braddy noted in newgeography.com (Smart Growth and the New Newspeak) that a widely cited National Association of Realtors had been "spun" to show that people preferred higher density living, from a question on an "unrealistic scenario," and ignoring an overwhelming preference for detached housing – roughly eighty percent – in other questions in the same survey. People’s preferences are not determined by what they say they will do, but rather by what they do.

    Off-Point Criticism: There was also "off-point" criticism, which can be more abundant than criticisms that are "on-point." Perhaps the most curious was by Brookings Institution Metropolitan Policy Program Senior Researcher Jonathan Rothwell (writing in The New Republic) in a piece entitled "Low-Density Suburbs are Are Not Free-Market Capitalism." I was rather taken aback by this, since none of these three words ("free," "market" or "capitalism") appeared in California Declares War on Suburbia. I was even more surprised at the claim that I defend "anti-density zoning and other forms of large lot protectionism." Not so.

    Indeed, I agree with Rothwell on the problems with large lot zoning. However, it is a stretch to suggest, as he does, that the prevalence of detached housing results from large lot zoning. This is particularly true in places like Southern California where lots have historically been small and whose overall density is far higher than that of greater New York, Boston, Seattle and double that of the planning mecca of Portland.

    Rothwell’s own Brookings Institution has compiled perhaps the best inventory of metropolitan land use restrictions, which indicates that the major metropolitan areas of the West have little in large lot zoning. Yet detached housing is about as prevalent in the West as in the rest of the nation (60.4 percent in the West compared to 61.9 percent in the rest of the nation, according to the 2010 American Community Survey). Further, there has been little or no large lot zoning in Canada and Australia, where detached housing is detached, nor in Western Europe and Japan (yes, Japan, see the Note below).  

    On-Point: Urban Growth Boundaries Do Increase House Prices: However, to his credit, Rothwell points out the connection between urban growth boundaries and higher house prices. This is a view not shared by most in the urban planning community, who remain in denial of the economic evidence (or more accurately, the economic principle) that constraining supply leads to higher prices. This can lead to disastrous consequences, as California’s devastating role in triggering the Great Recession indicates.

    The Purpose of Urban Areas: From 1900 to 2010, the urban population increased from 40 percent to 80 percent of the US population. Approximately 95 percent of the population growth over 100 years was in urban areas. People did not move to urban areas the cities for "togetherness" or to become better citizens. Nor did people move out of an insatiable desire for better urban design or planning. The driving force was economic: the desire for higher incomes and better lives. A former World Bank principal urban planner, Alain Bertaud stated the economic justification directly: "large labor markets are the only raison d’être of large cities."

    And for the vast majority of Americans in metropolitan areas, including those in California, those better lives mean living in suburbs and detached houses. All the myth-making in the world won’t change that reality, even if it pushes people out of the Golden State to other, more accommodating pastures.

    The performance of urban areas is appropriately evaluated by results, such as economic outcomes, without regard to inputs, such as the extent to which an area conforms to the latest conventional wisdom in urban planning.

    • Land use policies should not lead to higher housing costs relative to incomes, as they already have in California, Australia, Vancouver, Toronto and elsewhere. If they do, residents are less well served.
    • Transport policies should not be allowed to intensify traffic congestion by disproportionately funding alternatives (such as transit and bicycles) that have little or no potential to improve mobility as seems the likely outcome of radical densification. If they do, residents will be less well served.

    This gets to the very heart of the debate. The “smart growth on steroids” policies now being implemented in California are likely to lead to urban areas with less efficient personal and job mobility, where economic and employment growth is likely to be less than would otherwise be expected. The issue is not urban sprawl. The issue is rather sustaining the middle-income quality of life, which is now endangered by public policy in California, and for no good reason.

    Wendell Cox is a Visiting Professor, Conservatoire National des Arts et Metiers, Paris and the author of “War on the Dream: How Anti-Sprawl Policy Threatens the Quality of Life

    —-

    Note: Despite its reputation for high density living, Japan’s suburbs have many millions of detached houses. In 2010, 47 percent of the occupied housing in Japan’s major metropolitan areas was detached (Tokyo, Osaka-Kobe-Kyoto, Nagoya, Sapporo, Sendai, Hiroshima, Kitakyushu-Fukuoka, Shizuoka and Hamamatsu).

    Photo: An endangered species: Detached houses in Ventura County (Photo by author)

  • The Return of the Monkish Virtues

    “[The author of Leviticus] posits the existence of one supreme God who contends neither with a higher realm nor with competing peers. The world of demons is abolished; there is no struggle with autonomous foes, because there are none. With the demise of the demons, only one creature remains with ‘demonic’ power – the human being. Endowed with free will, human power is greater than any attributed to humans by pagan society. Not only can one defy God but, in Priestly language, one can drive God out of his sanctuary. In this respect, humans have replaced demons…..[The author of Leviticus] also posits that the pollution of the sanctuary leads to YHWH’s abandonment of Israel and its ejection from the land….Israel pollutes the land; the land becomes infertile; Israel is forced to leave.” – Jacob Milgrom, Leviticus


    “Pollution ideas are the product of an ongoing political debate about the ideal society. All mysterious pollutions are dangerous, but to focus on the physical danger and to deride the reasoning that attaches it to particular transgressions is to miss the lesson for ourselves…. Pollution beliefs trace causal chains from from actions to disasters…Pollution beliefs uphold conceptual categories dividing the moral from the immoral and so sustain the vision of the good society.” – Mary Douglas and Aaron Wildavsky, Risk and Culture


    “Celibacy, fasting, penance, mortification, self-denial, humility, silence, solitude, and the whole train of monkish virtues; for what reason are they everywhere rejected by men of sense, but because they serve to no manner of purpose; neither advance a man’s fortune in the world, nor render him a more valuable member of society; neither qualify him for the entertainment of company, nor increase his power of self-enjoyment? We observe, on the contrary, that they cross all these desirable ends; stupify the understanding and harden the heart, obscure the fancy and sour the temper. We justly, therefore, transfer them to the opposite column, and place them in the catalogue of vices.” – David Hume, An Enquiry Concerning the Principles of Morals

    The era of the 100 watt incandescent light bulb came to an end in America on January 1st. Lower wattages will soon join them in a phaseout over time. As I noted previously, this will mean factory shutdowns in the United States and the migration of the light bulb manufacturing industry to China. The most common replacement type bulbs, compact fluorescents, are not “instant on,” generally fail to provide a proper light spectrum, contain poisonous mercury, and burn out sooner than advertised. CFL boosters claim none of these are real problems and that CFLs are a slam dunk for benefit/cost reasons, but the cold reality is that despite significant promotion, they never received widespread consumer adoption voluntarily. Given how eagerly consumers slurp up even bona fide more expensive products like Apple computers when they are perceived to be superior, I’m inclined to think the consumers are on to something. I’ve tried out CFLs myself and thought they basically sucked.

    The supposed rationale for imposing an inferior product that did not receive the desired traction in the the marketplace is to prevent climate change. I went searching to try to find exactly what the impact of light bulbs on greenhouse gas emissions was and have found it quite difficult to obtain. The various sites touting CFLs all note the high output of CO2 from electricity generation generally, how much CO2 changing this or that bulb will save, etc, but as for what a wholesale elimination of light bulbs would achieve, that’s harder to find.

    According to the EPA, residential electricity accounted for 784.6 million metric tons of CO2 in 2009, or 11.8% of total US human greenhouse gas emissions. How much of that is from light bulbs? It’s not broken out in the EPA’s report (even the detailed version), but I’ll attempt an estimate of aggregate CO2 savings. (If someone has a direct link to this information, please let me know).

    The Guardian reported that an Australian incandescent ban would save that country 800K tons of CO2 emitted per year and a UK ban would save 2-3 million tons. It also reported that China could save 48 million tons per year by banning incandescents.

    The US is bigger than Australia and the UK, but similarly advanced developmentally. China is a bigger emitter than the US, has far more people, is less advanced developmentally, and is a bigger user of coal for electricity generation. However, all three countries project similar per capita emissions reductions from incandescent elimination. If the US savings were at the upper end of their range, it would have CO2 savings of around 15 million tons a year. That’s only 0.2% of total US greenhouse gas emissions. Even if the US saved the same 48 million tons as China, it’s only 0.7%. I’d be skeptical of anyone claiming the US would save a lot more CO2 per capita than these. Some maybe, a lot, no.

    In short, swapping out incandescent light bulbs is not going to be a major contributor to solving the problem of climate change. I’m not aware of anyone claiming it is. So why pass a law that is unpopular in many quarters and cram CFLs and other type of bulbs consumers haven’t chosen to buy on their own down their throats? It seems to be a purely provocative move of a mostly symbolic nature with little real substance that is sure to only harden opposition to the real changes we need to make to actually make material reductions in GHG emissions. (One might say the same of other items like mandatory recycling or banning plastic grocery bags).

    The answer is that the symbolism is the substance.

    The sad reality is that rather than make policy cases based on benefit/cost or other technical considerations, for political or personal reasons sustainability advocates have decided to model their cause on the template of religion. In it we have an Edenic state of nature in a fallen state because of man’s sin (pollution) for which we will experience a coming apocalyptic judgement (damage from climate change). Thus avoiding the consequences becomes fundamentally a problem of sin management. The proposed sin management solution is again taken from traditional Christianity: confession and repentance, followed by penance, restoration to right standing with God (nature), and committing to a holier life.

    There are two basic problems with this. The first is that while the religion template taps in to a deep psychological vein in the human spirit – some have suggested humanity may even carry a so-called “God gene” – most people already have a religion and aren’t likely to convert to a new one without a major outreach effort.

    But more importantly, the notion of penance, and perhaps of asceticism more generally, has never sold with the public, even in more religious eras. David Hume (a vigorous religious skeptic it should be noted) referred to the values resulting from this lifestyle as the “monkish virtues” and noted that they have “everywhere rejected by men of sense.” Or as Carol Coletta put it more recently, people don’t want to be told to “eat their spinach.”

    It strikes me that while perhaps environmentalists don’t really want to force a particular lifestyle on people, there is a fundamental desire to see people engage in some sort of public penance for our environmental sins. I believe this to be the root logic underlying a lot of feel-good (or perhaps more accurately, “feel-bad”) initiatives like getting rid of incandescent light bulbs. It is a form of penance and embrace of the monkish virtues.

    I can’t help but notice that even Christianity itself has moved away from promoting the monkish virtues. While things humility are of course still preached and expected to be modeled, modern Christianity mostly rejects the notion of an ascetic life. Most Evangelical churches actually preach that God wants humans to be happy. The idea is of a God who wants us to be unselfish, but not unhappy. A not insignificant number of churches actually preach the so-called “prosperity gospel” in which God will provide earthly blessings to His followers. In the Catholic tradition, monasticism itself has been in decline for some time. (I liken the reports of upticks in interest in joining monasteries as similar to the perennial “return of the suit” articles in fashion magazines).

    Whether these theological points are accurate or not is beside the point of this article. They appear to be attractional. For example, well-known prosperity gospel preacher Joel Osteen runs the largest church in the United States, with over 40,000 attending weekly.

    What might the environmental movement have looked like based on a different template? I’ll refer again to the work of Bruce Mau. If you’ve ever seen him present on this topic, he likes to start by noting that if we brought the entire world up to US standards of living, it would take four Earth’s worth of resources given our current technologies and approaches to make it happen. He thinks that’s a good thing, because the patent impossibility of that “takes that option off the table.” He then goes on to talk about all the super-cool new stuff we are going to have to invent and scale up to address the challenges of the future. If you haven’t, I might suggest getting his book Massive Change, which I reviewed a while back. It’s difficult to come away from one of Mau’s books or lectures without being excited about the possibilities of the future.

    I don’t think Mau has any different view of the fundamentals of climate change than your typical orthodox environmentalist. But his approaches to solutions (which are admittedly not always short term practical action plans) and the sales job on them is very different. As a designer, he knows he needs to create something that’s aspirational and attractional in order to get people to want it. It’s a shame too few people have followed that lead.

    The monkish virtues are just never going to sell. Perhaps you can get a room full of the sustainability in-crowd to buy into it, or even focus on top level political success as with the bulb ban. But ultimately I think this is self-defeating.

    In the short term I’d suggest ending any efforts to impose direct consumer mandates. I don’t think that’s where the money is, so to speak, in GHG reductions. Instead, let’s focus on the producer side of the equation in ways that are largely transparent to consumers and don’t involve significant costs. More fuel efficient vehicles might be one. Replacing coal with natural gas is another possibility. (The EPA report I linked earlier cited this as a big contributor the decline in GHG emissions in recent years). New technologies are clearly needed and should perhaps be invested in even though as we know this will lead to many failures along the way.

    As the financial crisis in Greece and elsewhere shows, people rarely confront structural problems, no matter how serious, until the crisis actually comes. At least if “austerity” (a monkish virtue if ever there was one) is the major part of the proposed solution.

    If an environmental equivalent of austerity is required to save the planet, then I’m afraid we should prepare for the deluge. I personally don’t think we’re at that point, given that we’ve had huge gains in energy efficiency for many decades now while our lifestyles have actually improved. More of that, not the promotion of monkish solutions like CFL lightbulbs, is what it will really take to drive further environmental improvements.

    PS: If you don’t think people are really promoting or embracing monkish lifestyles in support of environmentalism, read this article from the Guardian about people giving up on daily showers. Or think about the people trying to completely go “off the grid.” Even if CFLs don’t fit for you, clearly there are plenty of examples. I pick CFLs because they are an institutionalization of monkish virtues, not just the passion of the small minority, which has always been the case.

    Aaron M. Renn is an independent writer on urban affairs based in the Midwest. His writings appear at The Urbanophile, where this essay originally appeared.

    Photo by BigStockPhoto.com.

  • Shale Revolution Challenges the Left and the Right

    In his State of the Union address, President Obama invoked the 30-year history of federal support for new shale gas drilling technologies to defend his present day investments in green energy. Obama stressed the value of shale gas—which will create thousands of jobs and billions in profits—as part of his "all of the above" approach to energy, and defended the critical role government investment has always played in developing new energy technologies, from nuclear to solar panels to wind turbines.

    The president’s remarks unsurprisingly sparked a strong response from some conservatives (here, here, here, and here), who have downplayed and even attempted to deny the important role that federal investments in hydrofracking, geologic mapping, and horizontal drilling played in the shale gas revolution.

    This is an over-reaction. In acknowledging the critical role government funding played in shale gas, conservatives need not write a blank check for all government energy subsidies. Indeed, a closer look at the shale gas story challenges liberal policy preferences as much as it challenges those of conservatives, and points to much-needed reforms for today’s mash of state and federal clean energy subsidies and mandates.

    The Government’s Role

    Some have pointed to the fact that fracking dates back to the 19th century and hydraulic fracking to the 1940s as evidence that federal funding for today’s fracking technologies was unimportant. But dismissing the importance of federal support for new shale gas technologies in the ’70s and ‘80s because private firms had succeeded in fracking for oil in the ’40s and ’50s is like suggesting that postwar military investments in jet engines were unnecessary because the Wright Brothers invented the propeller plane in 1903.

    Enhancing oil recovery from existing wells in limestone formations by injecting various combinations of water, sand, and lubricants, as was done by private firms starting in the 1940s, is a vastly different and less complicated technical challenge than recovering widely dispersed gas methane in rock formations like shale that are simultaneously porous but not highly permeable.

    Recovering gas from shale formations at a commercial scale requires injecting vastly more water, sand, and lubricants at vastly higher pressures throughout vastly larger geological formations than anything that had been attempted in earlier oil recovery efforts. It requires having some idea of where the highly diffused pockets of gas are, and it requires both drilling long distances horizontally and being able to fracture rock under high pressure multiple times along the way.

    The oil and gas industries had no idea how to do any of this at the time that federal research and demonstration efforts were first initiated in the late 1960s—indeed, throughout the 1970s the gas industry made regular practice of drilling past shale to get to limestone gas deposits.

    This is not just our opinion, it was the opinion of the natural gas industry itself, which explicitly requested assistance from the federal government in figuring out how to economically recover gas from shale starting in the late 1970s. Indeed, shale gas pioneer George Mitchell was an avid and vocal supporter of federal investments in developing new oil and gas technologies, and regularly advocated on behalf of Department of Energy fossil research throughout the 1980s to prevent Congress from zeroing out research budgets in an era of low energy prices.

    Early Efforts

    The first federal efforts to demonstrate shale gas recovery at commercial scales did not immediately result in commercially viable technologies, and this too has been offered as evidence that federal research efforts were ineffective. In two gas stimulation experiments in 1967 and 1969, the Atomic Energy Commission detonated atomic devices in New Mexico and Colorado in order to crack the shale and release large volumes of gas trapped in the rock. The project succeeded in recovering gas, but due to concerns about radioactive tritium elements in the gas, the project was abandoned.

    These projects are easy to ridicule. They sound preposterous to both anti-nuclear and anti-government ears. But in fact, the experiment demonstrated that it was possible to recover diffused gas from shale formations—proof of a concept that had theretofore not been established.

    A few years later, the just-established Department of Energy demonstrated that the same result could be achieved by pumping massive amounts of highly pressurized water into shale formations. This process, known as massive hydraulic fracturing (MHF), proved too expensive for broad commercialization. But oil and gas firms, with continuing federal support, tinkered with the amount of sand, water, and binding agents over the following two decades to achieve today’s much cheaper formula, known as slickwater fracking.

    Early federal fracking demonstrations can be fairly characterized as big, slow, dumb, and expensive. But when it comes to technological innovation, the big, slow, dumb, and expensive phase is almost always unavoidable. Innovation typically proceeds from big, slow, dumb, and expensive to small, fast, smart, and cheap. Think of building-sized computers from the 1950s that lacked the processing power to run a primitive, 1970s digital watch.

    Private firms are really good at small, fast, smart, and cheap, but they mostly don’t do big, slow, dumb, and expensive, because the benefits are too remote, the risks too great, and the costs too high. But here’s the catch. You usually can’t do small, fast, smart, and cheap until you’ve done big, slow, dumb, and expensive first. Hence the reason that, again and again, the federal government has played that role for critical technologies that turned out to be important to our economic well-being.

    Drilling Down into Innovative Methods

    In fact, virtually all subsequent commercial fracturing technologies have been built upon the basic understanding of hydraulic fracturing first demonstrated by the Department of Energy in the 1970s. That included not just demonstrating that gas could be released from shale formations, but also the critical understanding of how shale cracks under pressure. Scientists learned from the large federal demonstration projects in the 1970s that most shale in the United States fractures in the same direction. This led government and industry researchers to focus their efforts on technologies that would allow them to drill long distances horizontally, in a direction that situated the well hole perpendicular to the directions that fractures would run, which allowed firms to capture much more gas from each well.

    Government and industry researchers also focused on developing the ability to create multiple fracks from each horizontal well, and in 1986 a joint government-industry venture demonstrated the first multifrack horizontal well in Devonian Shale. During the same period, government researchers at Sandia Laboratory developed tools for micro-seismic mapping, a technique that would prove critical to the development of commercially viable fracking. Micro-seismic mapping allowed firms to see precisely where the cracks in the rock were, and to modulate pressure, fluid, and proppant in order to control the size and geometry of each frack.

    George Mitchell, who is widely credited with having pioneered the shale gas revolution, leaned heavily upon these innovations throughout the 1990s, when he finally put all the pieces together and figured out how to extract gas from shale economically. Mitchell had spent over a decade consolidating his position in the Barnett Shale before he asked for technical assistance from the government. “By the early 1990s, we had a good position, acceptable but lacking knowledge base,” Mitchell Energy Vice President Dan Steward told us recently.

    Mitchell turned to the Gas Research Institute and federal laboratories for help in 1991. GRI paid for Mitchell to attempt his first horizontal well. The Sandia National Laboratory provided Mitchell with the tools and a scientific team to micro-seismically map his wells. It was only after Mitchell turned to GRI and federal laboratories for help that he finally cracked the shale gas code.

    A Counterfactual?

    But so what? Federal investments in new gas technologies may have proved critical to the shale gas revolution, but could they have happened without those investments? Where is the counterfactual?

    Constructing a counterfactual can be a useful analytical method, but it can be abused. In this case, the counterfactual has been asserted as a kind of faith-based defense against the inconvenient history of the shale gas revolution. Nobody has offered a real world example—for instance, a country where private firms developed economical shale gas technology without any public support.

    Nor has anyone offered a detailed historical analysis to justify the claim that private entrepreneurs would have done the critical applied research, developed the fracking technologies, funded the explorations in new drill bits and horizontal wells, and created the micro-seismic mapping technologies that were all required to make the shale revolution possible. A close look at the development of those technologies reveals private sector entrepreneurs, like Mitchell, who were loudly and clearly asking for help because they knew they had neither the technical knowledge nor the ability to finance such risky innovations on their own.

    The Implications for Renewable Energy Subsidies

    In the end though, we are mostly having this debate now because historical federal investments in shale gas are being compared to current investments in renewables. There is much that is in fact comparable—the federal role in the shale gas revolution went well beyond basic research, as some have claimed, and matches up with current renewables programs virtually demonstration for demonstration, tax credit for tax credit, and dollar for dollar when comparing the scale and nature of present federal support for renewables with past support for unconventional gas. But that doesn’t mean that President Obama’s subsidies for green energy are immune to criticism.

    Indeed, once we acknowledge the shale gas case as a government success, not a failure, it offers a powerful basis for reforming present clean energy investments and subsidies. Federal subsidies for shale gas came to an end, and so should federal wind and solar subsidies, at least as blanket subsidies for all solar and wind technologies. In many prime locations, where there is good wind, proximity to transmission, state renewable energy purchase mandates, and multiple state and federal subsidies, wind development is now highly profitable.

    If federal investments in wind and solar are really like those in unconventional gas, then we ought to set a date certain when blanket subsidies for wind and solar energy come to an end. Imposing a phase-out of production subsidies would encourage sustained innovations and absolute cost declines. We might want to extend continuing support for some newer classes of wind and solar technologies, those that are innovating new technological methods to generate energy, or those that are specifically designed to perform better in lower wind or marginal solar locations. But in the ’80s and ’90s we did not provide a tax credit to all gas wells, only those using new technologies to recover gas from new geologic formations—and we should not continue to provide subsidies to wind and solar technologies that are already proven and increasingly widely deployed with no end in sight.

    Another key lesson is that many of the most important research and demonstration projects in new shale gas technologies were funded and overseen by the Gas Research Institute, a partnership between Department of Energy laboratories and the natural gas industry that was funded through a small Federal Energy Regulatory Commission-administered fee on gas prices. GRI had both independence from Congress and the federal bureaucracy, and strong representation from the natural gas industry, which allowed it to focus research and dollars on solving key technical problems that pioneers like George Mitchell were struggling with. Federal investments in applied research and demonstration of new green energy projects ought to be similarly insulated from political meddling and rent seeking.

    These and other lessons from the shale gas revolution point to far-reaching reforms of federal energy innovation and subsidy programs. If the history of the shale gas revolution challenges the tale of a single lone entrepreneur persevering without help from the government, it also challenges the present federal approach to investing in renewables in important respects. The history of federal support for shale gas offers as much a case for reform of current federal clean energy investments as it does for their preservation.

    This piece originally appeared at The American.

    Shellenberger and Nordhaus are co-founders of the Breakthrough Institute, a leading environmental think tank in the United States. They are authors of Break Through: From the Death of Environmentalism to the Politics of Possibility.

  • Is Energy the Last Good Issue for Republicans?

    With gas prices beginning their summer spike to what could be record highs, President Obama in recent days has gone out of his way to sound reassuring on energy, seeming to approve an oil pipeline to Oklahoma this week after earlier approving leases for drilling in Alaska. Yet few in the energy industry trust the administration’s commitment to expanding the nation’s conventional energy supplies given his strong ties to the powerful green movement, which opposes the fossil-fuel industry in a split that’s increasingly dividing the country by region, class, and culture.

    But Republicans, other than the increasingly irrelevant Newt Gingrich, have failed to capitalize on the potent issue, instead lending the president an unwitting assist by focusing the primary fight on vague economic plans and sex-related side issues like abortion, gay marriage, and contraception. The GOP may be winning over the College of Cardinals, but it is squandering its chance of gaining a majority in the Electoral College, holding the House, and taking the Senate.

    No single sector affects more people and industries than energy, and none is more deeply affected by the disposition of government. Energy divides the nation into two camps. On one side there are the regions and industries dependent on the development and use of energy. They include the increasingly expansive energy-producing region stretching from the Gulf Coast and the Great Plains to parts of Ohio, Pennsylvania, and the Appalachian range.

    The centers of energy growth, including areas stretching from the Gulf Coast through the Great Plains to the Canadian border, have generated the highest levels of job and income growth over the past decade (along with parasitic Washington, D.C.).

    Nine of the 11 fastest-growing job categories are related to energy production, according to an analysis by Economic Modeling Systems Inc. Energy jobs pay an average of $100,000 annually, about the same as software engineers earn in Silicon Valley.

    Perhaps more important politically, this bonanza is now spreading to historical battleground states Ohio, Pennsylvania, and Michigan. Long-depressed areas like western Pennsylvania are reversing decades of decline as new finds and advances in natural-gas drilling have opened up vast new stores of domestic energy. The new energy wealth has created new jobs, enriched property owners, and provided states with potential huge new sources of revenue.

    On the other side of the energy divide stand a handful of dense, mostly coastal metropolitan areas with either little in the way of energy resources or, in the case of California’s most affluent urban pockets, little interest in exploiting them. With a shrinking industrial base and less dependence on automobiles, these areas now constitute the political base for the both the Democratic Party and the growing green-industrial complex, which boasts strong ties to Silicon Valley’s well-heeled venture-capital “community” and their less celebrated, but even wealthier, Wall Street allies.

    In these places, the current fossil-energy boom is regarded less as a boon than as an environmental disaster in the making, a view captured in the unrelenting attack on shale development in the news pages of The New York Times and other outlets in broad sympathy with the Obama administration. New production of low-cost, low-emission natural gas also threatens the viability of politically preferred renewables such as solar and wind. But unlike fossil fuels, such “green” initiatives have created very few jobs; overall, the promise of “green jobs,” as even The New York Times has noted, has failed to live up to its hype.

    Given the success in the other energy states, California—with double-digit unemployment—might reconsider its policies, but this is unlikely. “I asked [Gov.] Jerry Brown about why California cannot come to grips with its huge hydrocarbon reserves,” John Hofmeister, a former president of Shell Oil’s American operations and a member of the U.S. Department of Energy’s Hydrogen and Fuel Cell Technical Advisory Committee, told me recently. “After all, this could turn around the state."

    Brown’s answer, according to Hofmeister: “This is not logic, it’s California. This is simply not going to happen here.’”

    But elsewhere in the U.S., new technologies such as hydraulic fracking and vertical drilling have vastly increased estimates of North America’s energy resources, particularly natural gas. By 2020, the United States, according to the consultancy PFC Energy, will surpass Russia and Saudi Arabia as the world’s leading oil and gas producer.

    As President Obama has acknowledged, this surge of production boasts some great economic benefits. American imports of raw petroleum have fallen from a high of 60 percent of the total to less than 46 percent. Overall, according to Rice University’s Amy Myers Jaffe, U.S. oil reserves now stand at more than 2 trillion barrels; Canada has slightly more. She pegs North America’s combined reserves at more than three times the total estimated reserves of the Middle East and North Africa.

    At the same time, energy exploration is sparking something of an industrial revival. The demand for new rigs, pipelines, and a series of new petrochemical facilities has created a burst of industrial production across much of the country. Steel mills, makers of earth-moving equipment, and construction suppliers all have benefited. A recent study by PricewaterhouseCoopers suggests shale gas could lead to the development of 1 million industrial jobs. Not surprisingly, some of the biggest backers of shale-gas exploration are prominent CEOs from industrial firms.

    Energy policy may also be critical for the future of the Great Lakes–based American auto industry. Despite expensive PR ventures like the electric Chevy Volt, the Big Three depend for profits largely on SUVs and trucks. High oil prices will only help their competitors from Japan, South Korea, and Germany, all of which are ramping up in the emerging Southeastern auto corridor. Rising oil prices could also raise the costs of food production, which relies heavily on energy-intensive fertilizers and machinery.

    Aware of the negative consequences for a still-weak recovery, President Obama has started to mount a defense for his energy policies. Last month he launched several preemptive strikes, claiming credit for rising U.S. production while ridiculing Republicans for their “drill, baby, drill” response to rising energy prices.

    Obama is correct in asserting that increases in domestic production will not solve the energy price issue overnight, or even in the near future. But it was disingenuous for him to then take credit for the current energy boom, which resulted largely from policies adopted during the Bush years, while Obama’s policies have, if anything, slowed exploration and development.

    It’s fairly clear that the president and his team—notably Energy Secretary Steven Chu and Interior Secretary Ken Salazar—are at best ambivalent about greater fossil-fuel development. Obama, for example, recently proposed cutting tax breaks and subsidies for the oil industry, which he estimated at $4 billion annually—a new expense for the companies that would in large part be passed on to consumers at the pump.

    This is not necessarily a bad thing in its own right, but along with the effective tax hike, Obama proposed doubling down on the much larger and, to date, far less productive giveaways to the green-industrial complex, which received $80 billion in loans and subsidies in the 2009 stimulus. According to various studies, including the Energy Information Agency, solar firms enjoy rates of subsidization per kilowatt hour at least five times those gained by fossil-fuel firms.

    If all energy subsidies were removed, the fossil-fuel industry likely could shrug off the hit, while the heavily subsidized green-industrial complex would markedly diminish. Yet even if Congress refuses to continue the green subsidies, it’s probable that administration regulators would find ways to slow fossil-fuel expansion in a second Obama term. Responding largely to the Democratic environmental lobby, they have already overruled the State Department to delay the Keystone XL pipeline from Canada. Plans for new multibillion-dollar petrochemical plants on the Gulf will make easy pickings for federal regulators from agencies now controlled by environmental zealots.

    “The energy states feel they are being persecuted for their good deeds,” says Eric Smith, director of the Tulane Energy Institute in New Orleans. “There is a sense there are people in the administration who would like this whole industry to go away.”

    In the short run, Obama’s political exposure in the energy wars is somewhat limited. Most of the big-producing states—Oklahoma, Wyoming, Utah, Texas, Louisiana, Alaska, and North Dakota—are unlikely to vote for him anyway. Nor does he have to worry about too much pressure from inside his party; Democratic ranks in Congress from energy-producing states have thinned considerably in recent years, removing contrary voices inside the party.

    A more dicey issue relates to contestable states like Ohio, Pennsylvania, and Michigan, where many see the energy boom as a source of economic recovery. To make their case in these and other swing states, Republicans first have to make energy the overall revival of the American economy—the key issue for this November’s election. If they insist on campaigning primarily as stolid defenders of rigid social values and election-year promises of painless tax cuts, they will have themselves to blame for their drubbing in November.

    This piece originally appeared in TheDailyBeast.

    Joel Kotkin is executive editor of NewGeography.com and is a distinguished presidential fellow in urban futures at Chapman University, and contributing editor to the City Journal in New York. He is author of The City: A Global History. His newest book is The Next Hundred Million: America in 2050, released in February, 2010.

    Photo courtesy of BigStockPhoto.com.

  • Time to Rethink This Experiment? Delusion Down Under

    The famous physicist, Albert Einstein, was noted for his powers of observation and rigorous observance of the scientific method. It was insanity, he once wrote, to repeat the same experiment over and over again, and to expect a different outcome. With that in mind, I wonder what Einstein would make of the last decade and a bit of experimentation in Queensland’s urban planning and development assessment? 

    Fortunately, we don’t need Einstein’s help on this one because even the most casual of observers would conclude that after more than a decade of ‘reform’ and ‘innovation’ in the fields of town planning and the regulatory assessment of development, it now costs a great deal more and takes a great deal longer to do the same thing for no measureable benefit. As experiments go, this is one we might think about abandoning or at the very least trying something different.

    First, let’s quickly review the last decade or so of change in urban planning and development assessment. Up until the late 1990s, development assessment was relatively more straightforward under the Local Government (Planning and Environment) Act of 1990. Land already zoned for industrial use required only building consent to develop an industrial building. Land zoned for housing likewise required compliance with building approvals for housing. These were usually granted within a matter of weeks or (at the outset) months. 

    There were small head works charges, which essentially related to connection costs of services to the particular development. Town planning departments in local and state governments were fairly small in size and focussed mainly on strategic planning and land use zoning. It was the building departments that did most of the approving. Land not zoned for its intended use was subject to a process of development application (for rezoning), but here again the approach was much less convoluted that today. NIMBY’s and hard left greenies were around back then, but they weren’t in charge. Things happened, and they happened far more quickly, at lower cost to the community, than now.

    In the intervening decade and a bit, we’ve seen the delivery and implementation of an avalanche of regulatory and legislative intervention. It started with the Integrated Planning Act (1997), which sought to integrate disparate approval agencies into one ‘fast track’ simplified system. It immediately slowed everything down.  It promised greater freedom under an alleged ‘performance based’ assessment system, but in reality provoked local councils to invoke the ‘precautionary principle’ by submitting virtually everything to detailed development assessment. The Integrated Planning Act was followed, with much fanfare, by the Sustainable Planning Act (2009). Cynics, including some in the government at the time, dryly noted that a key performance measure of the Sustainable Planning Act was that it used the word ‘sustainable’ on almost every page. 

    Overlaying these regulations have been a constant flow of land use regulations in the form of regional plans, environmental plans, acid sulphate soil plans, global warming, sky-is-falling, seas-are-rising plans – plans for just about everything which also affect what can and can’t be done with individual pieces of private property.
    But it wasn’t just the steady withdrawal of private property rights as state and local government agencies gradually assumed more control over permissible development on other people’s land. There was also a philosophical change on two essential fronts.

    First, there was the notion that we were rapidly running out of land and desperately needed to avoid becoming a 200 kilometre wide city. Fear mongers warned of ‘LA type sprawl’ and argued the need for densification, based largely on innocuous sounding planning notions like ‘Smart Growth’ imported from places like California (population 36 million, more than 1.5 times all of Australia, and Los Angeles, population 10 million, roughly three times the population of south east Queensland).  The first ‘South east Queensland Regional Plan 2005-2026’ was born with these philosophical changes in mind, setting an urban growth boundary around the region and mandating a change to higher density living (despite broad community disinterest in density). It was revisited by the South East Queensland Regional Plan 2009-2031 which formally announced that 50% of all new dwellings should be delivered via infill and density models (without much thought, clearly, for how this was to be achieved and whether anyone particularly wanted it). Then there was the South East Queensland Regional Infrastructure Plan 2010-2031 which promised $134 billion in infrastructure spending to make this all possible (without much thought to where the money might come from) and a host of state planning policies to fill in any gaps which particular interest groups or social engineers may have identified as needing to be filled.

    The significant philosophical change, enforced by the regional plan, was that land for growth instantly became scarcer because planning permission would be denied in areas outside the artificially imposed land boundary. Scarcity of any product, particularly during a time of rising demand (as it was back then, when south east Queensland had a strong economy to speak of) results in rising prices. That is just what happened to any land capable of gaining development permission within the land boundary: raw land rose in price, much faster than house construction costs or wages. 

    The other significant philosophical change that took root was the notion of ‘user pays’ – which became a byword for buck passing the infrastructure challenge from the community at large, to new entrants, via developer levies. Local governments state-wide took to the notion of ‘developer levies’ with unseemly greed and haste. ‘Greedy developers’ could afford to pay (they argued) plus the notion of ‘user pays’ gave them some (albeit shaky) grounds for ideological justification. Soon, developers weren’t just being levied for the immediate cost of infrastructure associated with their particular development, but were being charged with the costs of community-wide infrastructure upgrades well beyond the impact of their proposal or its occupants. 

    Levies rose faster than Poseidon shares in the ‘70s. Soon enough, upfront per lot levies went past the $50,000 per lot mark and although recent moves to cap these per lot levies to $28,000 per dwelling have been introduced, many observers seem to think that councils are now so addicted that they’ll find alternate ways to get around the caps.

    So the triple whammy of ‘reform’ in just over a decade was that regulations and complexity exploded, supply became artificially constrained to meet some deterministic view of how and where us mere citizens might be permitted to live, and costs and charges levied on new housing (and new development generally) exploded.

    At no point during this period, and this has to be emphasised, can anyone honestly claim that this has achieved anything positive. It has made housing prohibitively expensive, and less responsive to market signals. Simply put, it takes longer, costs more, and is vastly more complicated than it was before, for no measureable gain.

    An indication of this was given to me recently in the form of the Sunshine Coast Council’s budget for its development assessment ‘directorate.’ (How apropos is that term? It would be just as much at home in a Soviet planning bureau).  Their budget (the documents had to be FOI’d) for 2009-10 financial year included a total employee costs budget of $17.4 million.  For the sake of argument, let’s assume the average directorate comrade was paid $80,000 per annum. That would mean something like more than 200 staff in total. Now they might all be very busy, but it surely says something about how complexity and costs have poisoned our assessment system if the Sunshine Coast Council needs to spend over $17 million of its ratepayer’s money just to employ people to assess development applications in a down market.

    If there had been any meaningful measures attached to these changes in approach over the last decade, we’d be better placed to assess how they’ve performed. But there weren’t, so let’s instead retrospectively apply some:

    Is there now more certainty? No. Ask anyone. Developers are confused. The community is confused. Even regulators are confused and frequently resort to planning lawyers, which often leads to more confusion. The simple question of ‘what can be done on this piece of land’ is now much harder to answer.

    Is there more efficiency? No. Any process which now takes so much longer and costs so much more cannot be argued to be efficient.

    Is the system more market responsive? No. Indeed the opposite could be argued – that the system is less responsive to market signals or consumer preference. Urban planning and market preference have become gradually divorced to the point that some planners actively view the market preferences of homebuyers with contempt.

    Are we getting better quality product? Many developers will argue that even on this criteria, the system has dumbed down innovation such that aesthetic, environmental or design initiatives have to fight so much harder to get through that they’re simply not worth doing.

    Is infrastructure delivery more closely aligned with demand? One of the great promises of a decade of ‘reform’ was that infrastructure deficits would be addressed if urban expansion and infrastructure delivery were aligned. Well it’s been done in theory via countless reports and press releases but it’s hardly been delivered in execution. And when the volumes of infrastructure levies collected by various agencies has been examined, it’s often been found that the money’s been hoarded and not even being spent on the very things it was collected for.

    Is the community better served? Maybe elements of the green movement would say so, but for young families trying to enter the housing market, the answer is an emphatic (and expensive) no. How can prohibitively expensive new housing costs be good for the community? For communities in established urban areas, there is more confusion about the impact of density planning, which has made NIMBY’s even more hostile than before.

    Has it been good for the economy? South east Queensland’s economy was once driven by strong population growth – the very reason all this extra planning was considered necessary. But growth has stalled, arguably due to the very regulatory systems and pricing regimes that were designed around it. We now have some of the slowest rates of population growth in recent history and our interstate competitiveness – in terms of land prices and the costs of development – is at an all time low. That’s hardly what you’d call a positive outcome.

    Is the environment better served? If you believe that the only way the environment can be better served is by choking off growth under the weight of regulation and taxation, you might say yes. But then again, studies repeatedly show that the density models proposed under current planning philosophies promote less environmentally efficient forms of housing, and can cause more congestion, than the alternate. So even if the heroic assumptions for the scale of infill and high density development contained in regional plans was actually by some miracle achieved, the environment might be worse off, not better, for it. 

    All up, it’s a pretty damming assessment of what’s been achieved in just over a decade. Of course the proponents of the current approach might warn that – without all this complexity, cost and frustration – Queensland would be subject to ‘runaway growth’ and a ‘return to the policies of sprawl.’ The answer to that, surely, is that everything prior to the late 1990s was delivered – successfully – without all this baggage. Life was affordable, the economy strong, growth was a positive and things were getting done. Queensland, and south east Queensland in particular, was regarded as a place with a strong future and a magnet for talent and capital. Now, that’s been lost.

    Einstein would tell us to stop this experiment and try something else if we aren’t happy with the results. To persist with the current frameworks and philosophies can only mean the advocates of the status quo consider these outcomes to be acceptable.  Is anyone prepared to put up their hand and say that they are?

    Ross Elliott has more than 20 years experience in property and public policy. His past roles have included stints in urban economics, national and state roles with the Property Council, and in destination marketing. He has written extensively on a range of public policy issues centering around urban issues, and continues to maintain his recreational interest in public policy through ongoing contributions such as this or via his monthly blog The Pulse.

    Photo by Flickr user Mansionwb

  • Who Stands The Most To Win – And Lose – From A Second Obama Term

    As the probability of President Barack Obama’s reelection grows, state and local officials across the country are tallying up the potential ramifications of a second term. For the most part, the biggest concerns lie with energy-producing states, which fear stricter environmental regulations, and those places most dependent on military or space spending, which are both likely to decrease under a second Obama administration.

    On the other hand, several states, and particularly the District of Columbia, have reasons to look forward to another four years. Under Obama the federal workforce has expanded — even as state and localities have cut their government jobs. The growing concentration of power has also swelled the ranks of Washington‘s parasitical enablers, from high-end lobbyists to expense-account restaurants. While much of urban America is struggling, currently Washington is experiencing something of a golden age.

    So what states have the most to lose from a second Obama term? The most obvious is Texas, the fastest-growing of the nation’s big states. Used to owning the inside track in Washington during the long years of Bush family rule, the Lone Star state now has less clout in Congress and the White House than in recent memory. Texans are particularly worried about restrictions on fossil fuel energy development, which is largely responsible for robust growth throughout the state.

    “Obama now wants to take credit for the increased production that has happened, but [increased production] has been opposed in every corner by the administration,” says John Hofmeister, founder of the Houston-based Citizens for Affordable Energy and former CEO of Shell USA. Hofmeister fears that in a second term, with no concern for reelection, Obama could exert even greater controls on fossil fuel development. This would have dramatic, negative implications not only for Texas but for the entire national energy grid, which includes North Dakota, Wyoming, Montana, West Virginia, Oklahoma, Alaska and Louisiana. These states fear that the nation’s recent energy boom, which has generated some of the nation’s strongest job and income growth, could implode in Obama’s second term.

    Take Louisiana, which is still recovering from Hurricane Katrina in 2005 and the BP oil spill in 2010. The administration’s moratorium on offshore drilling, sparked by the spill, has had a deleterious effect on the state’s energy economy, according to a recent study, with half offshore oil and service companies  shifting their operations to other regions and laying off employees.

    Once the moratorium was lifted in 2010, companies have faced long delays for new wells, growing from 60-day delays in 2008 to more than 109 last year  .  “The energy states feel they are being persecuted for their good deeds,” says Eric Smith, director of the Tulane Energy Institute in New Orleans. “There is a sense there are people in the administration who would like this whole industry to go away.”

    Many of these same states also worry about the administration’s proposed downsizing of the military. Obama’s move to cut roughly towards $500 billion in defense spending may make sense, but it  threatens places with large military presences such as Texas, Florida, Oklahoma, Virginia, Georgia, South Carolina and New Mexico.

    The D.C. metro area might also be hit by defense cuts, but overall the it has many reasons to genuflect toward the Obama Administration. Federal wages, salaries and procurement account for 40% of the district’s economic activity, roughly four times the percentage of any state. Expanding regulation on energy, health care and financial services has sparked a steady job boom in lobbying, think tanks and other facets of the persuasion industry — including among Republicans –at a time when employment growth has been sluggish elsewhere.

    D.C. partisans hail their city as the leader of a national urban boom. The district clearly benefits from diminished job opportunities in more market-based economies, particularly for educated 20-somethings.

    No place has flourished as much as the capital, but a second term would be favorable to states such as Maryland, which depend heavily on research spending directed from Washington and where federal spending accounts for fifteen percent of the local economy, over seven times the national average. Maryland agencies such as the National Institutes for Health will likely expand under an increasingly federalized health care system — particularly if Democrats gain more seats in Congress with an Obama win.

    Other big states that may benefit from a second term include New York, California and Illinois. New York benefits largely from the administration’s Wall Street leanings, despite the president’s recent attacks on financial elite. Even for the non-conspiracy theorists, the administration’s ties to Goldman Sachs appear unusually intimate. Powerful allies like Democratic Sen. Charles Schumer, D.C.’s greatest Wall Street booster, suggest big money has little to fear from a second term.

    Overall the administration’s basic policy approach has favored the financial giants. Support for bailouts, seemingly permanent low interest rates, few prosecutions for miscreant investment bankers, the institutionalization of “too big to fail” and easy loans for renewable fuel firms all have benefited the big Wall Street players.

    Of course, a Republican victory would not be a disaster for these worthies. Companies like Goldman Sachs are hedging their bets by sending loads of cash to the likely Republican choice, former Massachusetts Gov. Mitt Romney.

    But other New York interests, such as mass transit funding, would benefit from the current administration’s  generally pro-urban, green sensibilities. Tight regulations on carbon emissions — increasing the price of fossil fuels — may help the competitive position of New York City, which has little industry left and relatively low carbon emissions per capita, in part due to a greater reliance on hydroelectric and nuclear power.

    California also has reasons to root for an Obama victory. Although among the richest states in fossil fuels, particularly oil, the Golden State has become a bastion of both climate change alarmism and renewable energy subsidization. It adamantly won’t develop traditional its energy resources — which would help boost the state’s still weak economy — and Silicon Valley venture firms have eagerly grabbed subsidies and loans for start-ups from Energy Secretary Steven Chu’s seemingly bottomless cornucopia.

    Furthermore,  more powerful EPA would make California’s current “go it alone” energy and environmental problems less disadvantageous compared to more fossil-fuel-friendly states, leveling what is now a tortuous economic playing field.

    Similarly, attempts to push the state’s troubled high-speed rail line — recently described in Mother Jones as “jaw-droppingly shameless” –  will succeed only with strong backing by the federal government. Under a Republican administration and Congress, Brown’s beloved high-speed line would depend entirely on state and private funding, likely terminating the project.

    But no state needs an Obama victory more than his adopted home state of Illinois. To be sure, having a native son in the White House has not prevented the Land of Lincoln from suffering one of the weakest economies in the nation. The state has one of the highest rates of out-migration in the country, according to recent United Van Lines data and Census results.

    Even worse, the Land of Lincoln faces a fiscal crisis so great that it makes California look well-managed.  Without a good friend in the White House, and allies in Congress, Illinois could end up replacing long-struggling, now-improving Michigan as the Great Lakes’ new leading basket case. Count Illinois 20 electoral votes in the Obama column.

    This piece originally appeared in Forbes.com.

    Joel Kotkin is executive editor of NewGeography.com and is a distinguished presidential fellow in urban futures at Chapman University, and contributing editor to the City Journal in New York. He is author of The City: A Global History. His newest book is The Next Hundred Million: America in 2050, released in February, 2010.

    Photo from BigStockPhoto.com.

  • Britain Fears a Developer’s Charter

    The UK Government’s Department for Communities and Local Government (DCLG) announced that there were only 127,780 new housing completions last year in Britain. British house building activity is down to levels of after the First World War, when reliable industrial records began, and still falling. In 1921 the British population was nearly back up to 43 million following the slaughter of the First World War. In 2011 the population of England, Wales, and Scotland is approaching 61 million people. By 2031 the British population is expected to be closer to 70 million. With such existing unmet and growing demand for new housing the DCLG, the Government department that runs the Planning System should be busy finding ways to allow developers to build.

    Many feared that the National Planning Policy Framework (NPPF), prepared by the DCLG for an expected release in January 2012 would be a developer’s charter. We wish it was a developer’s charter! The NPPF continues planning policies, supported by all Parliamentary political parties, which continue to frustrate volume housebuilding. Developers have to prove that their proposals for house building are not merely about building useful homes at a profit, but are “sustainable development” when measured against disputable social and environmental criteria. No developer is free to build on their own land without first having to obtain planning approval from an array of third party interests all insisting on their interpretation of the moral idealism of sustainability.

    This makes the NPPF an anti-development charter for all those who oppose house building and population growth. Anyone can claim that more house building and more households are unsustainable in their area, in the effort to stop a project which they don’t approve of.

    The NPPF will do nothing to challenge the power of contemporary anti-development campaigners, who are well known. Anne Power, Lord Richard Rogers and other members of New Labour’s Urban Task Force (UTF) have correctly identified themselves as allied to the “Hands off Our Land” campaign run by The Daily Telegraph, the Conservative supporting newspaper.  The UTF favors a continuing commitment to ‘… reclaiming brownfield sites and re-densifying cities.’ To build only on previously developed land is the green ideal of the UTF and the “Hands off Our Land” campaign.

    We all know where these policies lead. Not to a golden age of regeneration for all, but to lucrative property investment for those with access to sufficient capital and the right connections to steer themselves through the planning system to obtain approvals. The volume of Greenfield land developed declined dramatically under New Labour. The present Conservative led Coalition Government continues the practice of obstructing development on Greenfield land.

    Between 2000 and 2006 the total area of land built on for new housing fell by 23%, with a 42% fall in the annual amount of Greenfield land used. In 2010 76% of all housing was built on previously developed Brownfield land, a slight decrease from the 80% in 2009. Only 2% of housing was built on the Green Belts around major cities and towns. The Green Belt in England covers 13% of the land, or twice the area already developed for housing. Small wonder that the price of the shrinking supply of land with a prospect of being approved for sustainable development remains inflated.

    House building was only increased from the low point of 2001 by increasing the density of development in the cities. Average densities rose from 25 dwellings per hectare (dph) in 2000, to 43 dph by 2010. In London the average density for new housing is much higher, at 115 dph in 2010.

    Densification policies considered sustainable have meant that the majority of the working British public can no longer buy a new house with a garden, in ways that previous generations may have taken for granted. Instead the plan has been to squeeze more new households into less space. UTF supporters and the DCLG imagined they were regenerating cities and saving the planet for all of society. Like traditional Conservatives they mean to keep developers and the population off Britain’s ample supply of otherwise redundant farmland.

    The Daily Telegraph’s campaign, best articulated by the conservative anti-growth philosopher Roger Scruton, is clearly the flip side of the UTF’s densification argument. He is happy as long as the population is kept away from the countryside he loves. ‘Thank God for obstacles to economic growth,’ says Scruton.

    Scruton speaks for the comfortable who already enjoy plenty of space. The Daily Telegraph’s campaign is ultimately concerned that existing housing markets are protected, sustained through the division between Town and Country, and moralised as a concern for environment and heritage. New Labour supporters are more likely to read The Guardian, but its more middle-class readership finds nothing to object to in The Daily Telegraph’s campaign, in order to restrict the “sprawl” of suburbia and halt the imagined damage this will do to the environment and urban communities. The Guardian’s readership formed the bed-rock of New Labour’s support, and back Next Labour. The working class may have deserted Labour, but is depoliticized and passive. The Guardian and The Daily Telegraph – still supposed by many to be at opposite ends of the old-fashioned and defunct ideological spectrum of Left and Right – prove closer than either cares to think.

    Labour Members of Parliament have traditionally feared the “flight to the suburbs” lest they lose voters and the associated tax revenue. The planning system has proved very effective in maintaining the political geography of Britain. Labour politicians negotiate their political dependency on urban containment with a Red-Green stance in urban areas, without threatening the Blue-Green interests of those who want to keep development out of the countryside. All depend on the denial of development rights that date from the 1947 Town and Country Planning Act, and which the NPPF reinforces.

    Meanwhile working class families are squeezed into what little Twentieth Century suburbia is still affordable, competing unsuccessfully with the more affluent for ownership of this increasingly scarce and valued commodity. What new housing is built is at higher density, usually on the least attractive sites. That is land previously occupied by factories, old infrastructure, and utilities, or by council housing estates re-developed at higher densities. Yet even these unpopular sites enter the inflated British housing market, sustained through a chronic lack of house building.

    The working class is caught in a political crusher made manifest through the planning system. The Red-Greens, who may imagine themselves on a new Left, gentrify towns and cities with “sustainable redevelopment”, and the Blue-Greens, who persist with being on the Right, protect their landscape for their exclusive enjoyment. Meanwhile the majority of home owners have come to depend on the inflated and unaffordable housing market. New Labour needed this house price inflation to allow the owner occupying majority to supplement inadequate wages by withdrawing equity from their homes. So does the Coalition. Deliberate or not, The Daily Telegraph’s commitment to building fewer new homes will stabilise what we have called the Housing Trilemma.

    Our current predicament may be thought of as a Trilemma, in which house price inflation supports burdensome mortgage lending and private debt, while households in the owner occupied sector accept low quality housing conditions. High rents shadow private sector housing costs, and private rental housing quality is often of the lowest quality. Many in Britain, including the majority of the home owning middle class, are dependent on the Housing Trilemma remaining stable.

    The planning system serves well in protecting the interests of existing home owners. Behind the NPPF’s moral idealism of sustainability, the immediate instrumental objective is to restrict new housing supply to avoid destabilising housing markets.  Appearing as a moral mission to save the planet from developers, the NPPF and the denial of development rights sustains the Housing Trilemma. Debt is secured, but housing remains unaffordable, quality low, and house building activity is at an all time industrial low. This is not a conspiracy. It is a predicament.

    When Britain’s elites talk about wanting to revive economic growth, they don’t mean a massive surge in new house building or an expansion of infrastructure. What they have in mind is a revival of financial services in The City, subject to uncertainties in the fragmenting Euro Zone, and the maintenance of high housing prices in the hope of more inflation to come. Meanwhile the countryside is kept pristine for the few who can afford access to it as a weekend retreat for the wealthy, including the pro-urban intelligentsia, in all their Red-Green-Blue moral plumage.

    The Coalition could have challenged the Housing Trilemma. Instead they have reinforced it.

    The result is predictable. Planning applications are falling in number and ambition. Only 25,000 new homes were approved in the second quarter of 2011 compared to 32,000 in the second quarter of 2010. This will be read by The Daily Telegraph campaign members as “proof” that there is no demand for development, inverting the causality. Money is being made out of an environmentally sanctioned scarcity rather than through increased productivity and innovation in a sector like house building and the wider construction industry. Britain’s already backward construction industry is further retarded, and it is becoming commonplace for social elites, and not only crazed nationalists, to blame immigration for housing shortages.

    Britain’s economy needs growth, but is unlikely to get it from the house building sector. Britain too needs a dose of political reality while the pro-urban intelligentsia preen their green morality.

    The Coalition cannot afford to confront the political problem of the Housing Trilemma if it is to sustain its fragile political base. Increasingly, only the elderly bother to vote and this equity rich group will be mostly satisfied with modest house price inflation as a hedge against general inflation, while savings in banks attract little return. Meanwhile an influential propertied elite still enjoys sustained house price inflation at the top of the market. They are anxious that environmental and heritage designations operate to enhance the exclusivity and enjoyment of their investments. The unelected charities, agencies and Non-Governmental Organisations that were aligned against the draft of the NPPF in July 2011 represent these elite interests. They may now back the redrafted 2012 NPPF with all its demands for sustainability. Their “Hands off Our Land” campaign has worked for them.

    The NPPF means that house builders face a future in which building on Greenfield land is effectively considered an eco-crime. Only those who can develop Town Centre sites, perhaps as rental housing, or as luxury homes for the equity rich will thrive. Basically Britain is no longer building homes with gardens for sale to young working families on modest incomes.

    If you are in a young working family, or hope to start one, the question is: What are you going to do about the housing predicament you and your friends face?

    We have to face a stark reality. Sadly, there is no contemporary habit of young working families organising to demand housing collectively. Meanwhile the 2011 to 2012 production figures look set to be lower again, and the developmental uncertainties about to be articulated in a redraft of the NPPF in pursuit of sustainable development will further the decline in production.

    Anticipating this feature of Britain’s ratcheting austerity does not make for a Happy New Year. Much depends on what the people of Britain, and particularly the young, do to demand that family houses are built at modest prices in places they want to live together. At present Britain fears a developer’s charter, even though the National Planning Policy Framework is nothing of the sort. Parliament might yet instead be in fear of people demanding cheap land on which to build a better place to live.

    James Stevens is Strategic Planner at the Home Builders Federation, www.hbf.co.uk. Email him at james.stevens@hbf.co.uk. The views expressed are his own and not those of Home Builders Federation. Ian Abley is a site architect and runs the pro-development website audacity, www.audacity.org. Email him at abley@audacity.org. Together they organise the 250 New Towns Club, www.audacity.org/250-New-Towns-index.htm.