Tag: Environment

  • Fracktivists for Global Warming: How Celebrity NIMBYism Turned Environmentalism Against Natural Gas

    Over the last year, celebrities such as Yoko Ono, Sean Lennon, Robert Redford, Mark Ruffalo, Mario Batali, Scarlett Johansson, Alec Baldwin, and Matt Damon have spoken out against the expansion of natural gas drilling. “Fracking kills,” says Ono, who has a country home in New York. “It threatens the air we breathe,” says Redford. 

    In fact, “gas provides a very substantial health benefit in reducing air pollution,” according to Daniel Schrag, director of Harvard University’s Center for the Environment. There have been “tremendous health gains” from the coal-to-gas switch, MIT economist Michael Greenstone told The Associated Press. Indeed, air pollution in Pennsylvania has plummeted in recent years thanks to the coal-to-gas switch. "Honestly," added Greenstone, "the environmentalists need to hear it."

    Fracktivism might be dismissed as so much celebrity self-involvement had it not reversed the national environmental movement’s longstanding support of natural gas as a bridge to zero-carbon energy — and kept shale drilling out of New York state. Last week, Governor Andrew Cuomo was set to green-light 40 demonstration gas wells in a depressed part of New York until Natural Resources Defense Council attorney Bobby Kennedy Jr. called him and asked him not to.

    Bill McKibben and his organization 350.org have made common cause with the anti-fracking movement, as has the Sierra Club. NRDC went from being supportive of a coal-to-gas switch to opposing the expansion of gas production. Even the Environmental Defense Fund’s chief, Fred Krupp, said in a debate last month that he opposes the expansion of natural gas.

    All of this comes at a time when carbon emissions are declining in the US more than in any other country in the world. The USA is the global climate leader, while Europe and Germany are returning to coal. The main reason is gas, which increased last year by almost the exact same amount that coal declined

    Just a few years ago, environmental leaders were saying that we faced a climate emergency, that emissions must start declining rapidly, and that enemy number one was coal. Now the same leaders are saying we have to stop shale fracking even though it is crushing coal and driving down American carbon emissions.

    Of course, the fractivism isn’t really about the fracking. Matt Damon’s anti-natural gas movie was originally an attack on wind farms. In 2005, Bobby Kennedy Jr. helped lead a campaign to stop the Cape Wind farm from being built because it will be visible from the Kennedy compound. Meanwhile, he was championing the construction of a massive solar farm in the Mojave Desert, 3,000 miles away — itself opposed by local environmentalists.

    Fracktivists like Mark Ruffalo protest that his NIMBYism isn’t pro-coal. He told AP that we don’t need natural gas; we can easily switch from coal directly to solar panels, like the ones Ruffalo installed on his Catskills house. 

    But when the sun isn’t shining on Ruffalo’s roof, he’s mostly getting his electricity from natural gas. In order to accommodate the intermittent nature of solar and wind, utilities rely on natural gas plants, which can be quickly ramped up and down to keep the lights on. Contra Gasland’s Josh Fox’s claims about using "compressed air" in a recent debate with Ted at Salon.com — cheap, utility-scale energy storage simply doesn’t exist.

    Privately, scientists and analysts within national environmental organizations are appalled that celebrity fractivism could get in the way of the coal-to-gas shift. They say the fracktivists undermine green credibility, and are disturbed by the failure of their movement’s leadership. 

    But there’s little reason to expect national green leaders will become, well, leaders. They will likely continue to follow donors who demonstrate time and again that what matters most to them — whether in the case of a nuclear plant in Long Island, a wind farm in Cape Cod, or a gas well in the Catskills — is the view from their solar-plated eco-compounds, not the potentially catastrophic impact of global warming on the planet.

    This post first appeared at TheBreakthrough.org.

  • Gas Crushes Coal

    Coal electricity declined by 12.5 percent in 2012, mostly driven by the switch to natural gas, which increased by almost the exact same amount (217 terrawatt-hours) as coal declined (216 TWh), according to new annual numbers released by the US Energy Information Administration.

    Wind electricity increased as well — by about one-tenth (20.5 TWh) as much as gas. Solar increased a little more than one-hundredth as much as gas (2.5 TWh).

    The figures come at a time when renewable energy advocates have claimed that wind and solar have been responsible for the big declines in coal — claims that do not stand up to scrutiny, according to a new Breakthrough Institute analysis.

    Indeed, the new numbers highlight the key difference between gas and solar and wind. Where taxpayers subsidized unconventional gas exploration from 1980 to 2002 to the tune of $10 billion, natural gas in recent years has been replacing coal without subsidies.

    Wind and solar, by contrast, remain almost wholly dependent on public support. Uncertainty last year over whether Congress would renew the key wind subsidy meant that less than half as much new wind will be installed in 2013 as was installed in 2012.

    Where the problem for wind has been its high cost, the problem for gas is that it has become too cheap. Natural gas production slowed last year in the face of unprofitably low prices caused by overproduction.

    This does not mean that subsidies for solar and wind should be cut, only that they should be reformed. Instead of subsidizing the production of electricity from the same old technologies, we need the kind of innovation that allowed natural gas to become cheaper than coal.

    This piece first appeared at The Breakthrough.

  • Natural Gas Boom: The “Janus” Effect

    The last five years have seen a revolution in terms of the amount of inexpensive U.S. natural gas made available for consumption in power plants, road fuels, and as a feedstock for new and expanded petrochemical plants. We are now even debating the advisability of large volume natural gas exports in the form of liquid natural gas (LNG).  

    This bonanza has created euphoria in the fossil energy and industrial communities, but has also created something of a “Janus effect” within the Environmental community.  To the Romans, Janus (the two faced god) provided a cohesive view of the present as well as an uncertain view of the future. In Rome, the temple to Janus was opened only when Rome was at war. During peace time, presumably because the future was more certain, the doors of the temple remained closed. They were last opened in AD 531 immediately prior to an invasion by the Goths. We all know how well that turned out.

    Environmentalists are reacting to the natural gas bonanza in three ways. The first group, which we may define as “pragmatists”, see a hopeful face based on solid evidence that natural gas helps with achieving multiple environmental goals by reducing particulate emissions, sulfur emissions, NOX levels and CO2 emissions.  They acknowledge natural gas fueled generators emit approximately 40% less CO2 per kilowatt hour than the older coal-fired units they are largely replacing. Although the aftermath of the recession has reduced the use of most other fuels, natural gas now rivals coal as the major fuel source for power generation in the US.

    A second group, the “environmental fatalists” are less impressed with the displacement effects on coal but appreciate that natural gas plants provide crucial support when mandated, for intermittent renewable power options, such as solar and wind. Once renewables represent approximately 10% of aggregate capacity, negative side effects of these “intermittent” sources become problematic; too much dependence on them can cause grid “instability” or, in a worse case, cascading power failures and massive blackouts. 

    Then there’s the third group, we’ll call the “ideologues.” Often the loudest, this group views natural gas as an implacable enemy for undermining the economic viability of renewable energy projects. They oppose the use of natural gas on principle and call for ever more restrictive regulations and production constraints on natural gas fueled power production. In their view, increasing the costs of generating electric power from natural gas will allow renewable generation finally to achieve cost parity. This “logic” explains at least some of the objections to fracking, an essential requirement for shale gas production, which, if restricted, would seriously undermine production and consumption of additional natural gas in the U.S.  

    The ideologues believe in “leveling the playing field” so that renewables such as solar and wind can be made economically viable. They see themselves fostering a new economy based on renewable energy. The rest of society’s role is to “shut up” and allow them unimpeded access to scarce and valuable assets (e.g. subsidized prices and preferential access to the grid) in order to wipe fossil fuels off the grid. 

    Natural gas based power generation represents the ideologue’s worst nightmare.  They know that increasing the use of natural gas for a generation undermines the economic value of renewable-based generating companies. It’s not hard to imagine that for those individuals and businesses profiting from renewable subsidies and mandates, natural gas represents a great threat. The argument therefore does make a certain amount of sense if you accept the initial premise.    

    Renewable mandates generally represent a commandment that “Thou shalt generate e.g. 10% of a given utility’s power output using approved renewable resources”, regardless of the costs to ultimate consumers.  Requiring utilities to purchase high priced renewable power under so called feed in tariffs results in those higher prices simply being “rolled in” to the aggregate cost of power delivered to all consumers and duly covered by an aggregate rate requirement.

    Such initiatives to support an artificial market for renewable power generation are politically vulnerable, since the public tends to reject mandates forcing investors in renewable energy projects to face bankruptcy as a distinctly possible outcome. Government-guaranteed loans supporting construction of the plants manufacturing new PV solar cells or wind turbines have already outraged a public forced to pay for their bankruptcies.  

    What is the future of America if the renewable mandate regime expands under state or federal programs? That future is now on display in Germany, a trailblazer in applying subsidies and preferential access to the grid to support the adoption of solar and wind power. The country has not only restricted the construction of new coal and nuclear power units, but also limited the operations of natural gas fueled generation by providing preferential prices and access to the grid for renewables. To be fair, the Germans are also groaning under the cost of imported natural gas supplies, primarily from Russia.

    Unfortunately, as a result Germany does not have adequate load following capacity to absorb the ups and downs of renewable power generation. The result is grid instability. These policies are creating potential dangers for an economy heavily dependent on power intensive manufactured exports.  Already German petrochemical manufacturers, such as BASF and Bayer, have warned that the country faces grave threats to its manufacturing base due to lower cost competition in the natural gas-rich US. Volkswagen has been equally blunt about their need to manufacture car parts outside of Germany. Remember that Germany’s job pool has roughly 24% of the work force engaged in export focused activity.

    The Germans avoid discussing their lack of enthusiasm for searching out low cost coal gas and shale gas deposits in the fatherland. The country now endures an aggregate price of 32 cents/kilowatt hour vs. a US price of about 10 cents/kwh. The bad news is that this already elevated German rate is slated to increase further in the next year, by another 50%, to a level of 48 cents/kwh.  

    To make it through Germany presumes the good will of neighboring countries which face their own energy challenges. Germany’s current power generation profile has approximately 20% of its power being provided by renewable sources, primarily wind and solar. Germany’s neighbors complain that the country is exporting the grid instability associated with its “green” policies. It’s gotten so bad that the country, which loathes nuclear power, is actually expanding the use of coal fired generation. In essence, coal fired generation is growing in Germany at the expense of higher cost natural gas generation. (The silver lining is that the U.S. is supplying the extra low cost coal required). Naturally, Germany’s CO2 and particulate targets are not being met, while the equivalent US targets are being met ahead of schedule.   

    Not surprisingly, the German government is now back tracking because their economy cannot support, from a technical or economic perspective, the current level of installed renewables. Angela Merkel has recently called for a more balanced approach to power generation. That will probably mean a policy of diverting subsidies and preferential treatment from solar and wind to natural gas and hydro.

    The Current Status in the US

    Back here in the US, we’ve managed to spend $97 billion or so on government funded wind and solar projects that certainly will not survive without operating subsidies, feed in tariffs, preferential access to the grid and production mandates.

    Fortunately, the US is upgrading our power generation fleet by building new, unsubsidized, gas-fired generation plants throughout the country. We are also seeing new pipeline and grid infrastructure coming to market along with significant expansions of our refining and petrochemical manufacturing facilities, exploiting nonconventional hydrocarbon resources. The bulk of this expenditure is being managed with minimal federal financial support.

    However, adverse government regulation of fracking could bring the shale gas band wagon to a sudden halt. (Beyond that, a measurable, multi-year slowdown in permits for new gas pipelines is also having a deleterious effect.)

    Recognizing the risks, shale gas proponents are taking another approach. Having apparently convinced the pragmatists and the fatalists of the benefits of natural gas, they are now beginning to spend significant sums in an effort to educate the general electorate and thereby isolate the diehard   ideologues.  

    Fortunately, the majority of the environmental community is not made up of latter day luddites bent on destroying western civilization, just as the majority of the oil and gas industry is not made up of barbarians seeking to plunder the environment. The majority of the population consistently supports measured progress on both the environmental and economic fronts.

    The challenge now is to grow support for  environmental compromises that produce favorable results for everyone. We still live in a democracy where everyone gets to vote and to have his or her say. However, we do not live in an “Alice and Wonderland” world where everyone can create his own reality. Germany is already facing the downside of listening to their ideological enthusiasts. Let’s take the German lesson to heart, and embrace a more pragmatic approach. It is after all, the American way.

    Eric Smith is a Professor of Practice at the A.B. Freeman School of Business at Tulane University. He serves as the Associate Director of the Tulane Energy Institute. He is a Chemical Engineer and has an MBA from the A. B. Freeman School at Tulane University. 

  • How Green Are Millennials?

    Besides his history-making embrace of full equality for gays and lesbians, the most surprising part of President Barack Obama’s Second Inaugural Address may have been the emphasis placed on dealing with the challenge of climate change. The president devoted almost three whole paragraphs, more than for any other single issue, to the topic. His remarks suggested that America’s economic future depended on the country leading the transition to sustainable energy sources and that “the failure to do so would betray our children and future generations.”

    Different generations reacted differently to the speech. The President’s rhetoric seemed like standard liberal fare to many Baby Boomers (born 1945-1965), who either vehemently agreed or disagreed with what Obama had to say depending on their political ideology. But members of the Millennial Generation (born 1982-2003) were in almost unanimous agreement with the way the President defined the context of this challenge. It was as if he was channeling the thinking of Millennials such as David Weinberger at the Roosevelt Institute’s Campus Network (RICN) who wrote, almost a year ago, “Millennials view environmental protection more as a value to be incorporated into all policymaking than as its own, isolated discipline. We are concerned with economic growth, job creation, enhancing public health, bolstering educational achievement, and national security and diplomacy. Young people recognize that each of these concerns is inextricably tied to the environment.”

    President Obama was also right, from a Millennials’ perspective, to emphasize the need for America to become a leader in sustainable energy technologies. Seventy-one percent of Millennials believe America’s energy policy should focus on developing “alternative sources of energy such as wind, solar and hydrogen technology; only a quarter believes that it should focus on “expanding exploration and production of oil, coal and natural gas.” Similarly, the RICN’s “Blueprint for a Millennial America,” a report prepared by thousands of Millennials who participated in their “Think 2040” project, placed the development and usage of renewable sources of energy at the top of all other environmental initiatives.

    The participants’ proposed solutions to the challenge, however, were not focused on the kind of top-down change so common to Boomers. .Instead the proposals  emphasized taking action at the community level. No one, the RICN blueprint said , should be asked to “make sacrifices without fully considering the cost to communities” whose “texture” is most likely to be impacted dealing with the challenge.

    Many politicians fail to notice this unique Millennial perspective. Members of the generation disagree sharply with their elders on the best way to address environmental challenges, preferring to tackle them through individual initiative and grassroots action rather than a heavy-handed top down bureaucratic approach.

    Of course,  Millennials are the most environmentally conscious generation in the nation’s history. Almost two-thirds of Millennials believe global warming is real and 43% of them think that it is caused by human activity, levels much higher than among all other generations. But, as Weinberger also wrote, “While environmentalists of years past were primarily aiming to bring clean air and clean water concerns into the national policymaking calculus, environmentalists today are far more worried about solving global problems like climate change by using local environmental solutions.”

    Adapting a Millennial approach to dealing with global warming would mark a major change for the Administration. All four of Obama’s first term environmental policy heavyweights were Boomers, whose preference for top down dictates was evident in almost every decision they made. Secretary of the Interior Ken Salazar established new controls on off shore oil drilling that satisfied neither side. Secretary of Energy Stephen Chu tried to jump start the development of renewable energy technologies in the United States by funding startups with dubious chances of marketplace success. And most conspicuously   EPA Administrator Lisa Jackson’s plans for regulating smog were rejected by the President. Fortunately ,  all of them have  announced plans to leave their posts. They will follow in the footsteps of environmental czar, Carol Browner, who left two years ago after a less than stellar performance during the Horizon Deepwater drilling disaster.

    There is talk within the administration of subtle changes in policy.   The departure of this quartet of ideologically-driven Boomers gives the President an excellent opportunity to appoint a new team to execute his vision for meeting the environmental challenges of our time.

    President Obama’s  new team will have to continue to link the need to develop U.S. energy production to both environmental concerns and economic development. It will need to couch the call for progress on reducing carbon dioxide emissions in the context of strengthening, not weakening, local communities and preserving the nation’s natural resources. Just who the president  finds to take on this politically nuanced task will say a great deal about his sensitivity to his Millennial Generation supporters’ attitudes and beliefs. It will also foretell a great deal about how successful he will be in matching the lofty rhetoric of his Second Inaugural Address with today’s political realities during his final term in office.

    Morley Winograd and Michael D. Hais are co-authors of the newly published Millennial Momentum: How a New Generation is Remaking America and Millennial Makeover: MySpace, YouTube, and the Future of American Politics and fellows of NDN and the New Policy Institute.

    Photo by gfpeck

  • The California-China-CO2 Connection

    Michael Peevey, President of the California Public Utilities Commission, is sincere and concerned about CO2 emissions. At a recent presentation at California State University Channel Islands, he spoke about California’s efforts to limit emissions. He mentioned green jobs, but, to his credit, he did not repeat the debunked claim that restricting CO2 emissions will be a net job creator. He also acknowledged that it doesn’t much matter what California does, if China doesn’t change its behavior. It turns out that if California were to reduce its carbon emissions to zero, in about a year and a half global CO2 would be higher anyway, just because of the growth in China’s emissions.

    Peevey talked about California’s increasingly ambitious plans for carbon reduction in the future. The goals include returning to 1990-level CO2 emmisions by 2020, and then an 80 percent reduction by 2050, regardless of population changes.

    This is going to be expensive. And the price of some of the potential technology — such as capturing atmospheric CO2 and pumping it underground — will include a lot more than the direct cost. The ultimate costs will, unfortunately, include increased global CO2 emissions.

    Some readers will remember the first time Larry Summers, the former US Treasury Secretary (under Bill Clinton) put his public career at risk because of his bluntness. In 1991, while Chief Economist at the World Bank, Summers gained international notoriety by saying in a memo, “I’ve always thought that under-populated countries in Africa are vastly under polluted.”

    That was the first of many times that lots of people demanded his head. He’s since claimed that it was sarcasm, but I don’t believe it. I believe he meant that environmental quality is a luxury good; that poor people need things like food and shelter, and they don’t much care if they trash the environment in the process. So, if pollution were localized, the poor would gain jobs and the wealthy would have an improved environment. Presumably, each would be happier.

    Of course, that sounds terrible to most people. But that’s precisely what we are doing here in California, only we’re doing it worse.

    California, by making production so very expensive, is chasing producers to places with low pollution controls. It’s worse than the situation Summers describes, because carbon dioxide emissions do not remain local. They spread throughout the atmosphere. Perversely, California is causing a global increase in CO2 emissions by its regulations limiting CO2 emissions in California.

    The problem is the result of acting on the concept of Think Globally and Act Locally (TGAL). TGAL works when pollution is local. But when air pollution is free to float around the world, you have to have a different strategy, and get the most reduction for your investment.

    And you don’t get the most for your investment in California. In terms of carbon efficiency — the ability to generate output while emitting less CO2 — California is one of the world’s most efficient economies. Each new reduction in CO2 becomes increasingly expensive. That is, reducing emissions is subject to increasing marginal costs. Reducing carbon emission in California is really expensive because we’re so carbon efficient already. Reaching the 2050 goal will be incredibly expensive. Worse, it won’t do any good.

    It’s not as if California can really afford it. Last month, I participated in the South Coast Association of Governments (SCAG) Third Annual Economic Summit. This great event provided lots of information about the economic challenges facing Southern California. For example, we learned that Los Angeles County’s economy will probably not reach its pre-recession level of jobs until at least 2018 and perhaps not until 2020.

    That’s a sobering thought.

    California State Sen. Roderick Wright, D-Los Angeles, a powerful speaker, documented California’s industrial decline, and made an emotional appeal for polices that produce jobs. The audience gave Wright a rousing ovation, something quite rare at economic conferences. The problem is that the audience was comprised of economic development people. Too bad no one else was listening. It was poorly attended by policy makers. There were only a handful of elected officials.

    California’s economy is struggling, even if many in the political class refuse to acknowledge the fact. Because of that, our investments need to be wise. The correct strategy for California is global. We need to go looking for the low hanging fruit.

    The low hanging fruit is mostly in developing countries like China, India and Brazil. We’ve tried to get them to cut their emissions at Kyoto and the like, but they refused, pointing out that they are much poorer than the West, and that we were able to develop with lower-cost polluting industries. They have a point.

    We should help them cut their carbon emissions. Reducing a ton of CO2 emissions is far cheaper in China than in California. So, let’s reduce it there.

    There are political problems with this proposal. California’s carbon regulations were sold to the people on the absurd claim that the regulations would be profitable: better than low cost, better than a free lunch.

    The bigger problem would be convincing California voters to tax themselves to clean up Chinese factories. That seems to me to be an information dissemination problem. If Californians knew the true cost of the existing program, and how little reduction in global CO2 concentrations it brings, they might logically be willing to look at other approaches. If they knew how much more effective a dollar spent on Chinese emissions was than a dollar spent on California emissions, they might seriously consider the proposal. The proposal could always be sweetened by requiring that all the work be done by California companies.

    It would be good for Californians. It would be a big step towards restoring California’s economic vigor. It would make a serious dent in global CO2 concentration. It would be less costly than our current plan.

    Let’s do it.

    Bill Watkins is a professor at California Lutheran University. and runs the Center for Economic Research and Forecasting, which can be found at clucerf.org.

    Flickr photo by doc tobin: Smog on the Great Wall.

  • What Stifles Good Housing Development?

    We can’t afford outmoded attitudes in housing development anymore – not as businesses, not as citizens, and certainly not as development professionals. As development consultants, we’re often asked to provide detailed input on project design and the marketing of developments throughout the United States and Canada. We usually work with a local team of engineering consultants that provides construction drawings and serves as an intermediary for the project with local governments. We have concluded that the choice of selecting the engineering consultant is one of the pivotal issues for the success of a development. The developer has to be the one to hold the engineers accountable. Otherwise, all design will continue to be done to minimum standards instead of excellence.

    Problems with the consulting engineer generally fall into two broad groups: complacency and undisclosed conflicts of interest. To illustrate, we’ll look at two recent examples from projects owned by clients of Rick Harrison Site Design Studio.

    The first involves a small proposed neighborhood in Texas. The initial design was drafted before either the site boundaries or floodplain were accurately surveyed, and yielded a total of 35 lots of 0.6 acres or more. Rick prepared an initial revision of the original design, resulting in a more aesthetically pleasing and efficient neighborhood, while maintaining the 0.6 acre minimum lot size. Accurate boundary lines, contours and floodplain were eventually furnished to create a precision plat for submittal. The developer requested that Rick update the revised design, and indicated that he was willing to sacrifice one of the lots in order to allow a more spacious entrance.

    While preparing the precision plat Rick realized that he didn’t know why the lots were at least 0.6 acres instead of the more common 0.5 acres on lots without city sewer. In two rounds of questioning, the consulting engineer indicated that the minimum lot size was 0.6 acres, or 26,000 ft². The area of 6/10th of an acre is actually 26,136 ft², so Rick questioned the engineer again. This time, the engineer explained that the minimum lot size was actually 0.5 acres, but his firm had developed a “rule of thumb” that 26,000 ft² was the 0.5 acre lot net of easement areas. However, in the specific case of this development the only easement required was a 12’-wide utility easement along the front lot line. The extra 0.1 acres per lot was a “fudge factor,” developed over time to compensate for the well-known difficulty in computing precise lot sizing using existing CAD software.

    The “land surface based” technology Rick used to create the revised design requires no additional time to obtain precision areas, so he was able to easily design each lot to meet the actual 21,780 ft² (half acre) minimum exclusive of the 12’ easement. The new design eliminated the fudge factor, and yielded 37 lots, including the more open entrance area (three more than expected). Furthermore, reductions were made to the length of street..

    “Fudge factors” are rules of thumb intended to make the engineer’s work easier, and to provide enough margin in the plans to account for omissions or miscalculations. The problem with fudge factors is that they adversely impact the profitability of their clients’ projects. The chart below demonstrates the differences:

     

    Initial Plan

    Revised Plan

    Difference

    Lot size

    0.6 acres (minimum)

    0.5 acres

    At least 4,356 ft.² per lot

    Number of lots

    34

    37

    3

    Lot value

    $75,000

    $75,000

     

    Gross sales

    $2,550,000

    $2,775,000

    $225,000

    Pavement area

    89,479 ft²

    81,509 ft²

    7,970 ft²

    Estimated cost

    $447,400

    $407,550

    $39,850

    Eliminating an imprecise fudge factor would yield a $225,000 increase in gross sales. Since the only increase in costs were per lot consulting fees, almost all of the gross revenue would drop straight to the bottom line. In addition, the community would benefit from a more attractive neighborhood with substantially less street pavement maintained in perpetuity, and a higher property tax base. If the developer was unwilling to sacrifice profits, the cost of each lot would have had to increase by $6,600 to the consumer.

    The second example concerns another proposed residential development, this one in North Dakota, in a city prone to severe flooding. As most people know, paved areas do not absorb rainfall, so it would seem logical that the more pavement area in a new development here, the bigger the potential for runoff, which leads to more flooding. In addition, the wider the streets, the more surface area the city has to snowplow and maintain. All these issues – the snowplowing, the road maintenance and the increased water runoff – are burdens to current and future taxpayers, with no discernible benefits to offset the burden. So imagine Rick’s surprise when the consulting engineer refused to even submit a plan for 50-foot-wide rights-of-way with 28-foot-wide street sections, instead of the 66-foot-wide rights-of-way with 37-foot-wide street sections, as specified by existing city regulations.

    To understand the issue, look at the origin of the standard street width requirement. Centuries ago, roads were unpaved, and were built with wide ditches to handle drainage alongside them. The 66 foot length reflected a land surveyor’s chain, developed in the year 1620 by a British clergyman interested in developing a system that would use easily available tools to survey land in the British countryside. His system caught on, and was brought to the New World by British immigrants and used for hundreds of years. Perhaps as recently as 100 years ago it made sense to use a single surveyors’ chain as the width of community streets, and so many towns did so. Today, most cities have eliminated drainage ditches in modern subdivisions, replacing them with storm sewers and more efficient design. These changes have allowed narrower street and pavement widths, with positive cost and environmental impacts.

    So — the minimum street right-of-way in this modern North Dakota city is the result of a decision to make roads 66 feet wide, due to the fact that 400 years ago an English clergyman connected a hundred links that were 6 1/2 inch long to make a convenient, 66 foot long “chain”. To our knowledge, there is no other reason.

    Given that surrounding cities have adopted modern standards, and that the logic behind narrower streets is solid, Rick could have presented a compelling case. But the engineer refused to even make the proposal. Why not propose a common sense solution? Complacency? Perhaps. The desire to comply with every regulation to avoid conflict? More likely. Are the engineers fees based upon the percentage of construction cost, with wider streets guaranteeing higher fees? Also likely.

    Unsustainable? Absolutely.

    In the first example, outmoded rules of thumb related to inadequate CAD technology would have cost Rick’s client at least $250,000, and would have burdened the local county government with a significantly diminished potential property tax base. In the second example, the engineer’s lack of concern for the long-term benefit of his client (with whom he has a contractual or fiduciary relationship), and to the public (to whom he has a professional responsibility), has burdened the community with exaggerated flooding problems and approximately 33% more pavement to be snow plowed and maintained for as long as the community exists.

    We can’t keep fudging to hide poor practices. If we are ever to achieve a more sustainable world and create better communities and housing products, we simply cannot accept mediocre design, technology and attitude.

    Rick Harrison is President of Rick Harrison Site Design Studio and Neighborhood Innovations, LLC. He is author of Prefurbia: Reinventing The Suburbs From Disdainable To Sustainable and creator of Performance Planning System. His websites are rhsdplanning.com and pps-vr.com. Skip Preble, MAI, CCIM is a real estate analyst and land development consultant specializing in market analysis, feasibility studies, project value optimization and market value opinions. He can be reached through his website, landanlytics.com.

    Flickr Photo by Billy Hunt: “This is from my photo essay observing the course of development in Charlottesville, Virginia“.

  • Uniting a Fractured Republic: Innovation, Pragmatism, and the Natural Gas Revolution

    Over the last four years, emissions in the United States declined more than in any other country in the world. Coal plants and coal mines are being shuttered. That’s not from increased use of solar panels and wind turbines, as laudable as those technologies are. Rather it’s due, in large measure, to the technological revolution allowing for the cheap extraction of natural gas from shale. By contrast, Europe, with its cap and trade program, and price on carbon, is returning to coal-burning.

    Could President Obama, during his second term in office, turn this homegrown success story into paradigm-shifting climate strategy? In a speech we gave to the Colorado Oil and Gas Association yesterday, we argue that, after a season of ugly ideological polarization, politicians, environmentalists, and the gas industry have a chance to hit the reset button on energy politics. 

    This will require the natural gas industry to clean up its act, accepting better regulations, cracking down on bad actors, and preventing the leakage of methane, a potent greenhouse gas. It will require environmentalists to consider whether there might be a different path to significant emissions reductions from the one they have pursued over the last 20 years. And it will require Left and Right to put a halt to the tribalism that has characterized the national debate over climate and energy. 

    — Michael and Ted

    Uniting a Fractured Republic

    Innovation, Pragmatism, and the Natural Gas Revolution

    by Ted Nordhaus and Michael Shellenberger

    In 1981, George Mitchell, an independent Texas natural gas entrepreneur, realized that his shallow gas wells in the Barnett were running dry. He had millions of sunk investment in equipment and was looking for a way to generate more return on it. Mitchell was then a relatively small player in an industry that by its own reckoning was in decline. Conventional gas reserves were limited and were getting increasingly played out.

    As he considered how he might save his operation, Mitchell turned his attention to shale. Drillers had been drilling shale since the early 19th Century, but mostly they drilled right through it to get to limestone and other formations. Dan Jarvey, a consultant to Mitchell at the time, told us, "When you look at a [gas drilling] log from the 1930s or 1950s or 1970s it is noted as a ‘gas kick’ or ‘shale gas kick.’ Most categorized it as ‘It’s just a shale gas kick’ – as in, ‘to be expected, but to be ignored.’"

    As Mitchell embarked on his 20-year quest to crack the shale gas code, most of his colleagues in the gas industry thought he was crazy. But Mitchell persisted and his efforts would ultimately culminate in today’s natural gas revolution.

    In doing so, Mitchell upended longstanding assumptions about the future of energy. Just a few years ago, the convention wisdom was that no source of electricity could be cheaper than coal. Today, in the U.S., natural gas is cheaper. As a result, coal’s share as a percentage of electricity generated went from over 50 percent in 2005 to 36 percent in 2012. While global coal use continues to rise, the U.S. is at present leaving much of it in the ground. Meanwhile, estimates of recoverable natural gas results in the United States have nearly doubled, growing from 200 trillion cubic feet in 2005 to 350 trillion cubic feet today.

    The implications for those of us concerned about climate change are also significant. Leaving coal in the ground has been the longstanding goal of those of us concerned about global warming. Natural gas releases emits 45 percent fewer carbon emissions. In large part due to the glut of natural gas, U.S. carbon dioxide emissions will have declined more in the United States than in any other country in the world between 2008 and 2012 — an astonishing 500 million metric tons out of 6 billion, according to the Energy Information Administration.

    While we don’t imagine that any of this is news to most of you in this audience, there is another part of the story that might be. That is the story of the ways in which both the gas industry and the federal government helped Mitchell along the way. In these intensely polarized times, when it seems that almost everyone imagines that either government or corporations are the enemy, and it seems impossible to imagine that the two might actually work together to further the public interest, there are important lessons here too.

    1.
    As Mitchell considered trying his hand at shale, he cast about to see what was known at the time about how to get gas out of shale. A geophysicist who worked with Mitchell recalled telling him that, "It looks similar to the Devonian [shale back east], and the government’s done all this work on the Devonian."

    The work Mitchell’s geophysicist was referring to was the Eastern Gas Shales Project, which was started in 1976 by President Ford. The Shales Project was just one of several aggressive government-led efforts to accelerate technology innovation to increase oil and gas production. Already in 1974 the Bureau of Mines was funding the study of underground fracture formations, enhanced recovery of oil through fluid injection, and the recovery of oil from tar sands. One year later, the government funded the first massive hydofracking at test sites in California, Wyoming and West Virginia, as well as "directionally deviated well-drilling techniques" for both oil and gas drilling.

    The mandate from Congress was for government scientists and engineers to hire private contractors rather than do the work in-house. This was consistent with the tradition of the Bureau of Mines, which would set up trailers around the country to support oil, coal and gas entrepreneurs. This strategy contrasted with the government’s nuclear energy R&D work, which had been hierarchical since its birth in the military’s Manhattan project. This decentralization proved wise, as it ensured that the information would rapidly reach entrepreneurs in the field and not gather dust inside of a federal bureaucracy.

    From early on, Mitchell and his team relied heavily on information coming out of the Eastern Gas Shales project. "We were all reading the DOE papers trying to figure out what the DOE had found in the Eastern Gas Shales," Mitchell geologist Dan Steward told us, "and it wasn’t until 1986 that we concluded that we don’t have open fractures, and that we were making production out of tight shales."

    Through the 1980s, Mitchell didn’t want to ask the government – or the Gas Research Institute, which was funded by a fee on gas pipeline shipments to coordinate government research with experiments being conducted by entrepreneurs in the field – for help because he worried that he wouldn’t be able to take full advantage of the investment he was making in innovation.

    But by the early 1990s Mitchell had concluded that he needed the government’s help, and turned to DOE and the publicly-funded Gas Research Institute for technical assistance. The Gas Research Institute, which had worked with other industry partners to demonstrate the first horizontal fracks, subsidized Mitchell’s first horizontal well. Sandia National Labs provided high-tech underground mapping and supercomputers and a team to help Mitchell interpret the results. Mitchell’s twenty-year quest was also made possible by a $10 billion, 20-year tax credit provided by Congress to subsidize unconventional gas, which was too expensive and risky for most private firms to experiment with otherwise.

    By 2000, the combination of technologies to cheaply frack shale were firmly in place. The final piece of the puzzle was the sale of Mitchell Energy to Devon Energy, which scaled up the use of horizontal wells. Over the next ten years the use of this combination of technologies would spread across the country, resulting in today’s natural gas glut.

    Though the collaboration between Mitchell and the government was one of the most fruitful public-private partnerships in American history, it was mostly unknown until we started interviewing the key players involved around this time last year.

    After our findings were verified by other researches and reporters, including the New York Times and the Associated Press, some in the oil and gas industry, like T. Boone Pickens, have tried to downplay the government’s role.

    But the pioneers of this technology have been forthright. "I’m conservative as hell," Mitchell’s former Vice President Dan Steward told us, but DOE "did a hell of a lot of work and I can’t give them enough credit… You cannot diminish DOE’s involvement." Fred Julander said, “The Department of Energy was there with research funding when no one else was interested and today we are all reaping the benefits." 

    2.
    Today marks the end of one of the most divisive chapters in American political history. There is more partisan polarization in Congress than at any time since Reconstruction. There are vanishingly few swing voters. And the ideological divide between liberals and conservatives at times appears unbridgeable.

    One of the most insidious aspects of today’s political polarization is the way gross exaggerations turn into ossified caricatures. Left and Right view the other as ignorant, insane, or immoral.
    From the Right we have heard that President Obama is taking the country to socialism, and that Big Government is destroying the American dream. From the Left we have heard that Governor Romney would have exported all our jobs to China, and turn Congress over to Big Business. Where this downward spiral takes us is to the conclusion that America is fundamentally broken. The two great institutions of American life — business and government — are viewed by one side or the other as corrupt and nefarious.

    Few issues have become more polarizing than energy. Both sides have taken ever more extreme positions. Prominent conservatives have exaggerated both the size of Obama’s clean energy investments and the number of bankruptcies. They have described global warming and other environmental problems as either not happening or not worth worrying about. Some environmentalists have taken the opposite tack, exaggerating the negative impacts of gas drilling, downplaying the benefits, and accusing anyone who disagrees with them of being on the take.

    As we say in California — everyone needs to chill out. There is too much at stake for America, our environment, and our economy, for such hyper-partisanship to continue.

    In our rush to point fingers and interpret everything in catastrophic terms, we have lost sight of the fact that we are the richest nation on earth, and one with improving environmental quality, precisely because the private sector and the government have worked so well together. The failures of Big Business and Big Government should be put in their appropriate historical context.

    When the Colorado Oil and Gas Association asked us to give this speech at its conference the day after the election, we agreed on two conditions: that we pay our own way and that COGA invite local environmental and elected leaders to attend. We are glad to see them in the audience, because we need a common dialogue.

    As two individuals who came out of the environmental movement, where we spent most of our careers, we are best known for our writings calling for reform and renovation of green politics. In particular, we have advocated that environmentalists drop their apocalyptic rhetoric, which is self-defeating and obscures the very real environmental problems we face.

    And we have argued that environmentalists have been overly focused on regulations, when our focus should also be on revolutionary technological innovation, which is needed to make clean energy and other environmental technologies much cheaper, so that all seven going on 10 billion humans can live modern, prosperous lives on an ecologically vibrant planet.

    But our work has also focused on reminding private investors and corporate executives of the critical role played by the government in creating our national wealth. While economists have long recognized that innovation is responsible for most of our economic growth, few realize that many of our world-changing innovations would have been unlikely to occur without government support. A short list of recognizable technological innovations includes interchangeable parts, computers, the Internet, jet engines, nuclear power and every other major energy technology.

    Consider the information revolution. The government funded the R&D and bought 80 percent of the first microchips. The Internet started out as a federally funded program to connect networks of computers of government. Every major technology in the iPhone can be traced to some connection with government funding. The driver-less robot car that Google has invented relies on technologies that come out of government innovation programs.

    While high tech executives who are our age or younger are unaware of the government roots of the IT revolution, the old-timers of Silicon Valley do, and frequently expresses their gratitude for it.

    While interviewing the participants of the shale gas revolution, we were struck by how much respect and deference each side gave to the other. In many cases the government scientists and engineers acted as consultants to private firms like Mitchell’s — "We never forgot who the customer was," said Alex Crawley, who ran the DOE’s fossil innovation program for many years.

    As environmentalists, we were taught to be suspicious of such cozy relationships between industry and government workers, that government could not simultaneously promote industry while also attempting to regulate it. But when it comes to technology innovation, those cozy relationships, and the revolving door between government agencies, whether DoD or DoE, and private companies like Mitchell Energy, are absolutely essential to allowing knowledge to rapidly spillover and flow throughout the sector.

    And yet, there is also an important role for regulation, not only to protect the public from accidents and environmental degradation, but also to improve technologies and promote better practices throughout the industry. Wise regulation in the long run promotes, rather than hinders, the spread of new technologies and new industries, and this has never been more true than in the case of fracking. While US gas production has taken off, many European nations banned fracking for fear of the local environmental impacts and have started to return to burning coal.

    Last August, George Mitchell and New York Mayor Michael Bloomberg announced they would fund a large effort by the states to establish better fracking practices. They called for stronger control of methane leaks and other air pollution, the disclosure of chemicals used in fracking, optimizing rules for well construction, minimizing water use and properly disposing of waste water, and reducing the impact of gas on communities, roads, and the environment.

    You would be hard pressed to find very many Americans who would call those reforms unreasonable. They are the kinds of things that die-hard anti-fracking activists and much of the natural gas industry could agree to. And indeed, states like Colorado, and environmental groups like the Environmental Defense Fund, deserve credit for bringing regulators and the gas industry together to improve practices. By squarely addressing the methane leakage problem, and reducing the local environmental impacts, the government and the industry can make natural gas an even more obviously better alternative to coal.

    And the good news is that reducing methane leakage is something the industry already knows how to do. Little innovation is required to make sure that old pipelines are not leaking, and that new cement jobs are done properly. Similarly, responsible disposal of fracking fluids is not rocket science, it is something that the oil and gas industry does routinely in other contexts. Promising efforts are also underway to develop more environmentally sound fracking fluids and to further minimize water usage.

    There are costs, of course, associated with all of these efforts. But if the history of fracking proves anything, it is that costs will come down quickly. Indeed, if history is any guide, we will see great improvements to fracking technologies and techniques over the next 30 years that will be mutually beneficial to the industry, the public, and the environment, for the history of the shale gas revolution has been a history of incremental improvements to the technology. The water intensity of fracking, for instance, was originally not an environmental problem for drillers but an economic one. Only once Mitchell and others developed methods that required vastly less water to crack the shale did fracking become economically viable.

    For all of these reasons, we should both regulate fracking fairly and effectively, and also continue to support innovation to improve unconventional gas technologies. Doing so will help assure a future for gas beyond the precincts in which it is already well established. We also need to support innovation in new gas technologies well beyond fracking practices to include carbon capture and storage, which is more viable economically and technologically for gas than for coal, because gas plants are more efficient, and the emissions stream much purer. In a world in which there may remain significant obstacles to moving entirely away from fossil fuels, gas CCS looks much more viable than coal CCS. As such, we need government and the gas industry to work together to demonstrate carbon capture technologies at sites around the country, similar to how we conducted the Eastern Gas Shales Project.

    And the gas industry should support innovation beyond natural gas to include support for innovation in renewables, nuclear and other environmentally important technologies. Championing energy innovation more broadly would do more for the industry than the millions it is currently spending on slick 30-second TV ads and will remind Americans that supporting gas as well as renewables is not a zero sum proposition. Getting our energy from a diversity of sources is in the national interest and gas will thrive for a long time regardless of the energy mix. Moreover, until we have cheap utility scale storage, renewables need cheap gas for backup.

    For all of this to happen, the gas industry and environmentalists alike must change their posture toward regulation. While it is the goal of a small number of us to rid the world of particular practices, whether shale-fracking or atom-splitting, most of the rest of us want to improve them.

    Over the last 10 years, our message to the environmental movement has been that it must change its attitude toward technological innovation. Technologies are not essentially good or bad but rather in a process of continuous improvement. But there is another side to that story that industry must remember. Regulations that are often bitterly opposed sometimes end up being a boon for industry, paving the way for the broad acceptance of new technologies and pushing firms to improve those technologies in ways that make them more economical as well as more environmental.

    In closing we’d like to invoke the title essay of our last e-book, “Love Your Monsters,” which was written by one of our Senior Fellows, a well-known French anthropologist named Bruno Latour. In the essay, Latour monkey-wrenches the Frankenstein fable. The sin of Dr. Frankenstein, according to Latour, was not creating the monster, but rather abandoning him when he turned out to be flawed. We must learn to love our technologies as we do our children, he concluded, constantly helping and improving them. In so doing, we too become all the wiser.

    As we consider the implications of the gas revolution for the future of both our energy economy and our environment, we should commit ourselves to the larger effort of improving our technological creations. In so doing, the gas industry and the environmental movement might together update the concept of sustainability for the 21st Century. We should seek not to put limits on the aspirations of 1.5 billion people who still lack access to electricity, nor on the billions more yearning for enough to power washing machines and refrigerators. Nor should we want to sustain today’s energy technologies to be used in perpetuity. Rather, we should embrace technological innovation as the key to creating cleaner and better substitutes to today’s energy and non-energy resources alike so that we might sustain human civilization far into the future.

  • The Rise of the Great Plains: Regional Opportunity in the 21st Century

    This is the introduction to a new report on the future of the American Great Plains released today by Texas Tech University (TTU). The report was authored by Joel Kotkin; Delore Zimmerman, Mark Schill, and Matthew Leiphon of Praxis Strategy Group; and Kevin Mulligan of TTU. Visit TTU’s page to download the full report, read the online version, or to check out the interactive online atlas of the region containing economic, demographic, and geographic data.

    For much of the past century, the vast expanse known as the Great Plains has been largely written off as a bit player on the American stage. As the nation has urbanized, and turned increasingly into a service and technology-based economy, the semi-arid area between the Mississippi Valley and the Rockies has been described as little more than a mistaken misadventure best left undone.

    Much of the media portray the Great Plains as a desiccated, lost world of emptying towns, meth labs, and Native Americans about to reclaim a place best left to the forces of nature. “Much of North Dakota has a ghostly feel to it," wrote Tim Egan in the New York Times in 2006. This picture of the region has been a consistent theme in media coverage for much of the past few decades.

    In a call for a reversal of national policy that had for two centuries promoted growth, two New Jersey academics, Frank J. Popper and Deborah Popper, proposed that Washington accelerate the depopulation of the Plains and create “the ultimate national park.” They suggested the government return the land and communities to a “buffalo commons,” claiming that development of The Plains constitutes, “the largest, longest-running agricultural and environmental miscalculation in American history.” They predicted the region will “become almost totally depopulated.”

    Our research shows that the Great Plains, far from dying, is in the midst of a historic recovery. While the area we have studied encompasses portions of thirteen states, our focus here is on ten core locations: North Dakota, South Dakota, Nebraska, Kansas, Oklahoma, Texas, New Mexico, Colorado, Wyoming, and Montana.

    Rather than decline, over the past decade the area has surpassed the national norms in everything from population increase to income and job growth. After generations of net out-migration, the entire region now enjoys a net in-migration from other states, as well as increased immigration from around the world. Remarkably, for an area long suffering from aging, the bulk of this new migration consists largely of younger families and their offspring.

    No less striking has been a rapid improvement in the region’s economy. Paced by strong growth in agriculture, manufacturing and energy — as well as a growing tech sector — the Great Plains now boasts the lowest unemployment rate of any region. North Dakota, South Dakota and Nebraska are the only states with a jobless rate of around 4 percent; Kansas, Montana, Oklahoma and Texas all have unemployment rates below the national average.

    A map of areas with the most rapid job growth over the past decade and through the Great Recession would show a swath of prosperity extending across the high plains of Texas to the Canada/North Dakota border. Rises in wage income during the past ten years follow a similar pattern. The Plains now boasts some of the healthiest economies in terms of job growth and unemployment on the North American continent.

    Of course, this tide of prosperity has not lifted all boats. Large areas have been left behind — rural small towns, deserted mining settlements, Native American reservations — and continue to suffer widespread poverty, low wages and, in many cases, demographic decline.

    In addition, the region faces formidable environmental and infrastructural challenges. Most prominent is the continuing issue of adequate water supplies, particularly in the southern plains. The large-scale increase in both farming and fossil fuel production, particularly the use of hydraulic fracking, could, if not approached carefully, exacerbate this situation in the not so distant future.

    Inadequate infrastructure, particularly air connections, still leaves much of the area distressingly cut off from the larger urban economy. The area’s industrial economy and rich resources are subject to a lack of sufficient road, rail and port connections to markets around the world. Yet despite these challenges, we believe that three critical factors will propel the region’s future.

    First, with its vast resources, the Great Plains is in an excellent position to take advantage of worldwide increases in demand for food, fiber and fuel. This growth is driven primarily by markets overseas, particularly in the developing countries of east and south Asia, and Latin America.

    As these countries have added hundreds of millions of middle class consumers, the price and value of commodities has continued to rise and seem likely to remain strong, with some short-term market corrections, over time.

    Second, the rapid evolution and adoption of new technologies has enhanced the development of resources, notably oil and gas previously considered impractical to tap. At the same time, the internet and advanced communications have reduced many of the traditional barriers — economic, cultural and social — that have cut off rural regions from the rest of country and the world.

    Third, and perhaps most important, are demographic changes. The late Soichiro Honda once noted that “more important than gold or diamonds are people.” The reversal of outmigration in the region suggests that it is once again becoming attractive to people with ambition and talent. This is particularly true of the region’s leading cities — Omaha, Oklahoma City, Tulsa, Kansas City, Sioux Falls, Greeley, Wichita, Lubbock, and Dallas-Fort Worth — many of which now enjoy positive net migration not only from their own hinterlands, but from leading metropolitan areas such as Los Angeles, the San Francisco Bay Area, New York and Chicago. Of the 40 metropolitan areas in the region, 32 show positive average net domestic migration since 2008.

    Together these factors — resources, information technology and changing demographics — augur well for the future of the Great Plains. Once forlorn and seemingly soon-to-be abandoned, the Great Plains enters the 21st century with a prairie wind at its back.

    Visit TTU’s page to download the full report, read the online version, or to check out the interactive online atlas of the region containing economic, demographic, and geographic data.

    Praxis Strategy Group is an economic research, analysis, and strategic planning firm. Joel Kotkin is executive editor of NewGeography.com and author of The Next Hundred Million: America in 2050. Kevin Mulligan is Associate Professor of Geography at Texas Tech University and Director of TTU’s Center for Geospatial Technology.

  • A Geographer Who Navigated the Globe

    Many people ask, “What do geographers do?” I would suggest that Marvin Creamer’s life story is all you need to know about the practical application of geography, even though most of us will never be stuck in a horizonless Indian Ocean on a “sea of glass”, or try to navigate the ferocious Drake’s Passage. Ancient mariners may have been able to sail long distances without instruments, but it is difficult, tricky, and can be extremely hazardous. Only one person, New Jersey’s Marvin Creamer, has ever attempted to circumnavigate the globe this way. I had the distinct honor of interviewing Marvin Creamer for the thirtieth anniversary of his historic achievement. The 96 year old sailor and geographer also founded Rowan University’s geography department, and taught there for over three decades.

    At 66 years old in 1982 (five years after his retirement), Creamer became the only person ever to sail around the world without any navigational aids, not even a watch. He had spent three years thinking about it, two years making practice runs and doing research on the possibility, and 18 months accomplishing it. Instead of a compass and sextant, his only navigational tools were his extensive knowledge of geography, an hourglass, and a lot of optimism.

    Creamer and his crewmates braved all manner of conditions over an 18 month period to accomplish the truly historic feat. He became an expert at celestial sailing, designating a “North Star” and triangulating his position off of it. Using this method he could keep within 1 degree of latitude and longitude, but this would not help him during the daytime or cloudy conditions.

    For navigation during daytime or overcast conditions, he needed to find other ways to determine latitude. He studied ocean currents, marine life, water color and temperature. These skills would be critical not only in stormy conditions, but also in calms under slate skies.. He used his encyclopedic geographic knowledge to — for example — glean vital information from a squeaky hinge by determining where a desiccating wind (that caused the squeaking) would originate from. Sailors can become disoriented the same way pilots become confused when they have no visual cues to reference. More than once Creamer found himself fogged in on a shipping lane, and shaken by the blaring fog horns of massive tankers nearby.

    The biggest planning problem was how to get around Cape Horn. Scientists who had worked in Antarctica advised that maybe once a month clear skies would exist on the Cape. He would have to be able to navigate the world’s most treacherous waters blind. Not only were the currents savage, the winds were often gale force, with high waves and icebergs. It’s no surprise that the Drake Passage is known as a sailor’s graveyard.

    The southern sky is also much cloudier on average that the northern hemisphere’s, which would add to the difficulty of navigating without instruments there. But Boy Scouts in New Zealand taught him how to use the Southern Cross and only a thin sliver of sky to find the Polar Point.

    Marvin Creamer departed Cape May, NJ on December 21, 1982 under overcast skies with temperatures in the teens and an advancing cold front. He arrived in Cape Town South Africa on March 31st , 1983, and spent 8 weeks there fixing the boat and getting rest. The next leg was crossing the Indian Ocean in winter time to get to Tasmania. Upon arrival in Hobart, the fishermen there were so impressed that he could make landfall in such harsh conditions on the dark and desolate coast that they threw him 36 parties in the 6 weeks he stayed. Heading back down the Derwent River, 90 mph winds tossed the Globe Star upside down. His steel hulled boat with its shortened mast, built for this trip., sustained only minor physical damage, but an ill crew member needed to be dropped off in Sydney.

    On the way he was trapped for two days by Bomboras — dangerous eddies over hidden reefs of rocks — crashing around him (he described them as ‘going off like geysers’ all around him), in one of the journey’s most terrifying episodes. Finally, after navigating through a minefield of rocky outcrops, he made it through to the East Australian Current and Sydney. After stopping in New Zealand, it was off to Cape Horn and Drake’s passage.

    Sailing through the Drake was a wild adventure with winds and currents so strong that the boat could never be turned more than 15 degrees without it feeling and sounding like it was being hit with mortar fire. During the 600 mile passage his tiller broke and his shoulder was dislocated. Creamer worked furiously to cut loose his camera mount and build a makeshift steering shaft.

    After the near catastrophe he turned north towards the Falklands, which are notoriously difficult to sail, due to their remoteness and constantly changing conditions. In addition to the geographic challenges, he had entered a sensitive area that had seen war only months before. The British were still on high alert. When Creamer saw British fighters overhead (and they spotted him as well), he looked for a place to make port, and unknowingly chose a top secret British base, where he immediately found himself under house arrest. But after a little dialogue his captors treated him royally, and provisioned him for the final leg of his incredible journey. Marvin arrived in Cape May, NJ safely on May 17, 1984.

    Photos by Ralph Harvey

    Chuck McGlynn is an assistant professor at Rowan University in Glassboro, New Jersey. The university is planning a series of major events next spring to commemorate Creamer’s achievement, including a planetarium experience where attendees will be able to “travel with Marvin and the Globe Star” around the world. An interactive map experience will allow users to select any point along the journey for a display of the Globe Star journey’s date, time, longitude, average air and water temperatures, prevailing winds and sea-current.

  • As Partisan Rancor Rises, States That Back a Loser Will Be Punished

    Never mind the big-tent debate talk from both Barack Obama and Mitt Romney about how their respective politics will benefit all Americans. There’s a broader, ugly truth that as the last traces of purple fade from the electoral map, whoever wins will have little reason to take care of much of the country that rejected them.

     At least since the dissolving of the “solid South” in the late ’50s and early ’60s, both parties have competed to extend their reach to virtually every region. As recently as 1996, Democrat Bill Clinton could compete in the South, winning several states in the mid-South and even in the heart of Dixie, including Louisiana, Arkansas, Kentucky, and Tennessee. President Obama has about as much chance of winning these states this year as Abraham Lincoln did in 1860—giving him little reason to consider them in a second term.

    In the Clinton years, powerful Democrats hailed from what we now call red states not only in the South but also in the Great Plains. South Dakota’s Tom Daschle served as both Senate majority and minority leader, and Louisiana’s John Breaux and North Dakota’s Kent Conrad and Byron Dorgan were also players.

    After his 2008 win, Obama dismissed Republican objections to his stimulus with a two-word rejoinder: “I won.” But it’s become clear since that neither party is willing to accept the other’s claim of a popular mandate for its agenda. And the log jam  probably won’t be broken in November—especially if, as seems like the most likely outcome, Obama wins a second term while Republicans hold the House and edge closer to retaking the Senate.

    The 2010 Republican landslide was the rare election that radicalized both parties. The new GOP House majority was attained by adding Tea Partiers who have pushed the House—and to a lesser extent the Senate—rightward. At the same time, Democrats lost many of their remaining members who’d held on in Republican-leaning districts, leaving the party with a smaller but more ideologically pure cast of true believers in office.

    The right-leaning Blue Dog Democrats who once dominated the party’s ranks in the Plains and the Southeast are virtually extinct (as are Northeastern Republicans). In 2008 there were more than 50 Blue Dogs; the 2010 election sliced their ranks by half. After November there could be fewer than a dozen remaining. More and more Democrats, as Michael Barone has noted, come from overwhelmingly Democratic districts.

    A reelected President Obama may well find himself with almost no Plains or Southern Democrats in Congress outside of a few House members in Dixie’s handful of overwhelmingly African-American districts. With little reason to make compromise or common cause with solid red-state Republicans, the administration could leave the denizens of these states to bitterly cling to their guns and religion, while the president expands on his first-term practice of bypassing Congress to legislate by decree on everything from environmental policy to immigration and the implementation of health-care reform.  

    Already, notes National Journal’s Ron Brownstein, Democrats hold congressional majorities in only three noncoastal states—Iowa, New Mexico, and Vermont. Much of the country inside the coasts may find themselves with little sympathy from or access to a president whose reelection they will have rejected, often by lopsided double-digit margins.

    This could impact, in particular, energy policy since American fossil-fuel production is increasingly concentrated on the Plains, the rural Intermountain west and the Texas-Louisiana coast. Virtually all the mineral-rich economies excepting green-dominated California now lies well outside the electoral base of the president and his party. In a second Obama term, these states could well propel the national economy but could have little say on energy policies. Farming and ranching concerns will also have little political leverage with the White House. And traditional social concerns, most deeply felt in the Southern and more rural states, would lose all currency in a second-term administration whose worldview stems from that in big-city-dominated, deep-blue coastal states.

    The dissenting states with large fossil-fuel-driven economies—West Virginia, Texas, Oklahoma and North Dakota—would likely go to court to battle regulatory steps that they see as threatening large parts of their economies. In the Great Plains, expect a reprise of the 1970s Sagebrush Rebellion that bedeviled Jimmy Carter, as states fight back against green-oriented Washington regulators cracking down on users of federal land and water.

    Of course, if Romney finds a way to win, the coastal states would likely come in for some similarly rough treatment. The former Massachusetts governor has saved his harshest remarks for closed-door private events with big backers, dismissing 47 percent of the electorate as spongers at one such event, and telling backers at another that the Department of Education would become a “heck of a lot smaller” under his presidency and that the Department of Housing and Urban Development, which his father led during Richard Nixon’s first term in office, would face substantial cuts and “might not be around later.” The most devastating policy move he shared behind closed doors, though, was telling donors that he might eliminate the deductibility of state and local income and property taxes on federal returns—a move that would amount to a significant tax hike to many people living in high-tax and high-cost-of-living deep-blue states like New York and California.

    But since those states are solidly Democratic, Romney has little to lose politically by punishing or alienating their citizens.

    Deep-blue business interests could also lose their influence in a Romney administration, particularly if Republicans hold on to their strong majority in the House. The green-energy tax and subsidy farmers that have staked their future on the continued favor of the Democratic Party could find themselves cut off, and transit developers would also take a hit as the vast majority of train and bus riders come from a handful of dense and Democratic states (almost 40 percent of all national riders are in the New York area alone).

    But with Romney, the blue states would at least have a kind of patrician insurance, much as Clinton brought Southern sensibilities to the Democrats. The former Massachusetts governor is tied by a cultural and financial umbilical cord to his old comrades in the financial world of New York and Boston, making him less of a threat to the coastal ruling structures than Obama is to those of the interior states or the South.

    Whoever takes the White House, the nation’s best hope may be the regional mavericks who defy the trend toward geographical polarization. Democrats such as Sen. Jon Tester in Montana and Senate candidate Heidi Heitkamp in North Dakota are running hard against the anti-Obama tide in their states. Should they win, the party’s need to protect their seats would help press the White House to modify the party’s drift to an increasingly leftish social and environmental agenda.

    On the Republican side, the need to protect a middle-of-the-road politician like Massachusetts Sen. Scott Brown would push other party members into moderating their more extreme positions on social issues and regulation. Republican victories by Tommy Thompson in Wisconsin and Linda McMahon in Connecticut might also help moderate the party by adding to the numbers of “blue states” in the GOP caucus.

    For the federal union to work effectively, there has to be a sense that we are all, in different ways, linked to each other and share common interests that mean we’re willing to make compromises to live together. It’s time to bridge our partisan regional divides and avoid an ever more nasty, and divisive war between the states.

    Joel Kotkin is executive editor of NewGeography.com and is a distinguished presidential fellow in urban futures at Chapman University, and contributing editor to the City Journal in New York. He is author of The City: A Global History. His newest book is The Next Hundred Million: America in 2050, released in February, 2010.

    This piece originally appeared in The Daily Beast.

    State text map by Bigstock.