Author: admin

  • Millennials: Key to Post-ethnic America?

    One of the most widely observed, yet least understood, attributes about the emerging Millennial generation is their ethnic and cultural heterogeneity. While they represent the most ethnically varied cohort in American history—far more than any previous U.S. generation—few social commentators actually agree on what this remarkable demographic detail really portends. Will Millennials usher in a new post-ethnic America—or simply reconfigure some different version of identity politics? Will they carry on the mantle of the civil rights movement—or eliminate antiquated racial-ethnic categories altogether? Are they even cohesive enough as a group to assert any meaningful, broad-based cultural agenda?

    Whatever paths they pave, one thing is certain: Millennials are poised to fundamentally reshape the way America has historically thought about race—and, as a result, will likely reconceive our nation’s own ethnic and cultural self-identity in the process.


    By their sheer numbers, Millennials are already reshaping the nation’s ethnic makeup. Not only do they represent a “baby boomlet” in terms of population size, but according to recent figures from the 2008 Current Population Survey, 44 percent of those born since the beginning of the 80’s belong to some racial or ethnic category other than “non-Hispanic white”. Millennials are revealing themselves to be the demographic precursor to Census Bureau projections showing whites as a minority by 2050. Slightly more than half of Millennials—56 percent—are white (non-Hispanic). Age itself is inversely correlated to diversity levels—the younger in age, the higher the proportion of “ethnic” populations within each age bracket. Contrast these figures to the 28 percent of current Baby Boomers who are non-white, and one begins to see a profoundly different look and hue for future generations of Americans to come, led by Millennials.


    Undeniably, Hispanics are at the forefront of this Millennial diversity. Slightly more than 20 percent of Millennials are Hispanics—twice as large as their Baby Boomer counterparts. Millennials also encompass a significantly larger share of Black and mixed-race folks than previous generations, but Hispanics are the driving force fueling the Millennial-led ethnic demographic makeover. Accelerated Hispanic population growth over the past several decades have provoked dire warnings about the perils of Hispanic immigration—threatening to “divide the United States into two peoples, two cultures, and two languages,” in the words of Harvard political scientist Samuel Huntington.

    Yet nothing could be further from the truth—particularly when it comes to Hispanic Millennials. Approximately 86 percent of Hispanics under the age of 18 are in fact born in the U.S. (as a whole, 95 percent of Millennials are U.S. born). Many are the offspring of immigrants, but their birthright is firmly rooted in the United States. Unlike their immigrant parents, this group strongly exhibits a preference for English as their primary mode of communication. According to the Pew Hispanic Center, 88 percent of second generation Hispanics and 94 percent of third generation Hispanics are highly English fluent (speak “very well”). Many second generation Hispanics tend to be bilingual, but English dominates by the third generation.


    Broadly speaking, a distinguishing characteristic of multi-ethnic Millennials is their heavily “second generation” orientation (nearly 30 percent are children of immigrants). Since they are more likely children of immigrants than immigrants themselves, the proportion of foreign born Millennials is relatively small compared to their immediate generational forebears: Generation X and Baby Boomers. Foreign-born persons comprise 13 percent of all Millennials (includes all those born since the 80s), but they make up 22 percent of the Generation X cohort (born between 1965 to 1979) and 16 percent of Baby Boomers (born between 1946 and 1964).

    Given their more varied disposition, it should hardly be surprising that Millennials are blurring the color lines that have long-marked previous American generations. According to market research firm Teen Research Unlimited, 60 percent of American teens say they have friends of different ethnic backgrounds. More telling, however, is a 2006 Gallup Poll showing that 95 percent of young people (ages 18 to 29) approved interracial dating—compared to only 45 percent among respondents over the age of 64. Likewise, a USA Today/Gallup Poll conducted last year among teens showed that 57 percent have dated someone of another race or ethnic group—up 40 percent from when Gallup last polled teens the question back in 1980.

    Perhaps more astounding is the casual mix-and-match cultural sensibilities of Millennials. Not content to cleave to any single ethnic or cultural influence, they are free to engage in the variety with no restrictions. One example is “Mashups”—entire compositions reconfigured from samples drawn from disparate musical genres—so popular on mp3 players. Millennial choices in popular culture are drawn from a broad pool of influences, and anything can be customized and suited to one’s personal preferences—just as easily as an iPod playlist. Likewise, the aesthetics of Millennial fashion, movies, and video games increasingly reflect a broad range of influences—from Japanese anime to East L.A. graffiti art.

    In my own marketing research and consulting practice, I’ve been able to witness firsthand the eclectic, dynamic nature of Millennials, usually behind a focus group window (our firm focuses on ethnic consumers for a range of Fortune 500 companies). Increasingly, today’s young consumer shun direct overtures aimed at appealing to their ethnic background. Similarly, they tend to discard traditional cultural labels in favor of their own self-created monikers like “Mexipino”, “Blaxican”, “China Latina”.

    As a market segment, Millennials represent a precarious consumer. In the marketing world, they are shaking the foundations of advertising and media. Enabled by technology, they are contributing to a fragmented media landscape that grows ever more disparate and porous. Forced to keep up, advertisers question whether they can ever again rely on traditional media to broadcast messages for a lifestyle characterized by instant text messaging, mobile media, and virtual social networking.

    But beyond the business challenges posed by this growing crop of emerging consumers, the most lasting social contribution of Millennials is not likely the next media or pop culture trend, but how they—by simple virtue of who they are— will redefine race and ethnicity for the rest of America.

    Thomas Tseng is a principal at New American Dimensions, a multi-cultural marketing firm based in Los Angeles.

  • Response to A Return to ‘Avalon’

    It’s interesting that the authors of an article about the youngest generation (Generation Y or Millennials) title their piece “A Return to ‘Avalon,’” a cultural reference that people born between 1982 and 2003 surely know nothing about. “Avalon” is a movie from 1990 directed by Barry Levinson (born in 1942) which takes place at the turn of the last century. I’m not sure whom the authors are writing for, but I’ve never seen “Avalon” and had to look up the plot on IMDB — and I’m almost 40 years old!

    Okay, this is picking nits. Nonetheless, writing about generations is pretty tricky stuff. To make sweeping generalizations is perilous at best, and forecasting the preferences of a group whose oldest members are only 26 years of age seems to me of marginal utility. And this comes from the author of a recently released generational book “Slackonomics: Generation X in the Age of Creative Destruction.” So I know wherefore I speak. Taking stock of things as they are is one thing, but as the saying goes, prediction is very hard, especially about the future.

    The authors write, “Millennials, born between 1982 and 2003, members of the largest, most diverse … generation in American history are becoming adults, entering the workforce, getting married and settling down.” Really? I’m guessing nearly 80 percent of the group they’re talking about isn’t even out of school yet, and some only recently out of diapers! Even the definition of “Millennials” is likely to change over time as it did for Generation X (which started out as Boomerangers or Baby-Busters until Douglas Coupland published his demographically defining novel). Are today’s five year-olds going to have similar preferences for things such as housing as people in their early to mid-twenties in 2008?

    Just for fun, let’s just take a look at a few of the predictions about Generation X.

    Circa 1985, before Generation X was known as such, we were going to have it pretty cushy in almost every way. As Baby-Boomers aged their way through society, vast opportunities would open up for the next, smaller generation — from colleges competing for applicants, to magnificent career opportunities as companies needed labor, to an abundance of affordable housing as Boomers traded up. But when Gen X was entering college, not only had it become increasingly competitive, that was the beginning of the student loan explosion as costs escalated. Moving up the corporate ladder has not been quite so easy as the world of work radically changed since the 1980s and Boomers continue to work into their 60s. Abundant affordable housing has hardly been the case, even after the housing bubble began to deflate. No matter how cheap housing gets, if you can’t get a mortgage, it’s not affordable.

    So Alex P. Keaton of the TV show “Family Ties” — a garden-variety suburban kid from Ohio who rebelled against his hippy-dippy parents with his “conservative” politics (which look pretty moderate by today’s standards) — was supposed to be a millionaire by the age of 30. But things didn’t quite work out that way, not even on TV. Nonetheless, predictions were made about the “preferences” of this generation based on circumstances at the time.

    This was, of course, before Generation X morphed from Reagan Youth-wannabe yuppies in the 1980s to politically apathetic and cynical Slackers in the 1990s — as if being under-employed was a personal choice and not a consequence of the economic conditions brought about by globalization and technological efficiencies that eliminated jobs and put downward pressure on entry-level wages. But then came the dot.com bubble and Xers were back to the future of “greed is good,” albeit it this time in Silicon Valley instead of on Wall Street. And on it goes as we continue to be whipsawed by the economy.

    So for the authors to dismiss out of hand changing economic circumstances, as they do with the following statement, is to skate on some very thin ice, indeed:

    “Despite the problems posed by high gas prices and the mortgage crisis, suburban growth is still outpacing that of both urban and rural areas, as not only homeowners but also businesses continue to locate in the suburbs. The desire of Americans for their own plot of land likely will continue well into the 21st century as well. The community- and family-orientation of the Millennial Generation will only reinforce the continued growth of America’s suburbs.”

    High gas prices and the mortgage crisis have only been an issue for about a year now — hardly enough time to reverse suburbanization, a decades-long pattern of development. Due to its very nature, the real estate market is slow to adapt to changing circumstances. So Americans might “prefer” to own their own plot of land in suburbia, but fewer people are going to be able to, and that might be a good thing. I might prefer to eat a big juicy steak every night for dinner — but it’s not necessarily good for me or the rest of the environment. I’m not predicting the demise of suburbia, but people are going to change their “preferences” as external circumstances warrant, from taking mass transit to living in more densely populated, walkable neighborhoods — whether in cities or suburbs.

    Lastly, to say that people who are “community and family-oriented” prefer the suburbs strikes me as a notion from another era — like Jefferson arguing in favor of an agrarian society because it’s more community and family-oriented. “Community” and “family-oriented” are pretty nebulous terms to begin with, but assuming we agree on what they mean and are good things, clearly “community” and “family” can be created and found in all kinds of environments – and found lacking in all kinds of environments.

    So one thing I think we can safely say to people under the age of 30: don’t trust people over 30 who are trying to make predictions about your future. They will be wrong. Heck, given the food crisis, we may be headed back to a Jeffersonian agrarian society!

    Lisa Chamberlain is the author of “Slackonomics: Generation X in the Age of Creative Destruction.” She lives in New York City.

  • Wind Power: A Composite View

    It is believed that Canada has wind potential enough to produce at least 20 percent of the country’s current power needs. According to Toronto Hydro Energy Services and the Independent Power Producers Society of Ontario, the province could develop its potential and generate between 3,000 and 7,000 megawatts of wind energy in wind farms. Although useful and scientific, the Canadian wind energy atlas is nevertheless a mathematical simulation and requires further investigation, and the data available does not amount to a national wind atlas (figure 2).


    Europe and a number of other countries have developed tools and instruments to assess their wind potential, which happens to serve meteorological purposes as well. Although wind regimes are known for most of the geographic regions of the world, finer and more reliable data is essential for wind resource assessment and sound investment.

    In Canada, there are three sources of information on wind resources: Environment Canada’s Meteorological Service of Canada (2004), Ontario Ministry of Natural Resources (2005), and PEI Energy Corporation, Government of Prince Edward Island (2006). In Ontario, two alignments benefit from quality wind resources, Thunder Bay – Sudbury, and Windsor – Toronto. And in Prince-Edward Island, it is the Summerside – Charlottetown stretch.


    A survey of wind capacity in Canada has landed inspiring figures which translate to encouraging development perspectives for the global wind power industry. For example, Alberta has the largest installed wind capacity of any province although no data is available on the costs of these facilities. The figures collected by the Conference Board of Canada (Howatson & Churchill 2006) provide reliable data in measuring the starting point of future construction projets in view of attaining the goal: 20 percent of Canada’s needs or current demand in electrical power, if industry succeeds in securing the following pre-requisites: (i) public policies, (ii) public subsidies, (iii) environmental authorizations, and (iv) community approval of the projects.

    According to the Conference Board of Canada, adequate wind resources must be a definite characteristic of a site to enable ‘wind turbines to consistently produce a high output of power’ (Howatson & Churchill 2006, p.i). Of course, this underlines the fact that the issue adressed here is the massive production of power, the kind of output that would add to the overall regional generation and not MWs that are intended to supply local needs. This will become important when safety and reliability are factored in. A site must also be at short range of transmission lines.

    The combined experience of both Denmark and Germany has indicated “that as the share of power generation from wind approaches 20 percent in a region, both transmission and backup requirements become more costly” (Howatson & Churchill, p.i). Indeed, such a substantial setup of backup capacity is an inefficient use of scarce resources unless export is considered or, as is the case for Prince-Edward Island (PEI), unless the objective is to drop combustive generation entirely (attempted horizon 2015). The latest figures compiled from CANWEA (Canadian Wind Energy Association, a non-profit trade association), indicate that in coming years (to 2012) Canada’s wind power generation will have more than doubled, from 1450, 26 MW (installed) in 2006 to 2636,40 MW (figure 7). The generation of wind generated electricity is expected to double (or almost) in five Canadian provinces: Nova Scotia, Ontario, Prince-Edward Island, Québec, New Brunswick (starting from nil).

    In the process of establishing adequate wind maps in the Canadian Maritimes, provincial governments and researchers at Université de Moncton – and the PEI population in general, have agreed to “a long term goal to supply 100 percent of its electrical capacity through renewable energy sources by 2015” (MacDonald 2005). In PEI, the energy to supply the needs of the 138,000 population (56 percent in rural settings) is as follows: 80 percent fossil fuels, 13 percent imported from New Brunswick via underwater cable, 6.5 percent other combustive sources (biomass, including wood, solid waste, and sawmill residue), and 0.5 percent locally generated wind power. That is 87 percent from combustion processes. The overall demand in PEI is approximately 130 MW and therefore the 30 MW East Point Wind Plant would supply almost 25 percent of present needs. Ironically, a brand new 50 MW diesel combustion turbine generator was commissionned early in 2006 at the Maritime Electric Charlottetown plant in PEI, Canada.



    Wind maps are hardly new and the recent interest in them has taken stock of the develoment in computer programs and methods. The 50-year data and analysis below (figure 6) indicates the general direction of winds in the Canadian Maritimes and therefore the dispersion of atmospheric pollutants generated in combustive production plants (figures 8 and 9).

    CONCLUSION
    It would seem that taxpayers, voters and consumers would be pressing governments to subsidize industry toward the development of clean energy sources and technologies, yet this remains to be seen because new, more complex energy sources cannot forseeably replace oil or combustion energy in transportation and industry (67.1 percent of all energy consumed in Canada by the transportation and industrial sectors – and residential, commercial/institutional and agriculture buy the rest).

    Although the discussion concerns electric power, the ambient discussions remain focused on oil and combustion processes including nuclear (which exploits the classic force of steam). We have included rather unpalatable technotes at the outset of this paper because energy is first and foremost a scientific concept – not a commodity, and certainly not the locus of miracles that it has been in prescientific ages, and much may be hidden or concealed in measurement units and conversion factors, including the conversion of energy forms.

    Wind power and the technologies that support this approach are rather primitive, yet they offer sound opportunities for the continued development of life on earth unlike oil and its supporting attribute, internal combustion (including nuclear) with their foul emissions and deadly waste. Geographically, however, the exploitation of wind power on a global scale would bring about a redeployment of human establishments apparently detrimental to developing countries only because most countries, including the affluent, are growth-driven. Indeed, populations and cultural features are displaced when huge hydroelectric projects are implemented, but wind farms may require that people live under spinning and humming skies.

    In this paper, we have identified the trend-setting Canadian province of Prince-Edward Island (PEI) as an example. PEI intends to satisfy most of its run of the mill local power needs using wind and thus curb both energy imports and GHG emissions. Nevertheless on the island, transportation, waste management (biomass as it is called now and/or cogeneration) and energy backup will continue to burn diesel and depend on combustion processes generally. The PEI experiment or project is being established under the supervision of Canadian R&D experts at the Université de Moncton (K.C. Irving Chair in Sustainable Development). In the latter case, the base-load power production is reversed.

    Canada will continue, it seems, to afford itself with the best available local sources of energy and will export some surpluses to its neighbor, but clearly in our view the current demand for energy does not apply its pressure on residential, commercial and agricultural or institutional markets: the pressure is clearly on transportation and industry, and electric power is not a solution to those problems.

    References
    Bregha, F. (2006) Energy Policy, in The Canadian Encyclopedia © 2006 Historica Foundation of Canada. Online: http://www.thecanadianencyclopedia.com/PrinterFriendly.cfm?Params=A1ARTA0002613
    Gasset, N. (2005) Atlas éolien du Nouveau-Brunswick. Thèse de maîtrise en études de l’environnement : conférence publique. Faculté des études supérieures et de la recherche, Centre de génie éolien, Université de Moncton. Online : http://www.umoncton.ca/cge/atlas_eolien/pdf/Presentation_annonce.pdf
    Hepperle, M. (2005) Timetable: Development of the Propeller. Online : http://www.mh-aerotools.de/airfoils/prophist.htm
    Hoogwijk, M.M. (2004) On the global and regional potential of renewable energy sources. Universiteit Utrecht. Online : http://igitur-archive.library.uu.nl/dissertations/2004-0309-123617/full.pdf (2.28 MB)
    Howatson, A., and Churchill, J.L. (2006) International Experience With Implementing Wind Energy. Ottawa : Conference Board of Canada. Online : http://www.conferenceboard.ca/documents.asp?rnext=1537
    MacDonald, K. (2005) East Point Wind Plant (30MW) Project Description. King’s County, Prince Edward Island. WPPI Registration #: 5902-P7-1.
    Mota, W.S. and Alvarado, F.L. (2001) Dynamic coupling between power markets and power systems. Revista Controle & Automaçao, Vol.12 no.1, p.36-41. Online: http://www.fee.unicamp.br/revista_sba/vol12/v12a277.pdf
    Mudge, T. (2000) Power: A First Class Design Constraint for Future Architectures. High Performance Computer Conference, Bangalore, India, Dec. 00. Online : http://www.eecs.umich.edu/~panalyzer/pdfs/Power__A_First_Class_Design_Constraint_for_Future_Architectures.pdf
    NEB (National Energy Board of Canada) (2006) Emerging Technologies in Electricity Generation. An Energy Market Assessment. Ottawa : Her Majesty the Queen in Right of Canada as represented by the National Energy Board. Online : http://www.neb.gc.ca/energy/EnergyReports/EMAEmergingTechnologiesElectricity2006_e.pdf
    NEB (National Energy Board of Canada) (2004) A Compendium of Electric Reliability Frameworks Across Canada. Online : http://www.neb.gc.ca/energy/EnergyReports/CompendiumElectricReliabilityCanada2004_e.pdf
    NEB (National Energy Board of Canada) (2003) Canadian Electricity Exports and Imports. Online : http://www.neb.gc.ca/energy/EnergyReports/EMAElectricityExportsImportsCanada2003_e.pdf
    Price, M. and Bennett, J. (2002) America’s Gas Tank. The High Cost of Canada’s Oil and Gas Export Strategy. Ottawa: Natural Resources Defense Council and Sierra Club of Canada. Online: http://www.nrdc.org/land/use/gastank/gastank.pdf
    Wildi, T. (2002) Electrical machines, Drives, and Power Systems. 5th Ed. Upper saddle River : Prentice Hall.
    Wilson, K.G. (1999) Du monopole à la compétition : la dérèglementation des télécommunications au Canada et aux Etats-Unis. Québec : Télé-université.

  • The Cost of Chicago Jobs

    In Chicago’s recent history, when you think of beers, Jesse Jackson and his sons Yusaf and Jonathan come to mind. Yusaf and Jonathan Jackson were fortunate enough to receive a coveted Anheuser-Busch distributorship on the north side of Chicago. Just the other day, MillerCoors announced it would move its corporate headquarters to downtown Chicago by the summer or fall of 2009. The cost was high. The City of Chicago and the State of Illinois will pay $20 million to help bring 300 to 400 jobs to Chicago.

    Chicago was in competition with Dallas for MillerCoors. Even though Dallas lost the MillerCoors battle to Chicago, Texas as of late has been the big winner in landing corporate headquarters. In April of 2008,Texas became the number one state for headquarters of the 500 largest corporate headquarters compiled by Fortune Magazine. As the Houston Chronicle reported, “Texas now boasts 58 headquarters, three more than New York, the previous No. 1, and California, with 52. The Houston area has 26 of the companies.”

    In the same week as the MillerCoors announcement, some rather grim news for Chicago and Illinois was released. The Illinois Department of Employment and Security reported that the “Illinois unemployment rate for June was 6.8 percent, climbing 0.4 percent points from May. The number of unemployed increased for the second month in a row, rising by 26,900 to 463,900 unemployed individuals, and reaching it highest level since June 1993.” To put things in perspective, last month, while the Illinois unemployment rate was 6.8 percent, the national unemployment rate was 5.5.

    While Illinois and Chicago give MillerCoors the $20 million welcome, America’s largest retailer is an object of derision in Chicago. Wal-Mart was allowed to open its first store in Chicago’s city limits after a protracted fight in the City Council. The pro-union Chicago Aldermen have prevented any more Wal-Marts in Chicago. The thousands of jobs Wal-Mart could have provided Chicago’s poor and working class will not happen.

    The taxpayers are allowed to subsidize MillerCoors with $20 million (for 400 jobs) in Chicago, but having several Wal-Marts employing thousands of job seekers is not to be in Chicago. Instead of challenging Chicago’s City Council to open up the city to an aggressive anti-union company, Mayor Daley wants organized labor peace. The organized labor calm is necessary to bring the Olympics to Chicago in 2016. Chicago didn’t have much domestic competition from other U.S. cities bidding on the Olympics because it’s a money loser for taxpayers. Mayor Daley, the unions, and businesses with heavy clout view the Olympics as a great heist with high tax tolerant Chicago taxpayers left with the tab.

    The last several decades, Illinois has been a sub-standard performer in jobs and population growth. In December 2007, Crain’s Chicago Business described the Illinois job situation:

    “Financial pressures on Illinois residents are deepening, as the state continues to lose economic ground compared to the nation and its own past.

    That’s the gloomy bottom line on a comprehensive study of the state’s economy being released this morning by the Chicago-based Center for Tax and Budget Accountability and the two research units of Northern Illinois University at DeKalb.

    The study finds that, though the rate of decline has somewhat slowed, Illinois continues to lose good-paying manufacturing jobs to service-industry posts that tend to pay less.

    As a result, most Illinois workers actually earned less in 2007 than they did in 2000, adjusted for inflation, with median household income dropping from $54,900 in 1999-2000 to $49,328 today.”

    An important part of the erosion of jobs-based earnings in the state is due to the loss of manufacturing jobs. Howard Wial and Alec Friedhoff did a study for the Brookings Institute in 2005 on manufacturing jobs lost in the Great Lakes Region from 1995-2005. The greater Chicago area was one of the leaders in manufacturing jobs lost. Wial and Friedhoff report:

    “Total employment in metropolitan Chicago grew moderately before the 2001 recession, declined from 2000 through 2003 and rose again in 2004 and 2005. The region gained 346,000 jobs (an 8.2 percent increase) from 1995 through 2000. Despite recent gains, total employment fell total employment fell by 109,900 (2.4 percent) from 2000 through 2005. Over the entire period 1995-2005,the region gained 236,100 jobs (5.6 percent), well below the national growth rate.

    Manufacturing employment declined almost continuously since 1995,with the largest annual losses occurring in 2001 and 2002. The region lost 35,700 manufacturing jobs (a decline of percent) from 1995 through 2000 and another 141,300 (22.2 percent) from 2000 through 2005. The result was a loss of 177,000 manufacturing jobs (a 26.3 percent decline) over the entire decade, the largest total loss of all regions in this analysis.”

    The Chicago Tribune had an interesting fact in their article about MillerCoors coming to Cook County. The Tribune reports that the county’s job growth for “management of companies” jobs is nowhere near that of adjacent counties. According to the Bureau of Labor Statistics, the Tribune reports: “between 2001 and 2006, they grew 7 percent in Cook County, 33 percent in DuPage County and 83 percent in Lake County.”

    In conclusion, Chicago needs all the jobs it can get. Cutting regulations and eliminating union mandates would be lot cheaper and more effective for attracting jobs then subsidizing a major corporation with $20 million dollars from taxpayers. The MillerCoors deal is indicative of the costs Chicago taxpayers endure. It’s ironic though, MillerCoors will be located in downtown Chicago (which is a special taxing district that has an 11.25 percent retail sales tax, the highest in America). The tax proceeds from the highest retail sales tax is meant to subsidize economic development in downtown Chicago. So, Chicago may lose retail jobs downtown.

    Steve Bartin is a resident of Cook County and native who blogs regularly about urban affairs at http://nalert.blogspot.com. He works in Internet sales.

  • Guzzling BTUs: Problems with Public Transit in an Age of Expensive Gas

    As gas prices inch up toward $5 per gallon, many environmentalists and elected officials are looking to public transit as a solution to higher transportation costs and rising fuel consumption. A closer look at the numbers, however, warrants more than a little skepticism that public transit can fulfill the nation’s energy conservation goals.

    The US transportation sector is a voracious consumer of fuel, accounting for 28 percent of all energy use in 2006 according to the US Department of Transportation. Petroleum products account for 95 percent of this consumption. Naturally, those interested in conserving natural resources, fossil fuels in particular, would want to focus on reducing oil use. Moving people out of cars and onto public transit seems to make intuitive sense.

    It turns out, however, moving people to transit may not be the best strategy after all. According to the US Bureau of Transportation Statistics, a typical transit bus uses 4,235 btu per passenger mile, 20 percent more energy per passenger mile than a passenger car. More interestingly, the amount of energy used by cars has fallen to 3,512 per passenger mile in 2006, an 18 percent drop since 1980. In contrast, the amount of energy used by transit buses increased by 50 percent over the same period, rising to 4,235 btus per passenger mile. Light trucks were not quite so energy friendly as energy use fell by nearly one third to 6,904 btus per passenger mile (although their energy efficiency has remained stable since the mid-1990s).

    The long term trend toward more fuel efficient private vehicles is likely to continue as more and more energy-frugal cars such as gasoline-electric hybrids and electric plug-in vehicles become more popular. The Toyota Prius sold its one millionth vehicle in 2008 as it achieved mass production status. Sixty-five hybrid models are expected to be on the US market by the 2010 model year, nearly tripling the current number available. Moreover, all-electric vehicles such as the Tesla sports car are expected to become more popular as consumers become more accepting of personal vehicles fueled by non-traditional technologies. (Notably, Toyota is experimenting with solar panels on new generations of the Prius.)

    These trends, of course, don’t imply that fuel consumption has declined overall. On the contrary, US motorists are consuming 75.4 million gallons of fuel each year, up 7.8 percent from 1980. Yet, this is remarkably stable trend given the fact vehicle mile traveled have increased by 49 percent since then. Travel demand has more than doubled for light trucks and similar vehicles while fuel consumption by these vehicles increased by just 59 percent. Efficiency gains, then, have effectively compensated for large shares of the increase in travel demand, dramatically reducing the amount of energy used for each mile driven.

    Unfortunately, the same can’t be said for public transit. While transit ridership has increased significantly over the past year, climbing to 10.3 billion trips during the first quarter of 2008 according to the American Public Transit Association, the overall effect on the travel market has been modest. Long-term, transit’s market share for all travel fell from 1.5 percent in 1980 to 1 percent in 2005. Transit’s market share for work trips has fallen to 5 percent overall. Meanwhile, the public transit infrastructure – buses, route miles, etc. – has remained largely intact. That means more buses are transporting fewer people, significantly curtailing public transit’s energy efficiency. Not surprisingly, the energy intensity for public transit increased on average by 1.5 percent per year from 1970 to 2006.

    The story is a little different for passenger rail, which carries about half of the nation’s public transit riders (although national data are dominated by ridership in New York City). Transportation consultant Wendell Cox has calculated the energy intensity for other modes of transit in 2005 and found that commuter, heavy, and light rail transit used significantly less energy per passenger mile (about 40%) than public bus or passenger cars.

    Yet, the prospect for reducing energy use significantly by improving rail transit’s market share of overall travel is slim. Despite double-digit increases, light rail ridership accounts for just 3.4 percent of transit passenger miles nationally . In contrast, commuter and heavy rail ridership growth was just 5.7 percent and 4.4 percent respectfully. Moreover, increased ridership for rail services depend on the availability of other transit services, most notably feeder bus routes as well as urban densities that are difficult to sustain outside a few major cities such as New York, Chicago, or Boston.

    Thus, as a practical matter, public transit is unlikely to provide a meaningful solution to reduced energy use in transportation. This becomes clear after looking at travel behavior in the wake of the increase in gas prices over the past year. Overall, public transit ridership increased just 3.3 percent. If we convert ridership into passenger miles traveled – a distance-based rather than trip-based measure – a 3.3 percent increase translates into 1.6 billion passenger miles over the course of a year. That may seem like a big number, until it’s compared to overall US travel.

    As gas prices went up, US automobile travelers eliminated 112 billion passenger miles from our roadways as vehicle miles traveled fell by 2.3 percent. Even if we assume all the increased transit ridership was accounted for by the migration of automobile travelers to public transit, buses and trains captured fewer than 2 percent of the reduction in automobile-based travel demand.

    Thus, in the end, those seeking ways to promote energy conservation are still relying on market forces to affect behavior and resource use. Higher-income consumers value mobility, and automobiles provide the flexibility and adaptability they demand. As energy prices rise, incentives to provide resource stingy alternatives such as hybrid and electric only vehicles increases, stimulating even further innovation that bring down costs over the long run. Meanwhile, contrary to public perception, as fewer segments of the population rely on fixed route transit systems, the relative energy efficiency of public transit declines.

    Samuel R. Staley, Ph.D., is director of urban and land use policy at Reason Foundation and co-author of “The Road More Traveled: Why the Congestion Crisis Matters More Than You Think and What We Can Do About It” (Rowman & Littlefield, 2006). He can be contacted at sam.staley@reason.org.

  • Sprinting Blindfolded to a New Equilibrium

    Everyone except the fabulously wealthy and the truly disconnected knows energy has become much more expensive in recent years, but it’s worth taking a step back and examining just how much it has jumped and what we should (and should not) conclude about the impact on nearly all aspects of modern life.

    The raw data provides a startling enough starting place. Since 2004, the U.S. retail price for gasoline has leaped from $1.53 per gallon to $4.10, and oil has skyrocketed from $28 per barrel to over $140. The retail price of natural gas, largely ignored by U.S. consumers during the summer, has increased from $9.71 per thousand cubic feet to $14.30 (its highest price ever for the month of April), and even coal has risen from $19.93 per short ton to $25.40. Electricity, closely tied to the price of natural gas and coal, has increased from 7.61 cents per kilowatt hour to 9.14.

    Whether this rise in oil prices was triggered by the fundamentals of supply and demand, politics, collusion among oil producers or a lack of investment in developing oil fields around the world (I strongly favor fundamentals), or you think we’re headed toward a peak in world oil production very soon or decades from now (I’m firmly in the “very soon” camp), the bottom line is that the economy doesn’t care about causes or rationalizations, just prices.

    Economics, the study of the allocation of scarce resources, tells us that economies constantly adjust themselves in response to price changes. One company raises the price of the tennis rackets they manufacture, so some customers will buy a competitor’s product or forego buying a new racket this year or take up another sport. Many such events constantly send overlapping ripples throughout an economy, causing the quantities demanded and supplied of many goods and services to rise and fall slightly. In econo-speak, the entire system seeks a new equilibrium in response to a change in the price of one good or service relative to substitutes. Talking about minor price changes for non-critical items is the normal background noise of a healthy economy. But energy, especially oil, presents an entirely different scenario.

    Virtually everything we buy depends on the price of oil to some degree. Raw materials, finished goods, customers, and the people who perform services all need to be transported. Oil and natural gas are also critical components in making plastics and many chemicals and fertilizers, and fossil fuels provide 70 percent of the energy consumed to generate electricity in the U.S. In most applications, trying to find a substitute for fossil fuels on the scale and immediacy we’d prefer is impossible without paying a very high price. (As the old line goes, I can do a job for you quickly, cheaply, or well — pick any two.)

    When the price of such a significant and unique resource rises so much and so quickly we’ve pole vaulted over a mild jostle to the system and gone straight to a deep, pervasive, game-changing shock.

    The sheer magnitude of this shock explains why there is so much talk in the financial press lately about which of the Big Three car companies could file for bankruptcy within a year. Ford, GM, and Chrysler are in a desperate race against time. Can they radically overhaul their product lines to meet the rapidly shifting demands of their customers before they run out of cash?

    Given the high cost and long development time to create a new car model, this is a daunting task, to say the least. By comparison, the commercial airline industry is in far worse shape, as they don’t have the potential to save themselves via converting to electric vehicles or plug-in hybrids. Even using biofuels to run their jets is more wishful thinking than a real-world solution, and is at least a decade away from widespread application. Unless the price of jet fuel drops dramatically and fairly soon, the downsizing and bankruptcies we’ve already seen among airlines will be only the beginning of their “adjustment.”

    So, all is gloom and doom, right? Well, no, and this is the point that’s so easy to overlook. Even though energy prices have been rising for years, we’re still in the very early stages of the U.S. economy’s reaction; what we’ve seen to date is much more the initial impact than our individual and collective response.

    Still, it’s not hard to find media stories about the increased use of public transportation, people driving less overall, and more workers telecommuting or converting to a four-day workweek. Plus, we’re awash in reports of how 2010 will be a sea change in the car business, with several companies offering plug-in hybrid and full electric vehicles in the U.S. A further complication is the virtual certainty that soon the U.S. will overtly begin to rein in CO2 emissions, whether via a carbon tax or a cap-and-trade system.

    Beyond that we have the growing challenge of generating enough electricity to meet traditional demands plus the additional burden of recharging electric vehicles, all the while reducing CO2 emissions and finding enough water to cool thermoelectric plants (nuclear, coal, oil, and natural gas) in a world where climate change is creating drought in inconvenient places. With so many large and powerful forces suddenly in motion at once, trying to make firm predictions about the quantity demanded or the price of any fossil fuel is a revelation of one’s hubris or insanity.

    The last thing we should do is fall into the trap of making simplistic, linear extrapolations. You can barely Google any energy related topic without finding references to the impending “death of the suburbs,” a topic Joel Kotkin addressed recently and I commented on both here and on my own site. Similarly, you can find numerous other opinions that grossly underestimate the inherent flexibility (and therefore unpredictability) of entire economies. Many of them are based on a fallacy that roughly says: “We use a lot of oil to do X. Oil will get more expensive, so therefore we won’t be able to do X.” These comments almost never mention the possibility that we’ll find ways to do X with far less oil, or that we’ll fill the same need by doing Y, or that savings in oil consumption in other, less critical, parts of the economy will buy us time to change how we do X.

    In more concrete terms, how many people really think that more expensive oil will stop us from making medical supplies or fueling trucks that deliver food? Isn’t it more sensible to assume that we’ll cut back on far less critical uses, like pleasure boating or flying for vacations, and keep the increasingly scarce oil flowing to life-and-death applications?

    Humanity has just begun an unprecedented, expensive, painful, and, above all else, unpredictable journey. The end of the age of cheap energy will no doubt reshape almost everything we do, from decisions about personal consumption to our governments and public policies, to our institutions and businesses, to our cities. Our collective future will be a lot of things, but “dull” isn’t on the list.

    Lou Grinzo runs the web site The Cost of Energy , and is an economist by training, a programmer, technical editor, and writer by profession, and an energy geek by genetic predisposition.