Blog

  • Crime Down in Urban Cores and Suburbs

    The latest data (2011) from the Federal Bureau of Investigation (FBI) Uniform Crime Reports (UCR) indicates that violent crime continued to decline in both the suburbs and historical cores of major metropolitan areas (over 1,000,000 residents). Since 2001, the rates of decline have been similar, but contrary to media reports, the decline has been slightly greater in the suburbs than in the historical cores. Moreover, despite the preliminary report of a slight increase in the violent crime rate at the national level in 2012, substantial progress has been made in making the nation safer over the past 20 years.

    Major Metropolitan Area Trends

    The FBI website includes complete data on 48 of the 51 major metropolitan areas for 2011 (2012 data are not yet available for metropolitan areas). The FBI notes that the data collection methodology for the city of Chicago and the suburbs of Minneapolis-St. Paul is inconsistent with UCR guidelines and as a result, the FBI does not include information for these jurisdictions. No data is reported for Providence.

    Among these 48 major metropolitan areas, the violent crime rate was 433 (offenses per 100,000 population known to the police), approximately 10% above the national rate of 392 in 2011. The violent crime rate in the historical core municipalities, or urban core (See Suburbanized Core Cities) was 911 offenses per 100,000 population. In the suburbs, which consist of all municipalities not comprising the historical cores, the violent rate was 272 offenses per 100,000 population. Thus, the urban core violent crime rate was 3.3 times the suburban violent crime rate (Figure 1).

    A comparison of the urban core and suburban crime rates by historical core municipality classification further illustrates the lower crime rates generally associated with more suburban areas. The violent crime rates in the more suburban urban cores are generally lower (Table 1). 

    • Among metropolitan areas with “Post-War & Suburban Core Cities,” the urban core violent crime rate in 2011 was 2.2 times that of the suburbs. This would include core cities such as Phoenix, San Jose, Austin and others that became large metropolitan areas only after World War II and the broad expansion of automobile ownership and detached, low density housing.
    • In the metropolitan areas with “Pre-War & Suburban Core Cities,” the urban core violent crime rate was 3.1 times that of the suburbs. These would include core cities such as Los Angeles, Seattle, and Milwaukee, which combine a denser pre-war inner city with large swaths of post-World War II suburban development within their borders.
    • The greatest difference was in the metropolitan areas with “Pre-War & Non Suburban Core Cities,” where the urban core violent crime rate was 4.4 times that of the suburbs. These would include such core cities as New York, Philadelphia, Boston and others, which had large areas of high density and significant central business districts before World War II, and which, even today, have little post-World War II suburban development within their borders.
    Table
    VIOLENT CRIME RATES: HISTORICAL CORE MUNICIPALITIES AND SUBURBS: 2011
    Violent Crimes Reported per 100,000 Population In Major Metropolitan Areas
    Historcial Core Municipality Classification Metropolitan Area Urban Core Suburbs Urban Core Times Suburbs Crime Rate
    Pre-War Core & Non-Suburban 436 1,181 273 4.3
    Pre-War Core & Suburban 443 821 265 3.1
    Post War Suburban Core 398 642 294 2.2
    48 Major Metropolitan Areas 433 911 272 3.3
    No data for Chicago, Minneapolis-St. Paul and Providence

     

    Suburban and Urban Core Trends: 10 Years

    Over the past decade, violent crime fell both in the suburbs and the urban cores. Among the 36 major metropolitan areas for which complete and comparable data is provided on the FBI website, the violent crime rate fell an average of 25.8 percent between 2001 and 2011. Urban core violent crime rates were down 22.7 percent, while suburb violent crime rates were down a slightly less 26.7 percent (Figure 2).

    Reconciling Differences with Other Analyses

    Other analyses have noted that urban core crime rates are declining faster than in the suburbs. The differences between this and other analyses are due to the use of different time periods, different metropolitan area sets, and most importantly, profoundly more limited definitions of the suburbs.

    An article in The Wall Street Journal raising concerns about suburban crime rates was based on an FBI analysis of all metropolitan areas, not just major metropolitan areas and covered 2001 to 2010. Crucially, the FBI classifies much of suburbia as not being suburbs. The FBI defines suburbs generally as any municipality in a metropolitan area with fewer than 50,000 residents as well as areas patrolled by county law enforcements agencies. Non-core municipalities with their own law enforcement that have 50,000 or more residents are not considered suburbs, regardless of their location in the metropolitan area. This would mean, for example, that Pomona would not be considered a suburb, despite its location 30 miles from Los Angeles City Hall, on the very edge of the metropolitan area, simply because it has more than 50,000 residents. As a result, the crime rates in “cities” versus suburbs cannot be determined by simply comparing FBI geographical classifications.

    A Brookings Institution report reported suburban violent crime rates to be dropping more slowly than in “primary cities,” which are a subset of the “principal cities” defined by the Office of Management and Budget (OMB). Many of these primary cities are virtually all post-World War II suburban in form. These include, for example, Mesa, Arizona, Arlington, Texas and Aurora, Colorado, each of which had fewer than 10,000 residents in 1950 and are virtually exclusively the low-density, automobile oriented suburban development forms that would be found in nearby Tempe, Grand Prairie, and Centennial, which are defined as “suburban” in the Brookings classification. The Brookings report looked at major metropolitan areas as well as smaller metropolitan areas and covered a longer period (1990 to 2008).

    OMB, which defines metropolitan areas, does not designate any geography as suburban. OMB specifically excluded “suburban” terminology from its 2000 metropolitan area criteria. Instead, in recognition of the increasing polycentricity of metropolitan areas, OMB began designating “principal cities.” Except for the largest city in a metropolitan area, principal cities are defined by the strength of their employment markets, and are generally suburban employment centers, not urban cores. In defining its metropolitan area criteria for the 2000 census, OMB recognized  that the monocentric city (metropolitan area) had given way to an urban form with multiple employment centers, located throughout the metropolitan area.

    OMB’s principal cities may be located anywhere in the area, without any relationship to the urban core. Rather than a single core city in a metropolitan area, OMB has designated up to 25 principal cities in a single metropolitan area.

    The National Trend

    The metropolitan area crime reductions are consistent with a now two-decade trend of substantially improving crime rates. This is despite preliminary data recently released by the FBI in June indicating a reversal of the trend for 2012. The FBI reported violent that violent crime increased 1.2 percent. With a 0.7 population increase from 2011 to 2012, the US violent crime rate would increase to 394 per 100,000 residents, from 392 in 2011. Metropolitan area data for 2012 is not yet available.

    This increase in crime rates should be a matter of concern. The 2012 violent crime rate increase is, hopefully, only a blip in a decline that will soon resume. The violent crime rate has declined eighteen of the last 21 years. Since 1991, the violent crime rate has dropped by nearly half (48.3%).

    This is in stark contrast with the previous 30 years, during which the violent crime rate increased in all but five years. By 1991, the violent crime rate had increased 3.7 times from 1961. By 2012, the national violent crime rate had fallen to the lowest level since 1970 (Figure 3).

    Why Has the Crime Rate Declined?

    There are multiple theories about the causes of the crime rate reduction. The late James Q. Wilson, who with George Kelling advanced the “broken windows” theory of crime prevention, offered a number of additional reasons for the fact that crime rates remained much lower, even during the Great Recession, in a Wall Street Journal commentary. The earliest and best publicized improvements in crime rates occurred under New York Mayor Rudolph Giuliani in the 1990s. Kelling and others (such as Hope Corman of Rider University and Naci Mocan of Louisiana State University) attribute much of the crime rate improvement in New York City to the “broken windows” deterrence strategies.

    The substantial decline in violent crime rates, in the nation, metropolitan areas, suburbs and urban cores, are an important success story. Yet, crime rates can never be too low. It can only be hoped that future years will see even greater reductions.

    Wendell Cox is a Visiting Professor, Conservatoire National des Arts et Metiers, Paris and the author of “War on the Dream: How Anti-Sprawl Policy Threatens the Quality of Life.

    Crime scene photo by Alan Cleaver.

  • Beware the Herbivore Effect

    In the 1980s, American commentators and best-selling authors repeatedly sought to convince companies and workers to be more "Japanese." After all, for two generations, the men of Japan, supported by their wives, constituted a fearsome force – first, in the run up to the Second World War, then during the economic "miracle" that drove that small island nation toward the pinnacle of global economic power.

    Yet, today, Japan’s latest generation of men appears lacking the fierce ambition that drove their fathers, much less their grandfathers. The term commonly used for this new generation of Japanese is "herbivores," a play on the word for plant-eating animals generally known for their docility. And, instead of embracing what the new generation is doing in Japan, we should look at our young people and think: God forbid.

    Growing up in a period of tepid economic growth, a declining labor market and a loss of overall competitiveness, Japan’s "herbivores" are more interested in comics, computer games and socializing through the Internet than building a career or even seeking out the opposite sex. Among males ages 16-19, 36 percent in one survey expressed no interest in sex, and some even despised it.

    Not that women are waiting breathlessly for male stirrings: Disinterest is even higher – 59 percent – for females in the same age category. The percentage of sexually active female university students, according to the Japanese Association for Sex Education, has fallen from 60 percent in 2005 to 47 percent last year. There’s a bigger issue here than overly tame libidos, suggests sociologist Mika Toyota. Once-critical interpersonal familial ties are being replaced by more ad hoc relationships based on common interests.

    One indication of this breakdown in family ties has been a gradual loss of interest in marriage, among men but at least as much so among women. By 2010, a third of Japanese women entering their 30s were single, as were roughly one in five of those entering their 40s. That’s roughly eight times the percentage in 1960, and twice that in 2000. By 2030, according to sociologist Toyota, almost one in three Japanese males may be unmarried by age 50.

    Such attitudes, one Osaka blogger observed, indicate that many young people, particularly women, sense "an unwillingness to throw away the freedoms of single life to comply with the strict societal demands accompanying co-habitation or marriage."

    Herbivores, it appears, are less likely to marry. Prime Minister Shinzo Abe can do his best imitation of President Obama’s loose money policies, pumping trillions of yen into his economy, but bigger civilizational forces appear to be at play. Demographics are more intractable than short-term markets. The herbivorization of Japan can’t be good news in a country that suffers from a plunging marriage rate, a declining workforce and a fertility rate so low that adult diapers outsell those for babies.

    Could the same process occur here? Are young American males following the path to herbivore pastures? There are some disturbing parallel trends. The onset of the Great Recession has slowed fertility in the United States, the one large high-income country with fertility rates historically above replacement levels, down to the lowest levels in a quarter century. Despite a rise in population of 27 million Americans, there were actually fewer births in 2010 than there were 10 years earlier.

    The herbivore effect can be seen in the postponing among younger Americans of both marriage and having children, according to a recent Pew Foundation study. As in Japan, a weak economy plays a role. The recession, and the weak recovery, has had a disproportionate impact on young people: Almost two in five unemployed workers are ages 20-34.

    There are other disturbing parallels. Young Americans are increasingly embracing what European scholar Angelique Jansenns described as "the deinstitutionalization of marriage" and "the emancipation of individual members from the family." Although more than 70 percent of U.S. millennials want to get married, nearly half believe the institution is becoming "obsolete." No surprise, then, that a growing proportion of American children born today – and a majority born to women ages 20-24 – are to unwed mothers.

    Another apparent casualty here is entrepreneurship, the very thing that characterized boomers and the successor Generation X. Boomers, while now in their 50s and 60s, are still at it, but start-up rates among young people are getting weaker even as boomers continue to start new ventures. No longer can we take as a given that entrepreneurial activity is associated with the young.

    "Millennials have been raised in ways that make them feel very pressured by the need to succeed," observe generational chroniclers Morley Winograd and Mike Hais. "They see life as a series of hoops to be jumped through, starting with getting into the right preschool, all the way to graduating from the right college. Such a view of the world makes them very afraid to fail on their own and, therefore, very risk averse."

    Some see this age of unambition as a positive. There is a devout "progressive" picture of millennials who don’t buy houses, cars or don’t fret much about getting on with their lives. Some of this may be attributable to cascading student debt but some see the emergence of a higher generational consciousness.

    The environmental magazine Grist sees "a hero generation" that will avoid the pain and suffering that comes with trying to overcome a tough economy. They will transcend the material trap of suburban living and work that engulfed their parents. "We know the financial odds are stacked against us and, instead of trying to beat them, we’d rather give the finger to the whole rigged system," the millennial author concludes.

    Anyone over age 40 will tell you how that strategy likely will work out.

    Yet, I, for one, have not given up. As the new generation begins to face the realities of growing up, Winograd and Hais suggest, they will begin to move away from the "herbivore" model. After all, despite the claims like those in Grist, most millennials, particularly those entering their 30s, want to own a home, with more than three times as many eyeing the suburbs as their ultimate destination as the big city.

    Fortunately, our millennials are not stuck in a narrow, expensive homogeneous country, like Japan. If our native-born young people lack sufficient moxie, newcomers from Mumbai, Mexico City, Seoul or Shanghai will show them the way – or the way to the unemployment office. And when millennials get around to buying homes, there are many places – perhaps not always Southern California – that can accommodate them.

    Like their boomer parents – who endured the malaise of the Jimmy Carter years – reality has a way of reawakening the carnivorous spirit of young Americans. This had better happen, anyway, if America is to remain competitive.

    Joel Kotkin is executive editor of NewGeography.com and a distinguished presidential fellow in urban futures at Chapman University, and a member of the editorial board of the Orange County Register. He is author of The City: A Global History and The Next Hundred Million: America in 2050. His most recent study, The Rise of Postfamilialism, has been widely discussed and distributed internationally. He lives in Los Angeles, CA.

    This piece originally appeared in the Orange County Register.

    Illustration by Timothy Takemoto.

  • Angry Young Men

    “’Angry young men’ lack optimism.” This was the title of a BBC News story earlier this year, exploring the deeply pessimistic views that some young working class British hold about their own future. Two-thirds of the young men from families of skilled or semi-skilled workers, for example, never expect to own their own home. Angry young men, this time of immigrant origin, were also recently identified as the group causing riots in Swedish suburbs such as Husby. As Swedish Prime Minister Fredrik Reinfeldt noted, the riots were started by a core of “angry young men who think they can change society with violence”.

    The social unrest occurring in Western Europe is often ascribed to the lack of integration into society among immigrants. It is true that dependency of public handouts rather than self-reliance has become endemic in Europe’s well‑entrenched and extensive welfare states. In Norway for example, the employment rate of immigrants from Asia is only 55 percent, compared to 70 percent for the non-immigrant population. Amongst African immigrants the figure is merely 43 percent.  In neighboring Sweden, a recent government report noted that the employment rate of Somalians was merely 21 percent. This can be compared to 46 percent in Canada and 54 percent in the US for the same group. The low incentives for transitioning from welfare to work in Sweden and Norway compared to in Canada and the US explain at least part of this difference.

    But a failure of integration is hardly the sole explanation for the social unrest which extends well beyond immigrant youth. Why not add another relevant perspective to the puzzle, namely the increasing marginalization that some young men feel across the continent? This frustration is hardly an excuse for violence, but relates to important social phenomena which deserve to be explored, and targeted with the right policies.

    Youthful exclusion from the labor market constitutes a major challenge to European economies. Unemployment for European youth is in many countries more than twice the level of adult workers. The youth unemployment in advanced economies is, according to the International Labour Organization, estimated at an average level of 18 percent. Some countries, such as Switzerland, Austria and Germany, fare relatively well with a rate below ten percent. In others, such as the UK, France and Sweden, around one in five of the youth is unemployed. In Spain and Greece the share recently peaked at a rate of one in two.

    It is hardly news that youth who face unemployment have a tendency to become angry, and to translate this anger to violence. What has become increasingly evident is how much this situation pertains particularly to men. 

    To begin with we can see that a number of societal trends in particular favor women’s career opportunities. Girls tend to perform better in school, regardless of class, place of residence or ethnicity. Young women also, not only in developed countries but even globally, now constitute the majority of students in higher education. Another important change which in particular benefits women’s career opportunities is urbanization. Large cities attract talented young people like magnets. The attraction tends to be greatest for young women, who find employment and opportunities for entrepreneurship in the sprawling service sectors. Men who remain behind in less densely populated areas sometimes struggle to find both work and a spouse.

    As a whole, we have little reason to feel sorry for men in the labor market. Since women still take the primary responsibility for children and family, men can on average invest much more time on their careers and thus more often reach the top. But while some men succeed, others fall behind. Men end up dominating not only the top of society but also the bottom. After having failed in school, many men face rejection in both the labor market and the marriage market. They are left with little in terms of social capital, in terms of valuable know-how and established social networks.

    One reason for why frustration grows is that for men the link between success in work and success in finding a partner is very strong. Men without higher education for example face a higher chance of never becoming a parent, whilst men with higher degrees face the lowest chance (the relation is the opposite for women, where the individuals with higher education face the highest risk of remaining childless).  Extreme opinions, racism and violence are not uncommon among young men who feel they have little chance of making their way in society.

    We should of course stress individual responsibility. But awareness of the alienation felt by some young men has the danger of morphing into a considerable long-term problem, even in wealthy European nations. In previous generations, a considerable amount of “simple jobs" existed in manufacturing, forestry, agriculture and the like which were suited for young individuals with limited education. Today, such jobs are far less available.

    Part of the explanation is that technological changes and increasing global competition are pushing the labor market towards higher degree of specialization. Another reason is that policies in many modern countries, due in part to bureaucratic regulation, work to slow industrial development. Although industrial job growth is clearly possible and very promising in developed nations, many politicians wrongly believe that new industry has no future in rich Europe.

    The lacking interest to open up for growth in manufacturing is combined with the fact that education systems in countries such as the UK and Sweden are not good at encouraging students with low academic interest to ready themselves for manufacturing and other technical jobs – the situation is much different in for example Germany and Switzerland, with promising apprentice systems. In addition a strong social stigma has begun to become associated with not having a higher degree. This prompts individuals to choose even university courses that aid them little if any on the labor market, rather than take available simple jobs and climb the career ladder by developing practical knowledge.   

    Frustrated young men should never be excused in their acts of violence. But we must take their lack of hope seriously. Both policies and the education system should be reformed, so that the simple entry-level jobs that are suited for young men who lack academic skills or interest are opened up. Such policies would as an added bonus boost growth, employment and in particular benefit smaller cities and rural regions. We surely need ample policies to boost women’s’ career opportunities and entrepreneurship, but we should also recognize the challenges tied to the increasing marginalization for the men who feel little hope of progressing in society by following the rules.

    Dr. Nima Sanandaji is a Swedish author of Kurdish-Iranian origin. He has written two books about womens carreer opportunities in Sweden, and is upcoming with the report “The Equality Dilemma” for Finnish think-tank Libera.

    Husby riot photo by Wiki Commons user Telefonkiosk.

  • How the Left Came to Reject Cheap Energy for the Poor

    Eighty years ago, the Tennessee Valley region was like many poor rural communities in tropical regions today. The best forests had been cut down to use as fuel for wood stoves. Soils were being rapidly depleted of nutrients, resulting in falling yields and a desperate search for new croplands. Poor farmers were plagued by malaria and had inadequate medical care. Few had indoor plumbing and even fewer had electricity.

    Hope came in the form of World War I. Congress authorized the construction of the Wilson dam on the Tennessee River to power an ammunition factory. But the war ended shortly after the project was completed.

    Henry Ford declared he would invest millions of dollars, employ one million men, and build a city 75 miles long in the region if the government would only give him the whole complex for $5 million. Though taxpayers had already sunk more than $40 million into the project, President Harding and Congress, believing the government should not be in the business of economic development, were inclined to accept.

    George Norris, a progressive senator, attacked the deal and proposed instead that it become a public power utility. Though he was from Nebraska, he was on the agriculture committee and regularly visited the Tennessee Valley. Staying in the unlit shacks of its poor residents, he became sympathetic to their situation. Knowing that Ford was looking to produce electricity and fertilizer that were profitable, not cheap, Norris believed Ford would behave as a monopolist. If approved, Norris warned, the project would be the worst real estate deal “since Adam and Eve lost title to the Garden of Eden.” Three years later Norris had defeated Ford in the realms of public opinion and in Congress.

    Over the next 10 years, Norris mobilized the progressive movement to support his sweeping vision of agricultural modernization by the federal government. In 1933 Congress and President Roosevelt authorized the creation of the Tennessee Valley Authority. It mobilized thousands of unemployed men to build hydroelectric dams, produce fertilizer, and lay down irrigation systems. Sensitive to local knowledge, government workers acted as community organizers, empowering local farmers to lead the efforts to improve agricultural techniques and plant trees.

    The TVA produced cheap energy and restored the natural environment. Electricity from the dams allowed poor residents to stop burning wood for fuel. It facilitated the cheap production of fertilizer and powered the water pumps for irrigation, allowing farmers to grow more food on less land. These changes lifted incomes and allowed forests to grow back. Although dams displaced thousands of people, they provided electricity for millions.

    By the 50s, the TVA was the crown jewel of the New Deal and one of the greatest triumphs of centralized planning in the West. It was viewed around the world as a model for how governments could use modern energy, infrastructure and agricultural assistance to lift up small farmers, grow the economy, and save the environment. Recent research suggests that the TVA accelerated economic development in the region much more than in surrounding and similar regions and proved a boon to the national economy as well.

    Perhaps most important, the TVA established the progressive principle that cheap energy for all was a public good, not a private enterprise. When an effort was made in the mid-’50s to privatize part of the TVA, it was beaten back by Senator Al Gore Sr. The TVA implicitly established modern energy as a fundamental human right that should not be denied out of deference to private property and free markets.

    The Rejection of the State and Cheap Energy

    Just a decade later, as Vietnam descended into quagmire, left-leaning intellectuals started denouncing TVA-type projects as part of the American neocolonial war machine. The TVA’s fertilizer factories had previously produced ammunition; its nuclear power stations came from bomb making. The TVA wasn’t ploughshares from swords, it was a sword in a new scabbard. In her 1962 book Silent Spring, Rachel Carson described modern agriculture as a war on nature. The World Bank, USAID, and even the Peace Corps with its TVA-type efforts were, in the writings of Noam Chomsky, mere fig leaves for an imperialistic resource grab. 

    Where Marx and Marxists had long viewed industrial capitalism, however terrible, as an improvement over agrarian feudalism, the New Left embraced a more romantic view. Before the arrival of “progress” and “development,” they argued, small farmers lived in harmony with their surroundings. In his 1973 book, Small is Beautiful, economist E.F. Schumacher dismissed the soil erosion caused by peasant farmers as “trifling in comparison with the devastations caused by gigantic groups motivated by greed, envy, and the lust for power.” Anthropologists like Yale University’s James Scott narrated irrigation, road-building, and electrification efforts as sinister, Foucauldian impositions of modernity on local innocents. 

    With most rivers in the West already dammed, US and European environmental groups like Friends of the Earth and the International Rivers Network tried to stop, with some success, the expansion of hydroelectricity in India, Brazil and elsewhere. It wasn’t long before environmental groups came to oppose nearly all forms of grid electricity in poor countries, whether from dams, coal or nuclear. “Giving society cheap, abundant energy,” Paul Ehrlich wrote in 1975, “would be the equivalent of giving an idiot child a machine gun.” 

    Elaborate justifications were offered as to why poor people in other countries wouldn’t benefit from cheap electricity, fertilizer and roads in the same way the good people of the Tennessee Valley had. Biomass (eg, wood burning), solar and efficiency “do not carry with them inappropriate cultural patterns or values.” In a 1977 interview, Amory Lovins added: “The whole point of thinking along soft path lines is to do whatever it is you want to do using as little energy — and other resources — as possible.” 

    By the time of the United Nations Rio environment conference in 1992, the model for “sustainable development” was of small co-ops in the Amazon forest where peasant farmers and Indians would pick nuts and berries to sell to Ben and Jerry’s for their “Rainforest Crunch” flavor. A year later, in Earth in the Balance, Al Gore wrote, “Power grids themselves are no longer necessarily desirable.” Citing Schumacher, he suggested they might even be “inappropriate” for the Third World.

    Over the next 20 years environmental groups constructed economic analyses and models purporting to show that expensive intermittent renewables like solar panels and biomass-burners were in fact cheaper than grid electricity. The catch, of course, was that they were cheaper because they didn’t actually deliver much electricity. Greenpeace and WWF hired educated and upper-middle class professionals in Rio de Janeiro and Johannesburg to explain why their countrymen did not need new power plants but could just be more efficient instead.

    When challenged as to why poor nations should not have what we have, green leaders respond that we should become more like poor nations. In The End of Nature, Bill McKibben argued that developed economies should adopt “appropriate technology” like those used in poor countries and return to small-scale agriculture. One “bonus” that comes with climate change, Naomi Klein says, is that it will require in the rich world a “type of farming [that] is much more labor intensive than industrial agriculture.” 

    And so the Left went from viewing cheap energy as a fundamental human right and key to environmental restoration to a threat to the planet and harmful to the poor. In the name of “appropriate technology” the revamped Left rejected cheap fertilizers and energy. In the name of democracy it now offers the global poor not what they want — cheap electricity — but more of what they don’t want, namely intermittent and expensive power. 

    From Anti-Statism to Neo-Liberalism

    At the heart of this reversal was the Left’s growing suspicion of both centralized energy and centralized government. Libertarian conservatives have long concocted elaborate counterfactuals to suggest that the TVA and other public electrification efforts actually slowed the expansion of access to electricity. By the early 1980s, progressives were making the same claim. In 1984, William Chandler of the WorldWatch Institute would publish the “The Myth of the TVA,” which claimed that 50 years of public investment had never provided any development benefit whatsoever. In fact, a new analysis by economists at Stanford and Berkeley, Patrick Klein and Enrico Moretti, find that the “TVA boosted national manufacturing productivity by roughly 0.3 percent and that the dollar value of these productivity gains exceeded the program’s cost.”

    Even so, today’s progressives signal their sophistication by dismissing statist solutions. Environmentalists demand that we make carbon-based energy more expensive, in order to “harness market forces” to cut greenhouse gas emissions. Global development agencies increasingly reject state-sponsored projects to build dams and large power plants in favor of offering financing to private firms promising to bring solar panels and low-power “microgrids” to the global poor — solutions that might help run a few light bulbs and power cell phones but offer the poor no path to the kinds of high-energy lifestyles Western environmentalists take for granted.

    Where senators Norris and Gore Sr. understood that only the government could guarantee cheap energy and fertilizers for poor farmers, environmental leaders today seek policy solutions that give an outsized role to investment banks and private utilities. If the great leap backward was from statist progressivism to anarcho-primitivism, it was but a short step sideways to green neoliberalism.

    But if developed-world progressives, comfortably ensconced in their own modernity, today reject the old progressive vision of cheap, abundant, grid electricity for everyone, progressive modernizers in the developing world are under no such illusion. Whether socialists, state capitalists, or, mostly, some combination of the two, developing world leaders like Brazil’s Lula da Silva understand that cheap grid electricity is good for people and good for the environment. That modern energy and fertilizers increase crop yields and allow forests to grow back. That energy poverty causes more harm to the poor than global warming. They view cheap energy as a public good and a human right, and they are well on their way to providing electricity to every one of their citizens. 

    The TVA and all modernization efforts bring side effects along with progress. Building dams requires evicting people from their land and putting ecosystems underwater. Burning coal saves trees but causes air pollution and global warming. Fracking for gas prevents coal burning but it can pollute the water. Nuclear energy produces not emissions but toxic waste and can result in major industrial accidents. Nevertheless, these are problems that must be dealt with through more modernization and progress, not less.

    Viewed through this lens, climate change is a reason to accelerate rather than slow energy transitions. The 1.3 billion who lack electricity should get it. It will dramatically improve their lives, reduce deforestation, and make them more resilient to climate impacts. The rest of us should move to cleaner sources of energy — from coal to natural gas, from natural gas to nuclear and renewables, and from gasoline to electric cars — as quickly as we can. This is not a low-energy program, it is a high-energy one. Any effort worthy of being called progressive, liberal, or environmental, must embrace a high-energy planet.

    Shellenberger and Nordhaus are co-founders of the Breakthrough Institute, a leading environmental think tank in the United States. They are authors of Break Through: From the Death of Environmentalism to the Politics of Possibility.

    This piece originally appeared at TheBreakthrough.org.

  • The Associate’s Degree Payoff: Community College Grads Can Get High-Paying Jobs, and Here Are Some Examples

    For some students, the decision to enroll at a community college is simple. A two-year school offers the credential they need at a much lower cost than a university, and the earnings post-degree are on par with — or better than — what they would make after going to a four-year school.

    Less debt, similar salary — the math adds up.

    But outside fields that require specific certificates or degrees, it’s not always clear to students which higher education path they should take. And as Jeffrey Selingo wrote in a recent Wall Street Journal weekend essay, a number of websites are cropping up that allow students and parents to compare the return on investment from college to college.

    Based on first-year salaries of graduates (one of the metrics included at CollegeMeasures.org via state unemployment insurance programs), Selingo points out that some community college degrees have been shown to have a stronger early return than bachelor’s degrees.

    Think a community-college degree is worth less than a credential from a four-year college? In Tennessee, the average first-year salaries of graduates with a two-year degree are $1,000 higher than those with a bachelor’s degree. Technical degree holders from the state’s community collegesss often earn more their first year out than those who studied the same field at a four-year university.

    Take graduates in health professions from Dyersburg State Community College. They not only finish two years earlier than their counterparts at the University of Tennessee at Knoxville, but they also earn $5,300 more, on average, in their first year after graduation.

    This isn’t new information by any means. In 2011, the Georgetown Center on Education and the Workforce, an EMSI client, released its well-publicized “College Payoff” report. Anthony Carnevale and his colleagues looked at median lifetime earnings — a key distinction from the sources that Selingo cites — for all educations levels by occupation to show that 28.2% of associate’s degree graduates out-earn bachelor’s degree holders. This is just one example of what Georgetown referred to as “earnings overlap” (see the following chart).

    Georgetown’s report provides clear evidence that degree level matters when it comes to lifetime earnings. But another critical element is the actual job that a person chooses.

    There are many fields — in healthcare, engineering, technology, manufacturing, etc. — in which associate’s degree graduates can make just as much or more than bachelor’s degree holders. But what specific careers are we talking about? Let’s take a look using the Georgetown study and EMSI data.

    Well-Paying Jobs That (Often) Take an Associate’s Degree to Get

    To get a sense of the top-earning jobs in which the majority of workers have an associate’s degree, we looked the educational attainment breakdown by detailed occupation from U.S. Census Bureau’s American Community Survey, via EMSI’s Analyst. This data is only available at the national level; the most recent numbers are from 2009 (see here).

    The following occupations are ones in which associate’s degree holders (or associate’s degree plus some college) comprise the largest percentage of workers. Note that the educational attainment varies for most occupations (e.g., most CEOs have a bachelor’s, some have a master’s, a few have less than a high school diploma). Also, the educational requirements for some occupations change over time. For registered nurses, the typical education needed for entry, as assigned by the BLS, is an associate’s degree — even though 43% of all nurses hold a four-year degree. For this reason, we excluded RNs from our analysis. (We also excluded air traffic controllers because only 14% have an associate’s degree).

    1. Radiation Therapists ($37.36 median hourly earnings)

    Associate’s degree holders make up 42% of this healthcare occupation, slightly higher than bachelor’s degree grads (38%). For both degree levels, workers in this field earn $2.1 million in their lifetimes, per Georgetown. And the job outlook is strong, too. Radiation therapist jobs have increased 14% nationally since 2001, and the female-dominated occupation is projected to grow another 6% from 2012-2015.

    2. Dental Hygienists ($34.77)

    The bulk of hygienists (57%) have associate’s degrees, followed next by bachelor’s degrees (30%). Georgetown lumped these workers in with other healthcare practitioners and technical occupations, but still the lifetime earnings are similar — $2.1 million for two-year degree holders; $2.2 million for four-year grads.

    This lucrative, female-dominated occupation is projected to grow 8% from 2012-2015.


    3. Nuclear Medicine Technologists ($33.96)

    Far and away the largest chunk of workers in this field have associate’s degrees (45%). Although nuclear medicine technologists are not included in the Georgetown report, associate’s degree holders among a larger subset of workers, diagnostic related technologists and technicians, earn $2.2 million in their lifetimes, compared to $2.4 million among bachelor’s degree grads.

    4. Nuclear Technicians ($32.85)

    The first non-healthcare field on our list, these workers are not to be confused with nuclear medicine technologists. Nearly 45% of these workers have an associate’s degree or some college, compared to 24% who have bachelor’s degrees and 23% who have a high school diploma or equivalent. (Note: Georgetown does not report lifetime earnings at the two-year level for nuclear technicians).

    More than a third of fewer than 9,000 nuclear technicians in the U.S. work in two specific industries — electric power distribution and fossil fuel electric power generation.

    5. Diagnostic Medical Sonographers ($31.83)

    Similar to No. 3 on our list, nuclear medicine technologists, 45% of workers in this field have an associate’s degree.

    This field has seen a 63% increase in jobs since 2001, from 34,752 to an estimated 56,514. And it’s projected to grow another 12% from 2012-2015.

    6. Aerospace Engineering and Operations Technicians ($29.48)

    Only 23% of these workers have associate’s degree, but another 33% have some college/no degree, which is why the typical education needed to enter this occupation (as assigned by the BLS) is an associate’s degree.

    Unlike the previous occupations on this list, the job market for aerospace techs isn’t so rosy. Employment in this field declined 16% from 2001-2012 (with the bulk of the jobs losses from 2001-2003 and 2008-2010). It’s projected to decline by 2% from 2012-2015.

    7. Engineering Technicians, Except Drafters, All Other ($28.54)

    Like aerospace technicians, more than half of these workers (56%) have either an associate’s degree or some college/no degree. But unlike the above occupation, this field is growing: employment increased 5% from 2001-2012 and is projected to go up 4% from 2012-2015.

    8. Respiratory Therapists ($27.04)

    A whopping 56% of respiratory therapists hold an associate’s degree, followed by 24% with a bachelor’s degree. The lifetimes earnings, as reported by Georgetown, are the same as for radiation therapists: $2.1 million for both degree levels.

    This is one of the strongest-performing associate’s degree occupations. The U.S. had 28% more respiratory therapists in 2012 than in 2001, and the field is projected to grow 8% through 2015.

    Note: This list doesn’t include the many high-paying jobs available through vocational technical education. Plumbers, electricians, welders — and an array of other skilled trades — often offer better wages than bachelor’s degree-required fields. See our piece on the aging skilled trades workforce here.

    Joshua Wright is an editor at EMSI, an Idaho-based economics firm that provides data and analysis to workforce boards, economic development agencies, higher education institutions, and the private sector. He manages the EMSI blog and is a freelance journalist. Contact him here.

  • The Culture War That Social Conservatives Could Win

    For the better part of a half century, social conservatives have been waging a desperate war to defend “family values.” However well-intentioned, this effort has to be written off as something of a failure. To continue it would cause even more damage to many of the things that social conservatives say they care most about.

    It’s not that we don’t need some sort of culture war — a conflict over values is the ultimate liberal value — but it makes no sense to keep waging a losing one. This includes, first and foremost, attempts to oppose gay marriage, something that almost half of Americans accept, according to Pew. Gay marriage wins even more support among millennials, who will over time come to shape our politics. Other social conservative efforts, like prayer in school or efforts to establish Christianity as a state religion, as recently was proposed in North Carolina’s legislature, make even less political sense.

    Obscured by such divisive approaches are larger issues, such as the durability of the family unit, that should be of concern to both liberals and conservatives. The number of children born to single mothers continues to soar. In 1970, 11% of births were to unmarried mothers; by 1990, that number had risen to 28%. Today, 41% of all births are to unmarried women. Most frightening of all, for mothers under 30, the rate is 53%.

    And Americans are increasingly eschewing not only marriage, but having children, although not yet to the extent of their counterparts in East Asia and Europe. This is particularly evident among the young.

    Not coincidentally, this is taking place as church affiliation, if not in free fall, is clearly on the downward trend. Secularism and the promotion of singleness and childlessness has gained cachet. Contemporary social thinking, as epitomized by “creative class” theorist Richard Florida, essentially links “advanced” society to the absence of religious values. Indeed virtually the entire span of modern urbanism — which has become entangled with both modern progressivism — not only disdains religiosity but gives remarkably short shrift to issues involving families.

    These trends represent a threat to values that many, if not most, Americans still adhere to, such as the primacy of the family, the importance of faith and the centrality of children. You don’t have to be an absolute believer in the revealed veracity of the Bible to see the danger posed by a national shift away from family and toward a hyper-individualist ethos.

    The question is not whether there should be a debate, or, if you will, a “war” over culture, but on what terms this struggle should be waged. This can’t be done, as one conservative writer suggested to me last year, “by marching back to the 1950s.” History does not move backward, and trying to inspire the next generations to live or think like their parents or grandparents simply lacks any serious appeal. There is truth to the Democratic claim that conservative Republicans suffer a “modernity deficit” that could assure them permanent minority status.

    But for all the failings of social conservatives, we should not ignore the reality that the decline of the family and of child-bearing must be addressed if this society is going to have any dynamism in the decades ahead. The largely native-born population is demonstrating all the essential weakness of their counterparts in Europe and East Asia; last year, more whites died than were born. Despite a total rise in population of 27 million from 2000 to 2010, there were actually fewer births in 2010 than 10 years earlier.

    Immigrants may bail us out in the short run — migrants and their offspring have accounted for one-third of the nation’s population growth over the past three decades—but the longer they stay, the more marriage and child-bearing decline over time. Even more seriously, 44% of all millennials think marriage is “obsolete”; among their baby boomer parents, the number is 35%. And fewer young people think childbearing is even important in a marriage.

    This could have disastrous social consequences, Conservative analysts such as Charles Murray point out the deterioration of family life among working-class whites, as measured by illegitimacy and low marriage rates. Among white American women with only a high school education, 44% of births are out of wedlock, upfrom 6% in 1970. With incomes dropping and higher unemployment, Murray predicts the emergence of a growing “white underclass” in the coming decade.

    Sadly, neither of the rising political tendencies — what might be seen as “clerical” liberalism and its libertarian counterpoint — are focused on the fundamental social deficit. Libertarianism, rapidly becoming the most legitimate form of conservatism, is almost psychologically incapable of addressing social issues. “The libertarian priority is meeting market needs,” noted Ben Domenach in Real Clear Politics recently.

    Markets are wonderful things, but what if, as they evolve, they can also tilt against families and communities? If everything boils down to what Marx called “the cash nexus” or simple individual “empowerment,” then having children, or committing to marriage, becomes far less palatable. It’s easy for well-heeled tech entrepreneurs, or inheritors of vast wealth, to speak about principles of classical liberalism, but if free markets fail to serve society’s needs, then support for competitive capitalism will necessarily fade.

    Libertarians tend to detest class warfare, but seem incapable of identifying with anyone other than those they consider “talented.” They seem unconcerned about market manipulations (inevitably aided and abetted by government) that might force more people out of homes and into congested, overpriced apartments. Or how technology is destroying whole classes of jobs while programs to train people for needed skills remain poorly funded.

    Ironically such an approach plays into the hands of the sworn enemies of libertarians, what I call the clerical progressives, who inhabit  certain cosseted institutions: universities, the media and foundations. This is where the new theology of planning the lives of the masses has been cooked up; it is a dogma of both power and belief, one that sees little role for the family as the central institution in society.

    This represents a very dangerous break point from the kind of progressivism embraced by Harry Truman, Pat Brown and traditional liberalism. Rather than see government as something that can help families achieve greater autonomy, and spark voluntary association, the clerical progressives prefer an approach that embraces government in place of parenting, and elevates planning from above over grassroots community.

    If you want to glimpse the world view of the progressive clerisy, watch the inane “Life of Julia” presented last year by the Obama campaign. In “Julia,” virtually every step in life is predicated on some government service. She does “decide” to have a child although a man is never mentioned (one can’t assume that progressive clerics accept the notion of immaculate conception), and the child, once sent off to government-funded pre-school, never reappears. So much for the permanence of family ties.

    Julia did not upset modern progressives because it reflected their worldview — Ms. even carried a piece hailing Julia as “a future standard for women” who are increasingly told that they don’t need men either as long-term partners in child-raising or even as spouses.

    This divergence from familialism represents the real basis for a new culture war. This means moving away from a focus on divisive and peripheral issues, such as gay marriage at least speaks to the desire for long-lasting bonds between people. The new cultural warrior might seek instead combine some elements of traditional social democracy — in terms of a commitment to upward mobility — with the assumption that family represents the essential institution in our society.

    Nowhere will this battle be more intense than in the field of urban planning. The current generations of progressives ascribe, almost universally, to the notion that people should be cajoled, by price or by edict, away from owning homes large enough to raise modern families, particularly those with more than one child. Today’s progressives, echoing an old tradition among urban aesthetes, find our century-long movement to suburbia — which has slowed but barely stopped — an abomination worthy of contempt and eradication.

    In the end what is needed is a new political counterpoint that embraces family as critical to the health of the society. This approach may not fit the conventional preferences of many conservatives, and most progressives, but is a necessary counterpoint to a process that threatens the future trajectory of our society.

    Joel Kotkin is executive editor of NewGeography.com and a distinguished presidential fellow in urban futures at Chapman University, and a member of the editorial board of the Orange County Register. He is author of The City: A Global History and The Next Hundred Million: America in 2050. His most recent study, The Rise of Postfamilialism, has been widely discussed and distributed internationally. He lives in Los Angeles, CA.

    This piece originally appeared at Forbes.com.

    Photo by John Perkins.

  • Kid-Friendly Neighborhoods: Takin’ It To The Streets

    Planners and parents have been concerned about two widely reported, and most likely related, trends: the increasing percentage of overweight children, and the growing number of hours that kids spend looking at a screen, be it a television or a laptop. These two activities take up most of the free time kids have after school. Add on the tendency for kids to be driven or bussed to school, and the result is what has been called a “nature deficit” — a disconnect to natural surroundings. Over the long run, the outcome could be a generation of physically unfit and socially maladjusted young adults. The warning statistics are all around us. Is there a way out of this unhealthy cycle? One answer may rest with our planning decisions. Can neighbourhoods be laid out so as to avoid these unwelcome results?

    Evidence from research pronounces an unequivocal ‘yes’. Many pieces shape the puzzle that forms the complete answer. The first element of a community friendly to outdoor childhood activity is its ability to draw people — adults as well as kids — out of their houses and prompt them to socialize with neighbours. Since 1980, several studies have shown that the great inhibitor to socializing on a street is traffic. The heavier the traffic, the less the socializing. When there’s not much socializing, adults and kids make fewer the friends, and the motivation to get out of the house goes down. A 2008 study on this showed that people who lived on cul-de-sacs had four times as many friends and two times the number of acquaintances as residents on through streets with heavy traffic did. It seems intuitive, and research confirms it.

    A second clue can be found by looking at the kinds of streets young kids play on most often. You may have guessed that research shows it’s the cul-de-sac. Kids on cul-de-sacs spent 50 percent more time playing actively than kids on other streets. Importantly, the benefits to kids who play on the street continue. Other studies have shown that play and exercise in the early years build an affinity for activity that can last a lifetime, and that, through friendships, these kids also develop the spirit of a beehive at work.

    The third puzzle piece needed to create a kid-friendly place is the presence of magnets in the surroundings. These are factors that pull kids out of their homes and send them walking to school, the corner store and other destinations. And one study found that of all the elements that would attract kids of all ages, the strongest common force was the presence of open space.

    How parents feel about letting kids play on the street, walk to school, or ride their bicycles plays into the result, too. Justified or not, parental fear and unease limits the range of activities that kids engage in, and builds unhealthy habits.

    This knowledge from the field provides a sketch of the essential elements of a kid-friendly neighbourhood and, beyond that, a child-friendly district. Which elements are most essential?

    There shouldn’t be any through streets in an area about the size of about ten city blocks. That feature gives kids plenty of room to move around in a low-traffic, low-speed environment. Parents socialize and kids play; parental insecurity fades. The easiest way to create this is by using connected cul-de-sacs and crescents.

    Every kid-friendly neighbourhood area should have at least one open space, whatever its size. That grants a safe haven for play — a magnet. Its land value will be recovered through higher values for the homes around it. Real estate research shows that homes near cul-de-sacs and open spaces command higher prices. And where there are bike and foot paths separated from the road, with few road crossings, parents are more likely to let their kids walk or bike ride.

    Can all this be achieved with a layout? Yes, by selectively fusing well known elements of available community plans. A number of examples of this fusion exist, and plenty of advice is accessible; check out, for example, Taking the Guesswork Out of Designing for Walkability.

    These techniques are not just for planning new neighbourhoods. Existing places can also be transformed to create child-friendly environments. Initiatives in many cities have changed neighbourhoods with positive results.

    How can you know when a neighbourhood has succeeded at incorporating these creative elements? One of the sure tell signs is chalk hopscotch marks left on the pavement! It signals that the kids have taken possession of a street, and are having fun. Every new family that moves into the neighbourhood will be heir to its physical and social benefits.

    Fanis Grammenos is the founder of Urban Pattern Associates (UPA), and was a Senior Researcher at Canada Mortgage and Housing Corporation for over 20 years, focused on housing affordability, building adaptability, municipal regulations and sustainable planning. Research on street network patterns produced the innovative Fused Grid. He holds a degree in Architecture from the U of Waterloo. For additional references on the studies mentioned here, please e-mail the author at fanis.grammenos@gmail.com.

    Flickr Photo by Joe Duty, Little Kid Down the Road chalking the sidewalk.

  • New York and California: The Need for a “Great Reset”

    Despite panning Texas Governor Rick Perry’s initiative to draw businesses from New York, Slate’s business and economics correspondent, Matt Yglesias offers sobering thoughts to growth starved states along on the West Coast and in the Northeast.

    “…the Texas gestalt is growth-friendly because, quite literally, it welcomes growth while coastal cities have become exceptionally small-c conservative and change averse. But if New York and New Jersey and California and Maryland and Massachusetts don’t want to allow the construction of lots of housing units, then it won’t matter that Brooklyn, N.Y.; and Palo Alto, Calif.; and Somerville, Mass.; are great places to live—people are going to live in Texas, where there are also great places to live, great places that actually welcome new residents and new building.”

    The entire country would benefit if states like California, New York, Massachusetts and New Jersey were to enact policies to compete with Texas, as Yglesias suggests.

  • Cities Still Being Squeezed

    Recent announcements of state budget surpluses have led to the popping of corks across the deepest-blue parts of America, particularly here in California. In some cases, the purported fiscal recovery has been enshrined by an emerging hagiography about Jerry Brown’s steadfastness in the face of budget debacles. One prominent piece even argued that the “smart, bold progressive movement” actually “saved” the Golden State, in part, by forcing up income tax rates.

    Yet, as Walter Russell Mead, among others, has argued, the states’ fiscal meltdown has not been averted, but simply delayed, by the current asset-driven economic recovery. Taken together, the states owe $1 trillion in unfunded pension obligations alone. These costs are eating up much of the projected surpluses, even in prosperous and relatively frugal states such as Texas.

    But the first place where the fiscal blowout will hit the road may be at the local level. This is, in part, because one way that states try to improve their balance sheets is by cutting aid to localities while imposing new mandates dealing with everything from housing to green policies. This has occurred in such places as Pennsylvania, Massachusetts, New Jersey and New York.

    “Quietly and without fanfare, governors and state legislators approved overly generous pension packages, let stand costly, antiquated laws and continued to shift costs from Albany to our front doors,” noted one upstate New York newspaper.

    So, even as state budgets improve somewhat, municipal budgets remain very vulnerable to cutbacks. Pew reports that both state aid and property-tax collections have continued to drop, something that perhaps will be slowed by a developing bubble in real estate values.

    Similarly, according to a report by the National League of Cities, city financers at the end of 2012 projected a sixth consecutive year of year-over-year declining revenue. The ability of localities – particularly the most-distressed – to endure this pattern much longer is somewhat dubious.

    In fact, the run up to a wave of municipal bankruptcies has begun. Seven major municipalities have already filed for bankruptcy, the largest being the city of San Bernardino. To a large extent, these bankruptcies are being driven by unfunded obligations for employee pensions. A new study by the Brookings Institute “estimated that the aggregate unfunded liabilities of locally administered pension plans top $574 billion. … On average, pensions consume nearly 20 percent of municipal budgets.” But the worst is yet to come. “[I]f trends continue, over half of every dollar in tax revenue would go to pensions, and, by some estimates, in some cases, would suck up 75 percent of all tax revenue.”

    This has locked many localities across the country into a classic vicious cycle as they try to dig their way back to growth. Unless radically reformed, health care and retirement obligations to employees seem certain to outweigh the ability to fund necessary government functions, the very things – infrastructure, public safety and other economic development components – necessary to nurse a region and its governments back to health.

    By far, most vulnerable will be those cities with high unemployment, rising crime and tepid recoveries. These will include many of America’s most violent cities – Detroit, Cleveland, St. Louis, Chicago, Memphis, Tenn., – as well as those with generally dysfunctional schools and decaying infrastructure. The idea of cutting police services in such places would invite even greater deterioration of public order.

    This process of small-scale deterioration is already well advanced in California. When it comes to buck-passing to the local level, no one can outdo the Golden State. Gov. Brown’s “realignment” strategy put the responsibility for state justice programs largely on local governments (though this came with promises of increased state aid). Brown also oversaw the dissolution of the state’s 400-plus redevelopment agencies, some of which may now be forced into bankruptcy. Many cities consider these agencies, which provide tax relief to businesses, as one of their most effective economic development tools. So, while state debt is expected to decline by $1.7 billion next year, local-government debt in California is actually set to increase by $600 million.

    Many counties and localities risk also losing their health care benefits under Gov. Brown’s revised fiscal year 2013-14 budget. These changes will be hardest on those localities with the biggest problems, notably some smaller cities already tilting on the edge of bankruptcy. As many as 10 others, including Oakland and San Jose, could join them. Many others are simply cutting back; Sacramento is now asking newly recruited police officers to pay into their pension plans before joining the force.

    These problems also are deeply entrenched in the state’s largest city. Los Angeles, more than any of the top 10 cities in the country, with the possible exception of Chicago, still suffers from quasirecessionary conditions. Not surprisingly, L.A.’s budget situation, in large part due to pension and other employee-related costs, remains perilous. A former mayor, businessman Richard Riordan, has predicted that, unless pensions and compensation are reformed dramatically, the city will eventually slide, inexorably, toward bankruptcy.

    The primary culprit in this slide, notes Riordan, has been the political domination of Los Angeles, and other cities, by public employee unions and the lack of true political competition. The original poster child for this is San Bernardino, where labor costs consumed 80 percent of the city budget, in large part due to public-sector unions’ investment in local political races.

    Bigger and somewhat economically stronger, Los Angeles may not soon go the way of San Bernardino, but its fiscal problems remain severe, with a projected $800 million deficit over the next four years and pensions that are underfunded by at least $15 billion. Clearly, these shortfalls will continue to undermine the city’s ability to keep its streets safe, roads paved and parks operating – until City Hall is willing to stand up to the public-sector lobby.

    The most recent citywide election was not too comforting in this regard, since both candidates for mayor were reliable allies of city worker unions. But, at least, it should be noted that the loser, Wendy Greuel, was, in part, defeated by revelations of her massive financial backing from those unions. It almost certainly hurt her standing with what should have been her base among more conservative, quasisuburban voters from her “hood” – the San Fernando Valley.

    Yet, even if incoming mayor Eric Garcetti can right the ship, residents of Los Angeles are likely to face a combination of rising taxes and fees for years to come to address soaring pension costs. Given the financial drag of pension and other employee benefit obligations, even traditional city services, such as street repair, will likely need to be funded by additional debt or fees on property owners.

    Short of major reform, this self-defeating pattern of higher local taxes and fewer local services is likely to continue even if state economies and budgets climb out of their recent distress. Yet, at the same time, this presents an opportunity to rebalance the relationship between private- and public-sector interests.

    If good habits are learned first in the home, perhaps the road to fiscal health will have to begin at the local level. Sacramento and other state capitals have demonstrated skill at kicking the can down the road while shirking their responsibilities to local governments. Instead, it may fall upon the localities to come up with ways to overcome decades of poisonously irresponsible decision making and concoct the proper fiscal antidote.

    Joel Kotkin is executive editor of NewGeography.com and a distinguished presidential fellow in urban futures at Chapman University, and a member of the editorial board of the Orange County Register. He is author of The City: A Global History and The Next Hundred Million: America in 2050. His most recent study, The Rise of Postfamilialism, has been widely discussed and distributed internationally. He lives in Los Angeles, CA.

    This piece originally appeared in the Orange County Register.