Wednesday, July 27, 2011

Salvaging Learning

The other day, courtesy of the public library system here in Cumberland, I had the chance to curl up on the couch with a copy of Canadian journalist Jonathan Kay’s survey of American conspiracy theorists, Among the Truthers. I’m sorry to say it was a disappointing read. Kay’s an engaging writer and the book has some good moments, but on the whole it was a depressing reminder of the reasons that the word “journalistic” has become a synonym for “facile and tendentious.”

This is doubly unfortunate, because the issues Kay was trying to raise are worth discussing. Over the last couple of decades, ordinary political discourse in America has been drowned out by a torrent of hatred and rage—there is really no other way to describe it—directed by people on nearly every part of the political spectrum against their perceived opponents. These days it’s become a commonplace of American political culture to insist that the world’s problems result from the deliberate malevolence of some group of people or other, whose intentions and historical role are portrayed with a degree of moral extremism that would make a third-century Gnostic gulp in disbelief. How many times, dear reader, have you seen blog posts and newspaper articles—we don’t even need to bring in the gutterslop that sloshes through talk radio these days—that flatten out the complex ambiguities of industrial civilization’s spiralling crises into an insistence that somebody or other is wrecking the world out of pure personal wickedness? This is the sort of thing I mean.

The bipartisan rise of this sort of hate politics in America, in turn, provides an exact parallel to the rise of the conspiracy theory movement that Kay tried to examine. Insist that George W. Bush was the puppet of a cabal of fascists plotting to conquer the world, or that Barack Obama is a socialist out to reduce Americans to slavery under a global dictatorship, and then it’s hardly a leap to go on to argue that Bush’s handlers masterminded the 9/11 attacks or that Obama is a Muslim illegal immigrant with a forged birth certificate. Argue for either of these latter, in turn, and you can use it to bolster your case for the limitless wickedness of your bĂȘte du jour.

It would take a book considerably more substantial than Kay’s to sort out the tangled roots of this twin pandemic of hatred and paranoia. For the moment, I want to focus on just one of the many factors involved, both because it’s not usually discussed in this context and because it’s deeply relevant to the project of this blog.

Every June, across America, a couple of million high school seniors go through graduation ceremonies in our nation’s public schools and receive the diploma that, once upon a time, certified that they had completed the general course of education proper to the citizen of a democracy. Nowadays, a sizable fraction of those graduates are functionally illiterate. More than half of them have no real notion how their government works and what the rest of the world is like, and have never had more than a passing glimpse of the major works of art, literature, and music that define America’s cultural heritage. All but a tiny fraction of them have never learned how to reason from premises to a conclusion or check the credentials of a fact.

I’m not at all sure how many of my readers outside the United States have any idea just how bad the state of education has gotten here. A pervasive philosophy of education that reduces its role to that of job training, cultural squabbles that have stripped curriculums of most of their meaningful content, reforms that made the careers of teachers and the finances of districts depend on standardized test scores and thus guaranteed that teaching students how to score high on those tests would be given priority over everything else, budget cuts that always get taken out of classroom expenses rather than administrative costs—well, you can do the math yourself. There are exceptions, but on the whole the public schools in America do a miserably poor job of teaching anything but contempt for learning.

Higher education is a more complex situation but, in some ways, an even more toxic one. Where the public schools trudge implacably onwards under the iron law of bureaucracy, colleges and universities have become an industry, governed by ethics no better than any other American business. It’s possible to get a good education from an American university if you’re lucky, smart and ruthless, but there are significant downsides to the experiment. The most important are, first, that the university system is more or less designed to leave you a quarter million dollars or so in debt by the time you finish your degree program, without the option of bankruptcy—college loans are federally guaranteed, meaning that the courts can’t discharge them—and, second, while the academic industry presents itself as a ticket to high-paying careers, the great majority of college degree programs don’t do anything of the kind. It’s been shown repeatedly that the vast majority of high school seniors who enter university now will never recover financially from the economic burden of paying off their student loans.

No doubt a case could be made, and no doubt it will be made, that the exposure to learning that comes from a college education is worth a lifetime of financial impoverishment. The difficulty with such claims is that the philosophy of education as job training that helped gut America’s public schools has done much the same thing to higher education, even in fields—such as the humanities—that sometimes claim to be exempt from the trend. In most of today’s American universities, despite a certain amount of lip service, humanities programs no longer fulfill their historic role of giving students a broad introduction to humanity’s cultural and intellectual heritage. Their focus instead is on the job training needed by future professors in one or another narrow subspecialty. Departments have to justify their existence in today’s academic industry by maximizing enrollment, however, and this means that degree programs in the humanities not only admit, but actively recruit, far more students every year than are needed to meet the demand for new professors of film studies, postcolonial literature, comparative history of ideas, and the like. That’s the reason why, as the joke goes, the first thing a liberal arts major says when he or she goes to work after graduation is “Would you like fries with that?”

Now factor in the multiple economic impacts of peak oil on a sprawling, dysfunctional collection of government bureacracies, on the one hand, and a corrupt and rapacious industry totally dependent on abundant credit and government loan guarantees, on the other. At the least, it’s a recipe for the end of American education as it’s currently practiced, and it’s not implausible that unless something else gets patched together in a hurry, it could mean the end of American education, period.

Like the rest of America’s bureaucracies and industries, education in this country got onto its current trajectory of metastatic growth in the aftermath of the Second World War, when oceans of cheap fossil fuel energy and the considerable benefits of global hegemony made no price tag look too big. When the wave of homegrown fossil fuel crested in the early 1970s, in turn, Americans—who even then were willing to blame almost anything for their troubles, other than the irritating unwillingness of the laws of physics to give them a limitless supply of energy—decided to double down and bluff, for all the world as though acting out the fantasy that we’d have plenty of energy and resources in the future would force the bluff to turn into reality.

The realization most Americans are frantically trying to stave off just now is that nature has called our bluff. That limitless new supply of energy most of us were waiting for hasn’t appeared, and there are good reasons, founded in the laws of physics, to think that it never will. In the meantime, our decision to double down has left us burdened with, among other things, a public school system and a collection of colleges and universities even more gargantuan and unaffordable than the ones we had before we doubled down, and a psychology of previous investment that all but guarantees that our society will keep on throwing good money after bad until there’s nothing left to throw. Politicians and ordinary people alike have taken to insisting, along these lines, that the solution to joblessness is to send people to college to get job training, on the assumption that this will somehow make jobs appear for them. To call this magical thinking is an insult to honest sorcerers, but it’s likely to be increasingly common in the years to come—at least until the bottom drops out completely.

Well before that happens, a system that’s already largely irrelevant to the needs of the present shows every sign of making itself completely irrelevant to the even more pressing challenges of the future. If anything is going to be salvaged from the wreckage, it’s going to have to be done by individuals who commit themselves to the task on their own time. To make sense of such a project, though, it’s going to be necessary to face a far more basic question: what, exactly, is the point of education?

That’s a far more complex question than it seems, because American culture has spent the last few decades at the far end of a pendulum swing between two sharply different understandings of education—and indeed of human knowledge itself. Call them abstraction and reflection. Abstraction is the view that holds that behind the hubbub and confusion of everyday life lies a fundamental order that can be known by the human mind, and accurately expressed by means of abstract generalizations—E=MC2, the law of supply and demand, the theory of evolution, or what have you. In an age dominated by abstraction, knowledge tends to be equated with these abstract generalizations, and education becomes a matter of teaching students to memorize them, apply them, and maybe add to the sum total of known generalizations.

Abstraction tends to predominate when civilizations are expanding. It’s a confident viewpoint, both in its faith that the human mind is capable of knowing the inner workings of the cosmos, and in its claims that its method for generating abstractions is applicable to all subjects and that its particular set of abstract generalizations equate to objective truth. Of course the faith and the claims run into trouble sooner or later; whatever method the civilization uses to determine truth—classical logic in ancient Greece, Christian theology in medieval Europe, experimental science in modern America—eventually ends up in paradox and self-contradiction, and the irreducible cussedness of nature turns the first elegant generalizations into clanking, overburdened theoretical machinery that eventually falls apart of its own weight. Meanwhile Utopian hopes of a society of reason and enlightenment, which partisans of abstraction always seem to cherish, run headlong into the hard realities of human nature: after Athens’ golden age, the Peloponnesian War and the self-destruction of Greek democracy; after the Gothic cathedrals and the great medieval summae, the Black Death and the Hundred Years War; after the brilliant trajectory of science from Galileo to Einstein—well, we’ll be around to see the opening stages of that.

That’s when reflection comes into play. Reflection is the view that recognizes that human ideas of the order of the cosmos are, in the final analysis, just another set of human ideas, and that the hubbub and confusion of everyday life is the only reality we can be sure of. In an age dominated by reflection, Giambattista Vico’s great maxim—“we can truly know only what we make”—takes center stage, and humanity rather than the cosmos becomes the core subject of knowledge. It’s not a knowledge that can be extracted in the form of abstract generalizations, either; it’s a personal, tacit knowledge, a knowledge woven of examples, intuitions, and things felt rather than things defined. From the standpoint of abstraction, of course, this isn’t knowledge at all, but in practical application it works surprisingly well; a sensitivity to circumstances and a memory well stocked with good examples and concrete maxims tend, if anything, to be more useful in the real world than an uncritical reliance on the constructions of current theory.

This is why Greek intellectual culture, with its focus on logic, mathematics, physics, and speculative philosophy, gave way to Roman intellectual culture, which focused instead on literature, history, jurisprudence, and ethical philosophy. It’s also why the culture of the high Middle Ages, with its soaring ambition to understand the cosmos by interpreting religious revelation in the light of reason, gave way to the humaniores litterae—literally, the more human writings—of the Renaissance, which focused attention back on humanity, not as an object under the not-yet-invented microscope, but as a subject capable of knowing and acting in a complex, unpredictable world. It’s by way of reference to those “more human writings” that we still call the characteristic interests of Renaissance culture “the humanities.”

Next week’s post will follow the most recent swing of the pendulum over to the side of abstraction, since that has to be understood in order to sort out what can be saved from contemporary science. Here, though, I want to spare a few moments for the almost completely neglected issue of the value of the humanities in an age of collapse. Modern American culture is so deeply invested in abstraction that the very suggestion that reflection, as I’ve defined it, could have pragmatic value as a way of knowledge seems ludicrous to most people. Still, given that we’ve landed ourselves in the usual trap that comes with overcommitment to abstraction—we can determine beyond a shadow of a doubt what has to be done, and prove that it has to be done, but we’ve become completely incapable of motivating people to do it—a strong case could be made that we need to pay more attention to that aspect of knowledge and culture that deals directly with human existence in its actual context of complexity and rootedness, an aspect that offers no general laws but many practical insights.

There’s another reason why it may be worthwhile to refocus on reflection rather than abstraction in the years ahead of us. As already mentioned, the partisans of abstraction have a hard time finding any value at all in reflection; Plato’s insistence that poets ought to be chucked out the gates of his Republic, John Scotus Erigena’s dismissal of core elements of the humanities because “they do not appear to have to do with the nature of things,” Descartes’ irritable condemnation of literary studies, and the fulminations of today’s scientific pundits against any dimension of human experience that can’t be measured on some kind of dial, all come from this habit of thought. Curiously, though, the reverse is rarely the case. In ages when reflection predominates, the sciences tend to be preserved and transmitted to the future along with the humanities, because the sciences are also products of human thought and culture; they can be studied as much for what they reveal about humanity as for what they reveal about nature. That shift has already been taking place; when the late Carl Sagan spun his compelling “We are star-stuff” myth for the viewers of Cosmos, for example, he was engaging in reflection rather than abstraction. Hs goal was not to communicate an abstract rule but to weave a narrative of meaning that provided a context within which human life can be lived.

The modern American educational system is by and large the last place on earth to try to pursue or communicate any such vision, whether undergirded by Saganism or some more traditional religion. Equally, though, as I’ve already pointed out, the modern American educational system is very poorly positioned indeed to deal with the impacts of peak oil, and the rest of the smorgasbord of crises the bad decisions of the last few decades have set out for us. The question that remains is what might replace it. What will come after the public schools is already taking shape, in the form of a lively and successful homeschooling movement that routinely turns out the sort of educated young people public schools once did; the replacement for what’s left of America’s once thriving trade schools is less well organized and defined as yet, but is emerging as craftspeople take on apprentices and people interested in a dizzying array of crafts form networks of mutual support. What we don’t yet have is a replacement for what the universities used to offer—some form of organized activity, however decentralized, informal, and inexpensive, that will see to the preservation and transmission of the intellectual heritage of our age.

What form such a thing might take is a challenging question, and one for which I don’t have any immediate answers. Still, it’s an issue that needs to be addressed. The pervasive spread of paranoiac conspiracy theories in contemporary American culture, which I mentioned toward the beginning of this post, is only one of several signs that too many people in this country have never learned how to doublecheck the validity of their own thinking, either against the principles of logic—a core element of the cultural heritage of abstraction—or against that attentiveness to circumstances and human motives that comes from “more human writings”—a core element of the cultural heritage of reflection. The people who chant “Drill, baby, drill,” as though it’s an incantation that will summon barrels of oil from thin air, are doing just as poor a job of reasoning about the world and reflecting on their motivations as the people who use the unprepossessing individuals teetering on the upper end of our political class as inkblots on which to project their need for scapegoats and their fantasies of absolute evil.

Working out the first rough sketch of a replacement for the American academic industry won’t stop the incantations or the scapegoating any time soon, and arguably won't stop it at all. Many other forces, as I suggested earlier, drive the contemporary flight from the muddled complexities of civil society into a comic-book world of supervillains whose alleged malignity is so clearly a product of the believer’s need to find someone to blame. Yet the tasks facing those of us who are trying to get ready for the unraveling of industrial America, and the comparable tasks our grandchildren and our grandchildren’s grandchildren will face, will demand plenty of clear, perceptive, well-informed thinking, guided both by abstraction’s useful generalizations and by reflection’s sharpened sensitivities. Doing something to salvage learning, while there’s still a chance to do so, is one potentially crucial way to help that happen.

Wednesday, July 20, 2011

Salvaging Resilience

Regular readers of this blog will know by this point that my efforts to make sense of the shape of the emerging deindustrial future involve the occasional odd detour, and one of those is central to this week’s post. Mind you, those same regular readers may be wondering if the detour in question has to do with Ben Bernanke’s secret name as a Sith Lord, a point which occupied some space in comments on a recent Archdruid Report. (The best proposal so far, in case you’re wondering, was Darth Flation – think (in)Vader, (in)Sidious, etc.)

Still, that tempting topic will have to be left for another week. Instead, I’m going to have to clear up the confusions surrounding a bit of jargon popular in the current peak oil blogosphere. That process is more than a little reminiscent of fishing scrap metal out of a swamp; in the present case, the word that needs to be hauled from the muck, hosed off, and restored to its former usefulness, is “resilience.”

The rise of this term to its present popularity in green circles has a history worth noting. A year or two ago, the word “sustainability” began to lose its privileged place in the jargon of the time, as it began to sink in that no matter how much manhandling was applied to that much-abused term, it couldn’t be combined with the phrase “modern middle-class lifestyle” without resulting in total absurdity. Enter “resilience,” as another way to talk about what too many people nowadays want to talk about, generally to the exclusion of more useful conversations: the pretense that a set of lifestyles, social habits, and technologies that were born in an age of unparalleled extravagance can be maintained as the material basis for that extravagance trickles away.

The word “sustainability,” it bears remembering, has a perfectly clear meaning. It means, as the word itself suggests, the ability of something to be sustained, either for a set period of time – “sustainable over a twenty year period,” for example – or indefinitely. That was its problem as a green buzzword, because next to nobody wanted to talk about just how long the current crop of “sustainable” tech was actually likely to stay viable (hint: not very long), and even fewer were willing to grapple with the immense challenges facing any attempt to sustain any of today’s technologies into the indefinite future.

The problem with “resilience,” though, is that it also has a perfectly clear meaning. Once people figure out what that is, it’s a safe bet that they’ll be hunting for another buzzword in short order, because resilience can be defined very precisely: it’s the opposite of efficiency.

Okay, now that you’ve stopped spluttering, let me explain.

We can define efficiency informally as doing the most with the least. An efficient use of resources is thus one that puts as few resources as possible into places where they sit around doing nothing. The just-in-time ordering process that’s now standard in manufacturing and retail, for example, was hailed as a huge increase in efficiency when it was introduced; instead of having stockpiles sitting around in warehouses, items could be ordered electronically from a database so that they would be made and shipped just in time to go onto the assembly line or the store shelf. What nobody asked, and very few people have asked even yet, is what happens when something goes wrong.

The great Tohoku tsunami a few months back provided a wakeup call in that direction, as factories across Japan and around the world suddenly discovered that the shipment of parts they needed just in time for next month’s production runs had been delivered instead to the bottom of the Pacific Ocean. In the inefficient old days, when parts jobbers scattered all over the industrial world had warehouses full of parts being produced by an equally dispersed array of small factories, that would have given nobody sleepless nights, since the stock of spares on hand would be enough to tide things over until factories could run some extra shifts and make up the demand. Since production had been efficiently centralized in very few factories, or in some cases only one, and the warehouses full of parts had been rendered obsolete by efficient new ordering systems, knock-on costs that would have been negligible in 1970 are proving to be very substantial today.

Efficiency, in other words, is not resilient. What makes a system resilient is the presence of unused resources, and these are inefficient by definition. A bridge is resilient, for example, if it contains a good deal more steel and concrete than is actually needed to support its normal maximum load; that way, when some outside factor such as a hurricane puts unexpected stresses on the bridge, the previously unnecessary structural strength of all that extra steel and concrete comes into play, and keeps the bridge from falling down. Most bridges are designed and built with that sort of inefficiency in place, because the downside of too little efficiency (the bridge costs more to build) is a good deal less troubling than the downside of too little resiliency (the bridge collapses in a storm). Like every project worth doing, a good bridge has to strike a balance between many conflicting factors, no one of which can be maximized except at the expense of others of equal importance.

This is something that one of the iconic figures of the Seventies, Buckminster Fuller, never quite grasped. For me, Fuller is what another iconic Seventies figure called a worthy opponent; his writings constantly force me to reexamine my own ideas, because they grate on my nerves so reliably. Partly that’s a function of Fuller’s insouciant assurance that technology inevitably one-ups everything else in the cosmos – Theodore Roszak’s apt gibe, “I would not be surprised to hear (Fuller) announce someday that he had invented a better tree,” comes to mind – and partly it’s his insistence that the universe had to make the kind of sense he wanted it to make – this is a man, remember, who spent much of his life insisting that pi couldn’t really be an irrational number – but the issue that comes to mind right now is his consistent preference for efficiency at the cost of resilience.

That’s not to say that Fuller didn’t score some major successes. If my house was in a good location for a wind turbine, I’d almost certainly use Fuller’s octet truss design for the tower, and a lot of very sturdy geodesic domes have been built using his patents. Still, it’s worth noting that not even Fuller was able to live for long in a dome house made to his own designs; if it had been perfectly caulked, it would have provided a comfortable home with very efficient use of materials, but since caulking is never perfect in the real world, it leaked like a sieve whenever it rained. That’s one of the reasons why Lloyd Kahn, the compiler of Domebooks I and II and a major proponent of geodesic domes back in the day, backpedaled in his 1973 compilation Shelter. That very worthwhile piece of Green Wizard literature talked at length about the problems with geodesic dome construction, and put most of its space into vernacular building from cultures around the world, from yurts and tipis to good sturdy old-fashioned carpentry that holds off the rain.

Most of the troubles that saddled Fuller with the label “failure-prone” were, like the vast number of leaky geodesic dome houses that sprang up in the Sixties, the product of too much efficiency and too little resilience. The Dymaxion car of 1933 is a case in point. In most respects it was a brilliant design, maneuverable and ultraefficient, but its career came to a sudden halt when one of the three prototypes got bumped by another car on Lake Shore Drive in Chicago, flipped, and rolled, killing the driver and seriously injuring everybody else on board. Fuller designed the car with a narrow wheelbase relative to its length for the sake of maneuverability, and a high center of gravity to provide a smoother ride on rough roads. Both those choices made the Dymaxion car more efficient but less stable, and at highway speeds that’s not a safe tradeoff to make.

Thus efficiency is not resilient, and resilience is not efficient. Just-in-time ordering is conceptually the same as the Dymaxion car’s narrow wheelbase and high center of gravity: a great idea, as long as nothing goes wrong. Since it may have occurred to you, dear reader, that today’s industrial civilization seems to have a lot in common just now with these examples of high efficiency and low resilience, you may be thinking that it might turn out to be necessary to accept a lower degree of efficiency, in order to provide our civilization with the backlog of unused resources that will give it resilience.

Ah, but here’s where things get difficult.

There’s a reason why contemporary industrial culture is obsessed with efficiency, and it’s not because we’re smarter than our grandparents. Every civilization, as it nears the limits of its resource base, has to deal with the mismatch between habits evolved during times of relative abundance and the onset of shortages driven by too much exploitation of that abundance. Nearly always, the outcome is a shift in the direction of greater efficiency. Local governments give way to centralized ones; economies move as far toward mass production as the underlying technology will permit; precise management becomes the order of the day; waste gets cut and so, inevitably, do corners. All this leads to increased efficiency and thus decreased resilience, and sets things up for the statistically inevitable accident that will push things just past the limits of the civilization’s remaining resilience, and launch the downward spiral that ends with sheep grazing among ruins.

Trying to build resilience into a system that’s already gotten itself into this bind is a difficult project at best. The point of these efficiency drives, after all, is to free up resources to support the standards of living of the privileged classes. Since these same privileged classes are the ones who have to sign off on any project to redirect resources toward resilience, the difficulties in convincing them to act against their immediate self-interest are not hard to imagine. Since efficiency tends to take an aura of sanctity in such cases – privileged classes, after all, are as prone as anyone else to convince themselves that what’s good for them is good for everyone – proponents of resilience face an uphill fight against deeply rooted assumptions. After all, who wants to go on record in support of inefficiency?

And of course that’s exactly what we’ve seen in recent decades in industrial society. The Glass-Steagall Act, which imposed resilience on the US banking system at the cost of a fair amount of inefficiency, is a good example; it was gutted by an enthusiastically bipartisan majority, giving us the highly efficient but hopelessly brittle financial system we have today. Many other measures that put resilience into the system were also scrapped in the name of “competitiveness,” though it’s worth noticing that America’s ability to compete in any arena that doesn’t involve blowing large chunks of a Third World country to kingdom come has gone down steadily while these allegedly competitive measures have been at work. All of it, slogans aside, served to free up resources to maintain living standards for America’s privileged classes – a category that extends well down into the middle class, please note, and includes a great many people who like to denounce the existing order of American society in heated terms.

That’s our version of the trap that closes around every society that overshoots its resource base. The struggle to sustain the unsustainable – to maintain levels of consumption the remaining resource base won’t support indefinitely – always seems to drive the sort of short-term expedients that make for long-term disasters. I’ve come to think that a great many of the recent improvements in efficiency in the industrial world have their roots in this process. Loudly ballyhooed as great leaps forward, they may well actually be signs of the tightening noose of resource constraints that, in the long run, will choke the life out of our civilization.

Thus it’s a great idea in the abstract to demand a society-wide push for resilience, but in practice, that would involve loading a great many inefficiencies onto the economy. Things would cost more, and fewer people would be able to afford them, since the costs of resilience have to be paid, and the short term benefits of excessive efficiency have to be foregone. That’s not a recipe for winning an election or outcompeting a foreign rival, and the fact that it might just get us through the waning years of the industrial age pays nobody’s salary today. It may well turn out that burning through the available resources, and then crashing into ruin, is simply the most efficient way for a civilization to go.

Where does that leave those of us who would like to find a way through the crisis of our time and hand down some part of the legacy of our civilization to the future? The same principles apply, though it’s fortunately true that individuals, families, and local communities often have an easier time looking past the conventional wisdom of their era and doing something sensible even when it’s not popular. The first thing that has to be grasped, it seems to me, is that trying to maintain the comfortable lifestyles of the recent past is a fool’s errand. It’s only by making steep cuts in our personal demand for resources that it’s possible to make room for inefficiency, and therefore resilience.

Most of the steps proposed in these essays, in turn, are inefficient – indeed, deliberately so. It’s unquestionably nefficient in terms of your personal time and resources to dig up your back yard and turn it into a garden; that inefficiency, however, means that if anything happens to the hypercomplex system that provides you with your food – a process that reaches beyond growers, shippers and stores to the worlds of high finance, petroleum production, resource politics, and much more – you still get to eat. It’s inefficient to generate your own electricity, to retrofit your home for conservation, to do all the other things we’ve discussed. Those inefficiencies, in turn, are measures of resilience; they define your fallback options, the extra strength you build into the bridge to your future, so that it can hope to stand up to the approaching tempests.

The emerging patterns of the salvage economy that have been discussed here over the last few weeks feed into this same quest for resilience. Many older technologies, of the sort that might readily be salvaged and put to use, are a good deal less efficient than their modern replacements, and therefore much more resilient.

Here’s an example. There’s been plenty of talk in recent years about the risk of an electromagnetic pulse (EMP) attack against the United States. It’s been the subject of Congressional hearings, a popular novel, and a great deal of hoopla in the media. There’s some reason for all this concern, as a single modest nuclear warhead detonated up in the ionosphere above the northern Midwest would generate a pulse that would fry electronic equipment over most of the continental United States, and it’s been argued that any of several non-nuclear technologies could do the same thing on a more local scale. There’s been a great deal of backing and forthing about how to shield national infrastructure against such an attack, but it’s only occasionally been noted that electronic technologies that are very nearly invulnerable to EMP already exist, and can be found in antique malls across the country.

The secret to those technologies? The old-fashioned vacuum tube. Vacuum tubes use plenty of power and convert most of it into heat, and the sturdy structure made necessary by that inefficiency makes tubes shrug off sudden transient pulses of the sort an EMP generates. Modern integrated circuits are many orders of magnitude more efficient, and so those same transient pulses go right into the heart of an IC chip and destroy it. If you plan on using a tube-based radio for communication in the event of an EMP attack, mind you, you need to be sure that it doesn’t have first-generation solid state components such as selenium rectifiers, or replace those with diode tubes, and you’d probably better do the sensible thing and get your amateur radio license, too, so you can get in some practice with your rig in advance. Still, it’s a viable approach, and a good deal cheaper than the alternatives – and it would be just as viable, and just as cheap, if the US government were to do the smart thing and arrange for a couple of midsized domestic electronics firms to start manufacturing reliable tube-based electronics as backups for critical infrastructure across the country.

There are countless other examples. By and large, older technologies are less efficient, because they were made in an age when efficiency wasn’t as overvalued as it is today. That means, in turn, that older technologies are by and large more resilient, and those who are concerned about resilience will often find that older, simpler, sturdier technologies are a better bet than the current state of the art. By and large, in turn, making use of those technologies means accepting downscaled expectations; a tube-based radio is easy, a tube-based television is challenging, and a tube-based video game would be around the size of a double-wide mobile home and use as much power as a five-story office building. This is why, sixty years ago, radios were common and cheap, televisions were less common and pricey, and games were played on brightly colored boards on the kitchen table or the family room floor without any electronics at all.

Still, downscaled expectations will be among the most common themes of the decades ahead of us, and those who have the uncommon sense to figure this out in advance and start getting ready for a less efficient future will very likely benefit from the increased resilience that will provide. Over the weeks to come, as I finish up the discussion of salvage and prepare to wrap up the entire series of posts on green wizardry that have been central to this blog’s project for more than a year now, I hope to be able to suggest a few more options for resilience along these same lines.

Wednesday, July 13, 2011

Salvaging Quality

It’s been a busy week for those of us who keep watch over the industrial world’s deepening tailspin, as politicians in the United States and Europe play a game of chicken using sovereign debt in the role traditionally filled by fast cars. The issue in the United States is simple enough: the most that either side is able to offer, given its political commitments, is less than the least either side can afford to accept, and the occasional turns toward demagoguery on both sides haven’t exactly helped. It’s still possible that some last minute compromise may be hammered out, but the odds against that are starting to lengthen, and if that doesn’t happen, the financial end of the federal government will start seizing up in about three weeks. It should be an interesting spectacle.

Europe is a more complex situation. Greece, the current poster child for sovereign debt dysfunction, did what poor countries so often do, borrowed in foreign markets far beyond its ability to repay, and now can’t meet its bills. Unfortunately the normal way to resolve such problems – defaulting on the debt – would bankrupt quite a few large banks in other EU nations, and these latter have put pressure on their national governments to stave off a Greek default. The problem here is that Greece is going to have to default sooner or later; the question is purely a matter of when The Greek government is in hock far beyond its ability to repay, and the austerity measures pushed on it by the cluelessly doctrinaire economists at the IMF have worsened the matter considerably by putting the Greek economy into a tailspin. So it’s simply a matter of waiting for the inevitable to happen, and the credit markets to go into spasm accordingly.

Mind you, the horrified utterances currently being splashed around the global media, claiming that default is unthinkable and unprecedented, are nonsense of the most blatant sort. Nations default on their debts all the time. Russia did it in 1998, Argentina did it in 2002, and both nations survived; most European nations, for that matter, have defaulted on their debts more than once over the course of their history, and bankrupted plenty of banks in the process – that’s where we get the word bankrupt, you know. Defaults have always been one of the inescapable risks of lending to governments. EU governments could get realistic about this, let Greece do what countries with too much debt normally do, and spend their time more usefully writing letters of condolence to the bank executives who will be out of a job shortly thereafter.

Come to think of it, it’s just possible that this is what EU governments are actually doing. The current flurry of handwaving and emergency meetings may be no more than a source of plausible deniability – we’re sorry, we did all we could, it was the fault of (fill in the blank to conform to local prejudices) that Greece crashed and took half Europe’s banks with it, blah blah blah. For that matter, it’s not completely beyond the bounds of possibility that politicians on this side of the Atlantic are playing a similar game. The US is up to its eyeballs in unpayable debts, loaded down with entitlements and international commitments that it can’t afford but that no elected official dares to touch, and lurching toward a default as inevitable as Greece’s but on an almost unimaginably vaster scale. Nearly the only way to get out of the resulting trap with some chance of national survival would be to trigger a run on Treasury bills, now, that will force a default on the national debt in the near future, when both sides can conveniently blame it on the intransigence of the other party and the perfidiousness of foreign lenders. It does seem unlikely that this level of public-spiritedness is at work in Congress and the White House, but I’d like to believe that it’s possible.

These latest consternations, in turn, provide all the more relevance to the theme I’ve been discussing in the last couple of posts here, the possibility of shifting over here and now to the salvage economy that’s already beginning to emerge outside the narrowing circle of scarcity industrialism. This week I’d like to bring up another dimension of that shift, and talk about one of the unspoken and unspeakable realities of life in a declining industrial society: the pervasive phenomenon of stealth inflation.

By this I don’t mean inflation in the sense in which economists use the word, the decrease in the value of money driven by the expansion of the money supply relative to the supply of goods and services. That kind of inflation deserves much better press than it gets; though it’s denounced by all right-thinking people these days, it’s one of the safety valves by which a capitalist economy’s tendency to produce excess paper wealth gets brought back into step with the actual wealth in circulation, the nonfiscal goods and nonfinancial services that meet actual human needs. It thus serves exactly the same role, in a much more subtle and flexible way, as the negative-interest currencies being proposed by would-be financial reformers these days.

Stealth inflation is a good deal less laudable. It’s the process by which the price of goods and services remains the same, while the value of what’s provided for that price diminishes. It’s sometimes done by decreasing quantities – most Americans over forty, for example, will remember the days when cans of soup and candy bars were a good deal larger than they are now – but far more often done by cutting quality. Sometimes this is a minor, even a subtle, factor; in other cases, it’s neither, and can quite easily become lethal in its effects.

A good example of the first kind came my way a while back when a friend, knowing I like to cook with cast iron pans, found an elderly example in a secondhand store for some absurdly small price and gave it to me. Because my wife has celiac disease – a severe enough case that relatively small traces of gluten can have unwelcome effects – I had to strip off the natural coating that cast iron cookwear gets when it’s well treated, and reseason the pan again, just as though it had been bought new. Even with this rough treatment, though, the old pan proved to be a much better piece of cookwear than any of the more recently manufactured cast iron pans I’d been using for a decade or so previously. Its inner surface has a much smoother finish, its metal conducts heat more evenly; this evening’s fried zucchini (fresh from the garden) was cooked in it, because no other pan I have does as good a job.

This isn’t simply a matter of chance or a personal quirk. Ask any cast iron aficionado and dollars will get you doughnuts – perhaps these days I should say “credit swaps will get you crullers” or something like that – you’ll hear a similar story; the cast iron cookware you can buy in your local hardware store simply isn’t as good as the same products made a quarter century ago, and the difference is no small thing. I’ve heard the same thing in the very different context of craftspeople who work with old tools; the quality of the metal, they say, as well as the workmanship tends to be dramatically better in tools that are at least a quarter century old.

In some cases the differences are enough to kill. One of the nasty little secrets behind the rising toll from food poisoning in the United States and elsewhere is that a great deal of it could easily have been prevented by common sense sanitary procedures that used to be standard, but have been cut for the sake of lower per unit costs and higher quarterly profits. What makes this all the more embarrassing is that this is America’s second encounter with what happens to the safety and quality of processed food in a capitalist system under economic stress; Upton Sinclair’s The Jungle probably ought to be required reading for the pundits, and there are many of them just now, who fatuously insist that government regulation is always and everywhere a bad idea

The same purblind mania for gutting sensible regulation that freed the banking industry from the Glass-Steagall Act and its equivalents in other industrial nations, and at a stroke brought back the devastating bubble-and-bust economics that dominated the industrial world before the Glass-Steagall Act was originally passed, has had equivalent effects in many other sectors of economic life. An acceleration in stealth inflation through declining quality is among the results. Still, there’s a deeper force pressing in the same direction, and it comes from the relentless mathematics of fossil fuel depletion and its impact on an economy founded on the expectation of constant growth.

There has been a great deal of talk recently on the leftward side of the economic spectrum about the need to “decouple” economic growth from increases in the supply of energy. Still, as Zen masters are wont to say, talk does not cook the rice; insisting that economic growth can continue while energy supplies are stuck in a bumpy plateau does not make it so. The production of real, nonfiscal goods and services requires inputs of energy, as well as raw materials (which must be extracted by using energy) and labor (which in America, at least, usually uses a fair amount of energy, too). The only goods and services that can grow unchecked as energy supplies flatline are financial goods and services – that is, “goods” that consist of the essentially arbitrary tokens our society uses to allocate real wealth, on the one hand, and “services” that consist of shuffling and exchanging these tokens in more or less intricate ways, on the other.

As the cheap abundant energy that provided the basis for three centuries of industrial civilization stops being cheap and abundant, then, one of the consequences is a widening disconnection between the production of nonfiscal goods and services and the production of money in all its various forms. Left to itself, the natural result would be a rising spiral of inflation in which the value of money declined steadily, to stay more or less in step with the amount of real goods and services available to buy. This natural result, though, is utterly unacceptable to the political classes – the people who take an active role in the political process – anywhere in the industrial world.

This has imposed any number of distortions on the global economy, but one of them is a constant push to keep the nominal rate of inflation as low as possible, thus sparing politicians the hard task of explaining to their constituents a reality that neither the politicians nor the constituents have yet begun to understand. That push drives the widespread juggling of economic statistics across the industrial world, but I’ve come to believe that it also provides an important motive force behind stealth inflation. Large corporations have plenty of interfaces with governments, and governments have plenty of levers by which to influence corporate behavior for political ends; if the politicians in Washington DC, let’s say, decided that it would be really helpful if businesses increased their profit margins relative to their costs by some means other than raising prices, it doesn’t seem at all unlikely that this preference would be heard in corporate boardrooms, and play at least some role in shaping their decisions.

What this means for the individual green wizard, in turn, is that there’s every reason to think that a good many of the goods and services sold to consumers are going to continue to decrease in quality in the years ahead. That in turn implies at least two things. The first is that the strategy of salvaging energy discussed in last week’s post has an additional advantage, because what’s being salvaged in a good many cases is not simply an equivalent of what’s on the market today, but a better product, one that tolerably often will work better and last longer than a new product of the same type. As we approach an age in which many goods may stop being available at all for extended periods, this is not an opportunity to ignore.

The second implication is that those who learn the skills needed to take older products that are no longer working, or no longer working well, and recondition them so that they can return to usefulness, may find themselves with a job skill of no small importance in the emerging salvage economy. It’s not too hard, for example, to find old handsaws for sale very cheaply at flea markets and estate sales. Fairly often, after being handed down through a couple of generations, these have rusted blades, teeth that are dull and bent out of their proper set, cracked and damaged handles, and the like. The steel of the blade, however, is very often of much higher quality than the equivalent new product in a hardware store today, and it doesn’t actually take that much in the way of skills and tools to remove the rust, polish the blade, reset and file the teeth, make a new handle out of hardwood and attach it to the old blade, and so on. The result is a saw that can be handed down for several more generations, and do a great deal of useful work in the meantime; it’s also a product that can be sold or bartered to craftspeople at a premium price.

In at least a few cases, it’s also possible to go one step further and figure out how to manufacture products on a small scale to old specifications. I don’t have anything like the metallurgical knowledge to figure our what makes the difference between my old cast iron pan and my newer pans, but the information’s surely out there, and could be tracked down by someone with the necessary background. Whether or not there would be enough of a market to make this a paying proposition anytime soon is another matter; there are odd little niche markets that might at least pay the bills.

Myself, having more facility with words than with metals, I’m contemplating tracking down a basic letterpress and exploring the honorable profession of Benjamin Franklin. The printing press with movable type was invented in the Middle Ages, after all, and very likely can remain a viable technology no matter how far down the slope of decline we end up sliding. Under current conditions, it can help pay its own bills via handprinted wedding invitations and the like; as conditions change and the complex supply chains that keep computer printers and copiers functioning become more problematic, a printing press powered by human muscle and capable of running on supplies no more complex than paper and homebrewed ink may turn into a serious asset.

Your mileage will unquestionably vary, and a second income refurbishing old items or using some outdated but sustainable technology will be the right choice for some people and the wrong choice for others. I mention it here partly because a good many readers of these posts have asked about potential businesses and income sources in a deindustrializing world, and partly because a fair number of people out there in the peak oil blogosphere don’t yet seem to have thought through the fact that they’ll need to earn a living in one way or another during the long slow unraveling of the industrial economy,

That unraveling may have its sudden jolts, to be sure. If the politicians in Washington DC and an assortment of European capitals fumble the current situation spectacularly enough, this autumn could see an economic crisis on the grand scale, with markets seizing up, banks shutting down, and governments facing abrupt replacement by legal means or otherwise. Still, we’ll come out the other side of it, no doubt poorer but still faced with the ordinary challenges of the human condition; if learning how to recondition old tools allows someone to barter for necessities during the years ahead, that’s a positive step, and such positive steps on the individual scale are the raw materials from which the deindustrial future will gradually emerge.

*********
On a related note, those of my readers who have been following The Archdruid Report for a while now will have seen me more than once pointing out that, challenging as the decline and fall of industrial civilization will doubtless be, it’s not the end of the world. With the help of an enthusiastic publisher, I’ve had the chance to make that point a little more clearly. This September, San Francisco’s Cleis Press will be bringing out my next book, Apocalypse Not, a wry survey of the apocalypse meme – the notion that sometime very soon, history as we know it will suddenly be replaced by a new osmos that just happens to bear a close resemblance to our favorite daydreams – and its role in inspiring the last three thousand years or so of End Times that weren’t. The Rapture, the Singularity, the prophecies of Nostradamus, and of course the upcoming 2012 brouhaha, among other things, all come in for discussion Though the subject’s a serious one, the book is a good deal shorter than my three peak oil books, and a lighter read as well. On the off chance that you’re interested, dear reader, it’s now available for preorder.

Wednesday, July 06, 2011

Salvaging Energy

The peak oil blogosphere this year seems to have decided that the Fourth of July needed a Grinch of its own to compete with Christmas, and a fair number of blogs duly went up denouncing the day and its entertainments. I can’t say that these added much to the peak oil dialogue or, really, to much of anything. It’s hardly a secret, for instance, that intellectuals on the two coasts like to belittle working class people who live in between, nor that it’s still quite fashionable on both ends of the political spectrum to characterize our system of government in terms that would get those who do so dragged away by a death squad if their rhetoric had any relationship to reality.

Me, I enjoyed the Fourth; I usually do. My wife and I spent a quiet day gardening, dined on chicken fried tofu – hey, don’t knock it if you haven’t tried it – and walked down to the city park that runs along the old Chesapeake and Ohio Canal, where we met friends, munched watermelon, and watched the annual fireworks display. The Fourth of July is one of the high points of any small town American summer, and I’m also sufficiently old-fashioned to celebrate the ideals that sent this country along its historical trajectory all those years ago. Since the United States is a country inhabited and governed by people rather than abstract ideological mannequins, those ideals got put into practice no more consistently than any other country’s, but they’ve lost none of their relevance, and I’d be a good deal less worried about the future of my country if I saw more people paying attention to them and fewer people waving them aside as obstacles to the pursuit of some allegedly glorious future.

The theme of this week’s post isn’t primarily about the future, glorious or otherwise. Strictly speaking, it’s a response to circumstances that will almost certainly go away within the lifetimes of people now living, and only exist today in those few nations that can afford the overblown consumer economy that budded, bloomed, and went to seed in the second half of the twentieth century. That response, curiously enough, has more than a little to do with the theme of independence central to the holiday just past, but the easiest way to make sense of it is to start with the nearly complete state of dependence expressed by another fashionable topic under discussion in the peak oil scene.

That topic is the return of electric cars to American roads. A dozen large and small automakers are in the process of bringing out battery-powered cars of various kinds, ranging from generic compacts like the Ford Focus and Nissan Leaf to more exotic items like the Aptera and the GEM. Most of them are pricey, and all of them have their share of drawbacks, mostly in terms of range and reliability, but a significant fraction of people on the green end of things are hailing their appearance as a great step forward. As things stand, that’s a bit of an oversimplification, since almost all the electricity these vehicles use will be generated by burning coal and natural gas, and the easy insistence that the grid can easily be switched over to solar and wind power has already been critiqued at some length in this blog. Still, there are a couple of other points that would be well worth making here.

First of all, of course, the best way to reduce your ecological footprint isn’t to replace a petroleum-powered car with an electric car, it’s to replace it with a bicycle, a public transit ticket, or a good pair of walking shoes. This isn’t the first time I’ve mentioned that option, and I know I can expect to be belabored by commenters who are bursting with reasons why they can’t possibly do without a car, or even leave the car parked in the driveway most of the time. Granted, the built geography of much of rural and suburban North America makes it a little challenging to do without a car, but something close to a hundred million people in the United States live in places where a car is a luxury most or all of the time, and a significant fraction of the others choose to live in places where that’s not the case. Still, let’s set aside for the moment the fact that the one energy-related bumper sticker that might actually make a difference these days would belong on the back of a bicycle, and would say MY OTHER CAR IS A PAIR OF SHOES. For those Americans who actually do find themselves in need of a car, how about the new electric vehicles? Will they really decrease your carbon footprint and your fossil fuel use, as so much current verbiage claims?

The answer is unfortunately no. First of all, as already mentioned, the vast majority of electricity in America and elsewhere comes from coal and natural gas, and so choosing an electric car simply means that the carbon dioxide you generate comes out of a smokestack at a power plant rather than the tailpipe of your car. The internal combustion engine is an inefficient way of turning fuel into motion – around 3/4 of the energy in a gallon of gas becomes low-grade heat dumped into the atmosphere via the radiator, leaving only a quarter to keep you rolling down the road – but the processes of turning fossil fuel into heat and heat into electricity, storing the electricity in a battery and extracting it again, and then turning the electricity into motion is less efficient still, so you’re getting less of the original fossil fuel energy turned into distance traveled than you would in an ordinary car. This means that you’d be burning more fossil fuel to power your car even if the power plant was burning petroleum, and since it isn’t – and coal and natural gas contain much less energy per unit of volume than petroleum distillates do – you’re burning quite a bit more fossil fuel, and dumping quite a bit more carbon in the atmosphere, than a petroleum-powered car would do.

This isn’t something you’ll see discussed very often in e-car websites and sales flyers. It’s even less likely that you’ll find any mention there of the second factor that needs to be discussed, which is the energy cost of manufacture. An automobile, petroleum-powered or electric, is a very complicated piece of hardware, and every part of it comes into being through a process of manufacture that starts at an assortment of mines, oil wells, and the like, and proceeds through refineries, factories, warehouses, and assembly plants, linked together by long supply chains via train, truck or ship. All this costs energy. Working out the exact energy cost per car would be a huge project, since it would involve tracking the energy used to produce and distribute every last screw, drop of solvent, etc., but it’s probably safe to say that a large fraction of the total energy used in a car’s lifespan is used up before the car reaches the dealer. Electric cars are as subject to this rule as petroleum-powered ones.

The energy cost of manufacture has generally been downplayed in discussions of energy issues, where it hasn’t been banished altogether to whichever corner of the outer darkness it is that provides a home for unwanted facts. (I’ve long suspected that this is not too far from “Away,” the place where pollution goes in the parallel universe that cornucopians apparently inhabit.) Promoters of the more grandiose end of alternative-energy projects – the solar power satellites and Nevada-sized algae farms that crop up so regularly when people are trying to ignore the reality of ecological limits – are particularly prone to brush aside the energy cost of manufacture with high-grade handwaving, but the same sort of evasion pervades nearly all thinking about energy these days. I’ve mentioned before that three centuries of cheap abundant fossil fuel energy have imposed lasting distortions on the modern mind; this is an example.

Still, factor in the energy cost of manufacture, and there actually is an answer to the question we’ve just been considering. If you really feel you have to have a car, what kind involves the smallest carbon footprint and the least overall energy use? A used one.

I suppose it’s just possible that one or two of the readers of this blog will remember a strange and politically edgy comic strip from the Sixties named Odd Bodkins. The rest of you will just have to forgive a bit of relevant reminiscence here. Somewhere between an encounter with the dreaded Were-Chicken of Petaluma and a journey to Mars with Five Dollar Bill, I think it was, the Norton-riding main character, Hugh, and his sidekick Fred the Bird had a run-in with General Injuns – the resemblance to the name of a certain large American automotive corporation was not accidental. I forget what it was that inspired Fred the Bird to shout “Buy a used car!” but the General’s response – “BLASPHEMY!!!” – was memorably rendered, and will probably be duplicated in a good many of the responses to this week’s blog. Most people in the industrial world nowadays are so used to thinking of the best option as new and shiny by definition, that the homely option of picking up a cheap used car as a way of saving energy is likely to offend them at a cellular level.

Still, the energy cost of manufacture needs to be taken into account. If you buy a used car – let’s say, for the sake of argument, a ten-year-old compact with decent gas mileage – instead of a new electric car, you’ve just salvaged the energy cost of manufacture that went into the used car, most of which would otherwise have been wasted, and saved all the energy that would have been spent to produce, ship, and assemble every part of the new car. Since it’s a ten-year-old compact rather than a brand new e-car, furthermore, you’re not going to be tempted to drive it all over the place to show everyone how ecologically conscious you are; in fact, you may just be embarrassed enough to leave it in your driveway when you don’t actually need it, thus saving another good-sized chunk of energy. Finally, of course, the price difference between a brand new Nissan Leaf and a ten-year-old compact will buy you a solar water heating system, installation included, with enough left over to completely weatherize an average American home. It’s a win-win situation for everything but your ego.

The same principle can be applied much more broadly. Very few people, for example, actually need a new computer. I’ve never owned one; I need a computer to make my living – publishers these days require book manuscripts to be submitted electronically – but I get my computers used, free or at a tiny fraction of their original price, and run them until they drop. One consequence is that I’ve salvaged the energy used in manufacturing the old computer, rather than burdening the planet with the energy cost of manufacturing a new one; another is that I’m keeping a small but measurable amount of toxic e-waste out of the waste stream; still another, of course, is that I save quite a bit of money that can then be directed to other purposes, such as insulation and garden tools.

Most Americans buy most of the things they used new, and dump a great many perfectly useful items into the trash; the more conscientious package them up and donate them to thrift stores, which is at least a step in the right direction. As a society, we have been able to afford this fixation and its attendant costs – new houses, new cars, new computers, new everything – because we’ve been surfing a tidal wave of cheap abundant fossil fuel energy. As we get further into the territory on the far side of peak oil, and as peak coal and peak natural gas come within sight, that state of affairs is rapidly coming to an end. One option, as I suggested in last week’s post, is to plunge into the emerging reality of scarcity industrialism, which centers on an increasingly savage competition for access to a shrinking pool of new and shiny things produced by what’s left of the world’s fossil fuel stocks.

A saner alternative, though, is to move directly into the stage that will follow scarcity industrialism – the stage of salvage economics. That’s what I’ve been discussing here, under a less threatening label. Right now, while the tidal wave of cheap energy has not yet receded very far, the beachscape of industrial society is still littered with the kind of high-quality salvage our descendants will dream of finding, and the only thing that has to be overcome in order to access most of it is the bit of currently fashionable arrogance that relegates used goods to the poor.

Now of course that’s not a small thing. One of the reasons that Thoreau’s concept of voluntary poverty got rebranded “voluntary simplicity,” and repackaged as a set of fashionable lifestyle choices that imitate authentic simplicity at a premium price, is the stark panic felt by so many middle class Americans at the thought of being mistaken for someone who’s actually poor. Those of my readers who decide that the advantages of voluntary poverty are worth pursuing are going to have to confront that panic, if they haven’t done so already. Like all supposedly classless societies, America makes up for its lack of formal caste barriers by raising caste prejudice to a fine art; the cheap shots at small town America mentioned toward the beginning of this blog are an expression of that, of course, and so is the peer pressure that keeps most Americans from doing the sensible thing, and buying cheap and sturdy used products in place of increasingly overpriced and slipshod new ones.

We are all going to be poor in the decades and centuries to come. Yes, I’m including today’s rich in that; the stark folly that leads today’s privileged classes to think they can prosper while gutting the society that alone guarantees them their wealth and status is nothing new, and will bring about the usual consequences in due time. Voluntarily embracing poverty in advance may seem like a strange tactic to take, at a time when a great many people will be clinging to every scrap of claim to the fading wealth of the industrial age, but it has certain important advantages. First, it offers a chance to get competent at getting by on less before sheer necessity forces the issue; second, it sidesteps the rising spiral of struggle that’s waiting for all those who commit themselves to holding on to an industrial-age standard of living; third, as I’ve already pointed out, buying cheap used items frees up money that can then be applied to something more useful.

It’s probably going to be necessary here to insert a response to what used to be the standard objection to the piece of advice I’ve just offered. No, buying used goods instead of new ones isn’t going to put any significant number of Americans out of work. Very little is actually manufactured in America these days, and most of what is, is produced and sold by conglomerates that pump money out of American communities and into the black hole of the financial economy. Nearly all used-goods stores, by contrast, are locally owned and circulate their earnings back into the community, where they generate jobs by way of the multiplier effect. The calculations would be fiendishly difficult, and you won’t find a mainstream economist willing to touch the project with a ten-foot pole, but I suspect that when the differences just listed are taken into account, buying used goods actually yields a larger number of jobs than buying new ones – and while thrift store clerks don’t make as much as corporate office fauna, to be sure, I have to admit to a suspicion that the former contribute a good deal more to the world as a whole than the latter.

For the time being, at least, the office fauna and their corporate sponsors are likely to continue to thrive after a fashion, lumbering through the jungles of deindustrializing America like so many dinosaurs, and the thrift store clerks and their customers will play the part of smart little mammals scurrying around in the underbrush. Still, like the mammals, those who opt out of scarcity industrialism to embrace the first stirrings of the salvage economies of the future will have certain advantages not available to their brontosaurian neighbors. One of them, as already suggested, will be a certain amount of spare room in the household budget, which can then be turned to other projects, or used to free up a family member to work in the household economy, or both.

Another will be the chance to learn skills that could well become income sources in the not too distant future; as I’ve suggested more than once here, salvage trades – that is, anything that involves taking the leftovers of industrial civilization and turning them into something that people need or want – will likely be among the major growth industries of the next century or two, and the ground floor is open for business right now. Still, the advantage that comes to mind just at the moment is the one suggested by the holiday fireworks I mentioned toward the beginning of this post. Not uncommonly in history, people face a choice between being comfortable and dependent, on the one hand, and poor and free on the other. It’s been a particularly important theme in American history, driving phenomena as different as the settling of the Appalachians and the counterculture of the Sixties, and I’ve come to think that it’s going to become a live issue again in the decades ahead of us.

In time to come, those who cling to the narrowing circle of scarcity industrialism will likely discover that most of the freedoms that remain to them are going to have to be handed over as part of the cost of admission; those who choose otherwise – and there will be a range of other options, though you won’t learn that from the mainstream media – will have to give up a great many expectations and privileges that are standard issue in the industrial world just now in order to preserve some degree of autonomy and individual choice. That’s the way the future looks to me, at least; if I’m right, the simple act of salvaging energy by buying used goods instead of new ones – a step that Ben Franklin would have appreciated, interestingly enough – might just turn out to be a useful step in the direction of the ideals that some of us, at least, were celebrating a few nights ago. We’ll talk more about this in the weeks ahead.