Thanks to Bud Nye, for his permission to publish this original essay at Wit's End. It follows a brief autobiography he composed:
Bud Nye, with an undergraduate degree in ecology and a masters degree in counseling, worked as a psychiatric nurse for 15 years and taught high school chemistry and physics for eight years before retiring. He has had a passionate, life-long interest in ecological collapse issues and helped start an emotion-focused extinction support group (EF ESG) in Tacoma, Washington three years ago, which meets monthly. More recently, he started an Attachment Support Group for Singles and Couples (http://www.meetup.com/Attachment-Support-Group-For-Singles-and-Couples/). To date, Guy McPherson has posted ten of his essays at the Nature Bats Last website. Readers may feel free to contact Bud at email@example.com with any questions regarding these groups or other issues.
(I adapted the first parts of this essay from Foragers, Farmers, and Fossil Fuels, How Human Values Evolve by Ian Morris, 2015, mainly pp. 142-151. The section related to hope comes from reading Emotion-Focused Therapy for Depression by Les Greenberg and Jeanne Watson, 2006. For readers new to the subject, NTHE refers to Near Term Human Extinction.)
NTHE: How We Got Here, Love, and Hope
~ Bud Nye
In his 2015 book Foragers, Farmers, and Fossil Fuels, How Human Values Evolve, Ian Morris suggests that our values change in ways similar to the ways our genes change: combined with external shocks through back-and-forth interactions between moral systems and the social, intellectual and physical environment, our values change. Thus, just as the race between foxes and rabbits plays out through millions of little biological competitions of sex, chasing, and eating, with small statistical shifts in the odds producing massive changes in the animals themselves across thousands of generations, so too the race between values and environments play out through billions of little cultural competitions as individuals decide what they consider the best thing to do in various social situations. Over time, small statistical shifts in odds produce massive changes in cultures, but in recent history massive changes occur in mere decades, rather than over millennia. I agree with Morris on this.
The transition from foraging to farming
Morris also suggests that the most important consequences of the domestication of plants and animals involved the increase in energy available to humans, even though at the cost of requiring humans to work much harder. Just as most species usually do, humans then turned a lot of this extra energy into more humans. Mounting population pressure increasingly rewarded more stratified organizations leading stratified societies to outcompete and replace less stratified ones. In these new kinds of societies, people who interpreted fairness, justice, and so on to mean that political, economic, and gender hierarchies had good effects, and that settling disputes by violence (as had occurred so often in earlier foraging societies) had bad effects unless a god-like ruler said otherwise, flourished more than those who did not.
At this point it makes sense to provide some important comparisons. Foraging societies had a violence-related death rate of at least 10%, and often as high as 25%. Arguments of all kinds drive foraging men to violence more often than they drive farmers or fossil-fuel users, especially arguments over women. Foragers lived on as few as 4,000 kilocalories (kcal) per day in the tropics to cover a person’s basic requirements for food, tools, cooking, fuel, a little clothing, and simple shelter. Nearer the poles they may have consumed twice as much energy for their heating, housing, and thicker clothing. “On the whole, foragers keep their groups small enough to survive on wild resources [in general fewer than about 50] not by wisely maintaining stable populations below the carrying capacity of their land, but by going through boom-and-bust cycles of rapid population growth and starvation.”
What about farmers? “A sixfold increase [over foragers] in energy capture per person, to about 30,000 kcal/person/day, appears to represent the outer limits of what could be done in a purely organic economy.” Though a number of foraging societies developed hierarchical social structures, for example Japan’s Jomon culture or the Pacific Northwest’s Kwakiutl tending to illustrate the richest, farming societies prospered much more than foragers and they also became much more hierarchical. While foraging societies can have quite an unequal wealth distribution, what happens in farming societies dwarfs the often unequal distribution in foraging societies. In comparison with foragers, farming societies seem to have shifted toward forced labor because they had to: neither kinship nor the market could generate the labor needed to build the ships, harbors, roads, temples, and monuments without which their huge populations could not have fed themselves or maintained their societies. “Forced labor, like patriarchy [and gender inequality], became functionally necessary to farming societies that generated more than 10,000 kcal/person/day.” Meanwhile, farming could only work if rates of violence fell. With governments powerful enough to intimidate their unruly subjects into living peacefully, farmers had violence death rates closer to five percent, and sometimes much lower, in comparison with foragers 10 to 25 percent.
“Energy capture per capita in the most industrialized Western economies grew sevenfold, from roughly 38,000 kilocalories per person per day around 1800 to 230,000 by the 1970s. The age of energy abundance had begun.” When we add up all the casualties in wars, genocides, state-induced famines, and murders, the 100 to 200 million people who died violently between 1900 and 2000 represent just 1 to 2 percent of the 10 billion people who lived. The fossil-fuel twentieth century proved ten times safer than the world of foragers, and two or three times safer than farmers. Since 1989 the global rate of violent death has sunk to just 0.7 percent.
Marshall Sahlins in his classic essay “The Original Affluent Society,” posed the obvious question: Why did people chose to exchange the freedom and leisure of foraging for the bondage and drudgery of farming? The biologist and geographer Jared Diamond once labeled this “The worst mistake in the history of the human race.” The historian Yuval Noah Harari has recently gone further still, calling the agricultural revolution “history’s biggest fraud.” The evolutionists Peter Boyd, Robert Richerson, and Robert Bettinger, however, suggest that we might do better to frame the question in an entirely different way. They suggest that we should ask “Was agriculture impossible during the Pleistocene but mandatory during the Holocene?” This, Morris suspects, serves as the most instructive way to pose the question: the shift from foraging to farming did not occur in an inevitable way—nothing involving humans ever does—but because of the energy issues involved the probabilities stacked so heavily in farming’s favor that the likelihood of it not happening became vanishingly small. (We can make exactly the same argument concerning global warming, economic, ecological, and nuclear collapse, and NTHE, all based on various aspects of energy capture and availability to humans and other species. The probabilities now stack so heavily in NTHE’s favor that the likelihood of it not happening has become vanishingly small.)
One of the main reasons that people began farming involved a massive external shock in the shape of climate change. Earth’s path around the sun shifts constantly, and after 14,000 BC temperatures began rising, although inconsistently, as small tilts and wobbles in the planet’s orbit produced abrupt bursts of warming or cooling. By 12,700 BC, temperatures got close to modern levels, and by some calculations, the mercury increased by 5 F° in the space of a single thirty-year generation. Glaciers melted, and great low lying plains—including what we now call the Persian Gulf and the Black Sea—became submerged. Every few centuries of warm, wet weather, however, got followed by several more of cold and ice, and around 10,800 BC, a genuine mini-ice age (known to specialists as the Younger Dryas) set in, plunging the world back into glaciation for twelve centuries. When it ended, though, the world quickly (by geological standards—the process took another two thousand years) became even hotter than we have become used to today. Plenty of climate fluctuations since 9600 BC have occurred, but none has even remotely occurred like the Younger Dryas. For nearly twelve thousand years, we have lived in what the archaeologist Brian Fagan calls a “long summer.”
This long summer occurred as a necessary condition for the invention of farming, but not as a sufficient condition. For sufficiency, a second condition had to occur: us. Warm, wet interruptions of the Ice Age had occurred around 135,000 years ago, 240,000 years ago, and 320,000-years ago (“interstadials”), long before fully modern Homo sapiens came on the scene, but none had led to farming. Instead, each had set off much the same boom-and-bust pattern: as the world warmed up, plants reacted to the increase in solar energy by multiplying madly; animals then reacted to the abundance of plants by eating them, and multiplying too; and pre-modern species of humans reacted to having so many plants and other animals surrounding them by eating everything—with predictable results in population growth, which inevitably occurs with increased food availability. But when—as always happened—the soaring numbers of each species of plant or animal ran out of the resources they fed on, populations crashed. (This will almost certainly soon happen yet again for humans, now on a global scale unlike anything that has occurred in our history as a species.)
In the long summer, this did not happen. At the coldest point in the last ice age, twenty thousand years ago, Earth had only about half a million people; ten thousand years later (in 8000 BC), six million; and now, another ten thousand years on, over seven billion. The combination of the long summer and modern humans, which made farming as close to inevitable as anything could occur in history, broke the boom-bust demographic cycle—temporarily.
Then as now it worked this way: while global warming affected every part of the planet, it affected some parts more than others. In a zone running from China to the Mediterranean in the Old World and from Peru to Mexico in the New—which, in an earlier book, Morris labeled the “Lucky Latitudes”, climate and ecology had conspired to favor the evolution of large-grained grasses (such as wild wheat, barley, and rice) and big, meaty animals (such as wild sheep, cows, and pigs). The hunting and gathering occurred here then anywhere else on Earth, and population boomed.
In some parts of the Lucky Latitudes (particularly the Jordan Valley), the pickings got so good that foraging bands could settle in semi-permanent villages, feeding almost (or sometimes completely) year-round from the wild foods that lay within reach of a single favored spot. Modern humans do not occur uniquely in having the ability to change their mobility patterns in response to the abundance or scarcity of food, but what happened next could have come about only once animals as brainy as us had evolved. As people increasingly stayed put, exploiting the plants and animals around their villages more intensively, and cultivating and tending them selectively, humans unconsciously (and very slowly) exerted selective pressures that modified their food sources’ genetic structures.
This process of domestication happened first in the Lucky Latitudes, not because people there were cleverer or more energetic than people in (say) Siberia or the Sahara, but because the Lucky Latitudes had by far the densest concentrations, of potentially domesticable plants and animals on Earth. Human beings lived much the same everywhere on the planet, and so, as we might expect, domestication happened first in the places where it could occur most easily—an energy issue.
Jared Diamond makes the point powerfully in his outstanding book Guns, Germs, and Steel. The world, Diamond observes, contains roughly 200,000 species of plants, but humans can eat only about two thousand of these, and only about two hundred have much genetic potential for domestication. Of the fifty-six domesticable plants with seeds weighing at least 10 milligrams, the wild ancestors of fifty originally grew in the Lucky Latitudes, and just six in the whole of the rest of the planet. Of the fourteen species of mammals weighing over a hundred pounds that humans domesticated before twentieth-century science kicked in, nine occurred as natives of the Lucky Latitudes.
No surprise exists, then, that domestication began in the Lucky Latitudes, nor that within the Lucky Latitudes, it appeared first in the region of Southwest Asia that archaeologists call the Hilly Flanks, which had the densest concentrations of potential domesticates of all. The wild ancestors of cattle, sheep, goats, wheat, barley, and rye all evolved here. The first signs of this process (the evolution of unnaturally large seeds and animals, which archaeologists usually call cultivation) show up in the Hilly Flanks between 9500 and 9000 BC, and we see evidence of full-blown domestication by 7500 BC.
What we now call China had high concentrations of domesticable plants and animals too, but not as high as those in the Hilly Flanks. Between the Yellow and Yangzi Rivers, humans cultivated rice by 7500 BC and domesticated it by 5500 BC. Millet and pigs followed over the next millennium. In Pakistan, we cultivated barley, wheat, sheep, and goats and then domesticated them on roughly the same schedule. We cultivated squash, peanuts, and teosinte in Mexico by 6500 and domesticated them by 3250, and quinoa, llamas, and alpacas in Peru by 6500 and 2750. The fit between the density of potential domesticates and the date at which domestication began remains almost perfect.
This fit makes it highly likely that Sahlins’ question misses the point of why people chose the bondage and drudgery of farming over the freedom and leisure of foraging, while Boyd et al.’s rephrasing—asking whether farming remained impossible before 9600 BC but mandatory after—hits it squarely. As the first farmers’ families grew, their landscapes filled up. As their primitive affluent societies got hungrier and hungrier, they could have looked their children in the eye and told them to starve rather than to work harder at cultivating plants and animals. For all we know, some foragers in the Jordan Valley ten thousand years ago did just this. The problem, though, involved the fact that they did not make a one-time choice. Tens of thousands of other people asked the same question, and each family had to revisit the decision of whether to intensify or go hungry multiple times every year. Most important of all, each time one family chose to work harder and intensify its management of plants and animals, the payoffs from sticking with the old ways declined a little further for everyone else. Every time cultivators started thinking of the plants and animals on which they lavished care and attention as their personal gardens and flocks, not part of a common stock, hunting and gathering would become that much more difficult for those who stuck to it. Foragers who clung stubbornly and/or heroically to the old ways became doomed because the energy-related odds kept tilting against them.
In reality, people would rarely, if ever, have confronted the choices’ quite so starkly as Sahlins imagined. A farmer who left his plow in the Jordan Valley around 6000 BC and started walking would not cross a sharp line into foragers’ territory. Rather, he would start to encounter people who farmed a little less intensively than he did (maybe hoeing their fields instead of plowing and manuring), and then people who farmed less intensively still (maybe burning patches of forest, cultivating them till the weeds grew back, then moving on), and eventually people who relied entirely on hunting and gathering. Ideas and people drifted back and forth across broad, fuzzy contact zones.
When people realized that neighbors with more intensive practices killed the wild plants and chased off the animals that their own foraging lifestyles depended on, they could fight these vandals, run away, or join the crowd and intensify their own cultivation. (See my essay “A Tragic View of Human Destiny” here: HYPERLINK "http://guymcpherson.com/2014/09/a-tragic-view-of-human-destiny/" http://guymcpherson.com/2014/09/a-tragic-view-of-human-destiny/) Instead of picking farming over foraging, people really only decided to spend a little less time gathering and hunting and a little more time gardening and herding. Later they might have to decide whether to start weeding, then plowing, and then manuring, but this occurred as a series of baby steps rather than a once-and-for-all great leap from the original affluent society to backbreaking toil and chronic illness.
On the whole, across hundreds of years and thousands of miles, for energy-related reasons those who intensified also multiplied; those who clung to their old ways dwindled. In the process, the agricultural “frontier” inevitably crawled forward. No one “chose” hierarchy and working longer hours; these things crept up on them very slowly over hundreds of years in many different places—driven ultimately by fundamental energy capture principles. We have no one to blame.
The great prehistoric exceptions to this pattern—the affluent foragers of Jomon Japan and the Baltic shores—seem to prove the rule. Farming advanced swiftly across the plains of Central Europe and Northeast Asia until its frontier came within fifty miles of the Baltic coast (around 4200 BC) and to the shores of Japan (around 2600 BC); but at both these points, it stopped in its tracks for more than a thousand years. Japan and the Baltic boasted wild resources of such richness that foragers had little.to gain from working harder and cultivating plants and animals, and if horticulturalists tried to force their way into these hunter-gatherer paradises, disrupting the abundance with farms and fences, they found themselves outnumbered by natives who knew how to fight. Even in these extraordinary locations, however, the wave of agricultural advance did eventually resume, until farmers had taken over every place on Earth where agriculture could profitably occur. Hence Morris’ conclusion: the shift from capturing energy by foraging to capturing it by farming did not occur inevitably, but once the world had warmed up and modern humans had evolved, it became as close to inevitable as anything in history could—just as our demise has now become, driven by fundamentally the same, energy-related principles based on fossil fuel use.
NTHE, love, and hope
Guy McPherson frequently concludes his talks regarding the high probability of NTHE with the idea that “Only love remains.” Of course we can, and many of us surely will, treat each other in many horrifically hurtful ways as economic, social, ecological, global warming and nuclear collapse continue to unfold. Even so, insofar as possible I strongly prefer the idea of people supporting each other emotionally and socially as we die—and working seriously to help myself and others to do that.
Related to all of this with its depression-inducing tendencies for most people, some words regarding hope—not hope of avoiding our self-constructed, self-annihilation trap!—seem important. Demoralization plays a major role in depression and the absence of hope occurs as a symptom of depression. It works as a major obstacle to engaging in treatment for depression. Cultivating hope then occurs as a major issue in treating depression, especially for some people. Helpers therefore must both evoke and offer hope, as well as remind people that sometimes, when things seem bleakest, people with depression do get better. For example, at 70 I have survived three episodes of major depression during one of which I came very close to killing myself. I have now remained depression-free for about the past 20 years despite many major disappointments, very stressful life changes, and deeply accepting the processes related to, and implications of, the near certainty of NTHE. One’s ability to lend hope regarding depression comes first from confidence that with effort depression can reverse. This confidence can come from the experience of seeing other people or oneself go through that process. We probably best communicate our confidence not in the content of what we say, but rather in the emotional tone and the conviction with which we speak.
Hope serves as an antidote to despair. Meanwhile, especially related to the high probability of NTHE, we need to use care in what, exactly, we hope for. Hope in an emotionally supportive context comes most centrally from developing trust and belief in the potency of caring others to help, especially early on in a supporting relationship. If people, early on, can experience concretely their often vague hopelessness and despair and share it with caring others, this generates healthy emotion-focused, relationship-oriented hope—not hope in changing our tragic destiny, whether our own, inevitable, personal death, or the inevitable extinction of our entire species in either the short or long term. Caring, supportive others’ ability to acknowledge the pain under one’s hopelessness activates the person’s unfulfilled yearnings for connection. This yearning then opens them to human contact and breaks their sense of isolation. Hope then emerges when others empathically understand one’s hopelessness. (For much more on empathy, see my essay “Answering Questions About Empathy For ASGs, ESGs, And Life” here: http://guymcpherson.com/2015/08/answering-questions-about-empathy-for-asgs-esgs-and-life/.) Hope for having the ability to reconnect to humanity, of others understanding, and most of all for having others validate one’s experience and reconnecting then emerges, as does a will to live.
Experiencing hope means allowing desire. Many people with depression avoid experiencing desire. If someone comes from a background of abuse, chronic disappointment, exploitation, or traumatic loss, they may experience desire as a frightening vulnerability or weakness that others will exploit or that will leave them open to others wounding them. If people have experiences in which their own difficulty controlling emotions or impulses has brought them pain, they may experience desire as an overwhelming force to avoid at all costs. Experiencing desire can feel dangerous to the core of the self. Emotion-focused hope involves allowing one again to feel and to wish.
Emotion-focused hope involves a yearning for a desired outcome related to human attachment and connection. Some writers view hope not as a positive emotion, but rather as a type of wishful thinking for relief from a negative experience. Camus captured this by suggesting that “hope is despair”, rather than an antidote to it. Sayings such as “Hope is what dreams are made of” and “false hope” capture the idea that hope can lead people astray. Many, however, including me, view emotion-focused hope as a virtue. This hope probably does not occur as a singular experience because it varies with the conditions that initiate it. Emotion-focused hope seems to sustain people in times of difficulty and it helps greatly in the psychotherapy enterprise in creating motivation to work on overcoming a problem. In a supportive community, such as in emotion-focused extinction support groups, encouraging emotion-focused hope plays an important role in forming a working alliance. How so? This hope activates people’s sense of agency, motivates them to work toward agreed-on goals, and it helps open up pathways to attain these goals. Research has shown that people characterized as having a large amount of hope when confronted with blockages come up with alternate pathways to goals. This indicates that hope promotes solutions to problems. We can see emotion-focused hope as a vital and life-giving principle, as captured in the saying “Where there is life, there is hope.” Studies have suggested that hope may contribute to the maintenance of healthy physiological states, including increased immune system functioning under the stress of significant loss—-and we will soon all experience far more stress and loss than most of us have ever imagined much less experienced.