An Introduction to Terror Management and an Ode to Thumbs
An Introduction to Terror Management and an Ode to Thumbs
For the previous post in this series click here.
To briefly recap my last post in this series, I began to make the argument that during the Pleistocene (i.e., cave-person times), humans and protohumans evolved in an environment surrounded by death. As a result, death was likely viewed as more normative and as a part of, albeit the end of, life. I ended my post with something of a cliffhanger—one that may have sounded a bit like an indictment: that civilization broadly and religion more specifically helped alienate the concepts of life and death so that today, we view them as opposites.
To be clear, this alienation is not the “fault” of religion. Although it is easy to view religion as a causal factor in human behavior (and I suppose it can be when we consider things such as suicide bombings, crusades, and marriage before sex), it is not where the story begins. Religion, like all other human constructs, evolved for a reason. And that reason was to help us make sense of death.
But before we can even get to the point where we decide if it is fair to blame religion, we need to take a step back again. Humans are the only species with religion, and in this way we are special. But not special in the sense that most religions would have us believe. What makes humans special is that we are self-aware. More to the point, we are self-aware of death. And this awareness is at the root of everything.
This isn’t my idea. In his 1973 Pulitzer Prize winning book The Denial of Death, sociologist Ernest Becker laid out an incredibly well-reasoned and diversely sourced argument for something that would eventually be known as Terror Management Theory (TMT). In my field of behavioral science, the importance of TMT depends on who you ask. On the one hand, it is an idea that exists alongside many others within the social sciences to help explain why humans and groups behave as they do. On the other hand, TMT could be considered as the closest thing the social sciences has to a unifying theory: what physicists might refer to as a ‘theory of everything’. The idea is simple: Awareness of one’s own mortality evokes a terror so strong that it can only be assuaged by the creation of meaning. The effects of this idea are both broad and profound: Once humans have a meaning, they will create and alter their beliefs and behaviors to be consistent with this meaning, even (very much ironically) engagining in behaviors that make death more likely.
I was not aware of TMT until I was well into my graduate program at the University of Texas. And to be honest, I was less than impressed. To be fair, I had not yet read The Denial of Death and so most of what I knew about TMT came from bits and pieces of journal articles and textbooks I had read devoted to something else. Nevertheless, one weakness of humans to which I am not immune is the ability to have a fully formed opinion based on only fragments of information. And my opinion was that TMT was kooky. Not kooky bad, or kooky wrong, but just kooky kooky. As in silly. I thought it was silly to think that everyday human behavior was motivated by the terror of death. Sure, swerving away from on-coming traffic or that prickly feeling of walking alone at night might be related to fears of death, but TMT advocates seemed to be arguing that fear of death motivated nearly everything.
As someone whose expertise was in perceptions of human facial attractiveness, I believe my first encounter with TMT was through a paper about changing perceptions of beauty over the last 50 years. For example, people with naturally lighter skin (decedents of Europeans) were considered more attractive in the 1970s when they appeared to have darker skin (e.g., a “healthy California tan”) than without. Whereas the reverse was true by the mid 2000s; here, a lack of a tan denoted greater attractiveness. Without going into much detail in this post (see the section on beauty), I would have explained that result as being caused by changes in common perceptual experience. For instance, the it-girls of the 1970s included Charlie’s Angels, whereas the 2000s gave us countless movies of pale-looking vampires and wizards. Because we tend to find what is familiar attractive, the finding regarding changing perceptions about tans made sense.
From a TMT perspective, however, tans were all about an unconscious association with death. In the 1970s, few people understood the connection between tanning and poor health. Although it would have been considered as healthy then, today we understand that tanning comes with a significant risk of skin cancer, especially among those of us who are fair-skinned. As such, TMT suggests that we were not so much attracted to the goth children of the night portrayed in Twilight as we were repelled by beachgoers and tanning salon patrons. Likewise, in the 1970s, people were not so much attracted to tanned celebrities as they were repelled by pale zombie-like walking corpses.
It may sound like I am exaggerating, but in fact, the word “corpse” shows up a surprising amount in this literature. And this is why I thought it was kooky. Although there are certainly exceptions to Ockham’s Razor (that the simplest explanation is usually true) in science, it is a policy that I try to apply to almost everything I study. Nature is complex, but causes are often parsimonious. For me, it was far simpler to imagine that white people used to be judged as more attractive when tanned than when pasty because there were far more tanned people around 50 years ago than there are today. The idea that people are less attracted to tanned people today because a part of our brains says “CANCER!” simply wasn’t palatable.
But just because something isn’t palatable doesn’t make it wrong. (Except for quinoa. Quinoa is wrong.)
Terror Management Theory started to make more sense for me when my own research kept leading me back to the idea that death was in fact at the core people’s responses to beauty (and perhaps a whole lot more). Let’s go back to cave-person times again.
Recall that the Pleistocene began roughly 2.5 million years ago. Not long after that (well… 500,000 years is a long time, but not on a geological level), our primate relatives developed what might be the most important physiological adaptation in our species’ history. Give it up for the opposable thumb.
The Thumb
The thumb is so much more than the most important finger. More than just about any other appendage, the thumb enables interaction with the modern human world. Although I could probably type this paragraph at 80% effectiveness without my thumbs on the laptop I am using, once I was done, I wouldn’t even be able to pick up my phone, let along use it. I wouldn’t be able to open jars (not that I often have that ability even with thumbs), nor would I be able to use even the simplest of tools.
Now think about what that implies for us as a species. It might initially seem as though we’ve created a world that centers around the use of the thumb but in fact the opposite is true: The appearance of the thumb enabled the creation of the modern world. This might seem silly on some level (perhaps especially if you are like me and you are now picturing human-sized thumbs building the pyramids or painting the Mona Lisa, or discovering penicillin). But it is also necessary true based on the timing of events. The thumb predates civilization by approximately 2 million years.
Why should it be the thumb that gets the credit? Because of what it must have demanded of our future evolution. The appearance of the thumb allowed greater interaction with the environment, which necessitated greater developments to the parts of our brain that engage not only fine-motor movement but also decision making (e.g., “Okay, I can grab this stick, now what am I going to do with it?”). Both of these neural activities take place in the frontal cortex, the portion of the brain that you think of—and think with—when you think of the brain. It is also comparatively overdeveloped in humans. Other animals, including those who are most closely related to us such as chimpanzees and bonobos, have smaller frontal cortices with less computing power.
A Video Game Analogy
My oldest son is a high school with a deep love of dinosaurs. Whereas most children move from a dinosaur phase to a horse phase or a Pokemon phase or a K-pop phase, he stuck with his first obsession. Because he’s a high school student he’s old enough to have extended his love of dinosaurs to the story of the dinosaurs. In other words, he’s interested in evolutionary biology. I mention him here because he’s taught me a new and simple way of thinking about evolutionary changes. (Although I’m an evolutionary scientist, my interest in humans is limited to, mostly, the last 2.5 million years. As a dinosaur guy, his epoch of interest is 100 times longer and includes many more dramatic changes.) In short, he has taught me to use the metaphor of a role-playing video game.
In these games, players are given task to complete (slaying dragons, making potions, etc.) and rewarded for completing those tasks with points of some kind. These points can be then be spent on “leveling up” one’s character. That is, players can make their characters stronger or faster or tougher to prepare them for more challenging parts of the game. But the key is the word “or”. Unless a player puts in 1000s of hours of gameplay (and in fairness, some do, so perhaps this is where the metaphor breaks down), they can only advance a few skills while others remain stagnant.
This is essentially how evolution works. No species is on its way to being the smartest AND the strongest AND the fastest. Natural selection chooses the individuals that are better fits to the environment within a varied species to survive and thrive. The choices that our ancestors made, with their thumbs, forever changed us.
Now think about what having thumbs cost us. Although our thumbs allowed us to constantly level up our intelligence (spear created +1; fire controlled +3; wheel invented +5), we did so at the cost of our physical size and strength. Our ancestors who invested in strength became orangutans; those who invested in dexterity became lemurs; those who invested in crypto became douchebags.
Those early protohumans were smart, sure. But they were also weak and slow and hairless compared with our cousins. As time passed, our defenselessness would become a real threat to our existence. But thanks to our intelligence (which was thanks to our thumbs), we solved the problem. We would live in relatively large groups. Saber-toothed tigers, Tasmanian lions, and dire wolves would be more reluctant to attack.
Big deal. There has always been safety in numbers. Antelopes know that. But the protohumans, with their opposable thumbs and increasingly larger frontal cortices, were a fair shade smarter than antelopes. That intelligence would eventually lead to a stratified social structure and the development of self-awareness. Two developments that would eventually take us to the top of what we imagine as the evolutionary ladder as well as what might ultimately be our assured mutual destruction. But that’s for a future post. The story continues.
-The Plague Doctor
Have a question or comment about this essay? Contact the website here.
Discover Past Articles by Theme
Death
The one thing that unites all living things is also the thing that all life seeks to avoid.
Beauty
The illusion that allows humans to avoid the terror of their own mortality.
The Uncanny
The discomfort of ambiguity, especially in the context of human life.