Shame and Dr. Strangelove

One of my favorite ever descriptions in science is a chapter in Steven Pinker’s How The Mind Works talking about sadness.  He likens it to the Doomsday Device from Stanley Kubrick’s classic Dr. Strangelove.  I’m not going to re-type the whole chapter and verse here, but try to summarize: in Dr. Strangelove the Doomsday Device is described as a completely automated device which, in the event of a nuclear attack on Russian soil, would result in an unstoppable nuclear blast which will destroy the entire surface of the planet.

The goal is to bypass the theory of nuclear brinkmanship.  Ordinarily, the idea goes, the U.S. might, in an era of high tension, launch a single nuclear missile, under the theory that the Russians aren’t willing to start a total mutually assured destruction, ending the world, just to make up for the loss of one city.  However, once the Doomsday Device is activated, there are no such calculations; a computer will make the decision to launch as soon as a single nuclear explosion occurs on Russian soil.  As such, the U.S. cannot afford to launch a first strike, since they know that the best they can hope for is the destruction of all life on planet Earth.

Thus, the Doomsday Device is subject to being misled, such as in the movie, when a U.S. bomber has its recall device damaged, and cannot be stopped from completing its mission, launching one errant bomb which will, in effect, end the world.  The Doomsday Device’s programmers would, if they could, choose to short-circuit the device and save the planet, especially since the Device’s existence hadn’t been made public yet, but the Device is designed to prevent that from happening.

Pinker analogizes sadness to the Doomsday Device.  As an example, if a close loved one dies, you are overcome with horrible feelings, which render you unable to function normally, sometimes for an extended period of time.  Somehow, you know that this will be an outcome of this sort of event; thus, you take actions to try to prevent your loved ones from dying.  These actions run from the mundane (don’t stick a fork in the electrical socket!) to the heroic (throwing yourself in front of an oncoming car to push a child out of the way).

However, like the Doomsday Device, there might be times when it would be convenient to shortcut the damaging aftereffects.  Indeed, in the event of traumatizing sadness, it serves a great function, evolutionarily speaking, to act ahead of time as if you were going to be terribly sad if you didn’t keep the lion from eating your sister.  However, once it happens, you would be best served by being able to get on with your life.  Indeed, if we could design an optimal computer program, it might do this – act as if it had emotions to prevent bad things from happening, but once they happen, do not get bogged down grieving over them; instead, learn a lesson and move on.

However, we aren’t designed nearly so rationally, and so our nuclear brinkmanship game with our emotions has to be played to its natural endpoint which, in this case, is the intense feelings of grief and sorrow that we suffer when bad things happen.  We cannot be programmed to ignore past experiences, and if we knew from past experiences, either directly or through seeing what happened to others, that grief really wasn’t so bad after all, then we wouldn’t go to such extraordinary measures to protect our loved ones, and our evolutionary fitness would go down as a result.

This is all as a very long prologue to saying that Ross Douthat could probably learn a little something from this analysis in his thinking about “How Shame Works.”  As prologue, you might read Ta-Nehisi’s post, which kicked off the conversation.

To wit, Ross writes:

Having your mother kick you out of the house if you get pregnant out of wedlock probably isn’t going to improve your life chances, but the fear that your mother might kick you out stands a good chance of deterring you from making a bad decision in the first place.

Which is, pretty much exactly, the defintion of emotion as a Doomsday Device that I sketched out above.  It’s not logical, as in, if we were programming a human being from scratch, we would program (for instance) your mother to act as if she were going to kick you out of the house if you get pregnant out of wedlock, but then, if it happens, to forgive and forget.  And you, the wayward child, would act for all the world as if you knew you would get kicked out if you did something so foolish, and you would not take the knowledge that it didn’t happen and play it forward in other, future decisions.

But, we’re not programmed like that; instead, we’re left with the messy, often-irrational programs that Mother Nature granted us.


3 Responses to Shame and Dr. Strangelove

  1. alanogilvy says:

    Your opening phrase (“one of my favorite-ever descriptions in science”) has me thinking about what mine might be. What are some of your others?

    Dr. Strangelove is a fantastic movie — thanks for the reminder.

  2. […] on a slightly more serious note (although, really – what’s more serious than S’Mores?) alanogilvy writes Your opening phrase (”one of my favorite-ever descriptions in science”) has me thinking […]

  3. Solveig says:

    If I were going to re-program humans, I’d bypass shame as a motivator and go straight to the classic emotional intelligence outlined by Goleman and Skinner and others. Which is of course essentially just considering and valuing long-term benefits (really don’t want to work low-paying service jobs for the rest of my life) over short-term gratification (mmm, Mark is sooo cute, and he’s a senior!).

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: