What Makes Humans Human?

Little, today, is as it was.

Anatomically modern humans have existed for about 200,000 years, but only since the end of the eighteenth century has artificial lighting been widely used. Gas lamps were introduced in European cities about that time, and electric lights came into use only in the twentieth century.

In other words, for most of human history, when night fell, it fell hard. Things got really, really dark,

and people gathered under the stars, which they could actually see, in those days before nighttime light pollution,

and under those stars, they told stories.

In EVERY culture around the globe, storytelling, in the form of narrative poetry, existed LONG before the invention of writing. We know this because the earliest manuscripts that we have from every culture record stories that were already ancient when they were finally written down. One of the earliest texts in English is that of the poem Beowulf. It reworks and retells, in a much distorted manner, much, much older stories—ones that predate the emergence of English as a distinct language. Stith Thompson, the great folklorist, did the literary world an enormous favor by compiling a massive index, today known as the Arne-Thompson Index, of motifs of ancient folktales worldwide. Name a story motif—three wishes, talking animals, the grateful dead, cruel stepsisters, golden apples, dragons, the fairy or demon lover, the instrument that plays itself –and you will find that the motif has an ancient pedigree and was already spread about the world long before historical times.

English is a Germanic language. All ancient Germanic societies had official storytellers whose job it was to entertain people in those days before modern entertainments like television and movies and the Internet and drones with laser-guided Hellfire missiles. In ancient Denmark, the storyteller was called a skaald. In Anglo-Saxon England, the storyteller was a scop (pronounced like MnE “shop”). The scop accompanied his stories on the Anglo-Saxon harp, a kind of lyre.

Of course, the telling of stories wasn’t the only entertainment around campfires. In most cultures, people danced and chanted and sang as well, and sometimes stories were told by the dancers or singers or chanters. All this was part of acting out the stories. (Want to know where the Christian devil, with his red body and horns, comes from? Well, in ancient Europe, people worshiped an Earth Mother and her consort, a Lord of the Forest, and they told stories of the hunt. When they acted these out around campfires, they held up to their heads animal horns, or branches in the shape of horns, and that’s how they pictured their Lord of the Forest, as a therianthrope, red from the campfire, with horns. When the Christians spread North across Europe, they made the god of the Old Religion into The Adversary. Grendel’s mother, the monster from the bog in Beowulf, is a demonized version, in a Christian story, of the ancient Anglo-Saxon fertility goddess Nerthus, to whom sacrifices were made by binding people, cutting their throats, and throwing them into a bog. You can see an ancient bas relief of the Lord of the Forest, btw, on the Gundestrup cauldron dating from 150 to 1 BCE. See the accompanying illustration.)

But where does this storytelling urge among humans come from, and why is it universal? Storytelling takes energy. And it doesn’t produce tangible results. It doesn’t mend bones or build houses or plant crops. So, why would it survive and be found among every people on Earth from the earliest times onward?

Contemporary cognitive scientists have learned that storytelling is an essential, built-in part of the human psyche, involved in every aspect of our lives, including our dreams, memories, and beliefs about ourselves and the world. Storytelling turns out to be one of the fundamental ways in which our brains are organized to make sense of our experience. Only in very recent years have we come to understand this. We are ESSENTIALLY storytelling creatures, in the Aristotelian sense of essentially. That is, it’s our storytelling that defines us. If that sounds like an overstatement, attend to what I am about to tell you. It’s amazing, and it may make you rethink a LOT of what you think you know.

At the back of each of your eyes are retinas containing rods and cones. These take in visual information from your environment. In each retina, there is a place where the optic nerve breaks through it. This is the nerve that carries visual signals to your brain. Because of this interruption of the retinas, there is a blind spot in each where NO INFORMATION AT ALL IS AVAILABLE. If what you saw was based on what signals actually hit your retina at a given moment, you would have two big black spots in your field of vision. Instead, you see a continuous visual field. Why? Because your brain automatically fills in the missing information for you, based on what was there when your eye saccaded over it a bit earlier. In other words, your brain makes up a story about what’s there. Spend some time studying optical illusions, and you will learn that this is only one example of many ways in which you don’t see the world as it is but, rather, as the story concocted by your brain says it is.

This sort of filling in of missing pieces also happens with our memories. Scientists have discovered that at any given moment, people attend to at most about seven bits of information from their immediate environment. There’s a well-known limitation of short-term memory to about seven items, give or take two, and that’s why telephone numbers are seven digits long. So, at any given moment, you are attending to only about seven items from, potentially, billions in your environment. When you remember an event, your brain FILLS IN WHAT YOU WERE NOT ATTENDING TO AT THE TIME based on general information you’ve gathered, on its predispositions, and on general beliefs that you have about the world. In short, based on very partial information, your brain makes up and tells you a STORY about that past time, and that is what you “see” in memory in your “mind’s eye.”

So, people tend to have a LOT of false memories because the brain CONFABULATES—it makes up a complete, whole story about what was PROBABLY the case and presents that whole memory to you, with the gaps filled in, for your conscious inspection. In short, memory is very, very, very faulty and is based upon the storytelling functions of the brain!!!! (And what are we except our memories? I am that boy in the Dr. Dentons, in my memory, sitting before the TV with the rabbit ears; I am that teenager in the car at the Drive-in with the girl whom I never thought in a million years would actually go out with me. But I’m getting ahead of myself.)

You can also see this storytelling function of the brain at work in dreaming. Years ago, I had a dream that I was flying into the island of Cuba on a little prop plane. Through the window, I could see the island below the plane. It looked like a big, white sheet cake, floating in an emerald sea. Next to me on the airplane sat a big, red orangutan smoking a cigar.

Weird, huh? So why did I have that dream? Well, in the days preceding the dream I had read a newspaper story about Fidel Castro, the leader of Cuba, being ill; I had flown on a small prop plane; I had attended a wedding where there was a big, white sheet cake; I had been to the zoo with my grandson, where we saw an orangutan; and I had played golf with some friends, and we had smoked cigars.

The neural circuits in my brain that had recorded these bits and pieces were firing randomly in my sleeping brain, and the part of the brain that does storytelling was working hard, trying to piece these random fragments together into a coherent, unified story. That’s the most plausible current explanation of why most dreams occur. The storytelling parts of the brain are responding to random inputs and tying them together—making sense of this random input by making a plausible story of them. This is akin to the process, pareidolia, that leads people see angels in cloud formations and pictures of Jesus on their toast.

So, those are three important reasons why the brain is set up as a storytelling device. Storytelling allows us to see a complete visual field; creates for us, from incomplete data, coherent memories; and ties together random neural firings in our brains to into the wholes that we call dreams.
.
But that’s not all that storytelling does for us. Storytelling about the future allows us to look ahead—for example, to determine what another creature is going to do. We often play scenarios in our minds that involve possible futures. What will she say if I ask her to the prom? What will the boss say if I ask for a raise? How will that go down? In other words, storytelling provides us with a THEORY OF MIND for predicting others’ behavior.

Stories also help people to connect to one another. When we tell others a story, we literally attune to them. We actually get “on the same wavelengths.” Uri Hasson, a neuroscientist at Princeton, recorded the brainwaves of people during rest and while listening to a story. During rest, their waves were all over the place. While listening to the same story, even at different times and places, those people had brainwaves that were in synch.

Storytelling also provides a mechanism for exploring and attempting to understand others generally. Our basic situation in life is that your mind is over there and mine is over here. We’re different, and we have to try to figure each other out—to have a theory of other people’s minds. By telling myself a story about you, I can attempt to bridge that ontological gap. Unfortunately, the stories we tell ourselves about others tend to be fairly unidimensional. You are simply this or that. I, on the other hand, am an international man of mystery. This is a tendency we need to guard against.

We also tell stories in order to influence others’ behavior–to get them to adopt the story we’re telling as their own. This is how advertising works, for example. The advertiser gets you to believe a story about how you will be sexier or smarter or prettier or more successful or of higher status if you just buy the product with the new, fresh lemony scent. And it’s not just advertisers who do this. Donald Trump sold working class Americans a fiction about how he could strike deals that would make America great again because he was such a great businessman, one who started with nothing and made billions. The coach tells a story in which her team envisions itself as the winners of the Big Game. The woo-er tells the woo-ee the story of the great life they will have together (“Come live with me and be my love/And we shall all the pleasures prove”). And so on. Successful cult leaders, coaches, lovers, entrepreneurs, attorneys, politicians, religious leaders, marketers, etc., all share this is common: they know that persuasion is storytelling. The best of them also understand that the most successful stories, in the long run, are ones that are true, even if they are fictional.

When we tell stories, we spin possible futures—we try things on, hypothetically. And that helps us to develop ideas about who we want to be and what we want to do. Gee, if I travel down that road, I may end up in this better place.

And that observation leads to one final, supremely important function of storytelling: Who you are—your very SELF—is a story that you tell yourself about yourself and your history and your relations to others—a story with you as the main character. The stories you tell yourself about yourself become the person you are. The word person, by the way, comes from the Latin persona, for a mask worn by an actor in the Roman theatre.

So, our very idea of ourselves, of our own personal identity, is dependent upon this storytelling capacity of the human brain, which takes place, for the most part, automatically. There is even a new form of psychotherapy called cognitive narrative therapy that is all about teaching people to tell themselves more life-enhancing, affirmative stories about themselves, about who they are.

Telling yourself the right kinds of stories about yourself and others can unlock your creative potential, improve your relationships, and help you to self create—to be the person you want to be.

So, to recapitulate, storytelling . . .

helps us to fill in the gaps so that we have coherent memories,

ties together random firings in the brain into coherent dreams,

enables us to sort and make sense of past experience,

gives us theories of what others think and how they will behave,

enables us to influence others’ behavior,

enables us to try on various futures, and

helps us to form a personal identity, a sense of who were are.

Kinda important, all that!

Storytelling, in fact, is key to being human. It’s our defining characteristic. It’s deeply embedded in our brains. It runs through every aspect of our lives. It makes us who we are.

It’s no wonder then, that people throughout history have told stories. People are made to construct stories—plausible and engaging accounts of things—the way a stapler is made to staple and a hammer is made to hammer. We are Homo relator, man the storyteller.

(BTW, the root *man, meaning “human being” in general, without a specific gender reference, is ancient. It goes all the way back to Proto-Indo-European, but there’s still good reason, today, to seek out gender-neutral alternatives, when possible, of course.)

Copyright 2015. Robert D. Shepherd. All rights reserved.

Art: Detail from the Gundestrup Cauldron. Nationalmuseet [CC BY-SA 3.0 (https://creativecommons.org/licenses/by-sa/3.0) or CC BY-SA 2.5 (https://creativecommons.org/licenses/by-sa/2.5)]

For more pieces by Bob Shepherd on the topic of Education “Reform,” go here: https://bobshepherdonline.wordpress.com/category/ed-reform/

For more pieces on the teaching of literature and writing, go here: https://bobshepherdonline.wordpress.com/category/teaching-literature-and-writing/

Posted in Short Stories, Teaching Literature and Writing, Uncategorized | 33 Comments

It’s about Time (a Catena)

creation-web-version

  

A brief tour of fascinating (and lunatic) notions that philosophers (and a few poets) have had about time. 

The Mystery of Time

“What then is time? If no one asks me, I know; if I wish to explain it to one who asks, I know not.”

–St. Augustine (345–430 CE), Confessions

PART 1: What Is Time? Types of Time

Albert_Einstein_at_the_age_of_three_(1882)Absolute or Scientific Newtonian Time

“Absolute, true and mathematical time, of itself, and from its own nature flows equably without regard to anything external, and by another name is called duration.”

–Sir Isaac Newton (1643–1727), Philosophiae naturalis principia mathematica (Mathematical Principles of Natural Philosophy)

The Specious (Nonexistent) Present

“The relation of experience to time has not been profoundly studied. Its objects are given as being of the present, but the part of time referred to by the datum is a very different thing from the conterminous of the past and future which philosophy denotes by the name Present. The present to which the datum refers is really a part of the past — a recent past — delusively given as being a time that intervenes between the past and the future. Let it be named the specious present, and let the past, that is given as being the past, be known as the obvious past. [Each of] all the notes of a bar of a song seem to the listener to be contained in the [specious] present. [Each of] all the changes of place of a meteor seem to the beholder to be contained in the [specious] present. At the instant of the termination of [each element in] such series, no part of the time measured by them seems to be [an obvious] past. Time, then, considered relatively to human apprehension, consists of four parts, viz., the obvious past, the specious present, the real present, and the future. Omitting the specious present, it consists of three . . . nonentities — the [obvious] past, which does not [really] exist, the future, which does not [yet] exist, and their conterminous, the [specious] present; the faculty from which it proceeds lies to us in the fiction of the specious present.”

–E. Robert Kelley, from The Alternative, a Study in Psychology (1882). Kelley’s concept of the specious present has been extremely influential in both Continental and Anglo-American philosophy despite the fact that Kelley was not a professional philosopher.

Albert_Einstein_as_a_childSubjective Time

“Oh, yeah. Hegel’s Phenomenology of Spirit. I never finished it, though I did spent about a year with it one evening.”

Experienced Time: The “Wide” Present

“In short, the practically cognized present is no knife-edge, but a saddle-back, with a certain breadth of its own on which we sit perched, and from which we look in two directions into time. The unit of composition of our perception of time is a duration, with a bow and a stern, as it were—a rearward- and a forward-looking end. It is only as parts of this duration-block that the relation or succession of one end to the other is perceived. We do not first feel one end and then feel the other after it, and forming the perception of the succession infer an interval of time between, but we seem to feel the interval of time as a whole, with its two ends embedded in it.”

–William James, “The Perception of Time,” from The Principles of Psychology, Book I

459px-Einstein_patentofficeA, B, and C Series Time (Three Ways of Looking at Time)

  • The A Series: Time as Past, Present, and Future
  • The B Series: Time as Earlier, Simultaneous, and Later
  • The C Series: Time as an Ordered Relation of Events (with the direction being irrelevant)

Influential distinctions made by John Ellis McTaggart in “The Unreality of Time.” Mind 17 (1908): 456-476. The three types are much discussed by philosophers in the Anglo-American analytic tradition.

See also The Unreality of Time 2: Block Time, below

PART 2: Does Time Exist?

No, It Doesn’t: Change Is a Self-Contradictory Idea

“For this view can never predominate, that that which IS NOT exists. You must debar your thought from this way of search. . . .There is only one other description of the way remaining, namely, that what IS, is. To this way there are very many signposts: that Being has no coming-into-being . . . . Nor shall I allow you to speak or think of it as springing from not-being; for it is neither expressive nor thinkable that what-is-not is. . . . How could Being perish? How could it come into being? If it came into being, it is not; and so too if it is about-to-be at some future time. . . .For nothing else either is or shall be except Being, since Fate has tied it down to be a whole and motionless; therefore all things that mortals have established, believing in their truth, are just a name: Becoming and Perishing, Being and Not-Being, and Change of position, and alteration of bright color.”

–Parmenides of Elea (c. 475 BCE), fragment from The Way of Truth, in Ancilla to the PreSocratic Philosophers, ed. Kathleen Freeman

Albert_Einstein_(Nobel)“Does the arrow move when the archer shoots it at the target? If there is a reality of space, the arrow must at all times occupy a particular position in space on its way to the target. But for an arrow to occupy a position in space that is equal to its length is precisely what is meant when one says that the arrow is at rest. Since the arrow must always occupy such a position on its trajectory which is equal to its length, the arrow must be always at rest. Therefore, motion is an illusion.”

–Zeno of Elea (c. 450 BCE), fragment from Epicheriemata (Attacks), in Ancilla to the PreSocratic Philosophers, ed. Kathleen Freeman

“One part of time has been [the past] and is not, while the other is going to be and is not yet [the future]. Yet time, both infinite time and any time you care to take, is made up of these. One would naturally suppose that what is made up of things which do not exist could have no share in reality.”

–Aristotle (384–322 BCE), Physics, IV, 10–14. 217b-244a.

462px-Einstein-formal_portrait-35Yes, It Does: Change Is the Fundamental Reality of Our Lives

“It is not possible to step twice into the same river.”

–Heraclitus, (c. 475 BCE), fragment from unnamed book, in Ancilla to the PreSocratic Philosophers, ed. Kathleen Freeman

[Heraclitus seems to have held this fact to be one of many indications of the essential unworthiness/irredeemability of this life; the other fragments of his writings that have survived suggest that Heraclitus was a kind of 5th century fundamentalist preacher, upset about the moral decay around him, who viewed the world as synonymous with decay, and who wanted to point his readers, instead, toward the eternal Logos. Plato inherited this view; the Christian church inherited Plato’s. Such contemptu mundi (contempt for the world) is often, in that tradition, expressed as contempt for that which exists “in time” and is not eternal.]

“Time is nature’s way of keeping everything from happening at once.”

–Woody Allen (1935–      )

Albert_Einstein_Head

No, It Doesn’t: Time is an Illusion Due to Vantage Point in an Eternal Space Time (the “Block Time” Hypothesis):

“Now Besso has departed from this strange world a little ahead of me. That means nothing, for we physicists believe the separation between past, present, and future is only an illusion, although a convincing one.”

–Albert Einstein (1879­–1955), in a letter written to the family of Michele Besso, on Besso’s death

“All time is all time. It does not change. It does not lend itself to warnings or explanations. It simply is. Take it moment by moment, and you will find that we are all, as I’ve said before, bugs in amber.”

462px-Einstein-formal_portrait-35–Kurt Vonnegut, Jr. (1922–2007), who is in heaven now, Slaughterhouse Five

Time present and time past
Are both perhaps present in time future,
And time future contained in time past.
If all time is eternally present
All time is unredeemable.

–T.S. Eliot (1888–1965), “Burt Norton,” from Four Quartets

No, It Doesn’t: The Now as Consequence of the Blindness of the Brain to Its Own Processing of Temporal Data (the “Blind Brain” Hypothesis)

“Nothing, I think, illustrates this forced magic quite like the experiential present, the Now. Recall what we discussed earlier regarding the visual field. Although it’s true that you can never explicitly ‘see the limits of seeing’–no matter how fast you move your head–those limits are nonetheless a central structural feature of seeing. The way your visual field simply ‘runs out’ without edge or demarcation is implicit in all seeing–and, I suspect, without the benefit of any ‘visual run off’ circuits. Your field of vision simply hangs in a kind of blindness you cannot see.

“This, the Blind Brain Hypothesis suggests, is what the now is: a temporal analogue to the edgelessness of vision, an implicit structural artifact of the way our ‘temporal field’–what James called the ‘specious present’–hangs in a kind temporal hyper-blindness. Time passes in experience, sure, but thanks to the information horizon of the thalamocortical system, experience itself stands still, and with nary a neural circuit to send a Christmas card to. There is time in experience, but no time of experience. The same way seeing relies on secondary systems to stitch our keyhole glimpses into a visual world, timing relies on things like narrative and long term memory to situate our present within a greater temporal context.

“Given the Blind Brain Hypothesis, you would expect the thalamocortical system to track time against a background of temporal oblivion. You would expect something like the Now. Perhaps this is why, no matter where we find ourselves on the line of history, we always stand at the beginning. Thus the paradoxical structure of sayings like, “Today is the first day of the rest of your life.” We’re not simply running on hamster wheels, we are hamster wheels, traveling lifetimes without moving at all.

“Which is to say that the Blind Brain Hypothesis offers possible theoretical purchase on the apparent absurdity of conscious existence, the way a life of differences can be crammed into a singular moment.”

–Scott Bakker, “The End of the World As We Knew It: Neuroscience and the Semantic Apocalypse”

PART 3: What Contemplation of Time Teaches Us about Living

Carpe Diem

“Such,” he said, “O King, seems to me the present life of men on Earth, in comparison with that time which to us is uncertain, as if when on a winter’s night, you sit feasting . . . and a simple sparrow should fly into the hall, and coming in at one door, instantly fly out through another. In that time in which it is indoors it is indeed not touched by the fury of winter; but yet, this smallest space of calmness being passed almost in a flash, from winter going into winter again, it is lost to our eyes.

“Something like this appears the life of man, but of what follows or what went before, we are utterly ignorant.”

–The Venerable Bede (c. 672–735), Ecclesiastical History of the English People, Book II

Albert_Einstein_(Nobel)

“Seize the day, trusting as little as possible in the future.”

–Horace (65–8 BCE), Odes 1.11

Oh, come with old Khayyam, and leave the Wise
To talk; one thing is certain, that Life flies;
One thing is certain, and the Rest is Lies;
The Flower that once has blown for ever dies.

Omar Khayyám (1048–1131), “Rubiyat,” trans. Edward FitzGerald

Gather ye rosebuds while ye may
Old time is still a-flying:
And this same flower that smiles to-day
To-morrow will be dying.

–Robert Herrick (1591–1674), “To the Virgins, to Make Use of Time”

459px-Einstein_patentofficeBut at my back I alwaies hear
Times winged Charriot hurrying near:
And yonder all before us lye
Desarts of vast Eternity.
Thy Beauty shall no more be found;
Nor, in thy marble Vault, shall sound
My ecchoing Song: then Worms shall try
That long preserv’d Virginity:
And your quaint Honour turn to durst;
And into ashes all my Lust.
The Grave’s a fine and private place,
But none I think do there embrace.
Now therefore, while the youthful hew
Sits on thy skin like morning glew,
And while thy willing Soul transpires
At every pore with instant Fires,
Now let us sport us while we may;
And now, like am’rous birds of prey,
Rather at once our Time devour,
Than languish in his slow-chapt pow’r.
Let us roll all our Strength, and all
Our sweetness, up into one Ball:
And tear our Pleasures with rough strife,
Thorough the Iron gates of Life.
Thus, though we cannot make our Sun
Stand still, yet we will make him run.

–Andrew Marvell (1621–1678), “To His Coy Mistress”

“Get it while you can.
Don’t you turn your back on love.”

–The American philosopher Janis Joplin (1943–1970)

Albert_Einstein_as_a_childGive Up/It’s All Futile Anyway

“A man finds himself, to his great astonishment, suddenly existing, after thousands of years of nonexistence: he lives for a little while; and then, again, comes an equally long period when he must exist no more. The heart rebels against this, and feels that it cannot be true.

“Of every event in our life we can say only for one moment that it is; for ever after, that it was. Every evening we are poorer by a day. It might, perhaps, make us mad to see how rapidly our short span of time ebbs away; if it were not that in the furthest depths of our being we are secretly conscious of our share in the exhaustible spring of eternity, so that we can always hope to find life in it again.

“Consideration of the kind, touched on above, might, indeed, lead us to embrace the belief that the greatest wisdom is to make the enjoyment of the present the supreme object of life; because that is the only reality, all else being merely the play of thought. On the other hand, such a course might just as well be called the greatest folly: for that which in the next moment exists no more, and vanishes utterly, like a dream, can never be worth a serious effort.”

–The ever-cheerful Arthur Schopenhauer (1788–1860), “The Vanity of Existence,” from Studies in Pessimism

Three Phenomenologist/Existentialist Views of Time

NB: the following are NOT quotations. I’ve summarized material that appears in much longer works. You’re welcome. I have included Husserl in this section, even though his work is just an attempted explanation of time, because the other two philosophers treated here are reacting to Husserl’s ideas.

Albert_Einstein_at_the_age_of_three_(1882)Husserl (very bright dude, this one): All our ideas about time spring from our conscious experience of the present. That experience is characterized by being intentional, by being toward something. We typically recognize three kinds of time: 1. scientific, objective, Newtonian time, which we think of as being independent of ourselves and as independently verifiable; 2. subjective time, in which events seem to move slower or faster; and 3. phenomenological or intentional time, which is the fundamental experience on which the other concepts of time are based, from which the other concepts derive because the phenomenological present includes not only awareness of present phenomena (the present), but retention (awareness of that which is not present because it no longer is—the past), and protention (awareness of that which is not present because it is about to be). The present is intentionality toward phenomena before us here, now. The past is present intentionality toward phenomena that are not present but are with us and so must be past (that’s where the definition of past comes from). The future is present intentionality toward phenomena that also are present but are not with us (as the past is) and so must be the future, which will be (that’s where the definition of future comes from). Therefore, in their origins in our phenomenological experiences, the future and the past are parts of the present, conceptual phenomena held in the present, alongside actual phenomena, as phenomena no longer present and not yet present.

Albert_Einstein_as_a_childHeidegger: Husserl had it all wrong. It’s the future, not the present, that is fundamental. We are future-oriented temporalities by nature, essentially so. Our particular type of being, Dasein, or being-there, is characterized by having care (about its projects, its current conditions, about other beings)—about matters as they relate to those projects. Our being is characterized by understanding, thrownness, and fallenness. Understanding, is the most fundamental of the three. It is projection toward the future, comportment toward the possibilities that present themselves, potentiality for being. Our understanding seizes upon projects, projecting itself on various possibilities. In its thrownness, Dasein always finds itself in a certain spiritual and material, historically conditioned environment that limits the space of those possibilities. As fallenness, Dasein finds itself among other beings, some of which are also Dasein and some of which (e.g., rocks) are not Dasein, and it has, generally respectively, “being-with” them or “being alongside” them, and these help to define what possibilities there are.  “Our sort of being (Dasein) is being for which being is an issue.” Why is it an issue? Well, we are finite. We know that we are going to die. This is the undercurrent that informs our essential being, which is care, concern. We are projections toward the future because undertaking these projects is an attempt, however quixotic, to distract ourselves from or even to cheat death. We care about our projects because, at some level, we care about not dying, having this projection toward the future for which we are living.

459px-Einstein_patentofficeSartre: The world is divided into two kinds of being: being-for-itself (the kind of being that you and I have) and being-in-itself (the kind of being that a rock or a refrigerator has). Let’s think a bit about our kind of being. Take away your perceptions, your body, your thoughts. Strip everything away, and you still have pure being, the being of the being-for-itself, but it is a being that is also nothing. (The Buddha thought this, too). Being-for-itself has intentional objects, but itself is no object (there’s no there there) and so is nothing, a nothingness. Time is like being in that respect. It consists entirely of the past (which doesn’t exist) and the future (which doesn’t exist) and the present (which is infinitesimally small and so doesn’t exist). So time, like being, is a nothingness. This being-for-itself is not just nothingness, however; it has some other bizarre, contradictory characteristics: Its being, though nothing, allows a world to be manifest (how this is so is unclear), a world that includes all this stuff, including others, for example, who want to objectify the being-for-itself, to make it into a something, a thing, a being-in-itself, like a rock. (“Oh, I know you. I’m wise to you. You’re . . . .” whatever.) The being-for-itself also has a present past (in Husserl’s sense) and is subject to certain conditions of material construction (the body) and material conditions (in an environment of things), and all these givens—the body, the environment, one’s own past, and other people seen from the outside in their thinginess—make up the being-for-itself’s facticity. The being-for-itself wants to be SOMETHING, and so lies to itself. It acts in bad faith, playing various roles (playing at being a waiter, for example) and creating for itself an ego (via self-deceptive, magical thinking). But in fact, being in reality nothing, being-for-itself (each of us) knows that that’s all a lie. We transcend our facticity and can be anything whatsoever, act in any way whatsoever. In other words, we are absolutely free and therefore absolutely responsible. This responsibility is absurd, because there is no reason for being/doing any particular thing. “Man is a meaningless passion.” But the absolute freedom that derives from our essential nothingness also allows for action to be truly authentic (as opposed to the play-acting) in addition to being responsible. Only in death does the being-for-itself succeed in becoming a being-in-itself, a completed thing, and then only if and in the manner in which he or she is remembered by others. A person who is not remembered never existed. Death is a time stamp or, if we are not remembered, an expiration date.

Albert_Einstein_(Nobel)The Eternal Return and the Weight of Being

“341. The Greatest Weight. What, if some day or night a demon were to steal after you into your loneliest loneliness and say to you: ‘This life as you now live it and have lived it, you will have to live once more and innumerable times more; and there will be nothing new in it, but every pain and every joy and every thought and sigh and everything unutterably small or great in your life will have to return to you, all in the same succession and sequence—even this spider and this moonlight between the trees, and even this moment and I myself. The eternal hourglass of existence is turned upside down again and again, and you with it, speck of dust!’

“Would you not throw yourself down and gnash your teeth and curse the demon who spoke thus? Or have you once experienced a tremendous moment when you would have answered him: “You are a god and never have I heard anything more divine.” If this thought gained possession of you, it would change you as you are or perhaps crush you. The question in each and every thing, “Do you desire this once more and innumerable times more?” would lie upon your actions as the greatest weight. Or how well disposed would you have to become to yourself and to life to crave nothing more fervently than this ultimate eternal confirmation and seal?”

–Friedrich Nietzsche (1844–1900), The Gay Science

462px-Einstein-formal_portrait-35The Fleeting One-Offness of Everything and the Resulting Unbearable Lightness of Being

“But Nietzsche’s demon is, of course, wrong. There is no eternal return. Where does that leave us? Isn’t life ALWAYS a matter of I should have’s and I would have’s and if I had only knowns? “[W]hat happens but once, might as well not have happened at all. If we have only one life to live, we might as well not have lived at all. . . .

“The heaviest of burdens crushes us, we sink beneath it, it pins us to the ground. But in love poetry of every age, the woman longs to be weighed down by the man’s body. The heaviest of burdens is therefore simultaneously an image of life’s most intense fulfillment. The heavier the burden, the closer our lives come to the earth, the more real and truthful they become. Conversely, the absolute absence of burden causes man to be lighter than air, to soar into heights, take leave of the earth and his earthly being, and become only half real, his movements as free as they are insignificant. What then shall we choose? Weight or lightness?”

–Milan Kundera (1929­–     ), contra Nietzsche, from The Unbearable Lightness of Being

Albert_Einstein_HeadCopyright 2010, Robert D. Shepherd. All rights reserved.

Posted in Existentialism, Metaphysics, Philosophy, Philosophy of Mind, Time | Tagged , , | 5 Comments

Selected Poems | Bob Shepherd

Posted in Humor, Love, Poetry, Politics, Religion, Sex and Gender, Teaching Literature and Writing | 2 Comments

The Unintended Consequences of Education “Reform”

Years ago, I was a member of a loose affiliation of business consultants concerned about new technologies, The Sociotechnical Systems Design Group. The driving force behind the work of this group of brilliant consultants was recognition that new technologies often bring with them a lot of unforeseen, unintended, negative consequences. Consider, as an example, Customer Resource Management (CRM) programs. These are software packages that keep tabs on salespeople’s interactions with and knowledge gathered about customers. The software makes this information available to senior managers. Here’s an unintended problem with that: for a long, long time, salespeople have kept a little black book containing their notes about customers—names, titles, number of children, children’s names and ages, wife’s name, hobbies, interests, pet peeves, hot buttons, comments about products, wishes, ambitions, and so on. Over time, this annotated list of contacts would become the salesperson’s most valuable possession—the distillation of his or her life’s work—and, importantly, WHAT MADE THE SALESPERSON VALUABLE. And the last thing a salesperson wanted to do was TO GIVE THAT AWAY. And so, salespeople resisted CRM, often withheld information from these systems, and often put in bad information, which led to garbage in, garbage out. Here’s another example: years ago, I was managing a large team of editors, designers, and freelancers in an educational publishing house. My projects, and I typically had several of them going at once, were complex, with lots of parts. I learned about GANTT and PERT charts—project management software—which promised to keep minute track of these parts and to help me identify and address slippages and bottlenecks and keep track of project component costs and, importantly, budget overruns.  And these programs WORKED BEAUTIFULLY. I soon found, however, that I was spending so much time working on my charts that I was missing important stuff going on with my staff—and a business is, of course, its people. So, I put the project management software away and went back to what folks in business call “Management by Walking Around.” Unintended consequences, aka, the unknown unknowns.

The Common [sic] Core [sic] State [sic] Standards [sic] (CC$$) were first announced in June of 2009. They were common only in the sense of being vulgar. They did not spell out the core of English language arts, which is knowledge, but rather were an almost completely content-free skills list. They were not developed by the states but by a group of individuals whose work was funded by Bill Gates, who saw the Core as a single set of national standards to which educational software products could be correlated and then “sold at scale.” At scale. You know, monopolistically. Because in ELA the CC$$ are simply a skills list and in Math they contain a lot of developmentally inappropriate stuff in the early grades and are no improvement on the preceding standards from the National Council of Teachers of Mathematics, one cannot with a straight face, I think, call these “standards,” which implies high goals one might seek to achieve.

The new “standards” were announced with much fanfare (and a lot of financial pressure to adopt them), and part of the initial messaging, incorporated into notes attached to the “standards” and into a series of speeches given by self-styled Common Core “architect” David Coleman, whom Gates had appointed the decider for the rest of us, was a call for a “return” to reading classic, substantive works of literature from the canons of American and world literatures, including plays by Shakespeare, excerpts from the Bible, and foundational works in American history like the Declaration of Independence and the Federalist Papers. Coleman was PROFOUNDLY IGNORANT of K-12 education in the United States. He didn’t seem to know that ALMOST EVERY SCHOOL IN THE COUNTRY, with very, very few exceptions, was using a hardbound literature anthology at each grade level, such as McDougal, Littell Literature or Prentice Hall Literature or The EMC Masterpiece Series, that contained—guess what?— classic, substantive works of literature from the canons of American and world literatures, including plays by Shakespeare, excerpts from the Bible, and foundational works in American history like the Declaration of Independence and the Federalist Papers. What an incompetent idiot! (For this idiocy and incompetence, Coleman was awarded the presidency of the College Board.) Furthermore, Coleman had no clue that throughout the U.S., in almost every high school, foundational works of American literature were taught in an 11-grade American Literature survey course that ran concurrently and was often taught in conjunction with an 11th-grade American History course. The 12th grade was usually reserved for a British literature survey course (though some schools chose to do a world literature course at this level, instead). No, Coleman, ignorant of the domain he was placed in charge of by Billy Gates, put into his “standards” a call for foundational works of American literature AT BOTH GRADES 11 and 12, which resulted in a weird phenomenon—publishers adding a smattering of American literature to their British literature anthologies. Weird.

Here’s an important thing about the Gates/Coleman “standards” bullet list: though it has tacked onto it this call for reading substantive literary works, the bullet list itself consists ALMOST ENTIRELY of very broad, very vague skills items—e.g., the student will make inferences based on texts. In other words, the list is almost entirely knowledge-and-content free, even though mastery in the English language arts involves acquiring an enormous amount of domain knowledge!!!!

How did that happen? Well, Coleman and crew, being incompetent, mostly just copied existing egregious state “standards” that had the same problem and tweaked those. HOWEVER, and here’s the HUGE ISSUE that these “standards” created: now that they were national, they did have the effect that Bill Gates was aiming for of leading to products based on the standards that were sold “at scale,” including a lot of online instructional products so bad that students would prefer having all the hair on their bodies plucked out by tweezers to having to do the next lesson in these. And because students in every state had to take high-stakes tests based on these national “standards,” being able to answer these skills questions on the standards became all important.

And that’s where the unintended consequences come in: these almost entirely content and knowledge free “standards” have led, in ELA, to a vast devolution of ELA curricula and pedagogy into random exercises on random “standards” from the Gates/Coleman bullet list. Where before, a publisher would put together coherent units of study of substantive work—units on the Elements of the Short Story or on Transcendentalist Literature, for example—now, even if such names of units are retained—the actual content becomes random exercises on random skills from the list. I call this the “Monty Python “And Now for Something Completely Different” Approach to ELA Instruction.” The actual consequence of the Gates/Coleman bullet list has been the end of coherent, cumulative, substantive ELA curricula and pedagogy.

And so an entire generation of students has been ROBBED of coherent, cumulative, substantive knowledge-based instruction in ELA.

Unintended consequences.

Copyright 2024. Robert D. Shepherd. All rights reserved. This essay may be reprinted and shared freely as long as this copyright notice is retained. Please do reprint and share it. Thanks.

———-

NB: The opening call by Coleman for reading “substantive works of literature from the canon” led E.D. Hirsch, Jr., author of Cultural Literacy and founder of the Core Knowledge Foundation, to embrace the Core INITIALLY. He soon, however, repented of this when he recognized that the Common Core was basically a list of vague, untestable skills like the egregious state standards that preceded it. This was confirmed to me in private correspondence with Hirsch, and he then wrote about his disenchantment with the Core in various places. So, if you read somewhere that Hirsch was a fan of the Coring of American education, bear this in mind.

For more on Education “Reform,” go here: https://bobshepherdonline.wordpress.com/category/ed-reform/

Posted in Ed Reform, Teaching Literature and Writing | Tagged , , , , | 1 Comment

Whence Witches: a Brief Backgrounder

I have a friend whose father is an earnest, conservative Catholic. She tells me that if her father saw a Wicca artifact or symbol in her house, he would disown her because Wicca is Satanism. It’s not. His opinion is based on—nothing. On airy nothing. But like many opinions with no warrant whatsoever, this one is held strongly. LOL.

So, if I were to try to explain the contemporary witchcraft phenomenon to him, I would have to back up quite a bit. But even then, even if I could hold his attention, explaining this to him would be almost impossible because the explanations would, for him, need explanation, and he would be predisposed not to accept much of what I would say. Nonetheless, here is the outline of a response to him:

People who live in a time of Door Dash and supermarkets and electric lights typically don’t grok that for almost all of human history, we were intimately tied to the land and to the cycles of the seasons because we fed ourselves via foraging and hunting and then via hunting, cultivation of crops, and animal husbandry (herding or the keeping of domesticated stock). Hunting wasn’t, for most people, a primary source of sustenance. Modern studies of hunter gatherers have shown that typically not much of their calorie intake comes from that source, and hunting consumes almost as many calories as it produces. So, foraging, crop cultivation, and animal husbandry were the biggies. All are dependent upon seasonal cycles.

I recently started growing tomatoes and peppers in containers (grow bags) on my patio. As I am writing this, I have about fifty seedlings in cups in trays. Here’s how weather works in Florida in the Summer: the prevailing winds are East to West. Ocean water, heated by the sun, evaporates, and wind drives it West, where it cools and then drops as torrential but brief late afternoon rain. I have to make sure that I get those seedlings inside before one of those brief-but-heavy storms, or they will be destroyed. So, I’ve started having to pay close attention to the weather. For most of human history, this “having to pay close attention” to nature was literally a matter of life or death. If you planted before the last killing frost or failed to give your field proper drainage before the rainy season started, then you and those you cared about would starve to death. The prospect of dying is highly motivating. LOL.

At the same time, if you were a farmer during those thousands and thousands of years before the modern era (almost everyone was), you were pretty ignorant about how the world worked. And so you made up explanatory stories (pourquoi stories, they are called). Almost every ancient culture worldwide had a central mythology that involved an Earth goddess (often associated with the moon because of the 28-day lunar cycle) and a fertility god (often associated with the Sun). And in this mythology, the life cycles of these gods were associated with the changing seasons, from their union at the Spring Equinox to the death and rebirth of the goddess’s consort at the Winter Solstice. And almost every culture had festivals related to that seasonal cycle. A great introduction to these ancient pagan fertility religions can be found in Sir James George Frazer’s monumental The Golden Bough: A Study in Magic and Religion. Highly, highly recommended. Endlessly fascinating.

In 325, the newly converted Emperor Constantine made Christianity one of the official religions of the Roman Empire (along with the worship of Sol Invictus), and because he was tired of the warring Christian factions in the streets, he called a synod to establish an official doctrine for his new Church. In the centuries that followed, this early Church ruthlessly hunted down heretics (followers of competing brands of Christianity, and there were literally hundreds of these) and pagans (followers of those almost universal seasonal cycles-based religions). And where they found them, they killed them or forced them to convert. And, as part of its campaign against other religions, the Church portrayed those pagan seasonal fertility cycle religions as devil worshipping. In Europe, the male fertility god of the pagans (Cernunnos would be an example) was commonly pictured with antlers. This portrayal one sees, for example, on the ancient Danish Gundestrup Cauldron. These antlers became, in the Christian retelling, the horns of the Devil, and the male fertility god became the Devil himself. The ancient Scandinavian Earth goddess Nerthus became, in the Christian retelling, the beastly mother of the beastly Grendel, descended from Cain. On May 4th, 1493, Pope Alexander VI issued a bull, Inter caetera, giving Christian conquerors the right to seize land belonging to pagans, who were styled “savages.” The Dominican friar Bartolome de las Casas, in his book A Short Account of the Destruction of the Indes, tells how the Spanish entirely wiped out the Arawak people of the West Indies, including an account of how, to celebrate Easter, they hanged Indians alive and upside down by their feet and then tortured them by flaying—thirteen of them in all on this occasion, one for Christ and twelve more for each of the disciples. The Massachusetts Bay Colony clergyman Cotton Mather, in his Wonders of the Invisible World, describes the New World as populated by savage Satan worshippers and the plague of witchcraft in the colonies as due to Satan being furious at the colony for trying to establish Christianity in his territories.

Long before the invention of Christianity, ancient peoples had medicine men or women, aka shamans, who communed with and called upon spirits and practiced herbal and spiritual cures. This was witchcraft in its original sense and as it is sometimes practiced in remote parts of the world yet today. In its war against paganism, the official Christian Church, established by Constantine, REDEFINED these traditional practices, traditional witchcraft, as Satanism. In other words, traditional seasonal-religion-based shamanistic practice was renamed “witchcraft” and redefined as activity in service of the Evil One. And ironically, as part of its propaganda against pagan practices, the Church adopted in its legal charges against practitioners of pagan religion THE EXACT LANGUAGE once used, in ancient Roman legal documents, to persecute Christians—accusing them of holding naked rituals in the forest, of eating babies, of fornicating with evil spirits, of riding through the air, and so on. All this is documented in a fascinating book by historian Norman Cohn entitled Europe’s Inner Demons: An Enquiry Inspired by the Great Witch-Hunt.  Highly recommended.

The late nineteenth and early twentieth centuries was a great time for experiments in New Religions in both Great Britain and the United States. Many, many were the fascinating religious experiments from the period. I’ll do a post on these one day. Among these, there were several folks (among them Gerald Gardner, Doreen Valiente, and Aleister Crowley) who rediscovered (but barely understood) ancient pagan, pre-Christian religious practices and implemented these in New Religions claiming to be the Old (pagan/pre-Christian) Religion—Wicca and Thelema. Wicca, like ancient European pagan seasonal-fertility-based religion, celebrates the relations of an Earth mother and her consort over a cycle of seasonal festivals. It’s all about stewardship of the Earth and gratitude for her bounty. It has NOTHING WHATSOEVER to do with Satanism.

Respect your mother. JAI MA!

Now, here’s the thing: If you don’t read, you won’t know the backstory of anything. And by read, I DO NOT MEAN surfing random articles on the Internet. I mean, if you don’t read substantive books about substantive topics ROUTINELY, throughout your life. Only by doing that will you know the backstories and so understand what and how things are and how they got to be that way. In other words, if you don’t read, you will have opinions, but they will be based on NOTHING MUCH. They will be profoundly ignorant, uninformed opinions.

But here’s the thing about such opinions, as Donald Trump so clearly exemplifies: you don’t know what you don’t know; so, profoundly ignorant people tend to be extremely certain about their uninformed, ignorant opinions. And, it’s very difficult to persuade them otherwise, because the only way to do that is to try to give them, in a short compass, the backstories that they would have gotten had they been readers, and usually, being nonreaders, they haven’t the patience for that, just as they haven’t the patience (or self-discipline) for reading.

This saddens me greatly. And it has been a source of sadness for me for most of my life. I’m a reader, and so much of what I say goes—whoosh—over the heads of those who are not themselves readers. And often they think that they know more than I do about x because they once saw a comment about this topic by some random person on Quora or Discord or TikTok or whatever.

I have been making some dents in my ignorance, but it is still, alas, vast.

Readings this post was based on:

Pope Alexander VI. Inter caeterna.

de las Casas, Bartolome. A Short Account of the Destruction of the Indes.

Cohn, Norman. Europe’s Inner Demons: An Enquiry Inspired by the Great Witch-Hunt.

Frazer, James George. The Golden Bough.

Mather, Cotton. Wonders of the Invisible World.

For further reading: https://bobshepherdonline.wordpress.com/2019/03/17/charge-of-the-goddess-for-beltane/

Posted in Food, Philosophy, Religion, Teaching Literature and Writing | Tagged , , , , | 9 Comments

The Best Tea for Cold Brews

I generally try to avoid superlatives, but . . .

Baihao Yinzhen, pronounced BAI how yen tzen, aka White Silver Needle, is a white tea grown in China’s Fujian Province. This highly sought-after tea is the costliest of the white teas because only the top buds of the tea plant (Camellia sinensis) are used to produce it.

To my taste, it makes the best cold brew iced tea of the many hundreds of varieties of tea that I have tried. I try to keep a bottle of cold brew White Silver Needle in the fridge at all times. It is quite delicate and immensely flavorful and delicious, without any of the astringency or bitterness that one sometimes gets with tea, and it has this beautiful green color, as you can see in the photo.

The name of the tea comes from the fact that it is plucked after the first growth of tea buds, while the tea leaves still have the tiny, tiny, tiny white hairs that give them their silvery color and the tea its name. The tea looks needle-shaped because the leaves are rolled by hand (between the palms) and then dried. A superb fermented pu-erh is also made from this tea, which I order from China in tea cakes and drink after having prepared it with less than boiling water, a short infusion time, and multiple infusions of the same leaves.

Posted in Food | Tagged , | Leave a comment

Tools: Part 1 | Bob Shepherd

My daughter says that my apartment looks like a hoarder’s. It is, emphatically, not as bad as that. There are no piles of garbage. LOL. But there is a lot of stuff in a relatively small space. I think of it as my tiny home. Available space is maximized, and everything is in its proper place.

This is a secret I learned long ago. If you put your keys down in the same place every time you enter your house, then they will never be lost.

It’s shocking, really, how much stuff I have managed to arrange neatly, comfortably, and accessibly in a small space. My female companion at any given time throughout my life would look at the trunk of the car that she was helping pack for a weekend trip and say, “No more will go in.” And I would rearrange things, and a lot more would go in. I was always good at this.

So, why so much stuff in my place? Well, I moved from a full-scale house to an apartment, and I took a lot of stuff with me. And for good reason. First, many of the things I have are precious to me—the artwork and books I have collected over the years, for example, and personal and family mementos. Second, I keep things that I use. My books, for reference and study. My tools, for crafting things.

Young people today often don’t have a lot of kitchen gadgets and other tools because they don’t make things from scratch. They don’t make guitars and doll houses for the grandchildren, so they don’t have a micrometer and a Japanese saw and a dead blow hammer. They don’t make sauerkraut from scratch, so they don’t have fermentation jars and a muddler. This is a great pity. Some things that are important are almost lost because of people’s tendency, now, to go store-boughten instead of DIY. Quality, for example, and a lot of personal satisfaction at having done a job well. Almost lost, I said, but only almost. I am pleased to say that whatever your crafts (if you don’t some, get them), there are thriving communities devoted to them online, and some of those communards are young. May all the gods bless them and their endeavors.

I continually bug my younger friends about this. “Don’t freaking spend eight dollars for a quart of mediocre yogurt,” I say. “For that price, make a gallon and a half of far more delicious yogurt yourself. ALL IT TAKES IS TO BOTHER LEARNING, and once you do, that learning IS YOURS TO KEEP FOREVER.” This is an important life lesson—one of the most important I know. And these days, there’s no excuse. You want to learn how to make phyllo dough for spanakopita? How to make pâte à choux and turn this into eclairs? There is some Greek or French grandmother or grandfather on YouTube who will show you exactly how it’s done. How it was traditionally done. Sometimes, traditions are best, having been honed for centuries. My spanakopita and eclairs are to die for, and here’s why: they are made fresh by me from good ingredients and using the right tools: a Danish dough whisk, parchment paper, a piping bag.

So, I have a lot of tools and gadgets that I have collected over the years, and my daughter is wrong about my having “too much” stuff. These are not the things of a hoarder. These are the tools of an artist and artisan. Let me share an example from recent days. I recently prepared tomato and pepper seeds for germination and then planting in containers on my porch. Both like hot weather, and I live in Southern Florida. So, what did I use to do this job?

Well, first I set out some disposable cups. I used an awl to poke holes in the bottoms of half of them. Then I put decorative marbles I happened to have around in the bottom of the other half of the cups and placed the cups with the holes inside those cups. This arrangement would provide proper drainage and allow me to keep the seedlings watered but not water-logged. Then, I filled a bucket with warm water and placed a brick of coco coir into it. The coco coir was soon loose and hydrated and would provide my seed-starting medium. Then, I used a canning funnel and a canning ladle to fill the cups with the coco coir, and I tamped this down lightly with a wooden muddler. Next, I emptied my seed packets one at a time onto a dilute hydrogen peroxide solution to kill any contaminants on them, such as funguses or molds, and to soften the hulls to make germination easier; drained them using a small strainer; and dumped the seeds into spring water to wash off the peroxide. Then, I used a different strainer to strain off the water and used wooden tea tweezers to place the seeds onto a bamboo tea shovel, which was the perfect tool for holding and moving the tiny seeds. Next, using a chopstick, I poked two holes about an eighth of an inch deep in the top of the soil in each pair of cups and used the tweezers again to move two seeds to the holes. Finally, I filled the holes with my fingers and covered them with a thin layer of growing medium and watered them with a little gooseneck watering vessel I have.

So, disposable cups, an awl, marbles, a bucket, a canning funnel, a canning ladle, a wooden muddler, two strainers, wooden tea tweezers, a bamboo tea shovel, a chopstick, and a gooseneck watering vessel. And all these tools I had, and here’s the thing: each was perfect for its job. The awl made perfect little round holes in the bottoms of the cups without cutting huge fissures. The tea tweezers allowed me to move the seeds without harming them as metal tweezers might. The bamboo tea shovel made it easy to move them without losing any. The chopstick made the perfect-sized hole for the seeds. The gooseneck watering vessel allowed me to control precisely the amount of watering.

Some advice: over time, spend the money to purchase the right tools for the jobs you do (and for those you have not yet anticipated). You will end up with quite a nice little collection, and these will make you happy. Again, I learned this lesson long ago: It was Christmastime, and I had bought a big, red toy fire engine for my son at his request. The thing made sounds and had flashing lights and needed batteries. The battery compartment was closed with four Phillips-head screws. I didn’t have a Phillips-head screwdriver handy, so I stood there trying to turn those little screws with a butter knife from the kitchen. Dumb.

Over the years I have accumulated a lot of “the right tools” for a lot of jobs, and that stuff makes doing tasks a breeze. One can save a lot of time and effort and produce higher-quality work if one has the right tools. If you want to make your own sauerkraut or kimchee, I highly recommend getting some Mason jars, a wooden muddler, a kitchen scale (for measuring water and salt for a brine), and some pickle pipe fermentation lids. Having these proper tools will save you a LOT of grief. I rarely kayak anymore, and my daughter tried to get me to throw out my paddles. But I have been doing a lot of container gardening on my porch, and I fill a large trashcan with tap water, mix in a small amount of Vitamin C powder, and this removes the chloramine from the water, which might harm my plants. And the kayak paddle is the perfect tool for mixing the powder into the water.

Why do the proper tools work so well? Well, they have the right affordances. In other words, their design is perfect to effect the desired result. I learned the term from the late, great expert on human interface design Donald Norman. Norman—head of the computer science department at MIT—hated, hated, hated bad design in everyday life—the blue page of doom on DOS, then Windows computers; doors that scream by their design “push me” when they have to be pulled or vice versa. Engineers tend to like very neat, very orderly designs—everything all lined up in a row. Perhaps it’s a mild autism spectrum thing that one finds in engineers. In his superb book The Design of Everyday Things, Norman gives the example of a bunch of identical levers on a control panel in a nuclear power facility. This one means raise the fuel rods. This one means lower them. They look precisely identical, but misreading the tiny labels in an emergency and choosing the wrong lever could mean a freaking nuclear meltdown. The workers at the nuke plant had solved this issue by procuring some beer taps and placing them over the levers. Grolsch means lower the rods. Bud means raise them. LOL.

Recently, I ordered a teapot from a fancy modern design company called Kinto, and in their fanciness, these folks had designed away the knob traditionally at the top of the pot in Asian designs. Such arrogance is typically its own punishment. In this case, the lack of the knob makes it impossible to add a braided lid keeper, which makes keeping the lid on when pouring multiple infusions difficult. When pouring, you have to place a finger on the hot top of the teapot. Ouch. If, in the crucible of the ages, a design has emerged unscathed, THERE IS TYPICALLY A REASON. The design serves a function. (For more on stuff for tea and its uses, see this: https://bobshepherdonline.wordpress.com/2024/01/23/how-to-drink-tea-a-brief-guide-bob-shepherd/. For instructions on making a beautiful traditional braided lid keeper for a teapot, see this: https://www.youtube.com/watch?v=81yA_zvbq-4).

When a tool has the right affordances, then it will work smoothly. Consider, for example, a hammer. Novices hold a hammer tightly and force it down against the nail, aiming for the nail’s head. But that’s not how the hammer is supposed to work. A skilled user of a hammer holds it loosely and creates an arc that allows the weight of the hammer head to fall on its own accord. Let me repeat that: on its own accord. That’s what having the right affordances means. The design accomplishes the task almost by itself, with hardly any conscious effort. People have to learn this same truth when chopping wood. Don’t shove the axe down. Arc the thing and allow the head to fall naturally and do the job for you. And don’t try to hit the surface of the wood. Don’t make the top of the wood what you are aiming at. Allow the hammer or axe head to fall as though it were going to strike THROUGH the nail or the wood. Let the proper tool do its job.

When you let the tool do the job, after a short while, the conscious use of the tool completely disappears. You don’t think about what you are doing. You simply let the tool work.

The philosopher Martin Heidegger called tools like this “the ready to hand.” They are there for the hand to take up and then let the tool work with almost no conscious thought at all. To a skilled user, the tool becomes an object of consciousness ONLY WHEN IT STOPS WORKING, WHEN IT IS BROKEN—if, for example, the head of the hammer or the axe has become loose at the end of the handle.

All that is prelude to Part 2 of this essay, in which I will take up three astonishing topics:

types of consciousness,

the coming internalization on the part of the few of common tools, and

the vastly different creatures, for good and ill, we shall be when that happens.

Posted in Art, Existentialism, Food, Philosophy, Philosophy of Mind, Technology | Tagged , , , , | 6 Comments

Putin and Trump: A Backgrounder

First, Tsar Vladimir the Defenestrator:

Vladimir Putin’s father worked for the KGB. They were fairly poor and lived in a one-room apartment, but because of the father’s job, they had a telephone. Putin was a runt and a loner. The other boys beat him up. He used to chase and kill rats in his tenement for entertainment. He loves telling a story about a time when, living there, he cornered a rat, and the rat jumped at him. He had a propaganda film made about himself that tells about how at the age of 16, he tried to volunteer to join the KGB.

Putin took a law degree, joined the KGB, and was posted to East Germany. When the Soviet Union fell, he frantically shredded documents and then went to St. Petersburg. In an infamous talk that he gave decades later (a state of the union speech in 2005), Putin described the fall of the Soviet Union as “the greatest political catastrophe of the 20th century.”

The mayor of Saint Petersburg hired Putin to oversee foreign contracts. Any business wanting to open an office in Saint Petersburg had to go through Putin. Putin took a lot of kickbacks. The citizens were extremely poor, and the grocery stores were empty. Putin was put in charge of a program whereby the city would receive raw materials, such as petroleum and wood, and exchange these for food from Europe—butter, milk, eggs, wheat, etc. Instead, Putin created companies to trade the materials for cash, he and the mayor pocketed the money, and none of the food arrived to the people of Saint Petersburg. Investigators wanted to charge Putin with theft. The mayor squashed the investigation. Putin then got a job in Moscow working for Yeltsin. Yeltsin had started off as a brave, revolutionary reformer but became corrupt (and a notorious drunk). He sold off Russia’s state-owned businesses to his buddies and family members and become wealthy. These beneficiaries of Yeltsin’s largess became the billionaire oligarchs. Yeltsin appointed Putin head of the FSB, the state security service and successor to the KGB.

Meanwhile, back in Saint Petersburg, the corrupt mayor had been voted out and was being charged with fraud. Putin arranged to have the guy flown out of Russia, to Paris, in the middle of the night. This loyalty to a former corrupt boss did not go without notice by Yeltsin, who had a problem. He was in extremely ill health and needed to retire, but the moment he did, he would himself be investigated by the new president for his crimes in selling off state assets and profiting from those sales. So, Yeltsin and Putin came up with a plan. Putin would become President and squash any investigation into Yeltsin. (Just as Trump today wants to become president again so that he can appoint people who will make the charges against him disappear). However, Putin was relatively unknown and probably wouldn’t win an election.

Then, a series of apartment bombings started taking place in Moscow and elsewhere in the middle of the night. Putin went on national television and said that these were the work of Chechen terrorists and that he would hunt them down and “kill them while they were sitting on the pot in their outhouses.” This was Russia’s 9/11. For a time, ordinary citizens in Russia didn’t know whether as they slept, they and their loved ones would be blown up. Putin promised to hunt down those responsible. This made him a national hero.

Then, one of the apartment bombs, in the city of Rayazin, failed to detonate. Investigators defused the bomb, located in a basement. It was made of explosives and used a detonator available only to the Russian military and to Putin’s FSB. The local police arrested the FSB (state security) guys who planted the bomb. Putin put out the transparently false story that this was just a training exercise. Once a Chekist, always a Chekist.

Investigators looking into the apartment buildings started turning up dead. Murdered on the street. This was an MO to be repeated by Putin throughout his career–killing inconvenient persons. Putin became the government’s voice, in the media, of a war against Chechnya in retaliation for the terrorist bombings. This RUSE worked. Putin was overwhelmingly elected president. Thousands had died in the apartment bombings and in the Chechen War. Among his first actions as president, Putin called a meeting of all the Yeltsin-era oligarchs and let them know, in a subtle but certain manner, that henceforth, if they wanted to hold onto what they had, Putin would get his vig on every transaction. If an oligarch didn’t play ball, Putin would cook up an excuse to jail him and nationalize the business, effectively taking it over himself. He then illustrated his point by having the richest man in Russia, head of the oil giant Yukos, stopped by police. One of the police threw a bag into the guy’s car. Then the guy was arrested for transporting an illegal handgun and sent to prison for 10 years.

And so it went. Every bit of business in Russia had to pay its Putin tithe, and Putin became the richest person in the world, far richer than folks like Elon Musk and Bill Gates and Warren Buffet. He became the murderous, criminal leader of a kleptocracy, the boss of all bosses in a Mafia state.

Since, then, Putin has ruled Russia ruthlessly. He has looted it of its wealth. He has murdered his opponents, principally by having them poisoned with nerve agents or thrown out of windows or down stairways, but by other means as well. He arrests and imprisons and murders his political opponents. He has made it illegal in Russia to be openly gay or lesbian. He has made it illegal even to hold up a blank piece of paper in protest. In 2012, Putin published an imperialist screed called “On the Historical Unity of Russians and Ukrainians” in which he announced his historical duty and intention to great a “Greater Russia” by absorbing his neighbors. He has conducted an illegal war in Ukraine characterized by the wholesale slaughter of civilians, rape of elderly women and children, forced deportation of children, destruction of civilian infrastructure, and destruction of schools, housing complexes, old folks’ homes, and cultural institutions such as theatres and libraries. He has made Russia officially a theocratic state. He has replaced the Constitution with a new one that basically makes him president for life. He holds completely sham elections by getting rid of any real opponents. He is, as of this writing, an indicted international war criminal.

Second, Putin’s Lap Dog, Glorious Leader Who Shines More Orange Than Does the Sun

In 1987, Trump was just another racist real estate developer. He and his father had owned and operated apartments and had had to settle with the government due to their history of marking applications from black people with a “C” for “colored” and then not renting to those applicants. In that year (’87), Trump flew to Moscow at the invitation of the Russian Ambassador on a KGB plane for an all-expenses paid trip. You know, the sort of thing that happens to U.S. businessmen all the time. LOL. Not.

Don the Con, aka Vladimir’s Agent Orange, aka Moscow’s Agent Governing America (MAGA), has his portfolio: it’s disruption. Disruption of U.S. political norms, of the U.S. environment, of U.S. alliances. He’s there to drive wedges, throw monkey wrenches, and generally screw up the works. Here’s the history:

Trump married not one but two Soviet-block-born women. Honey traps?

A profoundly ignorant, incompetent playboy, Trump ran through the more than half a billion he inherited from his father, trying to build and operate casinos to bilk people out of their money. Trump described this half a billion as starting with “a small loan from [his] father.” These casinos failed, and Trump faced bankruptcy. No American banks would lend to him. His loans to construct them were coming due. He couldn’t pay them. He was going to go under. But then he went to Moscow again. Suddenly, oligarchs connected to Putin started showing up all over the world with suitcases full of cash to buy Trump properties. And Deutsche Bank, which had substantial deposits from Putin cronies, ponied up a half-billion-dollar loan to save Trump. Sure, let’s loan half a billion dollars to the bankrupt guy. That’s how banks roll. NOT. One of the Trump sons bragged to reporters and the other to a friend about how the Trump organization was “rolling in Russian money.” So, both confirmed that Russia was bankrolling the Trump Organization, or RICO.

Meanwhile, Steve Bannon, Jeffrey Beauregarde Sessions, and Stephen “Goebels” Miller, were looking around for a potential politician to carry their racist, anti-immigrant agenda forward. They settled on Trump because of his racist history of not renting to black people and of calling for the death penalty for the Central Park Five (who were exonerated by DNA tests). All this is documented in the superb Frontline documentary “Zero Tolerance,” from Frontline. Trump announced a bid for the Presidency. The Russian foreign intelligence services spent enormously and committed enormous resources in terms of personnel to a social media disinformation campaign to ensure that Trump was elected. A REPUBLICAN-LED Senate committee issued a report detailing the Russian U.S. social media disinformation campaign for Trump. Because, of course, Russia would do this for any politician. LOL. They’re just nice that way, those Russians.

A lifelong British intelligence official wrote a report saying that Trump has deep ties to the Russians and that the Russians have kompromat on Trump in the form of a videotape involving hookers and golden showers in a Moscow hotel.

During the election, Trump repeatedly denied that he had any business in Moscow. At the very time he was saying this, he was negotiating to build a Trump Tower in Moscow. Various reports suggest that Trump did not intend to win, in fact, but believed that the PR from the Presidential bid would secure the Moscow deal.

During the election, Trump actually publicly called upon the Russian government to hack his opponent’s email. They obliged. This was sedition in plain sight.

After the election, Trump delivered whatever Putin wanted. He met with Putin and held a press conference in which, in contradiction to American intelligence, he said that Putin told him he had nothing to do with the social media disinformation campaign, and Trump said that he believed him. LOL.Trump told staffers on multiple occasions that the U.S. should withdraw from NATO. Trump alienated all of our allies. He threatened to withdraw U.S. forces around the world unless other countries started footing the bill. He unilaterally, over the objections of his Joint Chiefs, reduced our troops in Germany, on the NATO border with Russia. He unilaterally chose to abandon our allies, the Kurds, in Syria, leaving Syria to the Russians, a crime so egregious that Trump’s Secretary of Defense, James Mattis, resigned due to it. At a time when Russia had announced that it had developed hypersonic nuclear missiles, Trump withdrew the United States from the INF, which limited nuclear weapons, and from the Open Skies Treaty, which allowed the U.S. and Russia to fly over one another’s territories to inspect compliance with nuclear and chemical weapons treaties. Several of these actions were treasonous. They were, literally, treason. Vladimir got a great return on his investment in his lapdog: Sit up, roll over, good boy Donnie Boy. Yes you are.

Trump disrupted everything. He fomented racial division. He trumpeted Putin-style nationalism and autocracy. He appointed to head up every department and agency of the U.S. government a person dedicated to undermining the mission of that agency or department. In other words, he rendered the U.S. federal government largely dysfunctional. He created what former Bush, Jr. speechwriter David Frum called “the most dysfunctional White House in history.” His own former high-level staff referred to Trump as “a ***ing moron” (Tillerson) with “the understanding of a fifth- or sixth-grader” (Mattis). This worked very much to the favor of enemies of the United States, of course–having someone in charge of the United States who doesn’t know what happened at Pearl Harbor or that Alabama isn’t in the path of hurricanes on the East Coas or that stealth airplanes are not actually invisible or that Frederick Douglass is not alive and doing a great job or that the Continental Army did not capture the British airports or that we shouldn’t look into injecting disinfectants or that India does have a long border with China or why NATO was formed or why we have bases around the world–all of which Trump has claimed. Someone who tweeted major unilateral changes in defense policy at 2:00 in the morning. At the age of 71, Trump took the Montreal Cognitive Assessment, a brief dementia screening. He is so ignorant that he thought it was an IQ test and keeps repeating that falsehood. Oh, and the Washington Post tallied that Trump made 30,573 false or misleading claims over his four-year term in office. He is a pathological malignant narcissist and a pathological liar.

Mueller wrote a report saying that a) if the evidence exonerated Trump from obstructing the Russia probe, he would so state; b) that he is NOT so stating; and c) that it would be up to the Congress, not the Justice Department, to take action on this, under U.S. law. Trump’s Attorney General stated that the report exonerated Trump, in direct contradiction of the report. One assumes that the Attorney General can read, so this is a tad odd, huh? The House voted twice to impeach Trump TWICE, once for a quid pro quo with the Ukrainian president and once for seditious conspiracy to overthrow the results of a fair election.

Russia is conducting a war with Ukraine. Trump threatened to withhold military aid from Ukraine unless that country announced an investigation of Trump’s political opponent–an illegal quid pro quo.

Trump’s current and erstwhile allies in the Senate–McConnell, Graham, and Rubio, for example–just happened to have gotten enormous contributions to their Political Action Committees from oligarchs close to Putin.

Trump and his people denied knowing Lev Parnas or having anything to do with him, even though there are tons of pictures of them together at events.

Just part of the story.

And this all adds up to . . . it’s all fake news. LOL. All this history. Just coincidence and libtard kookiness. Lord help us. Are we really that dumb?

If there is anyone in the U.S. intelligence services who still thinks that Trump is not owned by Russia, then the word intelligence should not be used of him or her.

What Russia pulled off using its asset Trump is doubtless the most disturbing and consequential intelligence coup in history.


Scene from a play that really needs to be written:

DON THE CON: So, what should I call the new social network?

VLADIMIR: Well, Donald, in the old days, we decided to call the propaganda organ Pravda.

DON THE CON: Pravda? OK. You’re the boss. Pravda it is.

VLADIMIR: No, no, Donald. It means “truth” in Russian.

DON THE CON: You want me to call it “Truth in Russian”? Uh, OK.

VLADIMIR: No, no. (to Igor Kostyukov, GRU head) Take Donald away and explain this to him.

Posted in Politics, Trump (Don the Con) | Tagged , , , , | 21 Comments

Where Did Frank Herbert Get the Idea for the Spice Worms of Dune? | Bob Shepherd

Recently I was reading an obscure text called 3 Baruch, a piece of Christian pseudographia from perhaps the second century CE that describes a vision of five heavens on the part of the titular character. In this book I discovered where Frank Herbert got his idea for giant worms that excrete spice, which he describes as like looking and tasting like cinnamon, but, unlike cinnamon, having powerful, psyche-expanding, psychotropic, entheogenic, psychedelic properties. (NB: Psyche is a Greek word meaning both “mind” and “soul.”) Here’s the relevant text, from 3 Baruch, 6: 3-12:

And I said to the angel, “What is this bird?”

And he said to me, “This is the guardian of the earth.”

And I said, “Lord, how is he the guardian of the earth? Teach me.”

And the angel said to me, “This bird flies alongside of the sun, and expanding his wings receives its fiery rays. For if he were not receiving them, the human race would not be preserved, nor any other living creature. But God appointed this bird thereto.”

And he expanded his wings, and I saw on his right wing very large letters, as large as the space of a threshing-floor, the size of about four thousand modii; and the letters were of gold. And the angel said to me, “Read them.” And I read and they ran thus: “Neither earth nor heaven bring me forth, but wings of fire bring me forth.

And I said, “Lord, what is this bird, and what is his name?”

And the angel said to me, His name is called Phoenix.

(And I said), And what does he eat?

And he said to me, The manna of heaven and the dew of earth.

And I said, Does the bird excrete?

And he said to me, He excretes a worm, and the excrement of the worm is cinnamon, which kings and princes use.

Herbert was a great student of ancient religions, as readers of the Dune series (and watchers of the recent Dune films) will know. So, my contribution to Herbert scholarship. You’re welcome.

NB: Copyright 2014, Robert D. Shepherd. This post may be freely distributed IF this copyright notice is retained.

Posted in Film, Religion, Teaching Literature and Writing | Tagged , , , , | 2 Comments

Getting Clear about the Difference between Sex and Gender

Much of current debate about sex and gender is utterly confused, and the confusion comes from not recognizing the crucial distinction between sex and gender. A lot of unnecessary problems could be avoided by keeping this straight.

–from the article

The brilliant French novelist and philosopher Simone de Beauvoir gave Jean-Paul Sartre, her long-term lover, most of his best ideas, the ones that became the philosophical system known as Existentialism. However, this isn’t her only claim to fame. She also, in her seminal 1949 work Le Deuxième Sex (The Second Sex) introduced into general circulation the crucial distinction between sex and gender. It’s astonishing how many people, 75 years later, still don’t understand this distinction, so let me try to clarify it. First, what Beauvoir wrote:

On ne naît pas femme, on le deviant [One is not born a woman; one becomes one].

She was not, of course, saying that one isn’t (typically) born with either male or female genitals. This is true for all but a small percentage of kids (About 1.7 percent of kids are born intersex–with partially male and partially female sexual organs). What Beauvoir meant was that the characteristics associated with womanhood—what roles one plays, how one dresses, what accessories one wears, who one’s friends are, how one sits and walks, and so on—are culturally, not biologically, occasioned and acquired. They are a matter of gender.

In English, we are fortunate enough to have two distinct words that can be appropriated for the following distinct purposes:

We can (and should) use female and male to refer to the biological inheritance—to the biological sexual characteristics that we are born with and that we develop over time based on our genetic programming. These characteristics comprise our sex.

We can (and should) use woman and man to refer to the acquired, acculturated characteristics traditionally ascribed to and associated with particular sexes (to ones taught us by our culture)—to matters like roles, dress, and learned sex-specific behaviors. These characteristics comprise our gender.

So, to update Beauvoir, one is born female or male (i.e., someone with a particular sex) and becomes a woman or a man or some combination thereof (i.e., someone with certain non-necessary, acquired gender characteristics).

Sex is given (in most cases). Gender is not. The genders that people typically acquire vary depending on time and place. Among the Masai, for example, MEN wear elaborate jewelry and brightly colored clothing; engage in small handicrafts; and spend a lot of time in groups, gossiping—precisely the characteristics widely considered in the 1950s appropriate to American WOMEN. No one teaches a person to have a penis or a clitoris, a scrotum or labia majora. These are simply givens (in most cases). Of course, no one typically sits American boys down and tells them not to wear dresses, either. This behavioral propensity is acquired rather than learned based on behavioral models in the ambient environment. Gender is acquired. Sex is not.

By default, gender comes about by what the French Marxist critic Louis Althusser called “interpellation”–unconscious acquisition of cultural norms. But this is not necessarily so. Because it is acquired, gender is open to being modified with some ease. I used to teach the kids in my acting classes how to walk and sit like people of the opposite gender. This was eye-opening for them. In today’s repressive era of the Moms for the Liberty to Constrain Your Liberty, aka the Minivan Taliban and the Ku Klux Karens, I would probably be fired for these exercises, which my students found fascinating and illuminating.

Much of current debate about sex and gender is utterly confused, and the confusion comes from not recognizing the crucial distinction between sex and gender. A lot of problems could easily be avoided by keeping this straight.

For example, it’s important for young people to recognize that they can experiment with gender change or (even better, to my mind) fluidity WITHOUT THIS HAVING ANYTHING TO DO WITH THEIR BIOLGOICAL SEX. Consider, for example, this fact: Studies have shown that people speak much more nicely to female clerks than to male ones. They use slower and sweeter voices. Well, wouldn’t it be a good idea to speak nicely to male clerks, too? And if boys want to wear makeup, why the hell not? Why is this exclusively for girls? Certainly, male movie stars and politicians do so all the time. My mother took a lot of grief back in the 1960s for wearing pants. Why should young men take grief for wearing skirts or dresses? See, for example, the Islamic thobe, the African dashiki suit, the Sumerian kaunake, the ancient Greek chiton, the Christian priestly cassock or soutane, the Greek funstanella, the ancient Roman and Medieval European tunic, the Sikh baana or chola, the Samoan lavalava, the Japanese hakama, the Palestinian qumbaz, the Southeast Asian sarong, the Indian dhoti or veshti or lungi, the Scottish kilt, and many others.

Recognizing the distinction between sex and gender can lead to a NEW BIRTH OF FREEDOM—to people being able to explore freely gender-related options formerly closed to them—roles, ways of acting and speaking, choice of adornments and activities and partners and friends, and so on. And recognizing that gender and sex ARE DIFFERENT THINGS can lead people not to make decisions about medical treatments and changes to their bodies that they might later regret. People can have the freedom to explore alternate gender expressions without going to such extremes until they are old enough and certain enough to do so. They can also explore various sexual orientations without regard to sex OR gender, of course, and have a right to do so.

Posted in Philosophy, Sex and Gender | Tagged , , , , | 1 Comment

Gayle Greene on How to Build a Human

How do you build a world-class human? Well, you give him or her the benefits of a broad, humane liberal arts education that confers judgment, wisdom, vision, and generosity. In her new book, Immeasurable Outcomes: Teaching Shakespeare in the Age of the Algorithm, Gayle Greene, a renowned Shakespeare scholar and Professor Emerita at Scripps University, shows us, with examples from her classes over three decades, exactly how that is done. And she doesn’t do this at some high level of abstraction. Rather, she backs up her profound general observations with concrete, vivid, fascinating, moving, funny, honest, delightful examples from her classes. 

She also shows us how, under the “standards”-and-testing occupation of our schools, that development of well-rounded, liberally educated young people is being lost.

This engaging book is a full-throated defense of the Liberal Arts and of traditional, humane, in-person, discussion-based education in a time when Liberal Arts schools and programs are being more than decimated, are being damned-near destroyed by bean counters and champions of ed tech. Here’s the beauty and value of the book: contra the “Reformers,” Greene details the extraordinary benefits of the broad, liberal educations that built in the United States the people who created the most powerful, vibrant, and diverse economy in history. She makes the case (I know. It’s bizarre that one would have to) for not taking a wrecking ball to what has worked. 

Some background: Like much of Europe between 1939 and 1945, education in the United States, at every level, is now under occupation. The occupation is led by Bill Gates and the Gates Foundation and abetted by countless collaborators like those paid by Gates to create the puerile and failed Common Core (which was not core—that is, central, key, or foundational—and was common only in the sense of being vulgar). The bean counting under the occupation via its demonstrably invalid, pseudoscientific testing regime has made of schooling in the U.S. a diminished thing, with debased and devolved test preppy curricula (teaching materials) and pedagogy (teaching methods).

In the midst of this, Greene engages in some delightful bomb throwing for the Resistance.

OK.  Let’s try another metaphor. If Gates’s test-and-punish movement, ludicrously called “Education Reform,” is a metastasizing cancer on our educational system, and it is, then Professor Greene’s book is a prescription for how to reverse course and then practice prevention to end the stultification of education and keep it from coming back. 

Years ago, I knew a fellow who retired after a lucrative, successful career. But a couple months later, he was back at his old job. I asked him why he had decided not simply to enjoy his retirement. He certainly had the money to do so.

“Well, Bob,” he said, “there’s only so much playing solitaire one can do.”

I found this answer depressing. I wondered if it were the case that over the years, the fellow had given so much time to work that when he no longer had that to occupy him, he was bored to tears. Had he not built up the internal resources he needed to keep himself happy and engaged ON HIS OWN? Greene quotes, in her book, Judith Shapiro, former president of Barnard College, saying, “You want the inside of your head to be an interesting place to spend the rest of your life.” The French novelist Honoré de Balzac put it this way: “The cultured man is never bored.” Humane learning leads to engagement with ideas and with the world, and as Happiness Studies have shown repeatedly, outward-directed engagement, as opposed to self-obsession, leads to fulfillment, to flourishing over a lifetime, to what the ancient Greeks called eudaimonia, or wellness of spirit. Kinda important, that.

In a time when Gates and his minions, including his impressive collection of political and bureaucratic action figures and bobble-head dolls, are arguing that colleges should become worker factories and do away with programs and requirements not directly related to particular jobs, it turns out that the people happiest in their jobs are ones with well-rounded liberal arts educations, and these are the ones who are best at what they do. And it turns out that people taught how to read and think and communicate and be creative and flexible, people who gain a broad base of knowledge of sciences, history, mathematics, arts, literature, and philosophy, are self-directed learners who can figure out what they need to know in a particular situation and acquire that knowledge. Philosophy students turn out to be great lawyers, doctors, politicians, and political operatives. Traditional liberal arts instruction creates intrinsically motivated people—just the sort of people that employers in their right minds want and certainly the sort that most employers need.

All this and more about the value of liberal arts education Professor Greene makes abundantly clear, and she does so in prose that is sometimes witty, sometimes hilarious, sometimes annoyed, sometimes incredulous (as in, “I can’t believe I even have to protest this shit”); always engaging, human and humane, compassionate, wise, authentic/real; and often profound. As much memoir as polemic, the book is a delight to read in addition to being important politically and culturally.

Gates and his ilk, little men with big money to throw around, look at the liberal arts and don’t see any immediate application to, say, writing code in Python or figuring out how many pallets per hour a warehouse can move. What could possibly be the value of reading Gilgamesh and Lear? Well, what one encounters in these is the familiar in the unfamiliar. All real learning is unlearning. You have to step through the wardrobe or fall down the rabbit hole or pass through the portal in the space/time continuum to a place beyond your interpellations, beyond the collective fantasies that go by the name of common sense. Real learning requires a period of estrangement from the familiar. You return to find the ordinary transmuted and wondrous and replete with possibility. You become a flexible, creative thinker. You see the world anew, as on the first day of creation, as though for the first time. Vietnam Veterans would often say, “You wouldn’t know because you weren’t there, man.” Well, people who haven’t had those experiences via liberal arts educations don’t know this because they haven’t been there, man.

Gayle Greene has spent a lifetime, Maria Sabina-like, guiding young people through such experiences. Her classroom trip reports alone are worth your time and the modest price of this book. At one point, Professor Greene rifs on the meaning of the word bounty. This is a book by a bounteous mind/spirit about the bountifulness of her beloved liberal arts. Go ahead. Buy it. Treat yourself.

Copyright 2024, Robert D. Shepherd. All rights reserved. This review may be copied and distributed freely as long as this notice is included.

For more by Bob Shepherd about teaching literature and writing, go here: https://bobshepherdonline.wordpress.com/category/teaching-literature-and-writing/

For short stories by Bob Shepherd, go here: https://bobshepherdonline.wordpress.com/category/short-stories/

For poetry by Bob Shepherd and essays on poetry, go here: https://bobshepherdonline.wordpress.com/category/poetry/.

Posted in Teaching Literature and Writing | 15 Comments

Memory and the Construction of Self

Copyright 2010 Robert D. Shepherd. All rights reserved. NB: I wrote this back in 2010. Just getting around to posting it. The material I cover still stands, I think, though I might do some slight revision at some point.

Think of a time when you were swimming—at the beach, in a swimming pool, in a river or lake. Take a moment to close your eyes and picture the event. (What follows will be better if you actually do this.)

About half the time, when people recall such a memory, they picture themselves from the point of view of a third-person observer. I think of a time when I went to St. Martin and immediately after checking into my hotel, went for a swim. I see myself running down the exterior hotel stairway, crossing the sand, plunging into the sea, and taking long, crawling strokes in the turquoise water.  This memory is doubtless influenced by a photograph I have from the time, one taken from the hotel’s second-floor balcony by my then wife.

First- and Third-Person Memory

Being a third-person memory, my recollection is not, of course, precisely what I experienced. I could not, obviously, have been a third-person observer of myself!  In the present, we see the world not from the third-person perspective but in the first person, from the inside looking out. As I sit writing these words, I see the monitor in front of me, a cup of coffee, and if I glance down, the tip of my nose and my fingers at the keyboard. But if I remember this experience later, I’m likely to remember it not from the inside but from the outside: I’m likely to see in my mind’s eye the whole of me, sitting at a computer, writing. First-person memories tend to be phenomenally rich, whereas third-person memories tend to be of more extended duration, to be narrative (which is a clue, perhaps, to their construction).[1] 

Third-person autobiographical memories should give us pause because they are, to some extent, confabulations, or reconstructions, part real and part imaginary. They are not simple, retrieved events, pure sensory experience as taken in at the time (what psychologists call field memories) but, rather, stories that we tell ourselves, just-so stories about how things must have been. The philosopher Eric Scwitzgebel has theorized that our tendency to create third-person memories is related to our heavy consumption of movies and television, that we have learned to play movies in our heads, just as we now report dreaming in color, whereas in the 1950s and earlier, people most commonly reported dreaming in black and white.[2] Whether or not that is so, third-person memory is certainly quite common now. The naïve view of memory is that it is like a video recording in the head: this is what happened to me. I know. I was there. But as third-person memories demonstrate, we humans are quite capable of deceiving ourselves about what we remember.

Suggestibility and “Recovered” Memory: Deference to Authoritative Accounts

Sometimes our self deception can be about entire events. In a famous experiment, psychologist Elizabeth Loftus and her research associate Jacqueline Pickrell gave subjects booklets containing three stories from their pasts. The stories were supposedly gleaned from interviews with the subjects’ relatives. In each case, however, one of the stories, about the subject’s getting lost in the mall at the age of five, was false. In follow-up interviews, Loftus and Pickrell asked their subjects how much detail they could remember from each event. Unsurprisingly, subjects recalled with some detail about 68 percent of the true events. What was surprising was that about 29 percent of the subjects reported recollections of the false event, often providing elaborate detail about it.[3] It’s an experiment that has been repeated numerous times by various researchers with similar results.

There’s little consequence, of course, to having a false memory of being temporarily lost in the mall, but not all false memories are so benign. Years ago, when I was living in Chicago, a lawyer friend told me of a case she had taken on earlier in her career. Her client was an elderly African-American man who worked as a janitor in a largely white preschool in a largely white church of which the African-American man was a deacon. The man stood accused of molesting some of the students in the preschool, and a story about the molestation appeared on the front page of one of the Chicago papers. As the case developed, the children’s stories became more and more bizarre. They told of being burned in ovens and having large objects inserted in their orifices, but there was no physical evidence of this supposed abuse. In the end, psychologists ascertained that the children had confabulated. They had been visited by a social worker who gave them a demonstration about abuse using anatomically correct dolls. The mostly white children, scared anyway by a person who was older and not of the same race and who often appeared mysteriously from around corners, made it all up in their discussions with one another. What started as just-so stories, like the ones that kids tell about abandoned houses and dark closets and other objects of their fears, became magnified and reified, or given actuality. All charges were dropped against the elderly man, but his life was ruined. Abuse of children is unfortunately common, but in this case, no actual abuse had occurred.

Human suggestibility with regard to memory can have devastating consequences. Lawrence Wright’s disturbing and gripping book Remembering Satan[4]  tells the story of Paul Ingram, a sheriff’s department deputy and Republican Party county chairman in Washington state who fell victim to false accusations that he had molested his daughters when they were young and had later subjected them to Satanic ritual abuse. The daughters had fallen under the influence of a pair of psychologists who coached them through the process of “recovering” supposedly forgotten memories of abuse, and as a result, Ingram actually came to believe that there must be some truth to what the daughters were saying, was falsely convicted of molestation, and spent years in prison for crimes he didn’t do. As in Salem, during the witch trials, the daughters’ imagined experiences grew in complexity until they took in a great many townspeople involved in an abusive Satanic cult. Eventually, other psychologists were called in by the courts, and the whole edifice of the daughters’ fabrications, under the influence of their psychologist Rasputins, fell apart.

Supposedly repressed and recovered memories have played a key role in many such cases in the United States and elsewhere, so many, in fact, that the False Memory Syndrome Foundation was established to assist victims of false memories planted during therapy, though the work of this foundation and the validity of recovered memories remain contentious. A large-scale study by Elke Geraerts and others of Harvard and Maastricht looked at three types of memories of child abuse: ones continuously remembered, ones spontaneously recovered in adulthood, and ones recovered in therapy. Spontaneously recovered memories were corroborated about as often as continuous ones (37 percent of the time and 45 percent of the time, respectively), but recovered memories were not corroborated at all. The study by Geraerts and her colleagues suggests both that memories of traumatic events are extremely faulty and that people are extremely susceptible to manipulation of their memories.[5] Though recovered memories are questionable, there is no question that child abuse itself is a common problem, and the difficulties that people have with their memories work both ways. It can simultaneously be the case that recovered memories are suspect AND that memories of real abuse are often buried or whitewashed.

Memories of Misinterpreted Experiences

A number of years ago, I was living in Massachusetts and was single and dating. Having met in my dating life a few young women who were dealing with significant psychological issues, including bulimia and depression, I thought I might benefit from a class in the psychology of women offered by the Harvard Extension Program. So, I took the class, taught by a renowned feminist psychologist, and there I met a young woman who was convinced that she had been abducted by aliens. As it happened, around the same time, John Edward Mack of the Harvard University School of Medicine had studied sixty people who claimed to have experienced alien abduction. Dr. Mack spoke, once, with the woman I met, but she was never one of his patients or major research subjects. Interestingly, Dr. Mack reported that “The majority of abductees do not appear to be deluded, confabulating, lying, self-dramatizing, or suffering from a clear mental illness.”[6] The woman I met fit Dr. Mack’s description. She was bright, thoughtful, normal in every way, but she seriously believed, was in fact certain, that she had been abducted numerous times. Her stories of these abductions followed the classic plot line: She would awaken to find herself paralyzed, with creatures standing around her bed. She didn’t use the term, but her description fit that of the Grays, as UFO buffs call them, small aliens with big heads, large eyes, childlike bodies, and four long, large, ET-like fingers on each hand. The Grays would mill about the bed a bit. Then, the woman would feel herself lifted up in a beam of light and mist. The beam of light would carry her onboard an alien spaceship, where the aliens would perform various experiments on her. All the while, she would be immobile but perfectly conscious and completely, abjectly terrified. Eventually, the Grays would render her unconscious and she would awaken in her own bed. I shall never forget what this woman told me about these experiences: “Don’t tell me I imagined these things. I know they happened. I was there, just as I am here with you right now.”

Various explanations have been offered for the alien abduction experience. One is that the pineal gland produces small amounts of the psychotropic compound dimethyltryptamine, or DMT, which is known to cause self-appointed psychonauts to experience alien presences. The late Terrace McKenna, an enthusiastic advocate of the use of hallucinogens, wrote and spoke often of the alien “machine elves” whom he met and spoke with while under the influence of DMT. The most widely accepted explanation of the alien abduction phenomenon, however, is that during REM sleep, our brains protect us from acting out our dreams and so possibly hurting ourselves by inhibiting, post-synaptically, the operation of motor neurons, thus preventing the stimulation of our muscles. By this account, abductees awaken to a hypnagogic, or dreamlike, state, and find themselves paralyzed. In their susceptible, liminal condition, somewhere between waking and sleeping, their brains confabulate, making up a story to explain why they are in this pickle, and that story, that waking dream or nightmare, is what they remember. Because the “abductees” live in a time in which big-eyed, big-headed Gray aliens are as close as the local video store, their waking nightmares sometimes take on a form that they have borrowed from the popular culture. In Medieval times, such dreams took the form of succubae or witches. In fact, the story of a witch waking someone and riding him through the skies is what gives us our very word nightmare. (Of course, another possible explanation of alien abduction is that people are sometimes abducted by aliens, but that’s not a terribly parsimonious explanation, is it?)[7]

The Internal Story-Teller

Dreams can be quite bizarre. For a time, I kept a dream journal. In one of the more unusual dreams in that journal, I was in an airplane, a small prop plane that was flying into the island of Cuba, but in the dream, the island was a large, white-frosted sheet cake floating in a cliché of an emerald sea. Later, it was easy enough for me to piece together the sources of this dream. I had just returned from a trip, one leg of which was in a small prop plane. The day of the dream, there had been a news story about the illness of Fidel Castro. I had recently been to a wedding where there was a large cake (though not of the sheet variety—that must have been an adaptation of the cake idea to the topography of an island).

A widely held theory of dreaming is that it occurs as the mind sorts out and catalogues recent events.[8] Recently used neural pathways fire, and our pattern-making brains attempt to make sense of these random firings, putting them together into a coherent narrative. If the pathways that are firing are wildly divergent, we get these surreal dreams—islands that are wedding cakes. In another dream, I was again on an airplane and a large and, of course, red orangutan sitting next to me offered me a cigar. Come to think of it, that’s not so bizarre. I often find myself on airplanes sitting next to someone who is distinctly simian.

Dreams, alien abduction narratives, and confabulations great and small are revealing because they remind us of something very important about how people work: We are storytelling creatures. It’s not just when we are sleeping or in hypnagogic states that our brains are busy making sense of the world by telling us stories. It’s all the time. And when we get new information—we see a picture of our former selves or a relative tells about our getting lost in the mall at the age of five—our brains work to integrate that information into the narrative of our lives that we carry around with us. We take in sensory experiences and other information, and then we dream weave it into a narrative. That narrative, as much as the actual sensory experiences themselves, becomes memory, and our memories are, to a large extent, who we are. I believe myself to be a particular person with a particular history. I am the boy of five padding in his Dr. Denton’s across the floor of his grandparents’ upstairs bedroom at night to get a glimpse of a Ferris wheel, far across the darkened cornfields, turning red and green and golden in a dreamlike distance. I am the sixteen year old in the car at the drive-in movie trying to get up the courage (I never did) to kiss the amazing girl whom I never in my wildest dreams thought would go out with me. I am the hopeful applicant for his first editorial job, sitting across the desk from the renowned Editor-in-Chief staring at me over his broken reading glasses, which he has cobbled together with a bit of scotch tape. A self, an identity, is the summation of a great many such stories.

Suggestibility and False Memory 2: Deference to Social Sanction

But how true are the stories? Memory is notoriously faulty. Consider the following experience from another psychology class, one that I took in my freshman year in college: I was sitting in a large auditorium with some two hundred or so other students, listening to the professor, when costumed people burst in through the back door of the lecture hall, shouting and making a disturbance. They ran down one aisle (as I remember it), yelled a few things, leapt onto the stage, scattered the professor’s notes into the air, and then disappeared off the stage and through a side doorway. The event, of course, had been staged. The professor had us all write down what had just happened, and then we compared notes. My fellow students in the auditorium didn’t agree on much of anything at first—on how many people there were, on what they were wearing, on what they said, on what they did. Among other things, this event was a dramatic demonstration of the inaccuracy and inconsistency of eyewitness accounts. We humans have difficulty with the accurate recollection of experience. We’re not very good at it. Furthermore, we all have a tendency to confabulate, especially in social settings, where we have an all–frequent tendency to fall into group think and to start believing that we remember what other people confirm (We shall return to this subject later in this book). In the disruption demonstration/experiment, as the discussion continued, students began separating into groups—the ones who were certain that there were three intruders and those who were equally sure that there were four, for example. For many people, their certainty about what had happened increased over time, as they rehearsed it, and during this process, there was a lot of “Hey, yeah, I remember that too” going on.

Filling in the Gaps

Perhaps you think that you are not a confabulator, not someone who adds details to fill out the story and certainly not someone who will remember something differently because of someone else’s suggestion. Lest you fall into that trap, let me remind you that confabulation is a central part of sensory experiences themselves. Notoriously, we all have the feeling that we see the entire visual field before our eyes, but in fact, we all have blind spots in our visual fields caused by the fact that our retinas are interrupted in an area called the optic disc, where ganglion cell axons and blood vessels pass through our retinas to form our optic nerves. We view the world as continuous because our brains confabulate, filling in the missing details, telling us just-so stories.

But it’s worse than that. It’s not just that our perceptual systems regularly and systematically fool us. Memory is slippery. It’s susceptible to error because of drowsiness, illness, inebriation, inattention, stress or other strong emotion, and weakening or disappearance over time.  A couple of other problems with memory are particularly interesting. First, what goes into memory is severely limited. For a long-term memory to be formed, it first has to go through the narrow funnel of working memory. In a famous essay called “The Magical Number Seven Plus or Minus Two,” the psychologist George Miller pointed out that only seven or so distinct items can be held in working memory at any given moment. That’s why, for example, telephone numbers are seven digits long. We can increase this “working space” in memory by chunking, by putting items together into groups. So, it’s much easier to remember the string

S E T R K T A I M A R F A A R N

if we rearrange the letters and break them up into IM A STAR TREK FAN. But the point remains that of the innumerable things happening at any given moment, only a precious few gain admittance to working memory and thus have any hope at all of being transferred into short-term memory and from there into long-term memory.[9] The rest we assume, or fabricate, to put it less euphemistically, in later recall. Well, I was in my living room, so I must have been seeing this, that, or the other, our brains might as well be saying, though, of course, the brain does this unconsciously. In short, we actually attend to very few items at any given moment, but our brains are so constructed as to integrate what we were actually attending to with what we know, or think we know, about the world to prepare a long-term memory that is whole and consistent and present THAT confabulated memory to consciousness. If the long-term memory is of the third-person type, that confabulation is obvious, but we also confabulate first-person, field memories. It’s how we are made.

Inattentional Blindness

An important but often unremarked consequence of the limitations on working memory is inattentional blindness. As we have just seen, we can, at any time, attend only to a few things. So, the rest we are blind to. In another famous experiment, Daniel Simons of the University of Illinois and Christopher Chabris of Harvard showed subjects films of people passing a basketball around and asked them to count the number of passes. In the course of the films, a woman walked into the scene, sometimes carrying an umbrella and sometimes wearing a gorilla suit. Dutifully attending to their task, most subjects didn’t see these oddities—the umbrella or gorilla in the midst of the basketball game!  Memories typically have a wholeness about them, but most of that wholeness is imagined. When our brains do their work, telling us our stories, they make use of the material that we actually stored, and they fill in the rest. And sometimes they miss really interesting or important stuff, like the 800-pound gorilla in the room!

History as Confabulation: Narrativizing as Interpretation, or “Making Sense”

But it’s even worse than that.  Not only does memory fail us, and not only do our brains commonly and automatically fill in the gaps to make up for those failures, but we also, because of our story-telling natures, impose upon what we remember, or think we remember, narrative frames that serve to interpret and thus make sense of the events of our lives. Over forty years ago, the historiographer Hayden White wrote an influential essay, “The Historical Text as Literary Artifact,” in which he argued that when we discuss an historical event, we inevitably select some aspects of that event and not others, for time and scholarship are both limited, and every event might as well be infinitely complex. That much of White’s thesis is uncontroversial. What is controversial, and of enormous consequence, is White’s contention that we claim to have understood an historical event only after we have imposed upon it a narrative frame—an archetypal story, typically with a protagonist and antagonist, heroes and villains, a central conflict, an inciting incident, rising action, a climax or turning point, a resolution, and a denouement. The narrative frame exists not in the events themselves, but in our minds, as part of our collective cultural inheritance. Joseph Campbell famously proposed in The Hero with a Thousand Faces that a great many stories from folklore and mythology have a common form: a young and inexperienced person, not yet aware that he is a chosen one, sets out on a journey. He encounters a being who gives him a gift that will prove extremely important. He undergoes a trial or series of trials, succeeds as a result of his character and the gift, and emerges with some boon that he is able to share with others on his return. Joseph Campbell’s monomyth is one example of the kind of archetypal, interpretive narrative frame that gets imposed on events.

Returning to Hayden White’s thesis, to one person, the founding of Israel  is the story of an astonishing people, dispossessed, scattered to the winds (the setting out on the journey), subject to pogroms and persecutions (trials), who astonishingly, and against incredible odds, maintain their cultural identity, keep the flame of their nationhood alive by teaching every male child to read (the gift), suffer a horrific holocaust (more trials) and then, vowing never to allow such a thing to happen again, reclaim their historical birthright and carve out a nation in the midst of enemies, even going to the extent, unparalleled in human history, of reviving a scholarly, “dead” language, Hebrew, and making it once again the living tongue of everyday social interaction (the boon). I, myself, find this story, thus told, quite compelling and moving. In a very different version of these events, an international movement (radical anti-Semites would say “conspiracy”) leads to an influx of Jews into Palestine after the Second World War, and these Jews, taking advantage of a vote in the newly established United Nations, declare themselves a state and forcibly expel over 700,000 native Arabs from their homes. Both stories are true. The telling depends, critically, upon which events one chooses to emphasize. Overemphasis on one set of facts confirms some people in an obstinate unwillingness to make concessions necessary to secure a lasting peace. Overemphasis on the other set of facts leads other people to horrific acts of terrorism.

Consider, to take another example, this quotation from a white, American man of the nineteenth century:

“I will say then that I am not, nor ever have been in favor of bringing about in anyway the social and political equality of the white and black races—that I am not nor ever have been in favor of making voters or jurors of negroes, nor of qualifying them to hold office, nor to intermarry with white people; and I will say in addition to this that there is a physical difference between the white and black races which I believe will forever forbid the two races living together on terms of social and political equality.”[10]

Now consider this quotation, also from a nineteenth-century white, American male:

“On the question of liberty, as a principle, we are not what we have been. When we were the political slaves of King George, and wanted to be free, we called the maxim that ‘all men are created equal’ a self evident truth; but now when we have grown fat, and have lost all dread of being slaves ourselves, we have become so greedy to be masters that we call the same maxim ‘a self evident lie.’”[11]

The first of these statements, from a contemporary perspective, seems outlandish and shocking, the second reasonable and evident. You may have guessed, however, or may know, if you are a student of history or have glanced at the endnotes, that these two men are the same person: Abraham Lincoln, whom we remember as the “great emancipator.” Our view of Lincoln’s view depends, critically, on which of his statements and actions we attend to and from what parts of his life, and our view also depends on what narrative we tell based upon our selections and how nuanced that narrative is. Lincoln was, indeed, an early opponent of slavery and considered it an evil from an early age, but his views on the subject were far from monolithic, and they evolved over time. In short, they were complicated. And that’s always true of history. Whenever we look closely at some past event, we find that it is a lot more complex than are the simplistic accounts (one might call them myths) typically presented in high-school “social studies” textbooks.[12]

The Punch and Judy Show: Making Sense in Relationships

What Hayden White says of history is true of our personal histories too. What makes the stories that we tell ourselves about our lives into stories and not just collections of facts is that we selectively recall facts and we impose narrative frames upon them. The central character is a given. Each of us is the protagonist in his or her own tragicomedy. But we also identify antagonists and conflicts and moments of crisis and resolution. We create causal maps to explain why things happened as they did, often involving imputed motivations. So, in the stories that we tell ourselves from our own lives, we do not simply recall events; we interpret them. Two people, let’s call them Punch and Judy, are in relationship. They both recall an evening when they went to the theatre. Punch has decided that Judy is stubborn, that she must have her way about everything. So, when his brain reconstructs a memory, it automatically constructs it from bits and pieces, using that guiding principle. He conflates several actual times, over thirty years, in which Judy acted in a stubborn way and puts them all into the memory of that one evening. She refused to go to the show Punch had bought tickets for until her friend talked about great it was. She insisted on changing seats to sit on the outside. She refused to let him out until intermission, even though he needed to go to the bathroom. She insisted afterward that the leading lady was wearing a yellow dress at the beginning of Act II instead of a green one. One of these actually happened on that evening. Two never happened. Two happened, but at different events over the years. Judy, on the other hand, has decided that Punch often makes a spectacle of himself in public—that he has no decorum or tact. So, she has her own list of “things that happened on that evening at the theatre”—he interrupted the show and caused a scene by getting up to go to the bathroom ten minutes into Act I. He insisted on wearing, that evening, that ridiculous-looking jacket with the tux-like lapels. He told the waiter at dinner afterward how terrible the leading lady was, and that waiter was the leading lady’s good friend. And so on. But again, some of the things she remembers from that evening happened at other times or didn’t happen at all. They are confabulations that fit a general view that she has come to.

And this is common when relationships are in the process of failing. The day comes when one in the couple decides that the other is ×—whatever × is—and everything that happens after that is confirmation. The “evidence” grows that the situation is intolerable, and the person decides that the relationship is over, even though much of this “evidence” is confabulation.

We all are the central characters in our own stories, and we have a tendency to tell those stories to ourselves in a self-serving way, to remember our moments of glory and to forget or downplay those times that weren’t our most shining hour. And sometimes, the stories that we tell, impute personality traits or motivations to others that are absent or barely there.  Often, those imputations fuel resentment that festers and makes us cynical or mean-spirited when we would really be much better off to let it go, to move on, or, if we can’t, to consider (at least) the possibility that our interpretations are interpretations, not verbatim transcripts of reality.

Writing Our Stories v. Having Them Write Us

The stories from our lives are not created equal. Rather, we all tend to run a few critical, defining stories in our heads. Sometimes, these stories have only a tenuous connection to third-person, objectively verifiable reality, and sometimes, they can be terribly, terribly damaging, as when a person tells himself, over and over, a story of his or her victimization and in so doing becomes a perpetual victim. A number of clinical psychologists have recognized this and have created something called cognitive narrative therapy. The idea is to assist people to alter the stories that they tell themselves in crucial, life-enhancing ways. So, the victim of childhood molestation learns to think: No, I was not responsible for the liberties that my relative took with me when I was a child, and no, I was not at fault when he was found out. I was a child, and he was a pedophile, a person with a deep and terrible sickness. It was not my fault, and the story that I’ve been telling myself about that is deeply flawed.

As a senior in college, having finished the requirements for a degree in English, I experienced a crisis of faith. I had noticed in my reading of books, essays, and journal articles in my field, that literary critics and theorists typically devoted about a third of their energies to their topics, a third to displaying their erudition, and a third to protecting their intellectual turf. Did I really want go to graduate school and become an English professor and spend my life writing journal articles with titles like “Tiresias among the Daffodils: The Hermeneutics of Sexual Identity in Jacobean Pastorale?” Such articles were typically read by ten other scholars whose main motivation for doing so was to gather ammunition to refute what was said in the infinitely more brilliant articles that they were going to write. This didn’t seem a worthy use of a life.  Literary critics, take note: Many of the tools in your workshop, developed for the purpose of literary analysis, are extremely valuable for making sense of our life stories and for subjecting those to criticism. So, if you are looking for a way to make what you do even more relevant, that’s an idea. Many literary types already know this, of course.

We’ve seen that we are (To how large an extent? Try this for homework.) the stories that we tell ourselves about our lives. Some of those stories are even partially true! We’ve seen that inevitably our stories are based upon fragmentary evidence and are at least partially confabulated as a result of our storytelling gifts, our ability to “fill in the gaps.” We’ve also seen that sometimes we can benefit enormously from critical analysis of our own collection of life stories, and particularly of those stories that we replay a lot. If we are our stories, then we are, all of us, at least partially fabrications. That’s an unsettling idea, but it’s also liberating, for we can learn to take our own life stories with a grain of salt and so gain nuance in our understandings of ourselves and of others. And, instead of engaging in another Punch and Judy show with a partner or friend when we have differing memories of some event, perhaps we can have some understanding of how these differences arise and less certainty about the superiority of our own narratives. I’ve begun this work on uncertainty with an examination of what we know of ourselves because surely, of all that we know, we know ourselves best. But even there, as we have seen, there is reason for skepticism, for significant uncertainty, and that skepticism, that uncertainty, can be extremely healthy.


[1] Georgia Nigro and Ulric Neisser, “Point of View in Personal Memories.” Cognitive Psychology 15 (1983), 467-82.

[2] “Remembering from the Third-Person Perspective?” The Splintered Mind: Reflections in Philosophy of Psychology, Broadly Construed. Blog entry. June 6, 2007. http://schwitzsplinters.blogspot.com/2007/06/remembering-from-third-person.html

[3] Add footnote to Loftus.

[4] Wright, Lawrence. Remembering Satan: A Tragic Case of Recovered Memory. New York: Vintage, 1995.

[5] Geraerts, Elke, et al. “The Reality of Recovered Memories: Corroborating Continuous and Discontinuous Memories of Childhood Sexual Abuse.” Psychological Science. Vol. 18, no. 7. Jul7 2007, 564-68.

[6] Harvard University Gazette, July 24, 1992.

[7] Parsimony as a criterion for judging potential explanations is generally attributed to the medieval scholar William of Ockham, to whom is often credited the statement that Entia non sunt multiplicanda praeter necessitate, or “Entities should not be multiplied unnecessarily.” As is often the case with famous quotations and their attribution, this one does not come from Ockham, though it is likely that Ockham would have approved of it. The principle of parsimony, often referred to as Ockham’s razor, is that one should look for the simplest explanation that fits the facts. There’s no reason, of course, why explanations have to be simple. Events, for example, often have multiple causes. But there is good reason for not making explanations too complicated, for one could make up an infinite number of complicated but false explanations to fit any set of facts. Similarly, Einstein is often credited with having said, “Make things as simple as possible, but not simpler,” which I have not been able to verify, though he did say, in a lecture given in 1933 that “It can scarcely be denied that the supreme goal of all theory is to make the irreducible basic elements as simple and few as possible without having to surrender the adequate representation of a single datum of experience.” (“On the Method of Theoretical Physics.” Herbert Spencer Lecture, Oxford, June, 1933, in Philosophy of Science, vol. 1, no 2 (April 1934), pp. 163-69.)

[8] See, for example, Girardeau, Gabrielle, et al. “Selective suppression of hippocampal ripples impairs spatial memory.Nature Neuroscience, 2009; http://www.nature.com/neuro/journal/vaop/ncurrent/abs/nn.2384.html

[9] All this is made much more complicated by the fact that we are continually taking in information on some level and processing but not attending to it. By working memory, here, I am referring to the new information that we are capable of consciously attending to.

[10] Abraham Lincoln, Debate with Stephen A. Douglas at Charleston, Illinois, 1858

[11] Abraham Lincoln, Letter to George Robertson, 1855

[12] See, for example, Loewen, James W. Lies My Teacher Told Me: Everything Your American History Textbook Got Wrong. New York: The New Press, 2005. The title is, of course, an exaggeration. Oops. And Loewen is himself perfectly capable of getting some things wrong. The book remains, however, an interesting, amusing, occasionally enlightening and sometimes disturbing read. I myself got a lesson, years ago, in how difficult the work of an historian is when I created a series of books called Doing History. The idea behind the books was to coach kids through examining primary source materials—maps, letters, ship’s logs, oral histories, that sort of thing. My colleagues and I decided that we wanted to be very serious about getting our facts right. We didn’t want to produce books like the American history book that said that Sputnik was a nuclear device or the popular biology text that said that blood returning to the heart was blue! (One could go on and on multiplying these examples.) We soon found, though, when we sent to work verifying our facts, that as often as not, the facts we assumed to be true were disputed or questionable or flat-out wrong, and at some point, often, we just had to give up and use other material!

Posted in Epistemology, Philosophy, Philosophy of Mind | 5 Comments