December 24, 2009

Fate

Dear Friends:

Merry Xmas to you all. Here's something to think about in the New Year.


One film I keep returning to–I must have seen it at least five or six times now–is Damage, by Louis Malle. The story is a kind of Greek tragedy. Jeremy Irons plays a successful British civil servant whose inner life is empty; Juliette Binoche is his son’s fiancé, with whom he gets involved immediately after they meet. By chance, his son comes to the flat where they are having a tryst, and catches them in bed. Thunderstruck, the young man backs out of the room and falls backward over the bannister, plunging several stories down the center of the apartment building to his death.

His life thus destroyed, the man retires from the world. He takes up residence in a small town in an unidentified country, perhaps Greece or Italy. Life consists of shopping, cooking, and washing up, along with spending entire days sitting in front of a blown-up photograph of the fateful triangle–himself, his son, and the girl–which is mounted on the wall. He has, in effect, constructed some sort of shrine; but to what?

The man ponders what happened to him–events that were (or seem to have been) completely out of his control. He never knew who the woman really was, and yet the attachment went way beyond sex. As a high-level bureaucrat, he experienced his life as totally meaningless. He gave TV interviews and public speeches that were perfectly turned out–he said all the right things–but in reality, he was a shell. The girl, for some unknown reason, promised to fill that void (or so he believed, on an unconscious level), and so the chemistry was instantaneous, ferocious. Now, in the aftermath of it all, the man spends his time staring at the photograph, trying to decipher what it all meant.

What makes us, he says to himself, is beyond knowing. We surrender to love because it gives us some sense of what is unknowable. Nothing else matters, not in the end. I saw her only one more time, at an airport while changing planes. She didn’t see me. She was with a man, and carrying a child in her arms. She was no different from anyone else.*

She was no different from anyone else. This realization–perhaps only momentary–means that the “shrine” was not dedicated to the woman, nor even to the love that they shared, but to love itself. But perhaps much more than that. The purpose of the shrine, the need for it, is to worship that thing that is beyond knowing, the only thing that matters in the end. So what is it?

If human life is finally a mystery, the key in the lock is not that hard to figure out: it’s the sense of a Presence larger than oneself, and beyond the grasp of the rational intellect. For hunter-gatherers, this was a presence with a small “p”: their reality was immanent, was the environment itself. (The “great spirit” of the Plains Indians was typically the wind.) With agricultural civilization and the rise of religion, the Presence became transcendent, exalted to a “vertical” reality: God. Yet this presence, or Presence–this irreducible otherness–is finally within us. In Damage, the central character projects this “divinity” onto a perfectly ordinary person, which he comes to understand only years later. Yet the photo remains on the wall, and the daily “worship” remains the central activity of his life. Love gives us some sense of the unknowable, and the unknowable–even though it arises as an interaction between the self and the outer world–is unfathomable, as is the interaction. Hence, the enormous fascination, born out of the conviction people have that the experience embodies some great truth; which it does. Yet no amount of analysis or contemplation can resolve it; it just is.

Damage can be framed in many ways. I have already referred to Greek tragedy, but we can see it through the lens of Christian allegory as well. We have a man–say, Saul of Tarsus–going through the motions of a meaningless, ritualistic life. Suddenly, he is blinded on the road to Damascus, and that vision, or apparition, redefines his reality. In so doing, it tears up the old life, lifts him on transcendent wings, and finally deposits him in a chair in front of a photograph, bathed in the light of Christ. It’s not likely that the central character of Damage, even beyond the death of his son, would ultimately have it any other way. After all, he went from No Meaning to Total Meaning–not exactly a trivial adventure. As he says to himself at the end, as he walks the streets of his new home, “I found a life that was my own.” Who of us doesn’t want that?

Of course, much of a political nature could be added at this point. Mesmerization by the cross, no less than by the swastika or the hammer and sickle (to name but a few), has caused many millions of deaths. This is just a matter of historical fact, and I don’t mean to treat it lightly. But it seems to me that it has a larger context, an anthropological one: namely, that the need to feel a part of something greater–even if immanent, as in the case of hunter-gatherers–has been with us for at least 100,000 years, when the first light of self-conscious awareness glimmered in the brain of Cro-Magnon Man. The birth of the self, in a word, may have been coincident with the desire to immerse the self, and thereby to endow it with meaning. This is the very essence of Homo sapiens sapiens, and it is not likely to leave us any time soon.

Viewed from this perspective, the “damage” of human relations, if not quite forgivable, is at least explicable. How much of our lives is driven by this archaic impulse? A good bit of it, I would guess; maybe even most of it. It is tempting to say, of course, that the solution is then fairly obvious: we have to get a “handle” on this impulse, to channel or control it in some way. This is the path (ideally, at least) of organized religion or psychoanalysis, and it is not completely in error. But it does tend to omit the central point, that of the need to experience the phenomenon–to be “taken” by life, to let life “come and get you” and make of you what it will, so that you might get washed up on the shore of some small town, staring at a picture on the wall–if that should happen to be your fate.

How many of us are willing to take that chance?


©Morris Berman, 2009



*This text does not occur in the novel by Josephine Hart, on which the film is based, but in the screenplay of the movie, by David Hare.

December 09, 2009

Be Here Now

I often wonder how many people in their sixties or seventies have had this experience: you awake one morning and realize that forty years went by in the blink of an eye. You can barely remember them; it all seems like one big blur. What stands out is the suddenness of this passage of time. One day you were a young adult, and the next day (seemingly) you are a Senior Citizen. What in the world? you ask yourself. Where was I all that time?

As Proust told us, the past can only be recaptured kinesthetically, as a somatic memory that is largely fragmentary in nature. Much of this is intertwined with desire. But since desire pursues a “moving target,” so to speak, our personal history seems like an illusion, or even a hallucination. It is for this reason that the fantasy of wanting to return to an earlier point in our lives and “do it over again,” but with our present adult consciousness intact, is a common one–the subject of a number of films, in fact.

All this has led me to think of how I might avoid repeating this feeling that I wasn’t present for most of my life. Proust’s solution to this conundrum was what he called “intermittence”–submission to the “great turning wheel of experience.” It has an obvious affinity with the Buddhist concept of awareness. But it poses certain problems. First, how can I get myself to be aware of my experience as I am having it? It seems more likely that I shall forget to remember. Second, even if I were to become successful at doing this, there is no guarantee that on my deathbed, I won’t experience the past twenty or thirty years, once again, as a blur that went by like a speeding train. Either way, life will have passed me by.

Recently, motivated by some odd form of nostalgia, I searched the Internet for my first love, the girl I dated during my sophomore year in college. Much to my surprise, I found her: she was a successful architect, living in a small town in Virginia. Ironically enough, I had worked in that very town for several years; she and I probably passed each other on the street, or sat in the same cafés together without realizing it. The picture that came up on the screen, of a woman in her early sixties, confronted me with a parallel universe, as it were: had things “worked out,” this person could have been my life. Not that I had any regrets, or sadly missed that alternative possibility; but that she had floated out of my life, only to reappear as a virtual image forty-five years later, was a weird sensation. I thought of writing her, but finally decided against it. What did we really have to say to one another, after all this time? That relationship was someone else’s life.

Truth be told, I have found my “actual” life to be not very different from my dream life. I recall one dream I had when I was thirty...I was taking a bus to a different city, ostensibly to begin a new life, and, having gotten on the bus, discovered that my luggage was sitting out on the curb. I told the driver I needed to get it before we took off, but he told me that as this was a Sunday, the rule (for some strange reason) was that I had to travel without it or not at all. While I was trying to decide what to do, I looked just behind the area where the bus driver was sitting, and noticed a large circular badge or button, made out of metal, on which the words were written, in Spanish, “Doctor of Bone Medicine Aboard.” (This was particularly odd, because at that point in my life, I knew practically no Spanish.) I told the bus driver I would stay on the bus, and leave my baggage behind; whereupon he started the engine, and the bus left the station.

When I awoke, I had the feeling that the message in Spanish was related to a then-popular tune by Simon and Garfunkel, which goes like this:

Paraphernalia
Never hides your broken bones
And I don't know why
You want to try
It's plain to see you're on your own.
Oo-ee, spare your heart
Everything put together
Sooner or later falls apart.

One thing that I felt the dream was telling me was that in order to undertake a journey of freedom, of unfolding consciousness, I had to leave my emotional “baggage” behind. The second message seemed equally clear: nobody could heal me but myself; I was the “Doctor of Bone Medicine” accompanying myself on this journey. I would mend my broken bones, and I would not use “peraphernalia” (gimmicks or substitute satisfactions) to do it. And finally, the notion that life was fleeting, impermanent. We want things to last, but they don’t.

Some years later, I ran across a poem by Juan Ramón Jiménez that seemed to echo this existential reality:

I am not I.
I am this one
who walks beside me, without my seeing him
whom I sometimes see
and whom at other times I forget
who is quiet, serene, when I talk
who forgives me, gently, when I hate
who walks where I am not
who will remain standing, when I die.

Perhaps this is the “intermittence” that Proust was referring to; I’m not sure.

In any case, I find myself thinking about death a lot these days, and wondering what that will be like. Since I don’t believe in an afterlife, I imagine it as a letting go into nothingness–not a pleasant prospect. The Zen idea of being fully present in every experience doesn’t have much attraction for me, in this case, and I always admired the total honesty and simplicity of Zen master Shunryu Suzuki’s last words: “I don’t want to die.” When all is said and done, none of us escapes the human condition.

Last words, of course, say a lot about the person who utters them. “Tell them I’ve had a wonderful life,” said Wittgenstein, as he slipped out of consciousness and into the Great Void. I can’t imagine I’ll enjoy the same peace of mind. My guess is it will be more like, “You mean, this is it?” All those fragments–the girl I dated in college and found decades later on the Net, the dream about the “Bone Doctor,” the time I was seven years old and sat on the beach at Lake Ontario, playing in the sand with a pail and shovel–what, finally, did it all mean? Quite obviously, there is no forcing things to make sense: either they do or they don’t, and there is no guarantee that they will.

I recall, in 1973, visiting Prague (I was living in London at the time), and being politely accosted, in English, by an elderly Czech gentleman sitting on a bench on a street just off Wenceslas Square. He was wearing a suit, overcoat, and hat, and explained to me that he often hung out in the touristy sections of the city so that he might get a chance to practice his foreign language skills–French, Russian, German, English. His name–I remember it to this day–was Jan Horna, and there was something very dignified about him, very self-contained. We chatted for about half an hour, after which I asked him if I might take his photograph. He agreed, on condition that I write down his address and send him a copy of the photo; which I subsequently did. The picture captured him exactly, sitting on the bench with a look that was both wistful and questioning. I assume he is long gone by now, buried in some cemetery in Prague. I think about him from time to time, and wonder how the rest of his life turned out.

©Morris Berman, 2009

December 04, 2009

Audio of My Berkeley Poetry Reading, 16 Nov 09

Dear Friends:

Here 'tis; enjoy!:

http://moesbooks.com/shop/moes/091116.html

ps: a much clearer and more audible version was subsequently engineered by a friend of mine, and can be accessed at http://www.juliollosa.com/#topic=mb&sub=03

November 29, 2009

The Lacuna

So let me recommend another book, this time a novel, by Barbara Kingsolver: The Lacuna. This book is an unbelievable tour de force, in a whole number of ways; and it unexpectedly turns very political at the end, darkly hinting (it seemed to me) at right-wing forces that are likely to engulf us in the course of the next decade (I hope I'm wrong). It also indicates that when such things happen, an empire is on its last legs.

However, it wasn't the politics that attracted me here; it was the prose. What a writer! Comparing her power as a novelist with my own feeble efforts in this direction, I could see what a long row I'd have to hoe, if I continue writing fiction, to come even close to this level of achievement.

At least half the novel is set in Mexico, and her genius for Mexican slang, for the nuance of the Mexican way of life, is equally impressive. The following paragraphs capture some of the things I adore about this country--things I felt from the 1st time I visited in 1979, but which I could never seem to put into words. In the hands of a master like Kingsolver, the ineffable is 'made flesh', as it were:

"In the afternoon when the sun lights the stucco buildings across the street, it's possible to count a dozen different colors of paint, all fading together on the highest parts of the wall: yellow, ochre, brick, blood, cobalt, turquoise. The national color of Mexico. And the scent of Mexico is a similar blend: jasmine, dog piss, cilantro, lime. Mexico admits you through an arched stone orifice into the tree-filled courtyard of its heart, where a dog pisses against a wall and a waiter hustles through a curtain of jasmine to bring a bowl of tortilla soup, steaming with cilantro and lime. Cats stalk lizards among the clay pots around the fountain, doves settle into the flowering vines and coo their prayers, thankful for the existence of lizards. The potted plants silently exhale, outgrowing their clay pots. Like Mexico's children they stand pinched and patient in last year's too-small shoes.

"Here life is strong-scented, overpowering. Even the words. Just ordering breakfast requires some word like toronja, triplet of muscular syllables full of lust and tears, a squirt in the eye. Nothing like the effete 'grapefruit,' which does not even mean what it says."

Arriba!

November 28, 2009

Feeding the Beast

The following quotations are from ch. 3 of Chris Hedges’ most recent book, Empire of Illusion.

“The elite universities disdain honest intellectual inquiry, which is by its nature distrustful of authority, fiercely independent, and often subversive....The established corporate hierarchies these institutions service...come with clear parameters, such as the primacy of an unfettered free market....Those who critique the system itself–people such as Noam Chomsky, Howard Zinn, Dennis Kucinich, or Ralph Nader–are marginalized and shut out of the mainstream debate. These elite universities have banished self-criticism. They refuse to question a self-justifying system. Organization, technology, self-advancement, and information systems are the only things that matter.”

“The bankruptcy of our economic and political systems can be traced directly to the assault against the humanities...A culture that does not grasp the vital interplay between morality and power, which mistakes management techniques for wisdom, which fails to understand that the measure of a civilization is its compassion, not its speed or ability to consume, condemns itself to death. Morality is the product of a civilization, but the elites know little of these traditions. They are products of a moral void.”

“There has been a concerted assault on all forms of learning that are not brutally utilitarian.... Only 8 percent of college graduates...now receive degrees in the humanities....Business majors since 1970-1971 have risen from 13.6 percent of the graduating population to 21.7 percent....Any form of learning not strictly vocational has at best been marginalized and in many schools abolished....[The] defense of knowledge for its own sake, as a way to ask the broad moral and social questions, has been shredded and destroyed. Most universities have become high-priced occupational training centers.”

“And as small, liberal arts schools have folded–at least 200 since 1990–they have been replaced with corporate, for-profit universities....The myopic and narrow vision of life as an accumulation of money and power...has become education’s dominant ideology....The flight from the humanities has become a flight from conscience.”

“Our elites...do not have the capacity to fix our financial mess. Indeed, they will make it worse. They have no concept, thanks to the educations they have received, of how to replace a failed system with a new one....Their entire focus is numbers, profits, and personal advancement. They lack a moral and intellectual core. They are as able to deny gravely ill people medical coverage to increase company profits as they are to use taxpayer dollars to peddle costly weapons systems to blood-soaked dictatorships.”

“People like Lawrence Summers, Henry Paulson, Robert Rubin, Ben Bernanke, Timothy Geithner, AIG’s Edward Liddy, and Goldman Sachs CEO Lloyd Blankfein, along with most of our ruling class, have used corporate money and power to determine the narrow parameters of the debate in our classrooms, on the airwaves, and in the halls of Congress–while looting the country. Many of these men appear to be so morally and intellectually stunted that they are incapable of acknowledging their responsibility for our decline.”

“Obama is a product of this elitist system. So are his degree-laden cabinet members. They come out of Harvard, Yale, Wellesley, and Princeton. Their friends and classmates made huge fortunes on Wall Street and in powerful law firms. They go to the same class reunions. They belong to the same clubs. They speak the same easy language of privilege, comfort, and entitlement....Our power elite has a blind belief in a decaying political and financial system that has nurtured, enriched, and empowered it. But the elite cannot solve our problems. It has been trained only to find solutions, such as paying out trillions of dollars of taxpayer money to bail out banks and financial firms, to sustain a dead system. The elite, and those who work for them, were never taught how to question the assumptions of their age. The socially important knowledge and cultural ideas embodied in history, literature, philosophy, and religion, which are at their core subversive and threatening to authority, have been banished from public discourse.”

“The elite...know only how to feed the beast until it dies. Once it is dead, they will be helpless. Don’t expect them to save us. They don’t know how. They do not even know how to ask the questions. And when it collapses, when our rotten financial system with its trillions in worthless assets implodes and our imperial wars end in humiliation and defeat, the power elite will be exposed as being helpless, and as self-deluded, as the rest of us.”

November 23, 2009

Ways of Knowing

There are a few books one encounters in the course of one’s life that prove to be transformative. In most cases, one is not expecting this. But it happens, and you know that you’ll never look at the world in quite the same way. For me, one text that was particularly life-changing was a slender volume by the classical scholar John Finley, entitled Four Stages of Greek Thought. It was as if, within its pages, I discovered what kind of writer I wanted to be; even, what kind of life I wanted to lead.

Finley distinguishes between the heroic-visionary world of the Homeric Greeks and the theoretical-rational world of their successors. There is a scene in the Iliad, he tells us, in which Hector briefly leaves the battlefield and returns to Troy, to visit his wife and infant son. Standing in front of his house, he reaches out to take the child in his arms, but the boy draws back, frightened at Hector’s helmet with its horsehair crest. Hector laughs, takes off the helmet, and puts it down; and Homer then records how the helmet sits there on the ground, all shiny and motionless, reflecting the light of the sun. The Homeric world, says Finley, is one of brilliant particulars, fixed entities that are what they are, nothing more or less. It is not an especially comforting world, he tells us, but it is at least this: absolutely clear. “Happiness, one sometimes thinks, is clarity of vision, moments when things stand clear in sharpest outline...as if revealed for the first time.” He goes on: “However intoxicating the attractions of intellect, and however essential to the structures by which we live, something in us wants also the clear signals of the senses by which alone the world is made fresh and definite.”

This is, I suppose, the world of childhood, made magical by its very realism; and there certainly is something intoxicating about it: the wind in one’s hair, the shock of a cool lake on a warm summer’s day, the dry texture of an autumn leaf. Yet Finley uses the word “intoxicating” not to refer to the world of sensual immediacy, but to that of the intellect, which has its own siren song. Once we enter the world of Socrates and Plato, and the “sunlit tangibility of the fourth century” (fine phrase, that), there is no going back. The experience of rationality, of conceptual clarity, is so overwhelming that once “infected,” the mind will settle for nothing less. When Archimedes (allegedly) cried “Eureka ” in his bathtub, his excitement was over having discovered a pattern (in this case, the law of specific gravity), not over the sensual impact of the water on his skin.

This issue of pattern is the key to the phenomenon of intellectual intoxication, and probably first occurs, in a formal sense, in the work of Plato. “Noetic” understanding, the job of the philosopher-king, moves along a vertical line, upwards toward the gods. Indeed, it is widely accepted that this vertical model is based on the shamanic or revealed knowledge of the Mystery Religions that were popular in ancient Greece. One application of it can be seen in Plato’s Republic, in the famous “parable of the cave,” in which people sit with their backs to the light and take the shadows cast on the wall for reality. Such individuals are asleep, says Plato, whereas the true philosopher, the one who is awake, turns to the light, the actual source of the perceived phenomena. What you see, then, is not what you get; real knowledge requires this type of “vertical” understanding, this digging beneath the surface. It is not for nothing that Freud compared his own analytical method to the science of archaeology. (Indeed, Heinrich Schliemann was digging up the ruins of Troy during Freud’s lifetime.) What is on the surface, for Freud, is social behavior; what lies underneath this is repressed sexuality (hence the title of one of his most famous books, The Psychopathology of Everyday Life). In the case of Marx, the surface consists of class relations; the reality, the underlying pattern, is the mode of production of a society at any given stage in its history. For Gassendi, Descartes, and Newton, gross objects were mere appearances; the reality was atomic particles. A sunset may be beautiful, but the “truth” of the situation is refracted light. And so on. Cognition of this sort can hit you with the force of a hurricane.

The alternative mode of knowing is more “horizontal”: what you see is what you get. Or as Wittgenstein once put it, “depths are on the surface.” The whole phenomenological school–I am thinking of Husserl and Merleau-Ponty in particular–argues for direct physical experience as the key to the world (the sun gleaming off of Hector’s helmet, for example). The power of this type of understanding derives from the sheer “is-ness” of things, their pure ontology. To “know” a sunset as refracted light may be to not know it at all.

My oldest friend and I discovered, soon after we met, that we shared the same dilemma: we were torn between these two worlds. Both of them were intoxicating, in their own special way; so much so that we found it impossible to give either of them up. His solution was to create two separate, consecutive lives. Thus he spent three decades as a professional scientist, after which he retired to devote himself to photography, yoga, and jazz piano. My solution was to try to bring the two worlds together, and it cost me dearly. No university department could figure out what the hell I was doing, and typically regarded my writing as weird. In a culture severely split between mind and body, I could only be regarded as some sort of “cult figure,” at best. And really, what else could I expect? If you are going to insist that the dominant culture is ontologically crippled, it is not likely that that culture is going to stand up and cheer.

Reading Finley, in any case, provided me with a keen sense of validation, because he doesn’t end his analysis with a description of the two worlds and leave it at that. The “character of a great age,” he writes, is when the two worlds come together, and when, as a result, “meanings seem within people’s reach.” According to him, this unity found its greatest expression in Greece in the fifth century B.C., somewhere between Homer and Aristotle:

"Part of the grip on the imagination that fifth-century Athens
never ceases to hold is that these two kinds of worlds met
then, the former culminating as the latter came into being.
Aeschylus and Sophocles spoke for the older outlook that
saw things through shape; Socrates and Thucydides for the
nascent mind that saw them through idea."

It seems unlikely that we shall ever have such an age again, though who knows what the world will be like five hundred years hence. For now, at least, the integration of mind and body will probably remain a private experience: the intellect that feels, the sensuality that thinks. But ultimately, the commitment of the writer, or of anyone invested in the world of letters, the larger culture, cannot be restricted to individual experience, for solipsism is not an answer to anything. Putting meaning “within people’s reach” is finally what it is all about.

©Morris Berman, 2009

October 05, 2009

Interview with Scott Noble

Scott Noble is an independent Canadian filmmaker who is making a film that interviews a number of critics of the American empire. I can't imagine something like this getting into American theaters, but I admire him for trying. Anyway, he sent me the links for the footage he did of me (this was in Boston, Dec. 2008), and I thought I'd post them for those of you who might be interested. I should warn you, however, that downloading might be a problem: the files are quite large, and if you can indeed download them, it will take a fair amount of time. That said, here's the info:

http://www.megaupload.com/?d=JT9IROPG

and

http://www.megaupload.com/?d=9UM0W6JW

Enjoy (maybe)!

-mb

September 28, 2009

Not To Be Missed

Dear Friends,

Check out a Harrison Ford film entitled "Crossing Over". It's a pretty good portrait of what we have finally become, and of the violence that I believe now lies at the center of the American soul--not exactly a happy place these days. The movie is running in Mexico under the title of "Persecucion inminente," which comes closer to the mark. When the film ended and the lights went up, I noticed that I was the only gringo in the theater, and felt more than a little embarrassed about it.

Of course, the film got panned by most US newspapers. Gee, what a shock.

Anyway, "enjoy"!

mb

September 18, 2009

Oh fer...So now he's a poet?

Dear Friends,

As many of you know, I relocated to Mexico a little over three years ago. There were many reasons for this, both personal and political, which were often intertwined. In terms of daily life, I found the United States oppressive: the workaholism, the running on a treadmill and never getting anywhere, the feeling that one never has a moment to oneself to think, relax, just be. All of that changed when I crossed the border; the sensation was "libre por fin!" (free at last). And it proved to be true. Mexico is not in a rush; the rhythm of life is much slower down here, and a lot healthier, imo. You can believe what you want about crime and swine flu; the fact is, unlike Americans, these folks have their priorities straight. Slowly, I began to unwind. Slowly, I was much more in touch with my body, and with the sensuality of the environment. Slowly, I began to feel happy (what a thought).

All of this had a very large, and unexpected, impact on my writing. One year after my immigration, I suddenly wrote a novel. The bulk of it just poured out in a few days, and then it took another six months to complete the rest of it and start the editing process. I'm currently looking for a publisher for it (well, an agent to start with), and hope I can get it into print sometime next year. The title is Destiny, and it explores the question of whether we can change our lives as an act of will.

I was also asked to be a regular columnist for two upscale literary journals here in Mexico, and began contributing articles on a variety of topics, some of which have been posted on this blog. My Mexican publisher intends to bring it out in book form (in Spanish) as a collection of essays sometime next year, and I'm hoping I can find a US publisher for the English edition as well. These articles reveal a somewhat different form of nonfiction writing, for me, I think: more direct, more visceral, occasionally lyrical. Title: A Question of Values.

Finally: nearly two years ago I suddenly wrote a volume of poetry. This literally erupted while I was sitting in a small cafe; I had no idea I had all this in me, and it moved through my body like a tidal wave. I wrote the bulk of the collection on napkins over a period of three days, and then spent a few months editing it with the generous help of Paul Christensen, who teaches creative writing at Texas A&M, and who contributed a Foreword to the volume as well. And then a small, fine literary press in the Boston area responded positively to a query letter, and the book is going to be published next year by the Cervena Barva Press. The title: Counting Blessings. You get the idea.

Well, I guess good news travels fast. I was subsequently asked to read from this collection by the Poetry Center at San Francisco State University, and will be doing a week of presentations for them this November. The schedule is as follows:

-Nov. 16, 7:30 pm: Poetry Reading at Moe's Books, 2476 Telegraph Ave, Berkeley
-Nov. 18, 7:30 pm: Lecture at the Unitarian Church, 1187 Franklin St, SF (I'll be reading 2 or 3 essays from the collection I referred to above)
-Nov. 19, 3:30 pm: Interview at SF State U, Hum 512 (1600 Holloway Ave)

The theme of the Poetry Center's work this year is "The Poetics of Healing". It fits well with what I've been going through, I think. For those of you living in or near the Bay Area, I'd be delighted if you could join us.

And finally, let me thank all of you who have contributed to this blog over the past few years. In some indefinable but very real way, I think the intelligent back-and-forth that we've engaged in had something to do with my growth as a writer. Gracias and

L'Chaim!

-mb




September 10, 2009

The Parable of the Frogs

One who knows “enough is enough” always has enough.

Tao Te Ching


What does it take to produce large-scale social change? Most historians, if you catch them in an honest moment, will admit that the popular levers of social change, such as education or legislation, are bogus; they don’t really amount to very much. What does make a difference–and then only potentially–is massive systemic breakdown, such as occurred in the United States in the fall of 2008. It was the greatest market crash since 1929, leading to widespread unemployment (something like 20% of the population, in real–as opposed to official–statistics) and the loss of billions of dollars in retirement savings. It had been many decades since the middle class found itself in soup kitchens, and yet there they were. In the face of all this, however, very little seems to have changed. Americans are still committed to the dream of unlimited abundance as a “reasonable” goal, when in reality it is (and always has been) the dream of an addict. President Obama’s $12 trillion bailout and stimulus plan is funneling money into the very banking establishment that gave us the disaster; it rescues the wealthy, not those who really need the money. And while he could have appointed economic advisers such as Paul Krugman and Joseph Stiglitz (both Nobel laureates), who would have attempted to put the nation on a different economic path, he chose instead two traditional neoliberal ideologues, Timothy Geithner and Lawrence Summers, who believe in the very policies that led to the crash. “Change we can believe in” has never sounded more hollow.

The metaphor of addiction is extremely relevant to situations such as these, because addicts always seek to maximize their intake (or behavior) rather than optimize it, even though the former leads to self-destruction. In the face of what seems to be biologically driven activity, reason doesn’t have much of a chance. An experiment with frogs some years ago demonstrated this quite clearly. They were wired up with electrodes in the pleasure center of the brain, and could stimulate that center–i.e., create a “rush”–by pressing a metal bar. Not only did the frogs keep pressing the bar over and over again, but they didn’t stop even when their legs were cut off with a pair of shears And if you are going to object that human beings are not frogs, then you obviously haven’t been reading the daily newspapers, or observing the behavior of the people around you.

There are, of course, a few intelligent frogs around, ones who struggle to point out the difference between optima and maxima. They don’t have much of an audience, as you might expect, but inasmuch as this column has no pretensions to making a difference in the real world, let’s put the matter of popularity aside and concentrate on the ideas instead.

The first intelligent frog who comes to mind is the anthropologist Gregory Bateson, perhaps most famous for having been married to Margaret Mead. For Bateson, the issue was an ethical one. As he himself put it, “the ethics of optima and the ethics of maxima are totally different ethical systems.” The ethics of maxima knows only one rule: more. More is better, in this scheme of things; words such as “limits” or “enough” are either foolish or meaningless. Clearly, the “American Way of Life” is a system of maxima, of indefinite expansion.

But what if the reality of all social systems is that they are homeostatic, which is to say, designed to stay in balance? In that case, said Bateson, the attempt to maximize any single variable (for example, wealth) will eventually push the system into runaway, such that it will destroy itself. To take a physiological analogy, we recognize that the human body needs only so much calcium per day. We do not say, “The more calcium I ingest, the better off I’ll be,” because we recognize that past a certain point any chemical element becomes toxic to an organism. Yet we seem to be unable to extend this insight to the social or economic realm. We do not say, for example, “That company is making too much profit,” or “That individual (Bill Gates, Carlos Slim) has too much money for one person,” or “The Gross Domestic Product is spinning out of control.” Rather than being interested in balance, in stability, we are fascinated by asymptotes–frogs at the bar of pleasure, even while our legs are being cut off. We don’t get it, that if you fight the ecology of a system, you lose, especially when you “win”.

Maximizing a single variable, wrote Bateson, can seem like an ingenious adaptation, but over time it typically turns into pathology. The saber teeth of a tiger may have had short-range survival value, but this development weakened its flexibility in other situations that proved to be crucial. The “favored” species became so “favored” that it destroyed its own ecological niche, and disappeared. A gain at one level became a calamity at another.

In recent months, two American scholars of the intelligent frog variety began to understand this line of reasoning and to conclude from it that Adam Smith, with his theory of the “invisible hand”, was wrong. An early version of Gordon Gekko, with his eulogy of greed (in Oliver Stone’s 1987 film, Wall Street), Smith argued that the collective result of individual self-interest was the prosperity of the whole. But the economist Robert Frank, writing in the New York Times (12 July 2009), argues that “traits that help individuals are harmful to larger groups. For instance,” he goes on,

“a mutation for larger antlers served the reproductive interests of an individual male elk, because it helped him prevail in battles with other males for access to mates. But as this mutation spread, it started an arms race that made life more hazardous for male elk over all. The antlers of male elk can now span five feet or more. And despite their utility in battle, they often become a fatal handicap when predators pursue males into dense woods.”

In the case of the market, says Frank, individual reward structures undermine the invisible hand. “To make their funds more attractive to investors,” he writes, “money managers create complex securities that impose serious, if often well-camouflaged, risks on society. But when all managers take such steps, they are mutually offsetting. No one benefits, yet the risk of financial crises rises sharply.”

Similarly, U.S. Appeals Court Judge Richard Posner, in A Failure of Capitalism, points out that the crash of 2008 was brought about by individual actions that were actually quite rational: bankers and investors pursuing their own interests. Reckless behavior was quite consistent, he says, with being well informed about the risks involved in the context of an economic bubble, and so a great many money managers took those risks. The problem is that what was rational on the individual level was irrational on the collective level, thus leading to a systemic collapse.

We are thus led, quite naturally, from a consideration of optima vs. maxima to the question of individual vs. collective behavior. Which brings me to one of the twentieth century’s most intelligent frogs, the biologist Garrett Hardin, who posed the dilemma in a famous essay entitled “The Tragedy of the Commons” (1968). Consider, said Hardin, the example of a pasture shared by local herders. They all understand that the commons belongs to no one in particular, but supports the well being of all and is the responsibility of all. One day, however, one of the herders puts an additional animal out to graze, with the result that he increases his yield. The pasture, however, is slightly degraded by this. Meanwhile, other herders come to the same conclusion, and as each makes the rational decision to take advantage of the situation for personal gain, the net result is the overgrazing, and ultimately the destruction, of the common area. In a word, the system favors selfish individuals over those with greater foresight and restraint. Or as Hardin put it, “Freedom in a commons brings ruin to all.” Frogs, in a word, are not to be trusted.

How, then, can excess be curbed in a free democratic system? For we can be sure that the intelligent frogs, who are really quite exceptional, are not going to be listened to, and certainly have no power to enforce their insights. True, there are certain countries–the Scandanavian nations come to mind–where for some reason the concentration of intelligent frogs is unusually high, resulting in decisions designed to protect the commons. But on a world scale, this is not very typical. More typical, and (sad to say) a model for most of Latin America, is the United States, where proposed “changes” are in fact cosmetic, and where the reality is business as usual. In the context of 306 million highly addicted frogs, the voices of the smart ones–Bateson, Frank, Posner, Hardin, et al.–aren’t going to have much impact or, truth be told, even get heard.

Of course, authoritarian systems don’t have these problems, which is a good indicator of how things will probably develop. Under the name of “harmony”, for example, China regulates its citizens for what it perceives to be the common good. Hence the famous one-child policy, introduced in 1979, supposedly prevented more than 300 million births over the next 29 years in a country that was threatened by its own population density. In the case of the United States, the imposition of rules and limits on individual behavior to protect the commons is not, at present, a realistic prospect; the population is simply not having it, end of story. But how much longer before this freedom of choice is regarded as an impossible luxury? In fact, no crystal ball is required to predict the future here. The tragedy of the commons–what Hardin called “the remorseless working of things”–is that a society like the U.S. won’t undertake serious changes even when it is sitting on the edge of an abyss. It has to actually be in the abyss before it will entertain such changes; i.e., it has to be faced with no choice at all. It seems unlikely now, but things are probably moving faster than we realize. In terms of population, food, resources, water, social inequality, public health, and environmental degradation, a crunch of the type I am referring to may be only twenty years away.

In Shakespeare’s Two Gentlemen of Verona, the character Valentine is confronted by an outlaw, who asks him if he is content “To make a virtue of necessity/And live, as we do, in this wilderness?” That may prove to be the only “choice” we have. As Thomas Hobbes put it, a few decades after Shakespeare, “Hell is truth seen too late.”


©Morris Berman, 2009

June 30, 2009

Not Coming To Your Local Cineplex

Dear Friends:

This 2006 film is a bit dated (they actually believe the Democratic Party is going to turn things around), but worth a view nevertheless. One hopes Liberty News TV will do a follow-up on the emptiness and spinelessness of Barack Obama--his pandering to the Pentagon, bailout of corporations, and continuation of neoliberal economic policy--but for some odd reason I'm not holding my breath. Enjoy!

http://www.archive.org/details/LibertyNewsTVEpisode14

June 22, 2009

Tribal Consciousness and the Enlightenment Tradition

At one point in his work, Proust advances a theory of development that goes back to Goethe, and ultimately to Plato. It was Goethe who coined the term “morphology,” by which he meant the science of form, and the crucial idea was that the adult manifestation of an organism was already encoded in its earliest structural arrangement. Thus the entire oak tree, for example, was already present in the acorn; growth and development were basically a process of “unfolding” from an original archetype. It is a teleological theory, a theory of predestination; and Proust comments that if you see a pretty girl next to her mother, you can already discern in the daughter the pattern of ageing, the adult features, “in the wings,” as it were. Extending the theory from the biological to the social realm, Proust argues that we should hardly be surprised, for example, to learn that some Jewish person we might know (this around 1900, say) is heatedly on the side of Alfred Dreyfus.* For this is pre-ordained, he says; it’s in the blood. Our mistake is to believe that we are making rational decisions, when the truth is that “our minds possess in advance...the characteristic that we imagine ourselves to be selecting.” He goes on:

“For we grasp only the secondary ideas, without detecting the primary
cause (Jewish blood, French birth or whatever it may be) that
inevitably produced them....We take from our family, as [adult
plants] take the form of their seed, as well the ideas by which we
live as the malady from which we shall die.”

The theory, then, is one of genetic memory, and for Proust it applies to the biological development of human beings as well as plants. It also, Proust is saying, applies to the mental and supposedly intellectual function of human beings, in the form of what we might call “tribal consciousness.” Of course, Dreyfus was innocent and his enemies were a bunch of liars and antisemites, but for Proust that is not the point. The claim here is that we would expect Jews to be on the side of Dreyfus without worrying too much about the evidence pro or con, in the same way that it is not too much of a shock to learn that 96% of the black American population voted for Barack Obama. These are not really freely chosen rational decisions, in short, and we are kidding ourselves if we think they are.

This matter of tribal consciousness is enormously significant, it seems to me, and Jewish identity is as good an illustration of it as any. Suppose, at the height of the Dreyfus Affair, God had waved a magic wand and all of the Jews in France suddenly became Christian, and all the Christians, Jews. I can’t prove it, of course, but I’m guessing that a large percentage of the new Christians would suddenly regard Dreyfus as guilty, and a large percentage of the new Jews would now find him innocent. It is depressing to think that evidence gets marshaled in the service of emotions, but hard to avoid that conclusion. What happened in the aftermath of the Israeli attack on Gaza during December 2008-January 2009, for example, which was nothing less than the wholesale massacre of Palestinian civilians, was quite Orwellian: one heard Israeli spokesmen and apologists claiming that Israel (the occupying power) was somehow the victim in all of this–and they actually believed it. But again, if a magic wand suddenly rendered the Israelis Palestinians and vice versa, wouldn’t the former Israelis now be on the Palestinian side, and the former Palestinians now be convinced that yes, Israel was indeed the victim in this tragedy? That blood, rather than evidence, is the issue constitutes the essence of tribal consciousness. We need to examine this more closely.

I remember, some years ago, pondering this question of how tribal allegiance colonizes the brain when I ran across an intriguing work of science fiction by the American author Neal Stephenson, entitled Snow Crash. The core of the book is what might be called the “viral theory of religion,” in which the brain is taken over or possessed by a certain set of religious ideas. The virus replicates itself inside the individual mind, and it also jumps from one person to the next. Stephenson spends a lot of time applying this theory of infection to ancient Sumer, the thought process of which can be regarded as a kind of trance phenomenon. (Egypt would fall into the same category, it seems to me.) There were, he says, various attempts to break out of the trance, Judaism being the most notable. Thus the Torah was also a virus, says Stephenson, but a benign one; a counter-virus to the ancient mythological world, which was stuck in a rut. Scribes copied it; people came to the synagogue to read it. Judaism was basically the first rational religion, then, but eventually it hardened into legalism, whereupon it was challenged by Christ...whose ideas got taken over by viral influence almost immediately, becoming a new theocracy. The Reformation, fifteen centuries later, was then the counter-virus to this. Etc. The idea is that we become “hosts” for self-replicating information, and as further examples Stephenson points to mass hysteria, jokes, catchy tunes, and ideologies.

As it turns out, Snow Crash is the fictionalized version of the theory of memes, first put forward by the British biologist Richard Dawkins in 1976. The dictionary defines “meme” as “an idea, behavior, style, or usage that spreads from person to person within a culture.” It’s basically an information virus. Dawkins regarded it as a “unit” of cultural ideas that moves by way of imitation, and saw things such as catch phrases, fashion in clothing, and the technology of building arches (to take three unrelated examples) as falling into this category. Memes are essentially replicators, and their mode of transmission can be likened to contagion. As in the case of Stephenson, the virus/meme invades the “host,” takes it over; and this is not, said Dawkins, necessarily positive: in terms of replication, a successful meme can actually be detrimental to the host body. (Just think of what neoliberalism and the Milton Friedman-virus–the “shock doctrine,” in Naomi Klein’s memorable phrase–have done to North and South America, for example.)

Now quite frankly, there is a lot to be said against the theory, most notably that it sets up a kind of pseudoscience that ultimately doesn’t explain very much. There was, for example, a period in the history of science in which the concept of “instinct” was extended from biology to sociology and psychology. It was a total explanation: there was a death instinct, a love instinct, an artistic instinct, a criminal instinct, a nesting instinct, an instinct for sailing the high seas, and on and on. It took a while for social scientists to realize that these “explanations” were completely circular. As one philosopher observed, it was like labeling a bird that went around in circles a “rotopedist,” and then when asked why the bird went around in circles, “explaining” that it did so because it was a rotopedist! Obviously, if everything is an instinct, or a meme, then nothing is.

Second, the meme theory itself can be seen as a meme, moving through society like a virus. But this takes us into a classic situation known as “Mannheim’s paradox,” because then the scientific status of the theory is called into question (it too is a fad, in other words). Karl Mannheim, the German sociologist, developed a mode of investigation known as the Sociology of Knowledge, whereby one studies how ideas get accepted in an intellectual community. Foreshadowing T.S. Kuhn, Mannheim argued that this acceptance did not occur on a rational basis, but rather on an ideological one. However, we then have to ask if this applies to the Sociology of Knowledge as well. After all, why should it alone get a free pass? If it does apply (and Mannheim unsuccessfully tried to argue that it didn’t), the rug is pulled out from under the theory. It begins to look like the ancient “Liar’s paradox”: A Cretan said, “All Cretans are liars.” Was he telling the truth?

Finally, and related to this, is the phenomenon whereby the counter-virus becomes, in short order, the new virus. Judaism becomes Pharasaism, Christ becomes St. Paul becomes the Vatican, the Reformation becomes Protestant rigidity, and New Age spirituality becomes Oprah and Chopra. The old mimetic system gets cracked open, and then the opener becomes The Opener. This means that in effect, with the exception of the briefest of moments, there is no such thing as a non-meme world. As I argued in an earlier essay (“The Hula Hoop Theory of History”), we seem to be caught up in one form of “hula-hoop” or another; we never seem to get a handle on any kind of objective reality. But can that really be the case? I mean, we know that Galileo was right about falling bodies and Aristotle wrong; we know that severe population pressure leads to hierarchical social systems; we know that syphilis is caused by a particular bacterium and that if left untreated, will result in insanity and death; and we know that Alfred Dreyfus was innocent and that the French army was corrupt. Objectively speaking, we know things–a lot of things. And yet, there is no getting around the fact that tribalism–mimetic thinking–is the rule rather than the exception. Thus while there are a number of soldiers in the Israeli army who refuse to serve in the occupied territories, and Israeli peace organizations such as Yesh Gvul (“There is a limit ”) who support them, the majority of the population does indeed see itself as victims, and votes for a prime minister who can be guaranteed to continue the dead-end policies of oppression and occupation–until the demographics of the situation will finally render Israeli rule untenable, and things will change not by reason, but by force. One tribe, in short, will defeat another. What a triumph!

What our discussion comes down to is this: Leaving aside, for now, the first two (philosophical) objections to the meme-virus theory, and granting the fact that tribal consciousness really is the norm for the human race, what are the chances that mimetic behavior could be seriously disrupted, once and for all? This was, after all, the goal of the Scientific Revolution and the Enlightenment tradition; but as one political scientist once pointed out, “It’s not that the Enlightenment failed; rather, it’s that it has never been tried.” This is, of course, not entirely true; but when you have an “advanced” industrial nation with 59% of its adult population sitting around and waiting for the “Rapture” and the Second Coming, 29% thinking that the sun revolves around the earth or not knowing which revolves around which, and 45% believing that extra-terrestrials have visited the planet, you realize that this commentator has a point.

It all comes down to reflexivity: Can we break the hold of the meme-trance, and look at things from the “outside”? After all, intuitively speaking, heavy bodies should hit the earth faster than light ones when dropped from the same height, and we can plainly see the sun “rise” in the East and “set” in the West. Getting outside of the (medieval) meme here means that we look at evidence that is counter-intuitive; that we recognize that there is an objective truth to the situation that doesn’t give a damn about our personal or tribal belief system; that one can stand outside a situation and evaluate it, and extend this analytical mode to our own beliefs, and to who we are. “O would some power the gift to give us/To see ourselves as others see us,” wrote the Scottish poet Robert Burns in the eighteenth century. This external evaluation–what I have referred to elsewhere as “nonparticipating consciousness”–was, as Neal Stephenson correctly notes, the stellar contribution of the ancient Hebrews; and it was also characteristic of the ancient Greeks (their ties to the Mystery religions notwithstanding). After all, when you have Heraclitus talking about the problem of subjective judgment, and Democritus asserting that it is only by convention that we can talk about sweet, bitter, hot, and cold, “but in reality there are only atoms and the void,” you know you’re in a different kind of world than that of blind mimetic belief.

I am not, I should add, claiming that nonparticipating consciousness is without its problems; indeed, that was the entire point of my book The Reenchantment of the World. But it is also the case that there is too much that simply cannot be solved from within a strictly mimetic framework, and this is why we need to ask if the Enlightenment tradition can ever be made to “stick.” Reading its late twentieth-century representatives–I am thinking of philosophers such as Peter Singer and John Rawls–I am often frustrated at how naïve they are, because they are clearly talking about how people “ought” to behave (i.e., rationally) and not how they actually behave (i.e., tribally). What planet are you guys on? is the annoyed reaction I frequently have. And yet, this is the crucial point: Controlling the excesses of tribal consciousness really does mean taking the Enlightenment tradition seriously, breaking the “trance,” and standing outside the particular meme we are caught up in (whatever it is) and evaluating it rationally and empirically. Singer and Rawls don’t have any clear ideas on how to get to such a place, and frankly, neither do I. My guess is that force, not reason, will be the deciding factor in a whole host of areas as the twenty-first century wears on. But it’s challenging to think about what a non-mimetic path might consist of.

Here is a single example, something I can’t really do myself, but at least aspire to. A very long time ago, when I first got interested in Karl Marx, I ran across a biography of the man by Isaiah Berlin. At the time I had no idea who Isaiah Berlin was, but as I was keen to learn more about Marx, I read the book from cover to cover. It was a very sympathetic portrait of the great German philosopher; the author managed to get inside his head, enable you to see the world through Marx’s eyes. I came away impressed with Marx as a thinker; really, as a heroic figure. And then I subsequently learned that Communism was complete anathema to Berlin, who was a Russian (actually, Latvian) emigré; and that if there was one single political ideology he hated, it was that. I still retain a great admiration for Marx, of course, and confess I have some reservations about the work of Isaiah Berlin in general. But that is neither here nor there. Given his own mimetic background, it is hard not to regard his portrait of Marx as a type of heroism all its own.



©Morris Berman, 2009


*Captain Alfred Dreyfus was a French Jewish artillery officer falsely convicted of treason in 1894, and sent to the Devil’s Island penal colony in French Guiana, where he spent two years in solitary confinement. The real culprit, Ferdinand Esterhazy, was tried and acquitted in 1896 in what amounted to an Army cover-up (including the falsification of documents). In 1898, the famous writer Émile Zola led the public protest against the government, as the “Dreyfus Affair” tore the nation apart. Eventually, all the charges against Dreyfus were dropped, and he was finally exonerated in 1906. All in all, not exactly France’s finest hour.

April 29, 2009

How Chic Was My Progress

When it was hip to be hep, I was hep.

–From “I’m Hip,” by Dave Frishberg and Bob Dorough


At one point in The Labyrinth of Solitude, Octavio Paz quotes the German philosopher Max Scheler, who asked, “What is progress?” It’s a crucial question, and in the United States there is basically only one answer, involving the visible expression of technological innovation and economic expansion. Paz was not impressed with this notion of progress in 1950, when he wrote his famous essay, and it is a safe bet that he was increasingly disenchanted with the American model as the years wore on. Although he saw the flaws of his own culture quite clearly, he never felt that the American Way of Life was any kind of solution for Mexico or indeed, the rest of the world. Paz was prescient: at a time when everyone was celebrating America as an unrivaled success, he correctly pegged it as a wounded civilization, one that saw the future strictly in terms of novelty and never questioned what it was doing.

This extremely limited notion of the good life, combined with almost total unconsciousness, presents itself as daily reality in the U.S. I recall a friend of mine telling me, a few years ago, about a train trip she took up the California coast, during which she decided to walk through the cars very slowly, from back to front, almost pretending to be an invalid, so that she could eavesdrop on conversations. Every last one of these, she said, was about some gadget, some aspect of consumer technology–software, computer attachments, iPods, cell phone variations, etc. This is where, she concluded, Americans put their attention; it is what really excites them, makes them feel alive. Nor is this limited to Americans, of course. In the mid-eighties, when I was teaching at a Canadian university, my colleagues were literally ecstatic over the introduction of personal computers, firmly believing that these machines would write papers and books for them, perhaps help them get tenure or upgrade their entire careers (promises that failed to materialize, needless to say). As for south of the border, I was recently riding around Mexico City with a colleague of mine when we saw a huge billboard ad for some cell phone, with the caption, in three-foot high block capitals (in English, for some strange reason), KILL SILENCE. “Well,” I remarked to my colleague, “at least they are being honest about it.” “Oh,” he quipped, “you are fixated on cell phones.”

It’s hard to know how to reply to a dismissive remark of this kind, since even the brightest people don’t get it, and usually have no idea what George Steiner meant when he called modernity “the systematic suppression of silence.” Silence, after all, is the source of all self-knowledge, and of much creativity as well. But it is hardly valued by societies that confuse creativity with productivity. What I am fixated on, in fact, is not technology but the fixation on technology, the obsession with it. Unfortunately, it is hard to persuade those caught up in the American model of progress that it is they who are living in an upside-down world, not Octavio Paz.

For it doesn’t have to be this way. Notions of progress might conceivably revolve around how we treat each other in social situations, for example, not around the latest electronic toy. Some years ago I taught in the sociology department of a major American university, and marveled at my colleagues, who were constantly interrupting their conversations with each other to take cell phone calls–as if a conversation with someone who was not physically present were more important than one with someone who was. They had no idea of how rude they were on a daily basis, and regarded my own views on technology as “quaint.” Considering the damage this behavior was doing to human social interaction, and the fact that these folks were sociologists, I was impressed by the irony of it all. It was like being at a convention of nutritionists, each of whom weighed more than 300 pounds. After all, if obesity is the new health, what is there left to say?

This brings to mind the famous phrase coined by the late Senator Daniel Patrick Moynihan, “defining deviancy down.” Moynihan pointed out that there was a process in American culture by which behavior traditionally regarded as selfish or vulgar–e.g., abruptly breaking off a conversation with one person to initiate one with someone else–rapidly becomes acceptable if enough people start doing it. Deviancy, in short, goes down to the lowest common denominator, finally becoming the norm. Indeed, the vulgarization and “narcissization” of American society had become so widespread by the mid-1990s that books were being written on incivility, and conferences held on the subject as well. But none of this made any difference for actual behavior, as even the most casual observation of contemporary American society reveals.

I remember, some years ago, then Secretary of State Condoleezza Rice talking about American (non)relations with Cuba, and stating that “we don’t want that model to be able to replicate itself”–the old contagion theory of communism, as it were. Well, I’m not big on dictatorships myself, but what about the danger of the American model replicating itself? When you go to New Zealand and see the Maori people talking on cell phones and watching American sitcoms, you know that Moynihan’s prediction about the world turning into trash is not very far off.

China, which is all set to replace the U.S. as the next hegemonic power, is of course replicating the American model with a vengeance. “To get rich is glorious,” declared Deng Xiaoping, and the 1990s witnessed the stripping away of time-worn (non-Maoist) Chinese models of good citizenship and moral participation in collective goals. The race was on to crank out as many cell phones, DVD players, televisions, shopping malls, and highways as possible. Monthly car production went from 20,000 in 1993 to 250,000 in 2004, and Wal-Mart and McDonald’s have spread through the country like wildfire. In China Pop, Jianying Zha gives us a vivid (read: garish and appalling) portrait of a country wallowing in mass consumerism, from soap operas to pornography and beyond. China is now dotted with privileged consumption zones, theme parks, and beauty pageants. Cosmetic surgery clinics abound, promising to give young women more rounded, Western eyes. In fact, the beauty industry grosses more than $24 billion a year. ”Consumerism became a religion,” writes Rachel Dewoskin in Foreign Babes in Beijing, as “street kiosks made way for sleek boutiques and cafés, where Chinese and foreigners lounged together, drinking lattes and Italian sodas.” Companies arrived like missionaries, she recalls, seducing the average Chinese with products they never knew they needed. In the progressive China of today, everyone, according to the British anthropologist David Harvey, “speculates on the desires of others in the Darwinian struggle for position.”

This is why we have more to fear from the American model of progress, and its replication on a world scale, than from some aged caudillo in Cuba. For what does it consist of, finally, when “freedom” means little more than “free enterprise”? As Harvey tells us in his remarkable study, A Brief History of Neoliberalism,

“that culture, however spectacular, glamorous, and beguiling,
perpetually plays with desires without ever conferring satisfactions
beyond the limited identity of the shopping mall and the anxieties
of status by way of good looks (in the case of women) or of material
possessions. ‘I shop therefore I am’ and possessive individualism
together construct a world of pseudo-satisfactions that is superficially
exciting but hollow at its core.”

This beguiling quality–the notion of culture as chic–is an enormous shell game, as Harvey demonstrates in his summary of what happened to New York City during the 1970s. A fiscal crisis arose, the product of rapid suburbanization that was destroying the tax base of the city. Financial institutions were prepared to bridge the gap between income and expenditure in the city budget, and expansion of public employment via federal funding was also being considered. But in 1975 a powerful group of investment bankers, led by Citibank, refused to roll over the debt and left the city technically bankrupt. Union activity was curtailed; cutbacks took place in education, public health, and transportation; and wealth got redistributed upward, to the rich and super rich. It was, says Harvey, “a coup by the financial institutions against the democratically elected government of New York City.” Both the social and the physical infrastructure of the city deteriorated, and the city government, the municipal labor movement, and working-class New Yorkers were stripped of their power.

That wasn’t the end of it, however. The next step on the part of the business community was to turn New York into a “good investment opportunity.” “Corporate welfare,” writes Harvey, “substituted for people welfare.” The idea was to sell New York as a tourist destination, and “I [Heart] New York” swept through the town as the new logo. As Harvey notes:

“The narcissistic exploration of self, sexuality, and identity became
the leitmotif of bourgeois urban culture. Artistic freedom and artistic
licence, promoted by the city’s powerful cultural institutions, led, in
effect, to the neoliberalization of culture. ‘Delirious New York’...
erased the collective memory of democratic New York....New York
became the epicentre of postmodern cultural and intellectual
experimentation. Meanwhile the investment bankers reconstructed
the city economy around financial activities...and diversified con-
sumerism (gentrification and neighbourhood ‘restoration’ playing a
prominent and profitable role). City government was more and more
construed as an enterpreneurial rather than a social democratic or
even managerial entity.”

Progress (so-called) has to be chic, in other words, and this integrates well with the neoliberal equation of freedom with lifestyle choice; which effectively kills democracy, or renders it irrelevant. Again, it’s a question of how you define it. Home visits by doctors, for example (the norm, when I was a child), have vanished almost completely, and Americans would hardly regard the return of this practice as progress. It may well be a life saver, but it’s not particularly hip. SUV’s that destroy the environment are chic; mass transit is not. Dog-eat-dog competition is chic; a social safety net, or a health system that actually works, is not. Best sellers praising globalization are chic; community and friendship, rather passé. And so on. Children get excited by toys, bright colors, and the latest gimmick; adults, by the prospect of a truly healthy society. As deviancy is defined downward across the planet, whether in New York or Beijing, it leaves very few adults in its wake.

As far as technology goes, the irony is that it seems to be failing in its own terms. The social and psychological damage of “life on the screen” has by now been documented by numerous studies; but when the technology is actually delivering the opposite of what was originally promised, one has to ask what it is all for. The literature on this is fairly large, so all I can do at this point is touch on some of the highlights.*

In Tyranny of the Moment, Norwegian anthropologist Thomas Hylland Eriksen argues that while the period from 1980 saw a rapid expansion in so-called time-saving technologies, the truth is that we have never had so little free time as we do now. The Internet has made possible a huge expansion of available information, and yet all the data show an increasingly ignorant population. Changes that were touted as boosting creativity and efficiency have actually had the opposite effect. Air travel is now so heavily congested that by 2000, fifty percent of the flights connecting major European cities were delayed. In the U.S., road traffic tripled during 1970-2000, and the average speed involved in getting around decreased every year. In fact, the average speed of a car in New York City in 2000 was about seven miles per hour, and we can guess that it is even less today. Etc.

One activity heavily promoted as “progressive” was multitasking, made easy by the use of a variety of compact technologies. Yet a study conducted by the University of London in 2005, according to the journalist Christine Rosen, revealed that workers who are distracted by e-mail and cell phone calls suffer a fall in I.Q. of more than twice that experienced by pot smokers. In 2007, she notes, a major U.S. business analyst (Jonathan Spira, at a research firm called Basex) estimated that multitasking was costing the American economy $650 billion a year in lost productivity, and a University of Michigan study revealed that it causes short-term memory loss. In general, writes Walter Kirn, “Neuroscience is confirming what we all suspect: Multitasking is dumbing us down and driving us crazy.” Specifically, it interferes with areas of the brain related to memory and learning; it actually slows our thinking. The problem seems to be that when you move from one task to another, you have to keep “revving up” to get back to doing what you were doing before. Hence, the quality of work gets compromised due to loss of focus and loss of time. In general, the Net lowers the brain’s capacity for concentration and contemplation; “reading on the Net” is almost a contradiction in terms. “We inevitably begin to take on the quality of those technologies,” writes Nicholas Carr; “our own intelligence...flattens into artificial intelligence.”

All in all, it now appears that endless technological innovation and economic expansion, which have only themselves as their goal, finally undermine social relations, redefine common sense, and interfere with our ability to think. Harvey hits the nail on the head when he argues for the existence of an inner connection between “technological dynamism, instability, dissolution of social solidarities, environmental degradation, deindustrialization, rapid shifts in time-space relations, speculative bubbles, and the general tendency towards crisis formation within capitalism.” We are caught in a contradiction, he says, between “a seductive but alienating possessive individualism on the one hand and the desire for a meaningful collective life on the other.”

Personally, I don’t think there is much doubt as to which of these two options is going to win out. By 2050, the planet is expected to have a population of 10 to 11 billion people. Competition for food and water will be fierce; resources in general will be scarce. The majority of this population will probably be living on less than two dollars a day, and “iron” governments will arise to manage politically unstable situations . And yet, there may be an odd silver lining to this, as Blade Runner descends on us in earnest: clutched in the hand of every man, woman, and child will be a state-of-the-art cell phone, and in front of each individual the hippest of personal computers. Granted, we may be collectively dying, but at least we’ll be chic.


©Morris Berman, 2009




*To mention a few key sources: Thomas Hylland Eriksen, Tyranny of the Moment (London: Pluto Press, 2001); Nicole Aubert, Le culte de l’urgence (Paris: Flammarion, 2003); Christine Rosen, “The Myth of Multitasking,” The New Atlantis, No. 20 (Spring 2008), pp. 105-10; Walter Kirn, “The Autumn of the Multitaskers,” Atlantic Monthly, November 2007; Nicholas Carr, “Is Google Making Us Stupid?” Atlantic Monthly, July/August 2008.

April 11, 2009

The Hula Hoop Theory of History

Above all, no zeal.

-Talleyrand


There is a curious rhythm to human affairs, or perhaps more specifically, to Western history. Some movement or idea comes along, and everyone gets swept up in its wake. This is it, then; this is the Answer we’ve been looking for. All of those previous answers were wrong; now, at long last, we’re on the right track. In the fullness of time, of course, this shiny new idea loses its luster, betrays us, or even results in the death of millions. So apparently, we were deceived. But wait: here’s the true new idea, the one we should have followed all along. This is the Answer we’ve been looking for. Etc.

The American writer, Eric Hoffer, described this syndrome nearly sixty years ago in a book that also generated a lot of zeal (for a short time, anyway), The True Believer. People convert quite easily, observed Hoffer; they switch from one ism to the next, from Catholicism to Marxism to whatever is next on the horizon. The belief system runs its course, then another one takes its place. What is significant is the energy involved, not the particular target, which could be anything, really. For what drives this engine is the need for psychological reassurance, for Meaning with a capital M–a comprehensive system of belief that explains everything. There is a feeling, largely unacknowledged, that without this we are lost; that life would have no purpose, and history no meaning; that both (as Shakespeare put it) would amount to little more than a tale told by an idiot, full of sound and fury, signifying nothing.

I call this the Hula Hoop Theory of History, but one could also label it the Pet Rock Theory, or any other craze that grabs our attention for a week or a century. It has a lot in common with the skeptical thinking of the sixteenth-century philosopher Montaigne, who had a great influence on Eric Hoffer, among others. In his Essays, Montaigne pointed out that the new sciences of Copernicus and Paracelsus claimed that the ancient sciences of Aristotle and Ptolemy were false. But how long, he argued, before some future scientist comes along, and says the same thing about Copernicus and Paracelsus? Do we ever really know the truth once and for all?

One might also call this the Drunken Sailor Theory of History, I suppose. Reflecting on the first flush of the French Revolution, William Wordsworth wrote: “Bliss it was in that dawn to be alive.” After Robespierre, the Terror, and the rivers of blood that flowed through the streets of Paris, however, a sober Talleyrand could only comment that what the human race needed, above anything else, was to stay clear of zeal. The path from bliss to barbarism may not be linear, but it does seem to be fairly common, historically speaking.

The latest treatise in the Montaigne-Hoffer school of history is that of the British scholar John Gray, Black Mass. Gray draws liberally on the work of the American historian Carl Becker, whose Heavenly City of the Eighteenth-Century Philosophers (1932) has never been surpassed as an analysis of modernity. Becker claimed that the notion of redemption that lay at the heart of Christianity was recast by the philosophers of the French Enlightenment in terms of progress, or secular salvation. Enlightenment utopianism, in a word, was the transformation of Christian eschatology into the belief in the perfectibility of man–heaven on earth, as it were. This would be the Second Coming, the defeat of ignorance and evil (= sin) by means of reliable knowledge, science and technology in particular.

In Gray’s view, the modern “secular fundamentalisms”–Jacobinism, Bolshevism, Fascism, and most recently, globalization–followed directly from this transformation. The result has been satanic–a black or inverted mass (i.e., one recited backwards)–in that these pseudoreligions have all caused a world of harm. The one idea common to all of them is that progress and perfectibility are within our grasp, and can be attained through an historical process whereby true knowledge will defeat ignorance (evil). Thus the world, and our psyches, are saved, no less in the modern secular world than they were claimed to be in the medieval Christian one, because history itself is imbued with Meaning.

Sad to say, the first three of these secular religions proved, in the fullness of time, not to be the Answer but rather the God that failed; and globalization (Thomas Friedman and his devotees notwithstanding) is in the process of going the same route, revealing itself to be a “false dawn.” Of course, says Gray, once globalization and neoliberalism are finally exposed for what they are, and take their proper place on the scrap heap of history, it will hardly be the case that we shall abandon notions of progress, utopia, and Meaning in history. Not a chance. We in the West will have to find another hula hoop, another pet rock, because as a Christian civilization we are simply unable to live without the myth of redemption. Hence, he concludes, the “cycle of order and anarchy will never end.” The tragedy is that we “prefer the romance of a meaningless quest to coping with difficulties that can never be finally overcome.” Hence, “the violence of faith looks set to shape the coming century.”

At the present time, it’s not clear what the next hula hoop will be; but I’m not sure it matters all that much. If the Montaigne-Hoffer-Gray school of historical analysis is correct, what is certain is that there will be no derailing the zeal in advance, no stopping the next ideological-religious binge at the second martini, so to speak. The word “some” has very little meaning
in the world of secular fundamentalism; for us, it’s all or nothing. “Man cannot make a worm,” wrote Montaigne, “yet he will make gods by the dozen.”

For it is all a kind of shamanism, in a way, an attempt to become whole through magic. We are all broken, after all; that is why the promise of redemption has such a powerful hold on us. “I am he who puts together,” declared one Mazatec shaman, some years ago. It finally comes down to a (misguided) attempt at healing, which is reinforced by tribal practice (commonly known as groupthink). I recall attending a conference on postmodernism in the 1990s and being struck by how similar the lectures were, in form, to those of Communist Party members of the 1930s. The “holy names” were different–one cited de Man and Derrida instead of Marx and Lenin–but the glazed eyes and the mantra-like repetition of politically approved phrases were very much the same. Truth be told, I have observed the same hypnotic behavior at all types of academic conferences, from feminism to computer science. You watch, you listen, and you wonder: When will we finally wake up? And you know the horrible truth: never. In effect, we shall continue to erect statues to Napoleon, but never, or rarely, to Montaigne. This much is clear.

Which brings me to what I consider the bottom line, namely the structure of the brain. The frontal lobes, the large neocortex that governs rational thinking and logical processes, is a relative latecomer on the scene, in evolutionary terms. The limbic system, which is the center of impulse and emotion, has been around much longer. The conflict between the two is perhaps best illustrated by the case of the alcoholic sitting at a bar, staring at a frosty stein of beer in front of him. The neocortex says No; the limbic system says Go. Statistically, most drunks die of alcohol poisoning or cirrhosis of the liver; only a very few escape from the siren song of the limbic brain. As Goethe once put it, “the world is not logical; it is psycho-logical.” And that is to put it quite mildly, it seems to me.

We will not escape the ravages of climate change; we shall not avoid the economic and ecological disasters that are integral to global capitalism; not be able to avert an oil crisis, an energy crisis, or a food and water crisis that will become extreme when the world population finally arrives at 10 or 11 billion, by mid-century. These things are not going to be resolved by reason, by the neocortex, no matter how many articles are published on these subjects in learned journals or popular magazines. And they certainly can’t be resolved by the limbic brain, whose function is indulgence, not restraint. Hence, it is a fair guess that we shall start doing things differently only when there is no other choice; and even then, we shall undoubtedly cast our efforts in the form of a shiny new and improved hula hoop, the belief system that will actually be the true one, after all of those false starts; the one we should have been following all along. What to call it? Catastrophism, perhaps. Consider this the founding document.



©Morris Berman, 2009

February 26, 2009

The Moral Order

The notion that there was a way of life characteristic of modern (or industrial) societies that was qualitatively different from the way of life found in pre-modern (or folk) societies goes back, at least, to the German sociologist Max Weber. Modern societies, said Weber, were governed by bureaucracy; the dominant ethos was one of “rationalization,” whereby everything was mechanized, administered according to the dictates of scientific reason. Weber famously compared this situation to that of an “iron cage”: there was no way the citizens of these societies could break free from their constraints. Pre-modern societies, on the other hand, were permeated by animism, by a belief in magic and spirits, and governance came not through bureaucracy but through the charisma of gifted leaders. The decline of magic that accompanied the transition to modernity Weber called die Entzauberung der Welt–the disenchantment of the world.

The distinction between these two fundamental types of social orders emerged in a variety of contexts in the decades that followed. Thus Ferdinand Tönnies saw the two in terms of Gemeinschaft (community) vs. Gesellschaft (society, especially the culture of business), noting that whereas the former was characterized by bonds of kinship or friendship, the latter was notable for the preponderance of impersonal or contractual relations. Linguist Edward Sapir, in turn, cast the dichotomy in terms of “genuine” vs. “spurious” cultures, and eventually the American anthropologist Robert Redfield would label it the “moral vs. the technical order.” In one of his last books, The Primitive World and Its Transformations, Redfield tried to argue that the technical order would eventually give rise to a new moral order; but it was finally not very convincing. Ultimately, Redfield believed that while the human race had made great advances in the technical order, it had made virtually no progress in the moral order–the knowledge of how to live, as it were–and that because of this, the human prospect was rather dim.

Indeed, for all one can say about the scientific inaccuracy of the pre-modern world, at least it was imbued with meaning. This is not the case with the modern industrial-corporate-consumer state, which expands technologically and economically, but to no other end than expansion itself. As the sociologist Georg Simmel wrote over a century ago, if you make money the center of your value system, then finally you have no value system, because money is not a value. All of these writers (a list that includes Franz Boas, Arthur Koestler, Jacques Ellul, and Lewis Mumford, inter alia) were pessimistic because they could see no way of reversing the direction of historical development. It was obvious that as time went on, the technical order was not merely overtaking the moral order, but actually obliterating it. This loss of meaning does much to account for the rise of the secular-religious movements of the twentieth century, including Communism, Fascism, Existentialism, Postmodernism, and so on. It also accounts for the depth and extent of fundamentalist Christianity in the United States. For there is no real meaning in the corporate-consumer state, which is at once empty and idiotic. On some level, everybody knows this.

We might, then, characterize the crashes of 1929 and 2008 as spiritual rather than strictly economic in nature. John Maynard Keynes saw the fluctuations of the stock market as being governed by human psychology, i.e. by faith and fear. So while in the case of both crashes, one can point to financial “bubbles” and hyperinflated investments, the core of meaninglessness at the center of the consumer-driven economy means that a boom-and-bust cycle is inevitable. In the case of the Depression, it took a war–which involved a huge mobilization of Meaning–to pull us out of it. At the present time, the situation is very different: American wars are now neo-colonial and self-destructive, a drain on the economy. They can only make the situation worse. Hence, the U.S. government has turned to massive bailouts of financial institutions as a solution, but this is analogous to putting band aids on the body of a cancer patient: the core of the problem remains untouched.

And what is the core of the problem? Basically, that the technical order is meaningless; that the American Way of Life finally has no moral center. Indeed, I doubt whether it ever did. In Freedom Just Around the Corner, historian Walter McDougall characterizes the United States as a “nation of hustlers,” going back to its earliest days. What began as trade and opportunism finally issued out into a full-blown crisis of meaning, and it is this that now constitutes the crisis of late capitalism.

It is with this understanding that the political scientist Benjamin Barber recently (9 February 2009) published an article in The Nation magazine claiming that the only thing that could save us now was “a revolution in spirit.” Barber points out that President Obama’s economic advisory team (which includes Timothy Geithner and Lawrence Summers) is squarely in the tradition of neoliberalism and the Corporate State. How, then, can we possibly expect the “change that makes a difference” that Obama promised the American people during his presidential campaign? As Barber notes, “it is hard to discern any movement toward a wholesale rethinking of the dominant role of the market in our society. No one is questioning the impulse to rehabilitate the consumer market as the driver of American commerce.” His solution is to “refashion the cultural ethos” by shifting our values from shopping to the life of the mind. We need, he says, a new cabinet post for the arts and humanities, which will somehow get Americans to think in terms of creativity and the imagination, not in terms of mindless consumerism. “Imagine,” writes Barber, “all the things we could do without having to shop: play and pray, create and relate, read and walk, listen and procreate–make art, make friends, make homes, make love.” “Idealism,” he concludes, “must become the new realism.”

How is this change going to happen? What are the political forces that will bring it about? Barber doesn’t say, and I confess that when I read his article, I couldn’t help wondering if the man had recently suffered some kind of mental lapse. What also came to mind was a book written in 1977 by the American sociologist John Robinson, entitled How Americans Use Time. Robinson discovered that on an average daily basis, five minutes were spent on reading books (of any kind), one minute on making music, thirty seconds attending theater and concerts, and less than thirty seconds on visits to art galleries or museums. As depressing as these figures are, they are surely much worse thirty-two years later, given the heavy corporatization of the culture, the dramatic increase in the attention paid to television and video screens in general, and the widely acknowledged decay of the American educational system. Indeed, the square footage of shopping malls in the U.S.–4 billion as of ten years ago–vastly exceeds that of schools and churches. All of the available data show that the typical American citizen has about as much interest in the life of the mind as your average armadillo. Rather than being on the verge of some possible cultural renaissance, or a reversal of our entire history, what we are now witnessing is the slow-motion suicide of the nation, with Mr. Obama guiding us, in a genteel and intelligent way, into the grave. Indeed, what more can he, or anybody, do at this point? For despite appearances to the contrary, Professor Barber must know that substantive political change is not a matter of voluntarism or exhortatory messages or a purported cabinet post in the arts and humanities. These are little more than jokes. To buck 200-plus years of history requires massive political power moving in the opposite direction, and no such force has emerged on the horizon.

Nor will it. There is no record of a dying civilization reassessing its values (or lack of values, in our case) and altering its trajectory. Whether the type of moral order that Professor Barber imagines can ever become a reality somewhere on the planet is certainly worth debating. But what is not worth debating is whether such a moral order might make an appearance on American soil. History is about many things, but one thing it is not about is miracles.



©Morris Berman, 2009

February 08, 2009

The Asian Road to Victory

OK, folks; this is the 3rd in the series, which will also be posted at the Cyrano Online Journal (www.bestcyrano.org).


The Asian Road to Victory

There is by now a growing consensus that as the sun is setting in the West, it is simultaneously rising in the East. When Mao Zedong called the United States a “paper tiger” back in the 1950s, everybody laughed. Fifty years later, the remark doesn’t seem so funny.

Consider: by 2005, the trade imbalance between China and the United States was 202 billion dollars, having multiplied nearly twenty-fold in just fifteen years. China now holds 922 billion dollars’ worth of U.S. Treasury bills, and a total of almost two trillion in U.S. dollars. Its economy expands nearly 10% a year, while the American economy is hovering on the edge of a full-scale depression, and will need Chinese loans to bail it out. And while the American manufacturing sector gets weaker with each passing day, China has become the workshop of the world. It won’t be long before it starts to flex its muscles militarily as well.

Such are the conclusions of a number of distinguished economists and political scientists. What few of them provide, however, is an explanation for this turn of events. A notable exception is a recent book by the Irish journalist Eamonn Fingleton, In the Jaws of the Dragon, which makes the point that while the Americans spend like there is no tomorrow, the top-down bureaucratic system of China forces its citizens to save rather than consume. In this authoritarian, state-capitalist arrangement, a number of policies make consumer spending very difficult, with the resulting savings generating huge cash reserves that are then deployed in boosting key industries. It’s a coercive system, says Fingleton, and it works. (In fact, Franklin Delano Roosevelt did something similar during World War II, and the U.S. savings rate went from 5% to 25% in three years. The resulting capital was used to pay for armaments manufacture.)

Yet as Fingleton recognizes, the policy of restricted consumption and enforced savings has a deeper root to it, what he refers to as the “Confucian truth ethic.” Although there are real differences among the various schools of Eastern philosophy, they do have a number of important things in common; and as with the Judeo-Christian ethic of the West, these things go very deep. Whether we are talking about the I Ching, the Tao Te Ching, the Analects, or the Chuang Tzu, two items in particular stand out as central to this way of thinking: the notion that the truth is relative, or provisional; and that harmony is the ultimate end of society. Before I say any more about contemporary China, it might be worth our while to explore these themes in a bit more detail.

In a sense, harmony and radical relativism form the shadow side of the Western tradition, which prizes individualism and the reliability of (binary) logic and empirical evidence. This lends Eastern thought a “forbidden fruit” aspect, an exotic aura that exerted a strong influence on many young people in the U.S. during the sixties and seventies, especially. I remember my own introduction to it during that time, and the sense that a great weight had been lifted from my shoulders. For Western individualism and scientific reasoning can finally seem oppressive, too tight a box to live in; in which case thought systems such as Taoism and Buddhism appear to be a breath of fresh air. “Go with the flow,” we all told each other during those heady days in California.

A particularly significant milestone of the genre during that time was the publication, in 1974, of Robert Pirsig’s Zen and the Art of Motorcycle Maintenance. It was Pirsig’s claim that this Eastern shadow tradition showed up in ancient Greece as Sophism, the bête noire of Plato and his school. In fact, so forbidden was this fruit for Pirsig that he finally went insane in the pursuit of the “lost” tradition. Whether or not the Sophists really were Taoists, however, is not the point. What matters historically is that they represented an alternate fork in the road to Platonic doctrine, and one which Plato did his best to squash. The founder of the school, Protagoras (after whom Plato named one of his dialogues), was fond of saying that “man is the measure of all things”; by which he meant that every person has his or her own truth, and that all of these are equally valid. Rhetoric was the issue, he taught his disciples, not logic; persuasion, not reason, was what counted in any given argument. For Plato, this was the philosophy of the mob, of people who were morally and intellectually dead and interested only in acquiring the gift of gab. As Pirsig notes, Plato won the battle–at least in theory–and the Western notion of truth (postmodernism and perhaps law courts excepted) is that it really does exist, and is not merely a function of who is speaking or how persuasive an orator he or she is. As the British philosopher A.N. Whitehead famously remarked, Western philosophy is essentially “a series of footnotes to Plato.”

But the East went in a different direction, and for those accustomed to only one way of thinking, it definitely casts a spell. “Choosing is a disease of the mind,” as one Eastern text puts it. All is in flux; there is no Yes or No. We must avoid getting attached to Right or Wrong, because they fluctuate depending on the person and the circumstances. “For each individual there is a different ‘true’ and a different ‘false’,” says the Chuang Tzu. By following the Tao, going with the flow, one attains the best possible outcome. As the former Chinese leader Deng Xiaoping once put it, the Chinese are “crossing the river by feeling for the stones.”

That the truth is contextual, says Fingleton, means that expediency, or the optimization of what is regarded as beneficial, is the true priority. Thus Zhou Enlai, the consummate Chinese politician, was said to have never told the truth–or a lie. In effect, he made no distinction between the two; he just “felt for the stones.”

Again, on an individual level, Eastern philosophy can afford a large measure of relief. The Western reality system exalts notions of intentionality and deliberate action; it holds that the world can and should be bent to the human will. But this doesn’t really work in human life, does it? We all eventually have to confront the fact that there are many things in life–perhaps the most important ones–that are simply beyond our control. “Those who would take hold of the world and act on it,” wrote Lao Tzu, “never, I notice, succeed.” Hence the Chinese concept of wu wei, or not-doing; which, properly understood, is not the same thing as passivity. Rather, it refers to surrender, to letting things take their course, follow the Tao.

The word “Tao” appears for the first time in the Analects of Confucius, and means the right way of conduct for both the individual and society. According to the American sinologist Herlee Creel, contemplative Taoism, which operates on the individual level, strives for inner harmony. But there is also, he says, a purposive Taoism, which seeks to use the techniques of nonaction and nonjudgmentalism as a means to power. In other words, be without desire in order to get what you desire. This theme–which is essentially one of pure manipulation–features big in the Tao Te Ching, a book that (like The Prince, by Machiavelli) gives advice to kings and lords, and sees the Tao as a technique of control. “The sage, in governing,” says the Tao Te Ching, “empties the people’s minds and fills their bellies, weakens their wills and strengthens their bones.” We are starting to approach the political philosophy of the Chinese state, in which 97% of the population (a total of 1.3 billion people) have full bellies (no mean achievement, by the way).

This, then, is a system of “soft authoritarianism,” in which relationships take precedence over laws–which are, as Fingleton points out, only selectively enforced anyway. Confucianism, he says, is “every enlightened despot’s perfect ideology.” Its emphasis on harmony is easily twisted into an insistence on conformity. (“The nail that stands out is likely to get hit down,” as the Japanese like to say.) It enjoins the people to passivity, and legitimizes authoritarian leadership. Indeed, it is hard to dissent from a system in which there is no right or wrong, true or false, but only that which supposedly promotes the commonweal. Those who try–like the Falun Gong movement that was founded in 1992, and whose doctrines are basically Buddhist–become the target of government crackdown in short order. The Communist Party’s monopoly of power is presented to the Chinese people as a “natural” fact of life: the way, the Tao.

Much of Fingleton’s concern in his book is over the way in which he sees America becoming “Confucianized,” the way U.S. corporations play ball with the Chinese state so as to acquire influence and get on the gravy train. Thus Yahoo , Google, and Microsoft all agreed to abide by China’s censorship rules in serving Chinese Internet users–for example, to expunge all references to Tiananmen Square and Taiwanese independence. In addition, top technology firms in the U.S. contracted with China to develop fire walls that block access of Chinese citizens to “dangerous” information, including important Western websites. Under the influence of the China lobby, pro-Chinese journalists and academics in the United States get their reputations enhanced, go to exclusive dinner parties, and receive lavish fees for lectures. Those who are critical are quickly left out of the loop, and barred from sources of research and information. In general, the Chinese system is one of institutionalized bribery, in which corruption functions like legitimate payment for services rendered. The process, says Fingleton, is destroying American values (Enron executives did wind up in jail, after all). It is China that is changing us, he concludes, not we who are changing China. We are not democratizing them–far from it. Rather, they are Confucianizing us.

All this is probably true, but it seems to be part of a larger, graver loss, that of the Enlightenment tradition itself. Eastern philosophy may be the shadow side of that tradition, but it should be clear by now that the shadow has a shadow. How can the West confront a nation whose government is endlessly slippery, and that meets confrontation with Sophism, in effect? And if, as Mao Zedong predicted, “the East wind will prevail over the West,” what will it be like to live in a world dominated by an ethos in which the truth doesn’t, for all practical purposes, exist, and in which everyone is expected to fall in step with some enforced “harmony”? There is a word for this type of regime: Orwellian. The loss of the Enlightenment yardstick of truth to some kind of pervasive amorality would represent a loss far deeper than an economic one, it seems to me. A Confucianized society in which truth is nothing more than expediency is its own kind of prison; “go with the flow” can become its own form of ego, and of repression.

Some time ago, I was talking with a Mexican colleague of mine, a very brilliant teacher and administrator who had read up on China and was aware of some of these issues. “There may come a time,” he said with a sigh, “when we shall actually miss the gringos.”

What a thought, eh?



©Morris Berman, 2009