Friday, January 30, 2009

Freitag...Friday...Free Day

Glendower: I can call spirits from the vasty deep.

Hotspur: Why, so can I, or so can any man;
But will they come when you do call for them?

Henry IV

1. This isn't the first time I've lived in North Carolina, but I was surprised to learn the other day that a Eugenics Commission was established in the state in 1929 and not formally eliminated until 1977 (yikes). Under the auspices of the program, 7,600 men and women, most of them "feeble-minded" or mentally ill, were sterilized, if not forcibly, then under considerable pressure (e. g. either consent or your family will lose welfare benefits). A significant number of these were minors. The explicit intent, of course, was to "clean up the gene pool," and a number of other states apparently had similar projects, although North Carolina's lingered a few years longer than most. Currently the state legislature is debating compensation for past victims, although given economic conditions, it sounds unlikely.

This is appalling, of course, and is a good reminder that moral progress in society is possible. And yet...any psychiatrist can think of past patients, both men and women, who continued to produce offspring whom they were unable to take care of or who kept getting removed by social services. Since it's hard to remember to use birth control when you're high on drugs, one can't help but suggest to these people a more definite form of family planning (forcing it upon them is obviously another matter altogether). And while Oliver Wendell Holmes is forever notorious for his remark (vis a vis a sterilization case) that "Three generations of imbeciles is enough," we have all encountered families that seemed particularly unfortunate in their collective risk for mental disorder. But naturally we cannot become too zealous in our attempts at prevention.

2. Considering the way that television is so often spoken of as a stupefying scourge, particularly for children, the consternation over the impending switch from analog to digital signals (which may leave large numbers of people unable to afford service, at least for a time) is puzzling. Think of it: large swaths of the population unable to access this mindless, soul-destroying diversion! What's next, leaving asthmatics unable to afford cigarettes? Is television service some kind of right, an integral part of the pursuit of happiness?

3. How do you manage to see sure-fire no-shows in an outpatient clinic? You shuttle them directly from state hospital discharge to the clinic door, as we do occasionally here to try to foster compliance and prevent recidivism. The idea is, by orienting them to the clinic they would otherwise never get around to visiting, to say in effect, "If you keep coming here--and take your meds--you won't have to keep going there (to the 'big house')."

How do you know they'll no-show? A hint is when they say, "I told that doctor in the hospital that I would take the medication so that I could get out, but I'm not going to take it because I don't need medication." We appreciate your honesty. This from an older woman with bipolar disorder who kept dismissing her manic episode as "an incident" at Wal-Mart. I am continually amazed by how often "Wal-Mart" pops up in symptom histories.

4. A curiosity: a young woman with no other psychiatric symptomatology whose primary complaint sounded like hypnagogic hallucinations, which were already getting better on the Lunesta she had been taking for a few days. It's interesting at this kind of clinic to be able to say, rarely, "You can get that taken care of at your primary care clinic" (and to think: you are not one of ours).

5. I see a 20-something fellow who is accompanied by his mother. In gathering the history it emerges that he was diagnosed with ADHD as a child, and his mother said he was treated with Mellaril until he was 9. "Probably methylphenidate," I suggest. But no, she insists that it was Mellaril, for hyperactivity and learning problems, and she even recalls the dose, 75 mg. (He had no history of psychotic symptoms either then or since). Mellaril? I'll bet it did calm him down, and maybe his learning did improve, with respect to one lesson at least.

Wednesday, January 28, 2009


What are you changing?
What do you think you're changing?
You can't change things, we're all stuck in our ways
It's like trying to clean the ocean
What do you think you can drain it?
Well it was poison and dry long before you came

But you can wake up younger under the knife
And you can wake up sounder if you get analyzed
And I better wake up
There but for the grace of God, go I

Jenny Lewis

Okay, let's try this again. I'm reminded of the performing violinist who, puzzled but pleased by repeated calls of "Encore!" from the audience, obligingly played his piece several times over. Finally he heard a hectoring voice cry, "Encore! You're going to sit there and play it until you get it right!" (I've decided to institute an annual joke here at the blog; that was it for 2009).

So my musings yesterday about the blog title brought to mind one of several oddities about being a psychiatrist (and by psychiatrist I always mean a therapist of any kind). (Sometimes people ask if I mind the designation "shrink." I really don't, and it isn't offensive, but as a word I've always found it to be somehow stale and antiquated, very 1970's, sort of like groovy, although I'm not in fact sure when the word arose.)

Many are fascinated by what drives a person to pursue psychiatry, or by implication, what sorts of people are drawn to field, but fewer reflect on how the active practice of psychiatry, or the sustained adoption of the role, could change a person (for better or worse). I will defer that deeper issue, but merely observe that, compared to other professions, that of psychiatrist has the potential, in terms of social perception at least, to hijack the identity.

Perhaps this is because the presence of a psychiatrist tends to make the layperson self-conscious. I would imagine that apart from celebrities of any sort, this is true of only a small number of occupations: priests or ministers mainly, but also perhaps police officers, judges, and teachers also (and tellingly, psychiatry could be said to comprise elements of all those roles).

I can think of plenty of other occupations that would provoke greater admiration or even interest, but not necessarily more self-consciousness. Not long ago I read a blog post by an English professor who complained that everyone she met suddenly seemed apologetic about their grammar or knowledge of books. But I don't think that if I were a plumber (talk about a road not taken!), people I met would feel self-conscious about their pipes. Psychiatrists, preachers, law enforcement types, and teachers foster transferences (of the psychological, religious, legal, and academic kinds, respectively) wherever they go.

After I got married some years ago I was surprised by how many women asked my wife what it was like to be married to a psychiatrist. Can you imagine asking anyone what it is like to be wed to a teacher or an engineer? No, because they're assumed to be regular folks. No, this was like asking what it was like to be mated with a giant millipede, or perhaps a demented taxidermist. The answer is presumed to be titillating, but likely not pretty. No, my wife always answers (with, I never fail to note, something less than delectation in her voice) that it's pretty much like being married to any other man (as if that weren't the lily that couldn't be gilded in the first place).

Tuesday, January 27, 2009


"I must create my own system or be enslav'd by another man's."

William Blake

1. Inspiration...or the lack thereof. Like any good postmodernist, when no lively subject compels me, I can always write about the act of writing.

2. For a while now I have been cranking out a post per day, despite work and other responsibilities (Mrs. Novalis might say, "What other responsibilities exactly?" but don't mind her). Sometimes I think the self-imposed "pressure" (a bit melodramatic I know for a hobby) does help to keep the ideas flowing, but at other times it seems constricting and arbitrary (why not a post every other day, or two posts per day?).

3. Why call this "Psychiatrica" when mental health topics per se constitute only a minority of posts? Well, good question. To some degree I think that being a psychiatrist certainly shapes my approach to a number of issues, so I guess psychiatry is meant to be a kind of central organizing principle, although that doesn't mean that being an M.D. gives me any special insight into poetry or politics or whatever (except when it does).

While work is not disagreeable and sometimes is even interesting, inasmuch as the blog is a vehicle for creativity I often find myself wanting to express myself about matters other than work (otherwise it might start to seem like, well, work). And particularly after I added my name to the site, the use of clinical information no matter how disguised became more problematic.

It is often said that, more than others, Americans identify themselves with what they do for a living, so maybe that motivated the title as well.

4. Even before I added my name to the site, of course, a number of friends and family were aware of it, and this obliged some discretion. It would be a rather different experience I suppose to have a truly anonymous blog, identifiable by no one (except for the Google/Blogger voyeurs), in which case it would be a strangely quasi-private cyberjournal. On blogs where everyone pretty much uses his or her own name (like some of the arts or literature blogs I follow), it seems rather absurd to leave a comment under the pseudonym--sort of like showing up at an office party in a mask.

5. I think I'll take a day off from the blog every now and then (applause). The blog may be undergoing a reincarnation at some point, but I'm not sure. One can't rush inspiration in that respect either.

Monday, January 26, 2009

The Curious Case of William Joel

If music be the food of love, play on,
Give me excess of it; that surfeiting
The appetite may sicken, and so die.

Twelfth Night

High seriousness too shall pass. Like many others of my generation, I first came to know popular music in the form of Top 40, early 1980's material; I have spent the quarter century since trying to work through that aesthetic trauma. It took impressionable years merely to realize that there was much more in heaven and earth than such eminently ephemeral acts as Asia, Toto, and Tears for Fears. Shakespeare may be right in writing, above, that music can go from intoxicating to cloying in an instant, but all the same Ron Rosenbaum's dyspeptic diatribe against Billy Joel in Slate cannot go unanswered.

Why bash Peter Parker for not being Bruce Wayne, or Agatha Christie for not being Virginia Woolf? It is possible to enjoy high, middle, and low modes of music as with anything else. In my mind I tend to place Billy Joel on a soft-to-rugged continuum comprising Elton John and John Mellencamp; all three were distinctly middlebrow even in pop-musical terms, alternatively earnest and bombastic (as if bombasm (?) has no place in the arts). All three were derivative and impossible to take seriously, until one couldn't stop humming their irresistible, irrepressible tunes; the fact is that all three in their prime were inimitable composers of that contemptible genre, the pop song. They scampered amid the looming colossi of Dylan and the Beatles, but they couldn't help that.

From Piano Man in 1973 through An Innocent Man in 1983, Billy Joel generated a series of albums as good as any during that time; if that damns with faint praise, then so be it. To some degree, of course, he suffered from his own success--if Pink Floyd and Fleetwood Mac could be ruined through radio overplay, then how much more vulnerable would Joel's sometimes hothouse concoctions have to be? He could be saccharine (the execrable "Just the Way You Are" does not improve upon the hundredth hearing), and he could be crass (the, I suppose, forthright "Sometimes a Fantasy" from Glass Houses). He often came across as self-righteous and self-absorbed, but he was a pop star, for goodness sake.

Sure, some may find "Say Goodbye to Hollywood" more pompous than anthemic, but one could say the same of Dylan ("Idiot Wind") or Springsteen (pick any song). But Joel could flat out come up with tunes, and his voice was strong and pure. If one can find no affection for "Piano Man" or "Scenes from an Italian Restaurant," then one doesn't like Joel's genre, for he perfected it in such pieces. The chip on his shoulder could be off-putting, but his "Honesty" was refreshing at times.

I would rank Turnstiles, The Stranger, 52nd Street, and the superb live album Songs in the Attic at Joel's apogee, such as it is. By The Nylon Curtain the strain was starting to show ("Pressure" indeed), and An Innocent Man was a charming late spate of creativity that pretty much exhausted his musical ideas. Everything since has been unfortunate. Every now and then an artist, like Rossini and Salinger, lapses into long silence after extravagant shows of genius; they are outnumbered, alas, by the many who should follow their example but don't.

In my opinion music critics, logocentrists as they would have to be, tend to overemphasize lyrics of songs--their "message" or "statement"--far more than is warranted. Vocal music is not poetry that happens to have a tune; it is music first and foremost, and that's why we go to it. Similarly, the experience of opera is not primarily about plot. To be sure, the greatest operas have librettos that are of interest in their own right, but one needn't spend every moment with the greatest.

I am also convinced that musical affinity and aversion are intensely subjective; at times one is attracted or repelled by something for no rational reason. For instance, I tend to like introspective, countercultural female singer-songwriters; Joni Mitchell is reputed to be a great example; I should like her music. It does nothing for me. I could intellectualize about why that is, but I'm not sure it would have any wider validity.

This is not to say that criticism has no place in music. Musically and culturally, Joel is no Dylan, and Rosenbaum's article points out some reasons why. But on a more visceral level, for better or worse, he just doesn't "get" Joel's music; that's fine, but he needn't rain on the whole parade. Granted, my soft spot for Joel developed at a susceptible age; I rarely listen to him any more, and he wouldn't be on my desert-island ipod, could do worse. The world would be poorer without Joel's music; in the grand scheme, not very much poorer, perhaps, but poorer nonetheless.

So: three cheers for Piano Man, right?

Sunday, January 25, 2009

Just This

Yesterday I visited a local Zen center for the first time since moving here a few months back, and I was thinking this morning about subtle correspondences between Zen meditation and classical psychoanalytic practice. As psychological technologies, they are like half-siblings, sharing distinct affinities and oppositions. (I am neither a Zen devotee, exactly, nor an analyst, although I have long had an ambivalent fascination for both).

Both attempt to go beyond both everyday and logical concepts, which are viewed as limiting and distorting. In psychoanalysis, ordinary conceptual language is defensive and repressive, whereas in Zen language is mostly beside the point, an endless distraction. In psychoanalysis the instruments of language are turned ruthlessly upon language itself, in a kind of psychological self-vivisection; in Zen the swords of verbal intelligence are turned into the ploughshares of somatic awareness.

Suppose that you're looking for the needle in the haystack, where the needle bears a close resemblance to a piece of straw. In psychoanalysis, one painstakingly sorts through the hay, examining and comparing each piece exhaustively; in Zen, one steps back and focuses upon the haystack, appreciating the gleam of the sun upon it. The Zen question is: do you really need to find the needle in there? What if it doesn't exist? Even if it does, why do you assume it would make you happy?

Both psychoanalysis and Zen share a disconcerting austerity, a withholding of what we think of as the verbal and interpersonal creature comforts of life. No reassurance is to be had, only the paradoxical reassurance that one doesn't, finally, need reassurance. I therefore find in both a faintly sepulchral ambience: the reclining, quasi-solitary position of analysis, the still silence of Zen. Both are profoundly out of step with both conventional religion and consumerist (post)modernity.

Given the choice between an analytic hour, a meditation session, and some time with Shakespeare, I would always choose the latter. But the former have their own roles to play; how could we appreciate Falstaff without those elements that are opposed to Falstaff: iron discipline, limitation, and the void?

Friday, January 23, 2009

The Missing All

I know that He exists.
Somewhere -- in Silence --
He has hid his rare life
From our gross eyes.

'Tis an instant's play.
'Tis a fond Ambush --
Just to make Bliss
Earn her own surprise!

But -- should the play
Prove piercing earnest --
Should the glee -- glaze --
In Death's -- stiff -- stare --

Would not the fun
Look too expensive!
Would not the jest --
Have crawled too far!


I find Dickinson to be the most striking of poets because she inhabits the knife edge of agnosticism (about matters both metaphysical and theological), compared to which atheism is a cozy nook of fundamentalist certainty. From a purely external point of view, with respect to, say, church attendance or time spent in prayer, the agnostic and the atheist may appear indistinguishable, but subjectively there is all the difference in the world. Like any poet ought to, Dickinson sees this mystery, among other wonders, through fresh eyes, but with words that detonate upon reading, compounding gunpowder with the austerity of haiku. She may not have gazed upon God, but she sure did see the gaping God-shaped hole in the universe.

Jerry Coyne has written, in The New Republic, another devastating critique of both intelligent design and the often touted compatibility of religion and science. His lengthy but rewarding essay begins by reminding us that Charles Darwin and Abraham Lincoln were born on the very same day, February 12, 1809. The universe held its breath--can there have been any more world-historically significant day in the two centuries since? I will not attempt to summarize his trenchant analysis; a few comments will suffice.

Coyne notes that the fact that some scientists harbor conventional religious beliefs says more about the psychological inconsistency and talent for rationalization of human beings than it does about any real philosophical family resemblance. As he puts it:

True, there are religious scientists and Darwinian churchgoers. But this does not mean that faith and science are compatible, except in the trivial sense that both attitudes can be simultaneously embraced by a single human mind. (It is like saying that marriage and adultery are compatible because some married people are adulterers.)

Science does not directly disprove religion, but it has gradually removed the rational need for religion to attempt to explain any number of things about how the universe works. To be sure, there are things that science cannot explain, at least not yet, such as the origins of life or of the universe. But religion does not explain these either; it does offer purported descriptions of these origins, but this is not rational explanation, which depends on testability and experiment. To say that God created the universe is not to explain how the universe is created; it is, rather, to say, "Stop asking, in rational terms, how the universe was created."

Anyone who cares about psychology ought to take religion seriously, given that the great majority of human beings who have ever lived have been religious. The fact that a minority of people are unpersuaded no more undermines the centrality of religiosity to human nature than the occasional occurrence of voluntary celibacy negates the significance of sexuality. Religion arguably meets powerful needs for spiritual integrity of the self and the world, as well as underpinning morality and communal life. This is not to say that these needs cannot, some day, be met in different sorts of ways than they traditionally have been, but we ought not to overlook the needs themselves.
The problem, as Coyne writes, is that most religions go beyond moral, communal, and spiritual experiences to make specific empirical claims about the universe (involving, of course, creation, virgin births, heaven and hell, etc.). It is these claims that are specifically incompatible with science because they are immune to possible disproof by experiment (these claims have also led to much of the historical factionalism and violence that gives religion a bad name). The question is whether religion in the future can relinquish these empirical claims without losing its benignant influence for many people.

It's awfully hard to prove that something doesn't exist. Even so, I am virtually certain that, say, centaurs do not exist anywhere in the universe. I am so certain because the likelihood of intelligent life elsewhere (already a huge long shot of course) taking the form of such earth-generated mythological creatures is very close to nil, even if it cannot logically be shown to be zero.

I am very much less certain about God than about centaurs, largely because throughout history the notion of a supreme being has been so amorphous, and has taken so many diverse forms, that it is hard to know what we are talking about. In fact, it may be more accurate to liken the possibility of God not to any specified phenomenon such as a centaur, but to the potentiality of other intelligent life of whatever (in)conceivable form anywhere in the universe.

I personally have not encountered incontrovertible evidence, whether in my own subjective experience or available objective data, for the existence either of God or of extra-terrestrial intelligence. I cannot disprove their existence, and I have no interest in doing so, but I also don't go about my affairs under any specific assumption that they do exist. I try to do what's right, but I do it because it seems right, regardless of alleged supernatural justification. The fascinating and sometimes bewildering part is that while only a very small and eccentric fraction of my fellow human beings believe in UFO's, a rather large proportion of my earthly cohorts believe in God, so it behooves me to try to ascertain why that is and how we can all get along on this, the only world we really know.

Thursday, January 22, 2009

The Magpie

"I seem to have been only like a boy playing on the seashore, and diverting myself in now and then finding a smoother pebble or a prettier shell than ordinary, whilst the great ocean of truth lay all undiscovered before me."

Isaac Newton

Just a few tidbits from the Web today:

1. Following up on yesterday (I just can't seem to let this go), it is worth remembering that social life requires a certain minimum of formalized compulsion known as ritual in order to keep the threat of dissolution at bay. Thus President Obama and Chief Justice John Roberts performed an encore of the swearing-in, accurately this time. It is pure symbolism, but no less necessary for that.

2. While it's hard to fault any poet or any poem granted pride of place at an inauguration, I was not impressed by Elizabeth Alexander's effort, "Praise Song for the Day" (a transcript of her words is here, but apparently we don't yet have her original format). I suppose she was trying to be accessible, and the sentiments expressed were laudable, but the piece seemed both prosy and platitudinous; the language did not crackle or excite. As Horace would say, it instructed but did not delight. Or as Harold Bloom argued, noteworthy literature must be both deeply strange and absolutely true--strange needn't be bizarre or loopy, but it must be stranger than this poem.

3. William Saletan in Slate discusses the couple who are up for charges of reckless endangerment after their 11-year-old diabetic daughter died when they chose faith healing over traditional medicine. As he notes, this is pretty crude religion (as if medicine can't be a tool of God); it sounds like child abuse leading to manslaughter to me.

4. Mark Bauerlein at Brainstorm joins the lament over what he calls the "tsunami" of digital media threatening to submerge book culture altogether, although he contends that educators and other defenders of the faith (my term) must fight back all the harder. It seems to me that this kind of concern, which gathered steam in the 1990's but still seemed a bit cranky at that point, has reached a certain critical mass these days, although that doesn't mean anything will change. After all, it is now a culture-wide phenomenon and not the doings of any particular interest or industry; it has its own internal momentum now. (His metaphor struck me because the first grader at home loves watching videos of tsunamis and other natural disasters on Youtube; he's not a morbid child, really).

5. Jonathan Oberlander considers the prospects for health care reform under Obama in a commentary in The New England Journal of Medicine (one line summary: the financial and political obstacles for change remain massive, but the economic crisis could shake up traditional interests and power blocs just enough to actually get something done this time around).

6. I speculate a lot about identity from time to time. Donald Winnicott is famous for his distinction between the "true" and "false" self, and we all feel that we know people who fall more into one or the other category. Identity comprises both natural endowments and the narratives we aspire to, and we hope that the two go together in a way that seems at least remotely organic. David Hadju in The New Republic makes the case for Lucinda Williams as the genuine article, as a musician who has remained true to herself in a business notorious for fakery (she is compared favorably in this respect to Taylor Swift and Beyonce).

Wednesday, January 21, 2009


"I'll follow him around the Horn, and around the Norway maelstrom, and around perdition's flames before I give him up."

Captain Ahab

I'm not big on citing reviews of books I haven't read, but this one, Brian Dillon reviewing Obsession, by Lennard J. Davis, provokes thought at least. Arguably monomania is a great human and historical subject that has been very much neutered and watered down (there I go, mixing metaphors again) into the current constructs of obsessive-compulsive disorder and its lite counterpart, obsessive-compulsive personality disorder (if I only had a dollar for every time a medical student talked about having OCPD).

OCD is interesting both in a mainstream clinical way and also as a reflection of human experience. Clinically, I have always found it to be an overlooked diagnostic stepchild; every now and then one glances to the corner and thinks, "Oh, is he still here?" And some in psychiatry feel that it is vastly underdiagnosed. But doesn't it seem like all mental disorders, depending on who you ask, are either greatly exaggerated or woefully underrecognized? It almost seems like a separate criterion for being considered a mental disorder.

But I would say that OCD is often overlooked for two good reasons. One is that most psychiatrists begin their training in inpatient settings in which they learn to attend to a small constellation of very salient presentations: suicidality, severe depression, mania, psychosis, substance abuse, and personality disorder. Virtually no one is hospitalized specifically for OCD, although to be sure it can be associated with depression and other quite serious problems.

Unfortunately, available medications often forcefully shape diagnostic thinking, and OCD is not specifically targeted by any one class of medications. That is, it does respond to serotonin reuptake inhibitors, but these obviously treat depression and other sorts of anxiety as well. Most OCD of clinical significance will be associated with depressive symptoms also, and the prescriber takes comfort in the knowledge that the medication that will treat one will usually treat the other (there are a few non-serotonergic antidepressants such as bupropion and nortriptyline that will not treat OCD very well). Many with mixed depression and anxiety have vague ruminative propensities that may fall short of technical OCD.

In a wider human sense obsessiveness is a dark mirror image of our necessarily limited attention to the environment, a limitation that increasingly earns the label of ADHD--both are problems of assessing and assigning priority. In general I think we underestimate the profundity of attention as a neurological capacity; after all, it is the means by which we come to attach salience and value to the world and to our own lives and identities.

Attention was hugely complicated by the advent of consciousness. Other organisms don't have to deliberate over whether to turn toward the sun, to flee the predator, or whatever. Of course, we don't usually obsess over such basic functions, except when we do, when we have an eating disorder, a sexual conflict, or a masochistic streak. It's fascinating to speculate about some hapless hominid eons ago who could have been the very first one to experience some very primitive glimmer of awareness. The reassuring gray tunnel of instinct, the original one-track mind, suddenly split into two tunnels, and unlike all the previous times, it wasn't obvious which way was best. How terrifying that cognitive precipice must have been--or is to everyone when they reach their first Decision, whatever thay may be.

What most interests me about people isn't so much what has happened to them, but what they have come to value and care about, what they attend to. Depression is a syndrome of inattention as well in a way--it drains the world of value. Indecisiveness is a central depressive symptom that is often overlooked; the depressive world loses its contrast, such that it becomes harder to make necessary distinctions. All the decisional tunnels look dismayingly similar.

I'm intrigued by what makes people choose to be generalists or specialists, whether in academia or in life. I'm more the former by temperament, but sometimes I envy those who can devote their whole lives to, say, Paul Klee, Jonathan Swift, or Jenny Lewis. Okay, no one devotes his whole life to Jenny Lewis, but would that necessarily be wrong?

Well, this was a late afternoon ramble, and it shows. If there is any medium that defies obsession, it must be the humble and ephemeral blog, which must be revealed to the world daily, sometimes even hourly, in a wretchedly imperfect state. I should start a blog which has precisely one post per year, a wonder of wit and insight, part diamond and part supernova (a Silmaril, if you will), that I can polish and fuss and putter over through the months until every facet is like a window onto God. I should, but one morning that jewel would lose its luster, and I would lose interest. And the next post beckons (as does family, food, etc.).
Post revision #1 5:19 P. M. E. S. T.
Post revision #2 6:19 P. M. E. S. T.

Tuesday, January 20, 2009

Poe Country

The Carolinas are shut down by a devastating winter storm. Snow up to the eves...okay, they're expecting just three inches here, but in Tar Heel terms that would seem to be a blizzard; considering the two lane country roads all around, and drivers inexperienced on snow, I'm thinking I'll be going in late if at all.

Any unfortunate double-entendre in today's header is unintended; after all, politically if not economically speaking, we are looking at a relative embarrassment of riches starting today at noon.

No, the title refers of course to Edgar Allan Poe (1809-1849), who, I happened to see, turned 200 yesterday. He led an interesting life, if one marked by much suffering and a premature death that deprived us of who-knows-what wonders. As part of a venerable literary tradition, he was, of course, severely alcoholic.

I had forgotten, until I reviewed his bio, that he spent some ill-suited time at West Point (a less likely military man can hardly be imagined) and that he married his cousin Virginia Clemm when she was thirteen. I am sure some psychobiographers have worked up that relationship. Come to think of it, Samuel Taylor Coleridge ludicrously signed up for military service as well--is there something about poets and martial reaction formation? It's a bit like cats presenting for obedience training.

Poe has a special place in my reading remembrance--at an impressionable time he was one of several "gateway" authors for me. Encountering stories like "The Tell-Tale Heart," "The Cask of Amontillado," and "The Masque of the Red Death" in Junior High School was a kind of dark revelation (well, there were a number of those in Junior High, but this was of the good variety). His work opened up vistas of alternative reality, Bizarro-worlds in which everything was as it is "here," but rotated one quarter turn, with highly morbid consequences.

The plots were ingenious, but literature has never been primarily about plot for me, but rather about place and about character. Like the infernally enchanted worlds of Robert E. Howard, but refined beyond literal and figurative barbarism, Poe's places seemed to exude an eldritch kind of energy, powered by the use of words like "eldritch" that are irresistible to many thirteen-year-olds with a Y chromosome. And I suppose his tales may have provoked an early interest in abnormal psychology--who would not want to treat the distinctly dysfunctional dynamics of the Usher family?

What attracted me in Poe's mystery was not the puzzle of ratiocination, but rather the deeper, metaphysical mind-benders that have no solution. So my adolescent literary interests proceeded more to fantasy and science fiction than to detective or horror fiction; I have always found wonder to be a more primal and a more interesting experience than either fear or wordplay.

Some authors are great, but great chiefly at a certain crucial age, after which their pleasures diminish. So when I reread Poe now it is still with interest, but an interest imbued with a good deal of reminiscence. As is the case with much of so-called genre fiction, he seems more gimmicky now--one can hear the creak of the literary machinery. An interesting comparison for me is H. P. Lovecraft; I somehow never read him until I broke down and bought the Library of America edition a few years ago. I appreciate the deep strangeness of his style, and am glad that I have finally encountered Cthulhu in the original, but somehow it is not something that I can take seriously at this point (but had I read him twenty five years ago I would have thought him fantastic).

It may seem obvious, but it bears noting that the most significant thing about Poe's worlds is that death is not merely always a heartbeat away, but indeed keeps intruding inconveniently into life. His young cousin-wife died early from tuberculosis, and considering his lifestyle and his milieu he cannot have expected a comfortable old age. So we have "The City in the Sea:"

Lo! Death has reared himself a throne
In a strange city lying alone
Far down within the dim West,
Where the good and the bad and the worst and the best
Have gone to their eternal rest.
There shrines and palaces and towers
(Time-eaten towers that tremble not!)
Resemble nothing that is ours.
Around, by lifting winds forgot,
Resignedly beneath the sky
The melancholy waters lie.

And the grievous "Annabel Lee:"

For the moon never beams without bringing me dreams
Of the beautiful Annabel Lee;
And the stars never rise but I see the bright eyes
Of the beautiful Annabel Lee;
And so, all the night-tide, I lie down by the side
Of my darling, my darling, my life and my bride
In her sepulchre there by the sea--
In her tomb by the side of the sea.

And this cheering lyric:

Out--out are the lights--out all!
And, over each quivering form,
The curtain, a funeral pall,
Comes down with the rush of a storm,
While the angels, all pallid and wan,
Uprising, unveiling, affirm
That the play is the tragedy, "Man,"
And its hero the Conqueror Worm.

May he inspire adolescents, including the grown-up kind, for another two centuries at least. He is for the ages now, as was said of another American of his time, and if he was obsessed with death, he could now say, with John Donne, "And Death shall be no more; Death, thou shalt die!"

Monday, January 19, 2009

The Sense of an Ending

This being the Martin Luther King, Jr. holiday here in the States, I was going to extend my Pre-Inaugural Poetry Break for another day, but a typically unsentimental commentary by Stanley Fish in the Times warrants notice. Fish has long advocated for the self-sufficient, thought-for-thought's sake status of the humanities in education; that is, a la Harold Bloom, he decries any notion that the humanities make us "better" people in any broad or conventional way, but he sees this as their great merit, that they are (in ways that I would view as quasi-spiritual) ends in themselves and not means to (allegedly greater) social ends.

He likely still thinks this about the general nature of the humanities, but he has decided that their place in the contemporary university is all but gone, except for a view lonely redoubts that he quaintly terms "museums." Universities are increasingly blatant in their roles as essentially customer-friendly vocational schools, aiming to endow students with the "skills" necessary for "today's economy." Because the arts, literature and philosophy do not obviously or simplistically provide such skills, they are increasingly dispensable and barely even demand lip-service.

So I am reminded not only of my post of a few days ago, in which reading in general threatened to become an "arcane hobby," but also of a couple of trends in medical education. One is the inevitable erosion of psychoanalytic and, in a wider sense, humanistic thought on psychiatry and psychiatric education over the past twenty years or more. This is convincingly analogous to the process Fish describes.

The second and ironic evolution or, rather perhaps, reaction, has been the attempt to restore interest in narrative to medical education and discourse. This quixotic project, in which I played an exceedingly obscure role in the past, seems a bit like trying to open a lending library inside a video arcade, but stranger things have happened. Yet in general the sense is that we have entered a more crass and mercantile age--but this too shall pass, even if after our lifetimes. Meanwhile, enclaves of enlightenment will go on.

But, for the poem I had in mind for today, befitting this final day of a dank, dim, and dispiriting age in American politics, I summon that sour, dour Californian Robinson Jeffers (1887-1962):

Shine, Perishing Republic

While this America settles in the mould of its vulgarity, heavily thickening
to empire,
And protest, only a bubble in the molten mass, pops and sighs out, and the
mass hardens,

I sadly smiling remember that the flower fades to make fruit, the fruit rots
to make earth.
Out of the mother; and through the spring exultances, ripeness and
decadence; and home to the mother.
You making haste haste on decay: not blameworthy; life is good, be it
stubbornly long or suddenly
A mortal splendor: meteors are not needed less than mountains: shine,
perishing republic.

But for my children, I would have them keep their distance from the
thickening center; corruption
Never has been compulsory, when the cities lie at the monster's feet there
are left the mountains.

And boys, be in nothing so moderate as in love of man, a clever servant,
insufferable master.
There is the trap that catches noblest spirits, that caught--they say--
God, when he walked on earth.

(Note: poem format, but not content, altered to fit blog platform).

If this republic was "perishing" in 1925, what is it now--darkest undead? Is resurrection possible, by degrees? Perhaps eight years will suffice.

Sunday, January 18, 2009

Solar Plexus

Busy, busy, busy...The best thing for a Sunday really seems to be plenitude, the sense that there is not merely enough for us, but more than we could possibly take in; and plenitude ought to lead naturally to gratitude. Five billion years old, and still blazing away with appalling violence, cushioned to a caress by distance and atmosphere.

So I leave you with this poem I like from Mary Oliver:

The Sun

Have you ever seen
in your life
more wonderful

than the way the sun,
every evening,
relaxed and easy,
floats toward the horizon
and into the clouds or the hills,
or the rumpled sea,
and is gone--
and how it slides again

out of the blackness,
every morning,
on the other side of the world,
like a red flower

streaming upward on its heavenly oils,
say, on a morning in early summer,
at its perfect imperial distance--
and have you ever felt for anything
such wild love--
do you think there is anywhere, in any language,
a word billowing enough
for the pleasure

that fills you,
as the sun
reaches out,
as it warms you

as you stand there,
or have you too
turned from this world--

or have you too
gone crazy
for power,
for things?

When it is so cold and dark this time of year it is easy to forget that such incomprehensible radiance is there, where it always was, for us.

Saturday, January 17, 2009

Nature, Red in Beak and Talon

After last post I was thinking about cute and cuddly birds and airborne carnage, and was reminded of a rather different view from D. H. Lawrence:


I can imagine, in some otherworld
Primeval-dumb, far back
In that most awful stillness, that only gasped and hummed,
Humming-birds raced down the avenues.

Before anything had a soul,
While life was a heave of Matter, half inanimate,
This little bit chipped off in brilliance
And went whizzing through the slow, vast, succulent stems.

I believe there were no flowers then,
In the world where the humming-bird flashed ahead of creation.
I believe he pierced the slow vegetable veins with his long beak.

Probably he was big
As mosses, and little lizards, they say, were once big.
Probably he was a jabbing, terrifying monster.

We look at him through the wrong end of the long telescope of Time,
Luckily for us.

When this was written, in 1923, did many people suspect that birds descended from dinosaurs? They may look charming, but they may not be very nice, no, not at all.

Silly Goose

Just a poem today, but first an ecological aside. The Hudson River landing was epic all around, but a couple of the early reports contained speculations that birds had struck the plane. Well, let's get our direct objects right--who struck whom, exactly? In another article it was said that "both engines ingested multiple birds," which is more gruesome but also more accurate. No, I am no Audubon fanatic (I ride in those bird-killing machines too), but there would seem to be a moral of modernity here. "Are birds a problem?" we all suddenly wonder. Who knew that avian death lurked so close overhead?

So, in a bit of a crotchety and curmudgeonly mood, and reflecting on the frigid landscape (for the Carolinas anyway) and the passing of quasi-folk artist Andrew Wyeth, my thoughts came to roost on Robert Frost. Frost is often thought of as a folksy guy, and perhaps he was, but in a dark sort of way. The "black dog" hounded his life: he, his mother, and his wife suffered from depression; his daughter was committed to an institution, and his son committed suicide at age 38.

No "snowy evenings" here, but instead this:

An Old Man's Winter Night

All out-of-doors looked darkly in at him
Through the thin frost, almost in separate stars,
That gathers on the pane in empty rooms.
What kept his eyes from giving back the gaze
Was the lamp tilted near them in his hand.
What kept him from remembering what it was
That brought him to that creaking room was age.
He stood with barrels round him--at a loss.
And having scared the cellar under him
In clomping here, he scared it once again
In clomping off--and scared the outer night,
Which has its sounds, familiar, like the roar
Of trees and crack of branches, common things,
But nothing so like beating on a box.
A light he was to no one but himself
Where now he sat, concerned with he knew what,
A quiet light, and then not even that.
He consigned to the moon--such as she was,
So late-arising--to the broken moon,
As better than the sun in any case
For such a charge, his snow upon the roof,
His icicles along the wall to keep;
And slept. The log that shifted with a jolt
Once in the stove, disturbed him and he shifted,
And eased his heavy breathing, but still slept.
One aged man--one man--can't keep a house,
A farm, a countryside, or if he can,
It's thus he does it of a winter night.

A bracing reminder for Inauguration Day--one man can't keep a house. It takes a village--no that's not what I mean. "Be a light unto yourself," the Buddha said; ideally, unto others too.

Friday, January 16, 2009

By the Book...worm

"A little in nature's infinite book of secrecy I can read."

Soothsayer (Antony and Cleopatra)

I pick up a book every now and then, so I was interested to see another elegy on reading's long-foreseen demise, this one by Christine Rosen in The New Atlantis. She cogently presents the standard arguments for reading's unique and endangered virtues; bibiophile that I am, part of me sympathizes, yet I'm not sure that I buy the overall panic. But there must be many stages of moderation between her the-Dark-Ages-have-come-again gloom and the simplistic better-living-through-screens views of Internet boosters.

The problem is that like other book apologists, Rosen begins with distinctly moral advocacy of reading (and we are talking habitual and volitional reading here, not literacy in the basic and technical sense), but glides inevitably into wistful appreciations of the aesthetic culture of books. The two points of view raise rather different issues.

Rosen argues, like other literary eulogists, that reading uniquely exercises powers of concentration and attention, which in turn enable facility with abstract concepts, the kind of faculty which, as she points out, has powered modernity for the past five centuries. How does reading do this? Well, the central point seems to be that reading requires humility and "submission" (Rosen's word) to the author, that is, a certain openness to another's reality. In other words, in undertaking reading one acknowledges one's ignorance about some corner of reality and approaches a teacher in the guise of a learner.

To make sense of a book, one has to patiently proceed from sentence to sentence, paragraph to paragraph, page to page, much like Dante being guided through hell by Virgil. In contrast, in a computer format one is immediately "empowered" to scan however few words per pages one wishes and to click away with barely a thought. Unlike a book experience, one doesn't take the author whole, but rather skims over whatever bits and pieces one thinks are relevant; Rosen would argue that one often doesn't know ahead of time what is relevant and what isn't--that's part of what a book is supposed to teach.

The argument reminds me of the lament over the rise of music singles over albums; while one "submits" to hearing a full album to learn what "teachings" it may hold, with an mp3 player one is free to jump around from track from track, even from chord to chord. In a metaphorical version of this argument, if book-reading is gliding above a forest, dipping low from time to time to appreciate individual trees, then screen-reading is like driving among the trees at high speed--one may feel more in control and more intimate with the experience, but the actual result is disorientation.

Rosen even quotes a description of screen-users as "promiscuous, diverse, and volatile." Wow, that would certainly seem to be directed at bloggers--I'm thinking of myself differently already. Forget the library--where's the party?

I don't know that reading is mainly about submission. On the contrary, books offer enormous liberation as compared, say, to attending live performances (at a theater today or, in pre-literate cultures, at some oral storytelling event). One is free to take in a book at one's own speed, to reread, to skim at times, and of course to say the hell with a particular work without having to offend anyone.

There is no doubt that reading is a very worthwhile and even (trans)formative experience for those who relish it; the question is whether it inculcates skills and virtues that necessarily carry over into non-literary avenues of life. As Rosen notes, Harold Bloom among others has thought not, that reading is a deeply personal and spiritual activity, but not one that enhances the wider culture in any straightforward way.

So far as I know, the alleged cognitive and emotional virtues of reading are based largely on speculation and anecdote ("kids these days just seem more distractible"). I know we're talking about prose above a certain moderate level of complexity, but are we talking only fiction here? Is philosophy better or worse than history or fiction, and does biography count?

And as I said, such concerns always seem to segue into panegyrics about the aesthetics of books--the feel and even the smell of them and their paper, the pleasures of scanning a bookshelf, etc. While Rosen seems aware of being deemed an old fogey (I don't know her age or if she is in fact a fogey), she can't resist references to young people being noisier in libraries than they used to be (kids these days!). Well, I think books are sexy too, but the aesthetic details are irrelevant to the wider cultural concerns; I'm sure that with the advent of the automobile some mourned the loss of leisurely rides in an open carriage drawn by fine horses (an experience for which people continue to pay for today, it is worth pointing out).

There is no doubt that things are changing, but it remains an open question whether for the Worse. Book-reading seems unlikely to become "an arcane hobby," as Rosen puts it, and I would guess that books and screens will be complementary, that is, used for different purposes, for most folks who read at all (and recall that there have always been people who, even if they could read, have not done so for fun). But avid book-lovers probably will become more uncommon. Many opera-lovers probably assume that society would be much better off if more people loved opera; some book folks are likely the same way.

Rosen's essay notes that one study showed regular book-readers to be high Internet-users as well, but I wonder what age groups that involved, and whether that is a transitional phenomenon. My family got our first computer, a Neolithic Commodore 64, probably when I was in my early teens, so my generation is unusual in getting computer exposure at an impressionable time, but also in having pre-computer memories.

So let's see--if you visit this site you know your way around the Internet but presumably have some off-the-beaten-search-engine interests. Are the days of books "fallen into the sere?"

Thursday, January 15, 2009


"The test of a first-rate intelligence is the ability to hold two opposed ideas in the mind at the same time, and still retain the ability to function. One should, for example, be able to see that things are hopeless and yet be determined to make them otherwise."

F. Scott Fitzgerald

"The Crack-Up"

That quote came to mind yesterday when I was thinking about the paradoxes of psychiatry (the medication is making me sick but seems to be helping; I must accept myself yet learn how to change myself; I have lost control over my drinking but must stop drinking). I hadn't read Fitzgerald's great essay "The Crack-Up" in some years; rereading it, I see how poignantly it chronicles a classic mid-life crisis and depressive episode.

F. Scott Fitzgerald (1896-1940) had some contacts with psychiatry, not only through his own severe alcoholism, but more extensively through his wife Zelda's descent into diagnosed schizophrenia beginning in the late 1920's. The Fitzgeralds were a hard-partying literary glamour couple in the decade of the novelist's acclaimed success with This Side of Paradise (1920) and The Great Gatsby (1925).

Around 1930 things started to go sour all around for them: the era of good feelings gave way to the Great Depression, Zelda began to go in and out of sanatoria, and Scott's writing came less easily. Tender is the Night (1934) is considered strong in retrospect, but at the time it garnered critical disappointment, something the author was not at all used to. The story of an ambitious young psychiatrist's ill-fated marriage to one of his patients (we avoid that for good reason), the book was well-known to draw upon real-life marital details. In 1936 Zelda entered Highlands Hospital in Asheville, just a few hours down the road from here, where she remained until 1948, when she and eight other patients were tragically killed in a fire.

So when Fitzgerald published "The Crack-Up" in 1936, things were not going at all well, and overall it is a striking account of personal disintegration. The ever macho Ernest Hemingway (who would of course later receive ECT and eventually suicide) was said to be appalled by the vulnerability on display. The essay is substantial and is written in a puzzlingly (or not) elliptical style, but it certainly merits reading in its entirety. Fitzgerald died of a heart attack four years later. That was tragedy enough, but to my mind it begat an even greater one when Nathanael West was killed in an auto accident on his way home from a trip to Fitzgerald's funeral service.

"The Crack-Up" is filled with arresting images of emotional struggle and disappointment. Fitzgerald is remarkably vague about the specific causes of the breakdown, although he claims early on to have been sober for six months when it happened. But he alludes cryptically to "drawing on resources that I did not possess...mortgaging myself physically and spiritually up to the hilt." His own model for it seems to be a process of depleted energies more than of psychological conflict; if anything, then, his view is more allied to today's depression than to the psychoanalytic paradigms of his time.

More than any DSM checklist, he depicts the withdrawal, the self-isolation, and the fading of general passions that depression entails. Once routine undertakings become herculean challenges, and the range of interests narrows. He writes, "All rather inhuman and undernourished, isn't it? Well, that, children, is the true sign of cracking up." When a friend tries to boost his spirits, he counters that "of all the natural forces, vitality is the incommunicable one" (so the deeply depressed view the "power of positive thinking" with incomprehension).

One gets the sense that early success may have come to him too completely and with too little effort. Life beyond early adulthood can only be a relative disappointment compared to this:

"My own happiness in the past often approached such an ecstasy that I could not share it even with the person dearest to me but had to walk it away to quiet streets and lanes with only fragments of it to distil into little lines in books."

I thought of Wordsworth's "Immortality Ode" ("But yet I know, where'er I go/ That there hath past away a glory from the earth"). The joys of mid-life and later are not lesser necessarily, but they may be more subtle, and may require more conscious labor in their attainment.

Fascinatingly in view of recent concerns about video and the decline of reading, Fitzgerald mourns the ascent of movies:

"It was an art in which words were subordinate to images, where personality was worn down to the inevitable low gear of collaboration. As long past as 1930, I had a hunch that the talkies would make even the best selling novelist as archaic as silent pictures."

This was in 1936, in an era which, compared to today, is held to have been a veritable Golden Age of American literature!

"The Crack-Up" does not end with any great affirmation, whether reassuring or corny. On the contrary: "This is what I think now: that the natural state of the sentient adult is a qualified unhappiness. I think also that in an adult the desire to be finer in grain than you are, "a constant striving" (as those people say who gain their bread by saying it) only adds to this unhappiness in the end -- that end that comes to our youth and hope."

While he lived four more years in physical form, it is tempting to see a moral and psychological death here, the fall of an idealist. He had written, of an early life disappointment, "A man does not recover from such jolts -- he becomes a different person and, eventually, the new person finds new things to care about." If he's lucky, which in the end, perhaps F. Scott Fitzgerald was not. Or perhaps we should say: call no man unlucky who wrote, in The Great Gatsby, what many deemed "the perfect novel."

Wednesday, January 14, 2009

'Nuff Said

The readers of blogs are insatiable; "What have you done for us lately (i.e. since yesterday)?" they clamor (as one of them, I should know). Well, I have in recent days espied kind words about this here blog at Shrink Rap, Jung at Heart, and Dr. X; and today Maggie's Farm has included Ars Psychiatrica among its best blogs of 2008. Not sure I'm worthy, but much obliged to y'all!

Clinical Notes

It's become just like a chemical stress
Traces the lines in my face for
Something more beautiful than is there

Rilo Kiley

1. Speaking of ambiguity in psychiatry, benzodiazepines are a great example. A certain and significant subset of patients crave them, and sometimes they seem to be more trouble--and harm--than they are worth. It might be such a relief to be a fundamentalist and determine never, ever to prescribe them; but there is a moral microcosm here about personal purity and meeting the world as it is. Not only do lots of patients come to a new psychiatrist already on benzodiazepines, but for every person who may abuse them there seems to be at least one who is an ideal candidate, and for whom no other treatment will do. So the peculiar dance, part waltz, part tug of war, goes on. Xanax is where I draw the line though; and yet...never say never.

2. While I don't know of a good colloquial or clinical term for it, I am convinced that there is a kind of psychological hypochondriasis, that is, overconcern for psychological rather than physical conditions, that is far harder to treat than either pure anxiety or classic hypochondriasis relating to physical symptoms. You can't really give yourself cancer by worrying yourself about the possibility, but can you worry yourself into having a riproaring depression or anxiety disorder by brooding on the potential of such maladies? Unfortunately, yes.

3. There was moral panic in the 1990's about Prozac and other newer antidepressants making people "happier than happy." The theoretical concerns persist, but by now I think most people realize that antidepressants aren't strong enough to afford much euphoria to the normal (they're only moderately good at treating the unambiguously depressed). However, the situation appears to be different with so-called "stimulants" like Ritalin and Adderall, classically used for ADHD.

As William Saletan at Slate notes, stimulants may be the new enhancement drug in baseball. Compared to a 3-4% prevalence in the general population, apparently upwards of 8% of pro baseball players are submitting medical justification (in the form of a diagnosis of adult ADHD) for taking stimulants. The problem is, unlike antidepressants, stimulants may well enhance useful qualities like energy and attention in normals, and as with depression, the boundaries of ADHD are quite fuzzy and subjective, vulnerable to being stretched all sorts of ways. However, one might object that ADHD folks might be expected to be overrepresented in sports settings, because the syndrome often inclines those who have it to pursue active lifestyles (when accountants start claiming adult ADHD in large numbers, we'll know something is fishy).

4. I have found humor in psychiatry to be a funny thing (so to speak)--okay that wasn't very funny. As one would expect from human nature, psychiatric humor does go on, and widely, but so far as patients are concerned it is behind the scenes, surreptitious. Because of the long and lamentable chronicle of the stigma and mishandling of the mentally ill, psychiatric humor is sort of like racial humor, practiced above board only if one is of an appropriate ethnicity and patienthood. That is how it should be--respect and professional ethics demand it. And compared to other professions, those in psychiatry (and I include here nurses, therapists, etc.) see more than their share of the absurd, which, human nature being what it is, provokes involuntary amusement. It is a stressful profession at times, and, well, we creatures need to find humor in situations, if we're not allowed to find it in people.

So I mention this vignette not for the sake of any unseemly humor, but only as a break from my endless harping on ambiguity in psychiatric diagnosis...

Well, I started to write a full vignette, but even with details removed, it ain't right, for this venue anyway. Suffice it to say though that if you have been psychiatrically hospitalized twenty times, if you store your feces in your freezer, if you bear said feces to church (freezer being full, presumably), and, having been hauled to the emergency room, become extremely agitated and transform your room there into a scene of scatalogical mayhem, then we can safely leave diagnostic subtlety behind and say, ladies and gentlemen, we have a diagnosis. I appreciate the symbolism and possible psychological meanings of what the fellow was doing, but he really, really, really needs to stay on some medication this time.

Tuesday, January 13, 2009

Everyone a "Philosopher"

First (the Dodo) marked out a race-course, in a sort of circle ('the exact shape doesn't matter,' it said) and then all the party were placed along the course, here and there. There was no 'One, two, three, and away,' but they began running when they liked, and left off when they liked, so that it was not easy to know when the race was over. However, when they had been running half an hour or so, and were quite dry again, the Dodo suddenly called out 'The race is over!' and they all crowded round it, panting, and aking, 'But who has won?'

This question the Dodo could not answer without a great deal of thought, and it sat for along time with one finger pressed upon its forehead (the position in which you usually see Shakespeare, in the pictures of him), while the rest waited in silence. At last the Dodo said, 'everybody has won, and all must have prizes."

Lewis Carroll

Courtesy of Arts and Letters Daily, I found a compellingly spot-on take on contemporary moral relativism by Anthony Daniels at New Criterion (for such a profound issue, his analysis is remarkably brief and easy to read). Basically, his thesis is that moral relativism results when we grow too impatient for absolute and irrefutable philosophical justifications for ethics and morality. When no secular version of the Ten Commandments is unanimously forthcoming, the automatic fallback position is, "Well, then everything is relative."

It is a kind of black and white thinking: if ethical guidelines do not club us over the head with irrresistible force, then they must be wispy and dispensable. We crave an ethical absolute that, outside of religious strictures, is no longer available. It's a bit like violating a diet one is on and then deciding, "Well, what the hell, I may as well eat everything in sight."

Why is relativism increasingly a problem? Daniels has some interesting answers. One is the relative decline, in global terms, of Western political and cultural influence as compared to Asia (and, as I would add, Islam--could it be that some Westerners, viewing the fanaticism that characterizes a small but highly visible fraction of Muslim culture, implicitly decide that maybe relativism isn't so bad by comparison?). Second, the broadly decried emphasis upon individualism and consumption for several decades now has coarsened moral considerations.

But third, and interestingly, he suggests that, well, increasingly educated populations are less satisfied with the kinds of simplistic (but potentially useful) moral doctrines that prevailed in centuries past (think the Ten Commandments, the Golden Rule). That is, the average man on the street is much better than his ancestors at sophristy, that is, bending philosophical arguments to suit his own inclinations. He has just enough knowledge to be dangerous.

The problem is that as many philosophers of course have argued, ethics broadly considered draws on far more than the working out of logical syllogisms in some university library. It depends on evolutionary psychology, which hasn't much changed we can assume. But is also rests on the cultural consensus of shared tradition and decorum, on a kind of moral common sense that we increasingly seem to have lost. As individuals and as a culture, we have become clever enough to question and undermine communal common sense--but as my son teaches me anew every day, it's a lot easier to take things apart than to put them together.

I see an interesting correlate here with psychiatric diagnosis. Some seem to think that if mental disorders aren't as structured and as clearly laid out as, say, the periodic table of the elements, then they don't exist, or are merely social constructions. But there are levels of cultural common sense, the sense of ethics and the sense of illness, both of them falling under the wide rubric of the sense of the proper way to live. The common theme here is the tolerance, or lack thereof, for ambiguity, which is the natural habitat of ethics, law, and medicine.

The question is how much whole cultures can live with ambiguity. Socrates died because he was accused of blurring the clear black and white signposts that were "the gods." Those signposts served useful purposes, but they were also used to bludgeon people to death (just ask the Grand Inquisitor). But did Socrates have a substitute that whole cultures can live with? Can people live amid infinite shades of gray, without going blind or starting to hallucinate?

What is the good life anyway? Supposedly W. H. Auden said or wrote somewhere, and I paraphrase, "The purpose of my life is to help other people. What they themselves are good for I have no idea."

George W. Bush as Aphasic Bully

Two interesting Bush links, one sobering and one hilarious. The former, a clinical take-down of sorts by Justin Frank at The Daily Beast, argues that Bush is a classic bully, comprising deep inadequacy with a cold and distant upbringing, and a bully whose callous and sadistic ways have in fact carried forward into his Presidency.

I am not a student of Bush--I didn't even see Oliver Stone's W. So I will defer comment on the individual. But to my eyes the bullying attitude has gained remarkable ascendancy in both political and popular culture. In my opinion this has been the particular province of the Republican Party and Fox News as its media mouthpiece.

By comparison, the Democratic Party and MSNBC are not innocent of this sort of thing, but they are as the moon to the sun, one reflecting the intensity of the second. Rather than respectful disagreement about issues, bullying involves a sneering demeanor, an implacable self-regard, and a gleeful contempt for vulnerability. The explosion of reality TV, which usually seems to involve humiliation of contestants, entails this kind of sadism as well (let's all laugh at how poorly he or she sings on American Idol).

The second piece is Jacob Weisberg's collection of the 25 greatest Bushisms at Slate. How could the Broca's area of one brain produce so many stunning malapropisms, even over a period of years? We don't want to pathologize it of course, so the only thing is to be amused. Highly recommended.

Monday, January 12, 2009

"I Am"

Mundanity, like inanity, should be carefully titrated in a blog. What charms a parent encounters incomprehension in a world of (potential) readers. For instance, like any young children, ours occasionally revel in bodily function humor that, to put it mildly, is not ready for prime time.

But the other day we were on the way to get the first grader's haircut, and he started in on infinity again:

"Daddy, what is the biggest number?"

"I've already told you, infinity."

"But I thought you said numbers get bigger forever."

"They do--infinity isn't really a number, it's the idea that numbers keep getting bigger and bigger and never stop."

"What's the biggest number that isn't infinity?"

"Infinity minus one, I suppose, but that isn't really the name of a number either."

"Oh, Daddy!"

Sometimes if you look at an astronomical image of a star or nebula, perhaps, that is a few thousand light years away, you'll see blobs of light that appear to be stars but that on closer inspection reveal themselves to be minute swirls, galaxies that could be two billion rather than two thousand light years away. It's like being pulled into an infinite regress, like somehow falling off the earth and plummeting the length of the solar system, or like those dreams most people have of limitless free fall.

I was pondering childhood, of puzzles of identity, and of unbridgeable distances, and what popped out of the conceptual labyrinth but...John Clare. Clare (1793-1864) was sort of on the Junior Varsity team of British Romantic poets (subbing for William Blake perhaps).

Clare was intriguing for a couple of reasons. Many famous folks hail from humble beginnings, of course, but Clare was unusual in coming from a situation of nearly absolute indigence. His farming family was literally dirt poor; when I read a full biography of him a few years ago, I recall learning that some of his early poems were written on scraps of bark because there was no paper. He had very little formal education because he started work in the fields early (the Abraham Lincoln of poetry I suppose).

Clare struggled with his identity throughout his career because while he was patronized by publishers as a woodland wunderkind, they also tried to "correct" the rural dialect in which he naturally wrote. Over time he never really felt at home among the literary class, but he was by then estranged from the illiterate folk from whom he had sprung. He became unhappy and began drinking too much.

In mid-life he developed erratic and at times delusional behavior. It sounds like he suffered from bipolar disorder, for which there would be no specific treatment for nearly a hundred years. The last twenty years of his life were spent at the Northampton General Lunatic Asylum, where he wrote a number of his late poems, including his famous lyric "I Am." (Note: there should be a space between the six and seventh lines, but Blogger is freezing up on me).

I am--yet what I am, none cares or knows;
My friends forsake me like a memory lost:--
I am the self-consumer of my woes;--
They rise and vanish in oblivion's host,
Like shadows in love's frenzied stifled throes:--
And yet I am, and live--like vapours tost
Into the nothingness of scorn and noise,--
Into the living sea of waking dreams,
Where there is neither sense of life or joys,
But the vast shipwreck of my lifes esteems;
Even the dearest, that I love the best
Are strange--nay, rather stranger than the rest.

I long for scenes, where man hath never trod
A place where woman never smiled or wept
There to abide with my Creator, God;
And sleep as I in childhood, sweetly slept,
Untroubling, and untroubled where I lie,
The grass below--above the vaulted sky.

That last stanza is what gets me.

Sunday, January 11, 2009

Who Are You?

The Props assist the House
Until the House is built
And then the Props withdraw
And adequate, erect,
The House support itself
And cease to recollect
The Auger and the Carpenter --
Just such a retrospect
Hath the perfected Life --
A past of Plank and Nail
And slowness -- then the Scaffolds drop
Affirming it a Soul.

Emily Dickinson

After such knowledge, what forgiveness? Think now
History has many cunning passages, contrived corridors
And issues, deceives with whispering ambitions,
Guides us by vanities.
T. S. Eliot

A few links to identity today. Blogs are buzzing over Steven Pinker's typically bravura piece in the Times about "personal genomics." Early in the essay, he notes how, in trying to determine how we got to be who we are, we concoct stories of meaningful connection that may in fact be fictions cloaking our deep ignorance (the controversy over this phenomenon pretty much comprises the history of psychoanalysis).

Pinker is plenty modest about the very limited accuracy and reliability of current genetic tests (and of course the fact that identity goes well beyond the nucleotide sequence). But even when, at some point in the future, we have much more sophisticated genetic testing, that is, when we supposedly will "know" ourselves far better than we do now, there will always be the question of what to do with that information. How to live, what to do? Science does not answer existential questions.

Once we have as much data as, realistically, it is possible to have, it will be necessary to tell new stories of identity. That meaning-making narrative is the realm of freedom, which may be metaphysically illusory, but it's the best we've got, which is quite a lot I think. That's what psychotherapy is about, not pinning down how things are, but deciding how they could be. Naturally, you have to have some general notion of where you're starting from, but you're like an electron--you can never know your position absolutely.

What is most crucial to personal identity, the content of one's history, relationships, and consciousness, or the form of one's dispositions to think, feel, and behave in certain ways? When I think of who I am, what automatically comes to mind is the unique experiences I have had: the family I grew up in, the friends I have had, the places I have seen, the work I have done, the books and music I have absorbed, the family I have now. "Psychological testing" would show none of these things, but rather my impersonal propensities of intelligence, sociability, and neuroticism or the lack thereof. Which is more truly me, the subjective or objective, or is this trying to distinguish the mind from the brain when what is inside the skull is really a unitary phenomenon?

Similarly, when most people think of human history they think of the parade of nation-states, of conquests, of charismatic leaders (biography is a popular genre). When we think of the identity of the United States we think of Presidents, of wars, of assassinations. But arguably all this is quite superficial compared to the deeper currents that drive history.

I've been reading Europe Between the Oceans: 9000 B.C. to 1000 A.D., by Barry Cunliffe, an Oxford archeologist. It is fascinating to me to think of all the now nameless people whose migrations, joys and sufferings, works and deaths over millenia shaped who we are today. Just as they chipped stones for long hours to make tools, so human identity was being very slowly but inexorably chipped by the vagaries of climate and human interaction. The gene pool was being stirred and strained in ways too complex to fathom. For evolutionary psychologists, our identity is shaped more by the experiences of deep antiquity than by, say, the events of the past few decades (as social and cultural psychologists would maintain). But maybe the question is what kind of identity we mean.

Tolstoy infamously ended War and Peace with long disquisitions advocating this deep and impersonal view of history, arguing that while we fasten on a figure like Napoleon as supposedly shaping an age, in truth he himself was merely shaped by wider influences. This was a supreme irony insofar as he had just, in the preceding 1000 pages, constructed fictional individuals as lifelike and convincing as could possibly be done, individuals for whom one could only care and grieve as if they had really lived, only to argue that for the purposes of history their actions, indeed their decisions, had no significant force. If that were true, one would be tempted to say history be damned. But of course it isn't wholly true, and it is a matter of perspective--we are not interested in individuals because of their putative world-historical importance.

I'll end with whimsy here--Christopher Hitchens has an uncharacteristically light-footed piece in The Atlantic that points out the various feline characteristics of Barack Obama. He sees this as generally a good thing, although he laments that, unlike more substantial animals presumably, cats don't generally leave much trace of their passing (apparent message: Obama as lightweight). But I would point out that they're impressively clean, they leap tall fences in a single bound, they resist herding, and--they have claws.

Think of the endless dichotomies we use to sort identity: young/old, boy/girl, extrovert/introvert, secular/religious, conservative/liberal, canine/feline.

Feline. Except when canine.

Saturday, January 10, 2009

Once Again, Spirituality

I went to the cobbler
To fix a hole in my shoe
He took one look at my face
And said, "I can fix that hole in you."

I beg your pardon
I'm not looking for a cure
Seen enough of my friends
In the depths of the godsick blues

Jenny Lewis

Courtesy of Neuronarrative I saw the interesting post at Rationally Speaking about the elusive phenomenon of spirituality that is free of supernatural content. This is a question that has bedeviled me (so to speak) for some time.

I take the liberty here of quoting myself--a few years ago I wrote a paper on this topic in the context of the medical encounter. I am not savvy enough for a direct link to the pdf file, but you can easily go here and then scroll down to the third article which is mine (as articles go it is not long, but it is rather longer than a blog post--and fair warning, the style is, well, academic). I got a surprising amount of feedback to it at the time, both pro and con. So if your weekend is slow (I offer both commiseration and condolence) I hope you'll take a look.

For those without the luxury of time (or who, come to think of it, couldn't care less), I would only briefly say that I think a phenomenon of naturalistic spirituality does exist, but that the term "spirituality" itself is probably burdened with too many centuries of bogeymen to be useful; yet it is awfully hard to come up with a suitable alternative. Arguably a temperamental and individually variable capacity for religiosity does prevail, whether it is given metaphysical content or not.

Reflective and philosophical seem a bit weak for what we have in mind. A commenter at Rationally Speaking offered transcendent, which is quite good. Another option that occurs to me is sacred. For I do think that secular folks do hold certain things sacred, whether they acknowledge it or not (those who hold nothing sacred we call psychopaths). Logic and science can help to delimit the realm of the sacred, but the latter itself beckons when the tools of logic and science have done all they can do.

So when one of the kids, having absorbed some of the endless squalor of popular culture, exclaims "Oh, God!" I always correct to "Oh, gosh," not because of literal belief, and not only because it reflects better manners, but also out of respect for a concept that, whatever its controversy, has been a vessel for the sacred over time. Similarly I wouldn't let them clamber over the pews in a church even if no one were there. But one can find the sacred lots of other places as well, in places some people find more congenial.

A few famous aphorisms by Wittgenstein help me here:

"What has to be accepted, the given, is--so one could say--forms of life."

"The meaning of life, i.e. the meaning of the world, we can call God."

"If I have exhausted the justifications, I have reached bedrock and my spade is turned. Then I am inclined to say: 'This is simply what I do.'"

I particularly like this latter one, as it illustrates my ambivalent attitude toward philosophy over the years. As I argued in a post on suicide a while back, logical justification goes only so far, beyond which lies the sacred, that is, what we choose to be, to embrace, and to revere (or that we find ourselves unable not to be or to revere). But be careful, theologians--don't try to sneak metaphysics and justification back into the sacred circle where they have no power.

Friday, January 9, 2009


All the good that won't come out of me
And all the stupid lies I hide behind.

Rilo Kiley

Well, it's Friday and most of us are relatively free for two days, but if you're like me (and you may not be, at all) you're wondering whether your will is free as well. Interesting word, will, denoting the power of volition, the future tense, a posthumous bequest. Potent stuff.

I'm as big a booster for individual empowerment as anyone, but on the other hand, somewhere Nietzsche wrote that the notion of free will is a terrific means of controlling others, by inducing them to comply with our expectations of how they ought to be able to behave. Nietzsche was full of perverse ideas like that--how can the great good of free will be a method of social control? Well, free will is a heuristic that guides any individual's behavior, and by implication, no viable social group can do without it. If you really think about it, only determinism can be true, but only in the way that only nihilism can be true--you can't really live that way; every natural impulse cries out against it.

That's all tonight really, except for some quotes from old Will Shakespeare--man, that guy could write. Interestingly, he seemed mightily preoccupied with the conundrum of free will, but while he was no determinist, only his most heinous villians were implacably insistent upon the power of self-determination, whereas his heroes advocated a more nuanced and lenient view of what Homo sapiens is capable of.

So here's the psychopathic Iago (Othello):

'Tis in ourselves that we are thus or thus. Our bodies are our gardens, to the which our wills are gardeners; so that if we will plant nettles or sow lettuce, set hyssop and weed up thyme, supply it with one gender of herbs or distract it with many, either to have it sterile with idleness or manured with industry, why, the power and corrigible authority of this lies in our wills.

And the equally appalling Edmund (King Lear):

This is the excellent foppery of the world: that when we are sick in fortune--often the surfeit of our own behaviour--we make guilty of our disaster the sun, the moon, and the stars, as if we were villains by necessity, fools by heavenly compulsion, knaves, thieves, and treacherers by spherical predominance, drunkards, liars, and adulterers by an enforced obedience of planetary influence, and all that we are evil in by a divine thrusting on. An admirable evasion of whoremaster man, to lay his goatish disposition to the charge of stars!

In stark contrast, here is the forbearing Portia (The Merchant of Venice):

The quality of mercy is not strained.
It droppeth as the gentle rain from heaven
Upon the place beneath. It is twice blest:
It blesseth him that gives, and him that takes.
'Tis mightiest in the mightiest. It becomes
The throned monarch better than his crown.
His sceptre shows the force of temporal power,
The attribute to awe and majesty,
Wherein doth sit the dread and fear of kings;
But mercy is above this sceptred sway.
It is enthroned in the hearts of kings;
It is an attribute to God himself,
And earthy power doth then show likest God's
When mercy seasons justice.

I couldn't have said it better.