Friday, July 30, 2010

Concerning Natural Religion


The pensive man...He sees that eagle float
For which the intricate Alps are a single nest.

Wallace Stevens, from "Connoisseur of Chaos"


Two recent links set out the basic aims of religion--Ron Rosenbaum, in a welcome apologia for agnosticism (as opposed to blustery atheism), points out that science has never (yet) delivered on the central question of: why is there something rather than nothing? Marcelo Gleiser, over on NPR's commendably cosmic 13.7 blog, proposes death (i.e. the valorization of life) as a basis for a transcendent spiritual ecology. Basically, how did we get here, and why do we have to leave so fast?

In the currently popular effort to bridge science and religion, Gleiser seeks some universal principle within human nature and religious impulse that will supersede differences. And yet, if one can forgive mere contrariness, it occurred to me: what if the most universal principle of human religiosity is in fact diversity of belief? What if that which "unites" us is actually our tendency to form "our" worldviews in opposition to "theirs?"

Much research in evolutionary psychology, echoing millenia of intuitive observation of human nature, suggests that human beings may be wired for "us and them" thinking, for thinking in binaries, for forming "in" groups and "out" groups. We define ourselves by contrast, for better or worse (we may think it is mainly the latter, but in evolutionary terms it may be otherwise). Male vs. female. White vs. black. Old vs. young. Liberal vs. conservative. Rich vs. poor. Yankees vs. Red Sox.

This needn't mean that humanity is doomed to internecine nastiness, but it does suggest that populations will tend to resist conformity of thought. Wouldn't it be nice to rise above the science vs. religion debate? Sure, just as it would be great if Republicans could pal around with Democrats, or if Protestants and Catholics could agree to bury doctrinal differences.

History arguably consists of the vicissitudes and machinations of contrasts. The now generally mocked "end of history" forecast in the 1990's was apparently premature. Why do we mourn the declining diversity of culture or language, yet celebrate the notion of religious diversity merging into some fuzzy "reverence for life?" The latter risks both the incommunicably idiosyncratic and the coldly abstract; successful religions generally depend on persuasive personal narratives that bind communities together.

In other words, religion is not contingently parochial, it is necessarily so, presupposing the community of those who "get it" versus those who don't. And agnostics and atheists define themselves in opposition most of all; arguably belief preceded disbelief. If a totalizing belief system were ever to take over the earth, whether it be the Catholic Church, a Muslim caliphate, or an atheistic regime, then the next day heresy would break out somewhere. Unless or until the seemingly unanswerable questions are answered, incompatible belief systems will persist, which should be okay.

Monday, July 26, 2010

The Age of Anomie?


"Message: I care."

George H. W. Bush


The mad have always been with us--convincing descriptions of severe mood disorders in particular exist from antiquity. But neuroses and "problems of living," or at least the recognition thereof, are largely inventions of modernity, as set forth by Ronald Dworkin in his account of what he calls "the caring industry." His article is not brief but presents some fascinating arguments and warrants careful reading. I have time for only a short summary and response here.

As Dworkin describes, since the 1950's a massive population of therapists of every stripe and, now, "life coaches," has sprung up to manage emotional issues that until then had been the domain of traditional peer groups, lines of authority, and cultural connectedness. The rise of mobility and suburbanization and the decay of natural communities began this trend, which was reinforced and hastened by the social turmoil of the 1960's and onward. Therapists do the work (or try to) that used to be done (if it was done at all) by families, neighbors, and the clergy.

So far this is pretty familiar territory--Dworkin documents the increasing ubiquity of the caring industry in the military, the schools, and everyday life. He points out that while the 1950's are often recalled as a calm before the storm of contemporary upheaval, the decade was, beyond its infamous conformity, actually marked by a deep malaise, reflected in its moniker as "the Age of Anxiety." It was as if the plunge into collective and rapid social and technological change, still a few years off, could nonetheless be glimpsed and feared.

But Dworkin's more creative and radical claim is that, presumably in response to the civilization-wide catastrophes of the two world wars, people in the West at mid-century underwent a profound change in their emotional engagement, one that we still haven't recovered from. He suggests that for the better part of a millenium, love had grown as an ideal in personal, religious, and national life. But the extent and depravity of the world wars showed that love stops at the nation-state and cannot be extended to humanity as a whole; the implication is that love is not only insufficient for global security, it is positively dangerous, and an unaffordable luxury, inasmuch as it fosters tribalistic nationalism in a nuclear age.

The consequence was that people have tended to grow more detached and cynical in their attachments. Much of the residual lure of love, intensified by the entertainment industry, has come to reside in individual erotic connection, upon which more and more seems to ride even as it becomes more fragile. Dworkin argues that people have largely given up on the organic, intense, but volatile attachments to natural peer groups in favor of cooler, more easily manageable relationships obtainable from therapists in finely titrated 50-minute increments. It is an elaborate theoretical version of the therapist-as-friend-substitute claim.

There is much that could be said in response to this fascinating version of history. First, while he does not come out and say so, Dworkin's elegaic tone certainly evokes some Golden Age when people were happier--is that really the case? While he suggests that the masses are lonely and miserable, most surveys of Americans at any rate show that the majority consider themselves basically happy (I am aware of the myriad nuances surrounding a fraught construct such as happiness). To be sure, attachments are not so simple, so monolithic, or so geographically given as they long were, but are they worse? Perhaps they are more dynamic, more flexible.

For some reason Dworkin refers primarily to the traumas of the war-ridden early 20th century, but it seems to me that the nuclear age contributed more directly to civilization-wide angst. For 65 years now the possibility of nuclear holocaust has been a gun held to the head of humanity, and it has not gone away, even if we don't so much envision a Soviet premier with his finger on the button. For those so inclined, the environmental threats of recent decades have added to the perceived risk of the earth being ruined beyond repair. These things, it seems to me, could sap social trust and confidence more than the legacy of the trenches and the Holocaust, horrible though they were.

Dworkin maintains that people utilize "the caring industry" on a massive scale because they have grown more sad and alienated. However, by this argument, one could argue that people use air conditioning because they have grown hotter than people of previous centuries. The latter would of course be wrong, but one can make a case that psychotherapy is not so much a response to cultural calamity as it is just another modern innovation that people find useful and reassuring. The comparison to air conditioning may sound trivializing, but it isn't entirely far-fetched; after all, in heat waves such as we are having now in my part of the world, AC does save lives, even if for most it "merely" adds to quality of life.

It is true that air conditioning has made people less tolerant of the heat, just as automobiles make people on average less tolerant of exercise. But these are side-effects and trade-offs that the majority seem willing to make. And the relative softness and sensitivity of modern populations are arguably consequences of unheralded prosperity--there is a bit of a princess-and-the-pea phenomenon whereby the better off on is, the less tolerant one becomes of imperfection. Most people in history had no need of therapy because they were too busy trying to stave off starvation and disease.

Modern psychotherapy arose in Europe in the 19th century, which on that continent on average was a time of great prosperity; suddenly there was a larger middle class with the time and disposable income to worry about the kinds of "problems of living" that traditionally could be "suffered" only by the rich. It is significant then that psychotherapy really took off in the U. S. in the 1950's, which, while it was a time of war weariness and nuclear anxiety, was also in economic terms the start of a globally novel degree of economic well-being.

Despite its inequalities (more stark in recent years), over the past half-century the U. S. has been, in sum, the most prosperous nation in the history of the world, so arguably its vices--the breakdown of the family, the hypertrophic media and entertainment industries, the rise of obesity, and yes the "triumph of the therapeutic"--are largely ailments of prosperity. But these ailments are very real, and attended by real suffering--just ask the morbidly obese. The caring industry may be an inevitable result of liberal capitalism. As a culture, it may be that we have become less trusting, and perhaps even less loving, than our great-grandparents, but arguably we are also more knowing and less naive.

Sunday, July 25, 2010

Medicine and Metaphor


It seems as if the honey of common summer
Might be enough, as if the golden combs
Were part of a sustenance itself enough,

As if hell, so modified, had disappeared,
As if pain, no longer satanic mimicry,
Could be borne, as if we were sure to find our way.

Wallace Stevens, from "Esthetique du Mal"


To judge from the vituperative comments section of his blog, Daniel Carlat, M.D. is now apparently the Most Hated Psychiatrist in America, having offended fellow shrinks and patients alike by being honest. In his recent book, Unhinged, he bravely and forthrightly discusses the obvious, i.e., that we don't know nearly as much about the brain and about psychotropic medications as we would like, and that psychiatrists don't do psychotherapy nearly as much as they used to.

Until we have a full neurophysiological and philosophical understanding of consciousness, we will lack a complete understanding of psychotropic drugs, whether marijuana or Thorazine. Deal with it. We don't even have a full grasp of sleep, which is one of the most basic behaviors in the animal kingdom. Do we fully fathom how anesthetics work? And yet we submit to them, and to trust people to cut into our bodies while doing so. Can someone tell me exactly how the experience of pain works, and therefore how Tylenol is effective?

Indeed, one of the commenters on Carlat's blog likens psychotropics to aspirin, and I think that is an apt comparison. In all of medicine, psychiatric conditions are most similar to pain syndromes, and psychiatrists are most similar to pain management specialists. In both cases we have an imperfect account of underlying pathophysiology, but the resulting distress is clear, so we do our best to alleviate it. The DSM is a bit of a red herring; in their everyday work most psychiatrists treat symptoms.

There are of course legitimate philosophical debates about the proper place of pain in life, and no doubt we have become less tolerant of both physical and psychic distress in contemporary times, but the whole enterprise of medicine is predicated on the supposition that suffering is not inherently redemptive (it is a shame that Nietzsche lives on in the public imagination mainly with "What does not kill me makes me stronger," which is plainly wrong, or at least very selectively true). If patients come to a psychiatrist looking for a one-time fix or an indisputable explanation, they will likely be disappointed. But I would suggest that they're looking for someone who will understand and validate their suffering and offer some relief.

Any decent doctor should know that pharmacology and other technological interventions have their limitations and drawbacks--insight and behavior change should be attempted first, just as they should before, say, considering bariatric surgery for obesity. The psychiatrist's message should amount to: "I see that you are hurting. You're not alone and you're not a freak; I have seen a lot of people with similar problems. It's not your fault. I can't completely explain where your pain comes from, but I know of some things that could help."

Unless they are of the tiny minority subject to involuntary commitment, folks are free to avoid psychotropic drugs just as they may choose to steer clear of analgesics, but they shouldn't condemn those who find them helpful. The problem, as I've written here many times, is not that mental illness doesn't exist or that psychotropic drugs don't work even for symptoms (they unequivocally do, although not as completely or as often as we would like); the problem is that psychiatry has been far too confident and grandiose in its claims of diagnostic specificity and treatment efficacy.

People see psychiatrists who declare that their Major Depression is as clear and unambiguous as appendicitis might be, and that Prozac will definitely be the solution. But the majority of patients we see present with symptoms more like chronic back pain than like appendicitis. In a perfect world we would have a full understanding of back pain, such that it could be eliminated directly (or--gasp--tolerated) rather than treated with clumsy analgesic regimens. But that is a heroic ideal, not the world we live in. We need more realistic expectations, which is what Dr. Carlat has tried to supply in his blog and his book. As someone once said (oh right, Buddha), life is suffering, and since the dawn of civilization human beings have fermented, distilled, and smoked whatever they could get their hands on in an effort to tinker with flawed consciousness, and psychiatry, like pain management, is an attempt to undertake this in a way that is, yes, civilized.

Saturday, July 24, 2010

The Soul, Explained


"Why, look you now, how unworthy a thing you make of me! You would play upon me, you would seem to know my stops, you would pluck out the heart of my mystery, you would sound me from my lowest note to the top of my compass; and there is much music, excellent voice, in this little organ, yet cannot you make it speak. 'Sblood, do you think I am easier to be played on than a pipe? Call me what instrument you will, though you can fret me, you cannot play upon me."

Hamlet


In the latest entry in the Times's philosophy blog, Galen Strawson trenchantly outlines both the inevitable logic gainsaying free will and the practical necessity supporting it. Rationally and scientifically, free choice can only be a mirage: a decision of what to do must stem from what one is. But what one is must in turn result from an interaction of genetic composition and environmental context; there is no room in the causal chain for some kind of ex nihilo self-creation. How can one be responsible for one's biologically and culturally contingent dispositions?

Interestingly, Strawson answers this by appeal to a novelist (Ian McEwan), who suggests another way of looking at free will, as that for which one is obliged to accept ownership. This reflects the central sociability of free will; ownership is an inherently social concept, and the conviction of free will that attends consciousness is an evolutionary means of modulating individual behavior. Free will involves that for which we must answer before our peers; human groups cannot function insofar as individuals disown their own conduct. Free will is therefore a matter of reason and language; a choice is always accompanied by at least an implicit self-justification before a virtual tribunal (this is internalized with varying degrees of integrity as conscience). From the vantage point of science, free will is a fiction, but practically speaking only the fatalistically depressed and the psychopathic disavow personal choice. Free will is located not in neurobiology, but in sociology--no emotions are more fundamentally interpersonal than guilt and shame. The mythical human being raised by wolves, without human contact or language, is by definition unfree.

So free will is located not in the brain, but "in" the group or human culture. What does this imply for the soul? I would argue that just as free will is not some kind of metaphysically magical legerdemain, the soul is not some kind of ethereal "stuff" existing separate from the brain. But the soul is not merely another name for the brain. Rather, the soul is massively distributed, consisting of vast networks of social and experiential contacts (another name for it would be "identity" or "the self").

My soul, like anyone's soul, does contain my irreducibly subjective experiences, but beyond that it entails every kind of relationship I have ever had, the work I have done, the influences I have taken in or given out. And crucially the soul has a history and a rationally supported system of values, both of which depend upon language. This is why animals, who presumably have moment-to-moment conscious subjectivity, do not have souls (that doesn't mean we should eat them though). Infants do not begin to develop a soul until language acquisition begins, and arguably the severely demented have largely lost their souls (both groups are still deserving of care and respect not only because they can still feel pain, but because regard for them is part of our souls). Having a soul is not like being pregnant--it is a matter of degree and diversity (although from the point of view of human rights and dignity, we consider all Homo sapiens to be fundamentally ensouled).

So the soul is not some kind of mysterious stuff, it is a network of ideas and relations. Most people's souls die when their bodies die or within a generation or two, but some souls live on in perpetuity, like Plato's or Muhammad's or Shakespeare's. Hitler's soul, regrettably, will have a very long life. Similarly, if we inquire about the identity of some country, the United States for example, it is obviously true that the U.S. consists of a certain part of the land mass of North America plus the bodies of some 300 million inhabitants as well as their various artifacts and structures. However, the U.S. is not merely these material conditions; for better or worse, its identity comprises a history and a system of ideals--all of this goes to make up its collective soul.

People almost universally have an intuition, whether right or wrong, that they are something beyond their bodies and brains, and in the sense in which I am speaking they are right. For the soul depends crucially upon actually lived history and relations with widely dispersed persons and phenomena. My soul contains not only my body/brain, but also my family, my profession, the Big Bang theory, the Andromeda Galaxy, the Louvre, and countless other entities. And my understanding of my soul requires that these things really exist, that they are not merely phantom firings of a deluded nervous system. The soul is very much a developmental beast: what I did yesterday defines my current soul much more than what I did twenty years ago.

Even though, from a scientific standpoint, I could not have ended up (in this universe anyway) with any other soul than the one I have, I conduct myself in explicit or implicit view of others with whom I share a mutually negotiated system of values, such that I feel obliged to keep my soul in what I consider to be the best possible condition. As with anyone, results may vary. And at times one maintains the soul with an eye not so much toward the currently available group as toward a stipulated or ideal community.

Psychology of the Artist as a Young Man

Inspired by current local conditions, I picked up Heat Waves in a Swamp, an exhibition volume on the works of the visionary American watercolorist Charles Burchfield (1893-1967; his Four Seasons is above). Discussing Burchfield's decision to remain in small-town Ohio and then rural western New York rather than going for the bohemian big-city life, Dave Hickey writes:

First, he was far from a genuine rustic, enamored of his isolation. His daily life was that of a cosmopolitan intellectual who has isolated himself, as a secret drinker might, to conceal his weakness. In Burchfield's case, this weakness was his bond with the landscape of his youth, a place that, for Burchfield, was less beloved than genetically imprinted--like the promise of water on a baby duck--that was less a theatrical setting for his art than an inextricable, mysterious extension of his selfhood, or he an extension of it. Over the years, I have tried to put a name to this particular malady. Many authors, writers, and performers have suffered from it. It is characterized by a shift of centeredness. The ground shifts and the barrier between ourselves and the world disappears. We feel ourselves to have become possessed--to have become an extension of the world, a particle in that whirlwind. The self is obliterated in this instant. The effect is akin to Stendhal's syndrome, to the wooziness we feel when we are captured by pictorial illusion. It is akin to dancing, to the loss of self that accompanies our giving ourselves up to the music. I call it the curse of the soft self--the unwilling dissolution of one's identity into its environment. It is the malady of painters, writers, actors, musicians, and critics. It always brings with it the terror of not being able to reassemble one's identity int he wake of having lost it.

I found this to be remarkably perceptive, and I think I can help Hickey out with the name of the peculiar malady: schizoid personality, of which this passage is an apparently unknowingly evocative description. The schizoid type's instinctive human craving for connection is in conflict with a hypersensitivity toward physical and emotional contact; what is feared is smothering, intrusion, definition by others. The world as it is is too much with them; the self is perpetually under siege. The modus operandi of the schizoid person is withdrawal into an internal world of imagination and intellectualization (William Blake's "I must create my own world or be enslaved by another man's"). Schizoids are lovers of distance, which Burchfield apparently found far from the hip, crowded art scene. As Nancy McWilliams writes in Psychoanalytic Diagnosis (1994):

The most adaptive and exciting capacity of the schizoid person is creativity. Most truly original artists have a strong schizoid streak--almost by definition, since one has to stand apart from convention to influence it in a new way. Healthier schizoid people turn their assets into works of art, scientific discoveries, theoretical innovations, or spiritual pathfinding, while more disturbed individuals in this category live in a private hell where their potential contributions are preempted by their terror and estrangement. The sublimation of autistic withdrawal into creative activity is a primary goal of therapy with schizoid patients.

Friday, July 23, 2010

Mad Scientists at Work


I have a new hero after reading a profile of neuroscientist and writer David Eagleman:

Eagleman rejects not only conventional religion but also the labels of agnostic and atheist. In their place, he has coined the term possibilian: a word to describe those who "celebrate the vastness of our ignorance, are unwilling to commit to any particular made-up story, and take pleasure in entertaining multiple hypotheses."

Sign me up--I want to be a possibilian.

The "guinea pig" complaint is far and away the most common one I hear about previous psychiatrists (and I'm sure it is said about me by those patients who move on to other prescribers). A new medication is tried every month, seemingly willy-nilly, without a sense of an overall framework or plan. A psychiatrist, plainly, is no auto mechanic. Psychotherapy, truth be told, really is a neverending experiment, but medication somehow is supposed to be different.

We know a vast amount about the effect of medications over large populations, but idiosyncratic variation in drug response remains too great to predict outcomes for individuals. In that sense medication reactions are almost like an extension of the assessment process. Treatment, diagnosis, and prognosis become one. Medication trials obviously don't give the same kind or precision of information that a brain MRI will give a neurologist, but they are very informative of a patient's dynamics and likely outcome.

The problem of prognosis is fundamental to psychiatry. Neurologists, even though they often can do relatively little about sometimes appalling diseases (MS, ALS, Huntington's Disease), nonetheless enjoy a greater stature than psychiatrists because, even if they can't do more, they know more. A patient would obviously prefer to get well, but if he can't get well, he wants to know what the future holds so that he can wrap his mind around it and plan accordingly.

There are of course crude measures of prognosis: general intelligence, education, financial and social support, and the presence or absence of past hospitalizations, suicide attempts, personality disorder, and substance abuse. But on a more subtle level, in psychiatry prognosis declares itself only over time, as the myriad variables involved in a mind interact with unique life circumstances. The physiological systems generating identity, behavior, and other aspects crucial to psychiatry are far more complex and unpredictable than those giving rise to, say, motor or sensory control.

The milder or more subtle a condition is, the harder prognosis can be to pin down. I can no more predict how a low-grade dysthymia will behave over decades than I can predict when a person might get married, or how much money they'll be making ten years from now. This isn't to say that I can't predict at all, but such prognostications are based as much on common sense (the past tends to predict the future, etc.) than on any grand professional expertise. Since I'm not allowed to keep a crystal ball in the office, I'm limited to indirect measures of understanding.

Maybe more psychiatrists should aspire to be possibilians, to "celebrate the vastness of our ignorance" rather than pretending to more knowledge than we actually have. Prescribing a medication isn't like performing an oil change--it is accompanying a patient in an experience of self-discovery. Physicians differ from drug dealers in that the substances we purvey, by social contract, must have minimal standards of safety, uniformity, and usefulness. In addition, we are expected to be wise and discerning students of human nature. Beyond that, things get interesting.

Thursday, July 22, 2010

Calling Sir Galahad



"Nothing will come of nothing: speak again."

King Lear


An older fellow presents with chronic depression, personality disorder, social isolation, medical problems, and a deeply ingrained sense of bitterness and entitlement. However, he is highly intelligent, erudite, and possessed of an acidic, acerbic wit, making a session with him a bleakly endearing clinical approximation of reading Samuel Beckett.

He declares that while he has no imminent plan or intent, he deems it "92%" likely that within five years he will be dead "by my own hand." And yet, two minutes before the end of our meeting, he asks whether I know of any wisdom that he ought to keep in mind if or when the suicidal bug should bite him in the future. Interesting bit of intellectualization, that.

Ah, for a transcendent mantra that could tear the scales from the suicide's eyes and show the world in all of its eminently worthwhile glory! I'm thinking of a phrase that, when uttered, would silence the most raucous city street and bring the mighty to their knees, that would be like a compound of: the Holy Grail, the ultimate Om, fragments of the True Cross, the Philosopher's Stone, the meaning of Zen, the Ark of the Covenant, the Fountain of Youth, the proof of the existence (or non-existence) of God, the proof that Shakespeare wrote (or did not write) Shakespeare, the Theory of Everything, a perpetual motion machine, the Aleph of Borges, and the One Ring of Sauron (which, recall, did not permit its wearers to die).

But alas, that fantasy is akin to keeping a tower upright by proposing to blast it perpendicularly into space, when in reality its supports are far more prosaic, comprising deep-seated attachments between stone or steel and the earth from which they came. And of course that's what the request entailed: attachment, not anti-gravity thrust. "We'll talk about it next week."

Tuesday, July 20, 2010

The Heart of the Matter



"It's hard to make that change
When life and love
Turn strange
And old."

Neil Young


Hats off to Dr. Rob at Musings of a Distractible Mind, who muses (of course) on suffering and the ends of medicine. Every now and then it is good to look up toward the remote (and ultimately inaccessible) peak at whose base one labors. To switch metaphors, each morning we march out upon the beach, brooms at the ready, braced to sweep back the tide. Physicians should know their Sisyphus. Heroic? Not usually. Futile? One hopes not. For some it seems to be the worst possible profession, except (a la Churchill) for all of the others.

Monday, July 19, 2010

David Foster Wallace Lives



Life, friends, is boring. We must not say so.
After all, the sky flashes, the great sea yearns,
we ourselves flash and yearn,
and moreover my mother told me as a boy
(repeatingly) 'Ever to confess you're bored
means you have no

Inner Resources.' I conclude now I have no
inner resources, because I am heavy bored.
People bore me,
literature bores me, especially great literature,
Henry bores me, with his plights & gripes
as bad as achilles,

who loves people and valiant art, which bores me.
And the tranquil hills, & gin, look like a drag
and somehow a dog
has taken itself & its tail considerably away
into mountains or sea or sky, leaving
behind: me, wag.

John Berryman, "Dream Song 14"


This poem, by a suicide who was also the son of a (paternal) suicide, came to mind as I was thinking more about David Foster Wallace. Even now, nearly two years later, I occasionally come across reflections on his demise, the shock and dismay of which still linger in the reading community. It occurs to me--as I'm sure it has occurred to many although I haven't actually seen mentions of it--that Wallace was the Sylvia Plath of this generation, the literary light snuffed out by the Black Dog.

There are obvious differences: gender, genre, and age--while Plath was only 30ish, Wallace was 46, and it's possible that his best work was already behind him in any event. But why did his suicide generate more consternation than the deaths of, say, Nathanael West or Albert Camus by motor vehicle at similar ages?

The inspirational quote by Wallace cited in Retriever's post of yesterday struck me perhaps because I had recently come across a very different sort of Wallace quote; the contrast of the two pretty much sums up the conundrum of genius and madness. The quote I mean was in Daniel Carlat's Unhinged (about which more eventually I'm sure), apparently an account of depression from a Wallace story:

You are the sickness yourself....You realize all this...when you look at the black hole and it's wearing your face. That's when the Bad Thing just absolutely eats you up, or rather when you eat yourself up. When you kill yourself. All this business about people committing suicide when they're "severely depressed": we say, "Holy cow, we must do something to stop them from killing themselves!" That's wrong. Because all these people have, you see, by this time already killed themselves, where it really counts....When they "commit suicide," they're just being orderly.

This is pretty grim and implacable stuff. Does anyone know a good joke? But really, black as it is, it is a spot-on description of severe depression. The difficulty of separating the illness from the self. The relentless but narrow logic of self-destruction. And Wallace is right in implying that suicide is merely a (final) symptom, and obstructing the symptom without addressing the disease is arguably unhelpful. Pre-empting a suicide is not in itself a solution to anything except insofar as it enables attention to underlying pathology.

In his book Carlat refers to Wallace in arguing that psychiatry still has a long way to go in terms of effective treatment. It is interesting that in the past decade or two, both the explosion of antidepressant advertising (especially on television) and the controversy over widespread medicating practices have fostered the notion that antidepressants are powerful drugs. That may have been part of the Wallace surprise--while people on the street still understand on some level that suicides still happen, the notion that in 2008, twenty years after the introduction of Prozac (the threat of which was thought to be that of making too many people "better than well"), a writer of the caliber of Wallace would go and kill himself seemed, well, so 19th century, or at least, so 1963.

Wallace and Plath also had in common the fact that neither suicide was, in purely clinical terms, surprising. Both had been hospitalized and had received ECT earlier in their lives. Plath was on an early antidepressant, and Wallace had been on many of them, including the monoamine oxidase inhibitor Nardil. I don't know the details of their treatment, but it appears they had plenty of it. While terminal cases are common (indeed inevitable, right?) in all areas of medicine, terminal psychiatric cases remain frustratingly abstract--we do not have the ominous scans, biopsies, or lab results that provide the paradoxical balm of the inexorable. Oh yes, we get a feeling that a certain case isn't likely to end well, but all we can do is fight on, calling the prognosis "guarded."

Suicide continues to shock, even as understanding of depression grows; a stigma persists. It is deemed "selfish" and demeaning, and that is on the whole probably a good thing. It keeps some people, I think, from reaching the point of no return. For as Wallace's quote argues, past that point, there is no weighing of potential alternatives, there is only a sense of overwhelming necessity, no more resistible than gravity. That is why involuntary commitment exists--logic has failed.

Just as suicide was long considered a sin against God, it continues to be seen by some as a sin against Life. The suicide rejects what most of us hold most dear, and there may even be a trace of egoism in this view, on two levels (not only "How could he consider himself too good for this world?" but also "How could he not want to be part of a world that contains...me?"). In social terms, suicide threatens to startle the horses in the street as the saying goes. I'm capable of viewing depression as medically as anyone, but this moral dimension means that depression will never be as simple a matter as diabetes--the former's sinister distant cousin is acedia, or willful blindness of the true light. As many have noted, the psychiatric/therapeutic office has an element of the confessional.

Sunday, July 18, 2010

One-Sided Conversations?



"And there's some evil mothers, well they're gonna tell you that everything is just dirt."

Velvet Underground

My great reader Retriever adduces a fine David Foster Wallace quote and suggests the necessity of worship--of something, anything--even among atheists. I agree--agnostics/atheists shouldn't hold themselves to be above reverence or the idea of the sacred. Worship is humility before the sublime coupled with self-acceptance, existing as spiritual experience.

I would argue that what distinguishes the agnostic/atheist is not worship, but rather prayer, that is, the lack thereof. Prayer is the essence of theistic religion because, no matter how subtly it is undertaken, it presupposes the existence of a divine respondent "out there," and one with a personal understanding of and care for human beings. If, as I wrote on a few posts ago, religion is chiefly about relationships, then prayer expresses those relationships. The title of Retriever's post--"Who do you worship?" (italics mine)--reflects this exactly. The essence of religion is faith in an unseen interlocutor.

Atheists are not inherently nihilistic or hopeless; as their name suggests, they are properly considered "godless," unpersuaded by the myriad contingently cultural conceptions of supernatural agency. They needn't be against God; they merely have no convincing philosophical or personal experience of (her), so they feel it makes sense to live their lives under the assumption that (She) does not exist. An atheistic may, in the broad sense, worship many things (ideals, humanity, objects of beauty), but s/he worhips noone, that is, no person or specific stipulated agent.

As usual Emily put it well:

My period had come for Prayer --
No other Art -- would do --
My Tactics missed a rudiment --
Creator -- Was it you?

God grows above -- so those who pray
Horizons -- must ascend --
And so I stepped upon the North
To see this Curious Friend --

His House was not -- no sign had He --
By Chimney -- nor by Door
Could I infer his Residence --
Vast Prairies of Air

Unbroken by a Settler --
Were all that I could see --
Infinitude -- Had'st Thou no Face
That I might look on Thee?

The Silence condescended --
Creation stopped -- for Me --
But awed beyond my errand --
I worshipped -- did not "pray" --

Saturday, July 17, 2010

The Stars, Like Dust


"If our titles recall the known myths of antiquity, we have used them again because they are the eternal symbols upon which we must fall back to express basic psychological ideas."

Mark Rothko


"Twelve hundred miles its length and breadth,
That four-square city stands.
Its gem-set walls of jasper shine,
They're not made by human hands."

Iris Dement


This afternoon I enjoyed the Whitney Museum's exhibition of Charles Burchfield, a latter-day William Blake of small-town America, whose watercolors blaze with a similar spiritual energy (above is his "Sphinx and the Milky Way").

And I couldn't have said it better, as regards plenitude and significance, than this blog post by Adam Frank. Size matters. An atheist could be defined as one who finds all existing conceptions of God to be inadequate.

Thursday, July 15, 2010

Still On Vacation


We are stardust.
Billion year old carbon.
We are golden.
Caught in the devil's bargain
And we've got to get ourselves back to the garden.

Joni Mitchell


In their sheer magnitude, extremes of urbanization return humanity to the fold of natural history: the closest we can come to the pullulating condition of the sea and stars, featuring oppositions of emptiness and plenitude. Even here, of all places, there is nothing outside of nature.

Wednesday, July 14, 2010

Saturday, July 10, 2010

No End of Books

"I...had always thought of Paradise
In form and image as a library."

Jorge Luis Borges, "Poem of the Gifts"


"My library
Was dukedom large enough."

Prospero, The Tempest


If Thoreau was right that one's wealth may be measured by how much one can do without, that is, the smallness of one's needs rather than the mass of one's possessions, then the Internet and Kindle, etc. are making readers rich beyond the dreams of avarice. As the (then) seven-year-old put it recently, "Daddy, why do you have so many books in the house? Don't you know that you can read books on the computer?"

And yet Nathan Schneider, among many others I'm sure, claims a role for books going well beyond the textual information they contain. For him they are a "theater of memory," a personal (and collectively, a cultural) record of imaginative and intellectual development. And David Brooks argues that books, in contrast to the chaotically egalitarian Internet, embody standards of intellectual hierarchy and wisdom.

What of print culture will survive? Not physical newspapers, surely. For many years a faithful newspaper subscriber (and long ago a delivery "boy"), I am now appalled that, daily, countless trees are sacrificed, and countless gallons of gasoline burned, so that bundles of paper may be dropped on porches to be briefly perused before passing (one hopes) to the recycling bin, when the same information may be transmitted by the energy cost of turning on the computer for a few minutes. To be sure, this information should be paid for, and it remains economically perplexing that one has access to so much free information online.

While not a hoarder per se, I used to stockpile magazines and journals (no more). Will weeklies and monthlies survive in paper form, if only for doctors' offices? They would seem to have a better chance than newspaper, but probably not, still, a good chance. Their value does not significantly transcend the textual information they contain.

What of books then? I have been considering another winnowing of my library--what is indispensable in paper form? Three categories, I would argue. First, and most straightforwardly, are the art, photography, and graphic design books, which are frank objects of beauty in their own right. Second, there are books of, one might trivially and misleadingly say, "sentimental" interest, that is, books that, when they catch my eye on the shelf, evoke a memorable experience of imagination or thought in my life (ranging from the relatively picayune, such as Stephen R. Donaldson's Thomas Covenant series, read in the summer of 1982, to quasi-respectable professional works of recent years). Third are the classics, however considered, that is, those texts that embody such value for me that at any given time I wouldn't want to have to depend on anything beyond visible light for the reading of them, and certainly not upon electrical power, Internet access, or the vicissitudes of any electrical device. Once acquired, a book provides near-perfect freedom which, short of frank theft of destruction, cannot be taken away.

Books will survive in some form for the same reason that, even in the age of reproduction, original art and live performance survive: as bodily creatures, we crave and require more than just information. We need tactile experience. Books will become fewer and more expensive, and people will become far more selective in what they choose to shelve, but they will survive. What will vanish into readable cyberspace will be the teeming fields of routine fiction and fact, texts that are diverting or useful for a month or a year. Those that prove their worth to following generations will pass into paper. For those who care about such things, personal and cultural libraries will remain anchors of intellectural and imaginative memory and identity. Like churches, museums, and parks, books will persist, even in diminished numbers, as quasi-sacred artifacts of thinking bodily creatures.

Friday, July 9, 2010

The Analog Life

"The formula for my happiness: a Yes, a No, a straight line, a goal."

Friedrich Nietzsche


Not long ago I wrote a post contrasting ambiguity and clarity; this came to mind today in response to a video by Dan Ariely that was on Arts and Letters Daily. His point was that online dating is less helpful than it could be because it sorts people by digital, searchable characteristics (think height, religious affiliation, etc.) when our actual experience of others is far more subtly graded: analog.

Indeed, personal sensibilities can be (digitally!) grouped into the analog and the digital. The former sees nothing but continua and shades of gray, while the latter craves contrasts. Think yes and no, good and evil, male and female, liberal and conservative, young and old, rich and poor, sick and well. The mind variably demands and abhors such simplifying, absolute goalposts as frames for experience.

Literature is analog; psychology is digital. Psychoanalysis is analog; medical-model DSM-IV psychiatry is digital. History is analog; politics is digital. The tension between humility and conviction: both seem to be necessary.

Saturday, July 3, 2010

Ghost in the Machine

Or: the metaphysics of childhood. (Paraphrased) actual conversation:

Seven-year-old: Daddy, how could Sandman [a Spider-Man adversary] survive since he's just made out of sand? I mean, wouldn't he need internal organs (sic)?

Me: Yes he would, that's why he's just make-believe. That's why ghosts aren't real either [seven-year-old clings fervently to a belief in the supernatural].

Seven-year-old: But ghosts are invisible.

Me: So, they would still need internal organs, wouldn't they?

Seven-year-old: No, Daddy, ghosts are made out of the afterlife (sic).

It never ceases to amaze me that matter has evolved to the point of deeming its own role in consciousness to be dispensable in this way. Sort of like ascending a tower and kicking the ladder away--the view is excellent, until one gets hungry or restless...