Tuesday, June 28, 2011

That (Self-) Blaming Feeling

"Why, worthy thane,
You do unbend your noble strength to think
So brainsickly of things. Go get some water,
And wash this filthy witness from your hand."

Lady Macbeth


In a compelling argument for the congruence of brain and mind, and the ethics that ought to follow therefrom, David Eagleman maintains that blame derives from a misguided and outmoded belief in free will. He claims that blame is basically backward-looking, implying that one could and should have done differently (than one just did). But when we face up to monism and the brain-as-mechanism, we realize that, after the fact, there is nothing to do but acknowledge that given the conditions that prevailed at any past time, one could not in fact have acted differently.

Eagleman argues that shame and blame are not, in fact, very good at modifying behavior, and what we need is a more rational and forward-looking attempt to achieve desired outcomes, in ourselves and others. A la B. F. Skinner, he proposes that we approach brains as we would approach engines or computers that are on the blink. Both sticks and carrots may be necessary to shape desired behaviors, but they should be undertaken in a dispassionate way, free of messy or reckless vindictiveness.

There is nothing inherently objectionable about his advocacy of what he calls "the frontal workout," that is, an updated biofeedback project whereby one might learn (or teach) better control over impulses. But he might have said more about the phenomenology of guilt and blame, which are, after all, very deep aspects of human experience. These are very distinct and familiar subjective phenomena, and arguably they are far from arbitrary or nonsensical.

Blame is the social group's means of imposing its norms, and blame works most effectively when it is internalized as guilt and shame. Blame is a deterrent, plain and simple. And as is so often the case, it works best when it is involuntary (when blame is reflected on too carefully, one arrives at Hamlet). This is not to say that shame and blame are generally good things, merely that they are natural (and many perfectly natural human behaviors are odious). However, even in Eagleman's handyman-of-the-brain world, some impetus and motivation for change must exist, and I don't know whether that motive would come from unless from those primeval emotions of guilt and shame. They merely exist in healthy and in pathological forms. Guilt and shame may seem to be primarily about the past, but really they project forward into the future; like pain, they are the brain's message to itself: That didn't go well, so try something different. Blame and guilt are modes of moral (self-)argument.

And in a follow-up to the recent post about reading, the literati are a bit atwitter about Philip Roth's declaration that he has stopped reading fiction. In a Salon article Laura Miller speculates that inasmuch as fiction provides insight into character and human subjectivity, perhaps some do reach a point at which they have all the insight they need. After all, the novel isn't called the novel for nothing, and some readers do believe there is nothing new under the sun. But then again, one could paraphrase Samuel Johnson and say that "He who is tired of fiction is tired of life."

Sunday, June 26, 2011

The Tree of Life

Solitude from mere outward condition of existence becomes very swiftly a state of soul in which the affectation of irony and skepticism have no place...After three days of waiting for the sight of some human face, Decoud caught himself entertaining a doubt of his own individuality. It had emerged into the world of cloud and water, of natural forces and forms of nature. In our activity alone we find the sustaining illusion of an independent existence as against the whole scheme of things of which we form a helpless part.

Joseph Conrad, Nostromo


Terence Malick's The Tree of Life consistently defies expectations of coherent narrative, instead implanting myriad images implacably in the mind. One could be haunted by this film. As David Thomson wrote in his review in The New Republic: "Less than a framework of story, we have a situation, and this is itself not just fair, but an enlightening novelty. Most of us do not feel that we are living stories (at least not until later); we believe we are getting on with a situation."

As the movie's epigraph from Job suggests, the situation is one of inevitable suffering and loss, albeit experienced in a perpetual haze of existential glory. The tone of the work is continually exalted, which probably accounts for its controversial and varied reception. For those predisposed to its message, irony is silenced; the sacred is always a puzzle to the intelligentsia.

The situation in The Tree of Life, is, most mundanely, that of a family in 1950's Texas, but really Malick is concerned with the situation of human life and its vexed relation to life, broadly considered. Much has been made, both derisively and respectfully, of Malick's depiction of the history of the universe and the pre-human earth (dinosaurs even!), but I'm not sure why. Narratively, this is merely the use of a very wide-angle lens, and a salutary use at that--there is more to heaven and earth than is dreamt of in Manhattan. Indeed, a few aerial shots of early hominids would not have been out of place. Psychologically, the "family romance" may seem endlessly interesting, but neither man nor woman lives by interpersonal relationships alone. There is that which preceded us and that which will outlast us.

In the first few minutes of the film, as we get our first impressionistic views of the O'Brien family, a female voice-over poses the contrast of nature and grace, asserting that the way of nature is domination and self-indulgence, whereas the way of grace is care and endurance. Much of the film unforgettably documents the necessity of nature--deep space, inscrutable water, arboreal visions, scathing light, barren rock, towering glass and steel--but the realm of grace is uniquely human. Consciousness is dualistic not in substance (body and soul, brain and mind) but in moral experience, in what we have no option but to choose.

Only human beings, in all of life that we know of, can fail the test of grace, and we see the risk and stakes of such failure in the boy, Jack, of 1950's Waco and the contemporary man, Jack (a ravaged Sean Penn). Violence and predation antedated humanity by many millions of years, but only with the first glimmer of consciousness did the storyline of Cain and Abel come into the world. We see it in the boy Jack's sullen resentment of his father and his acts of petty boyhood mayhem (breaking windows, mistreating frogs, stealing lingerie). Similarly, only humanity is prey to despair, of which contemporary Jack appears to be a classic example, suffering Kierkegaard's "sickness unto death."

Some reviews I've read seemed to infer that the culminating beach scene was some kind of Rapture-like representation of the end of the world, but to me it seemed a symbolic depiction of redemption, as Jack somehow breaks through his granitic alienation. The idea and the ideal of the sacred presume that amid seemingly endless tawdriness or trauma there are still spaces and times of grace if we can only find them.

Saturday, June 25, 2011

Drill Imagination Right Through Necessity

Play

Nothing's going to become of anyone
except death:
therefore: it's okay
to yearn
too high:
the grave accommodates
swell rambunctiousness &

ruin's not
compromised by magnificence:

that cut-off point
liberates us to the

common disaster: so
pick a perch--
apple branch for example in bloom--
tune up
and

drill imagination right through necessity:
it's all right:
it's been taken care of:

is allowed, considering


A. R. Ammons

Sunday, June 19, 2011

Missing

"How can one transmit to others the infinite Aleph, which my timorous memory can scarcely contain?"

Jorge Luis Borges


Some time ago Linda Holmes at NPR wrote a wonderful piece observing that, by virtue of sheer plenitude of space and time, each of us is destined to miss out on the vast majority of whatever it is we love in life. Far from being a downer, it is comforting and even self-transcending to realize that no matter how assiduous or dynamic one may be, there are just more people to meet, books to read, films to see, or sunsets to witness than any one life can manage. It is a reminder that even if, as the cliche goes, the world is much shrunken owing to the speed of travel and communication, one can divide infinity many times over and still be left with infinity. To live a lifetime is to gaze upon an ocean of experience, yet be allowed to dip one's hand in the water only once.

One consequence of having a large "physical" library (as opposed to having a Kindle sitting unobtrusively on the table) is that the many hundreds of tomes mutely gaze outward, as if in reproach of my all-too-human forgetfulness. My eight-year-old has asked before, "Daddy, why do you have all of these books if you can get them all on the computer?" One reason is that my recall isn't what it once was, and my library is one kind of living personal record. Many volumes I do dip into now and again--a poem here, an essay or short story there--but how many, realistically, will I live to reread altogether?

For some 20 years--roughly, from 15 to 35--I was a prolific reader, of all genres, but particularly fiction. You know: the canon, the great books (and many that were not-so-great). While I still read, of course, typical life circumstances have much reduced the time available for it. Whether by coincidence or not, I find myself less patient with fiction, and more given to non-fiction, than used to be the case, but I continue to fight that. Proust I feel sure I will live to reread, all 3000 pages. But the 1000 pages of Les Miserables? Probably not. Much of Dickens I hope to reread, but probably not Barnaby Rudge. Recently I read Harold Bloom claiming that rereading Samuel Richardson's Clarissa was a great priority. Really? I've never read Richardson even once. Do I need to read him before expending time on rereading Jonathan Swift? And should I do that before, or after, I brush up on American history?

Inasmuch as there is nothing outside of reality, fiction is merely a peculiar branch of non-fiction, reality's myriad conscious self-reflections. Per Stendhal, a novel is a mirror carried along a main road, but it is a puzzling kind of mirror, with surprising concavities and convexities. Fiction seeks reflections that reverberate and recreate reality in microcosm, a la Borges's Aleph. A successful work of art achieves a unity that symbolically reproduces the completeness of reality. Non-fiction is always a magnifying glass, if not a microscope--clarity is purchased at the expense of breadth. Fiction is a necessarily distorting mirror, since any simple mirror or magnifying glass capable of capturing everything we care about would have to be as large as the universe itself.

The stakes are high in the arts--the potential payoff is high, but when fiction seriously fails, it is upsetting, because it is as if reality itself is being mocked or even maimed. Bad non-fiction is like a lie, which is bad enough, but bad fiction is like blasphemy. I forever vacillate between Plato--who saw the arts as begetting deceptive images (among the myriad shadows on the wall of the cave), distractions from the pursuit of truth--and Aristotle, who argued that poetry at its best reveals necessary truths, while history merely documents contingencies. Perhaps it is just a matter of epistemological and existential focus, the iris of the inquisitive mind.

Wednesday, June 15, 2011

Is Psychiatry Like Acupuncture?

As I've discussed a few times here, this is the worst of times for antidepressants and other psychiatric medications; considering questionable efficacy and likely side effects, their popular esteem is at a low ebb. This makes them...a great deal like various alternative medicine treatments that remain highly popular and widely used (and paid for) despite the disdain of evidence-based medical critics.

In The Atlantic David H. Freedman discusses the persistent popularity of alternative medicine and its unlikely cohabitation with conventional research even at the Mayo Clinic and other hallowed institutions. He points out that while medicine made its reputation in the first half of the twentieth century with the significant (if not complete) conquest of infectious disease, its efforts to extend its domain to the kinds of chronic diseases that plague us today (diabetes, heart disease, cancer, Alzheimer's disease) have been frankly disappointing. What, exactly, has medicine done for us lately?

Psychotherapy and psychiatric medication have been targets of critical and cultural derision on the part of many for decades, yet millions of patients seem to derive some kind of healing experience from the pill or the couch, as the case (and the personal inclination) may be. The same could be said of the masses flocking to chiropractors, homeopaths, and, yes, acupuncturists in defiance of the conventional medical wisdom. We spend years in medical school learning about physiology, when practically speaking, healing arguably has more to do with constructing a healing ritual than with one's board scores. The "chemical imbalance," absurdly oversimplified though we hold it to be, may be like the acupuncturist's "lines of force," a necessary if fictional semantic scaffold on which to mount a clinical encounter. The shaman lives!

Monday, June 6, 2011

History of a Suicide

"I perceive I have not really understood any thing, not a single object, and
that no man ever can,
Nature here in sight of the sea taking advantage of me to dart upon me and
sting me,
Because I have dared to open my mouth to sing at all."

Walt Whitman


In puzzling over an unexpected suicide (and how many suicides are not, at some level, surprises?), we often ask empirical questions, as a detective might. How did this come about? Who or what is the primary culprit? But arguably the challenges suicide poses are chiefly existential and interpersonal, not factual. That is, the suicide, in rejecting life itself, dissents from values that we hold very dear.

And the question of "How could we not have known?" is more relational than epistemological. That is, suicide reminds us of the perturbing basic inscrutability of human relationships. If we do not know something so basic as whether someone is suicidal, what do we really know about them? That's why psychotherapeutic relationships can be the most intimate of all--not obviously in a physical sense, but in an existential one. The therapist often hears things that no one else in a person's life hears.

I just finished Jill Bialosky's History of a Suicide, which considers the suicide of her younger sister Kim some twenty years ago at the age of 21. It is a worthwhile and reflective addition to the suicide memoir shelf, but Bialosky is, like many, preoccupied with questions of causation. The problem is that completed suicide is complex and rare (relative to the numbers of the depressed); why would we expect suicide to be any more fathomable or predictable than other atypical behaviors, such as murder or sudden religious conversion? If we had the technology or insight to predict individual suicides, what other behaviors might we be able to foretell?

Bialosky seeks out a suicide specialist who tellingly conducts a "psychological autopsy," as if we can answer the dilemma of suicide using the tools of pathology. Unsurprisingly, a number of potential contributing factors come to light: a family history of mental illness and even suicide, a father who abandoned the family and ignored or rejected Kim, a depressed and withdrawn mother, an abusive boyfriend, and alcohol and drugs. This list is noteworthy for its obviousness and for the fact that every one of these things is objectionable in its own right even apart from any possible relation to suicide. The things we might do to reduce suicide risk--maintain family integrity, shore up communities, limit drug use, and increase awareness and treatment of depression--are things that we ought to be doing anyway. These influences ultimately tell us nothing, because we do not know which is necessary or sufficient.

The other thing that suicide teaches is how little we sometimes know of ourselves. It appears that a certain fraction of suicides, at least the final determination to act, are impulsive. If we could interview completed suicides after the fact, I suspect that a significant number would express surprise, if not dismay, that they actually went through with it.

Sunday, June 5, 2011

Who Needs Psychiatrists?

I have seen a medicine
That's able to breathe life into a stone,
Quicken a rock, and make you dance canary
With spritely fire and motion, whose simple touch
Is powerful to araise King Pippen, nay,
To give great Charlemain a pen in's hand
And write to her a love-line.

All's Well that Ends Well


The criticisms of contemporary psychiatry are coming fast and furious now, and not just from the fringe any more. Cheryl Fuller at Jung at Heart refers to a review by Marcia Angell of three recent anti-psychiatry volumes (of which I have read Daniel Carlat's Unhinged and Robert Whitaker's Anatomy of an Epidemic, but not Irving Kirsch's The Emperor's New Drugs). And while it's not specifically about psychiatry, an American Scholar article by Harriet Washington documents the discouraging corruption of medical research and publishing by so-called Big Pharma.

The mounting charges are of the most serious kind, and warrant a full-on response from the profession (which this blog post does not aspire to be). To very briefly summarize, the basic effectiveness of antidepressant drugs (and to greater or lesser extents, all psychiatric medications) is increasingly dubious as the integrity of research purportedly showing their efficacy is called into question. Critics maintain that for decades (antidepressants came into general use in the 1960's), thousands of psychiatrists (and of course other physicians as well) and millions of patients have prescribed and taken non-therapeutic compounds based on an underestimation of the placebo effect.

As for neurobiology, critics point out, correctly, that there is no evidence for any specific "chemical imbalance" that antidepressants allegedly alleviate. However, this is not the crux of the issue, for other central nervous system agents (e.g. anticonvulsants and anesthetics) have mechanisms of action that remain somewhat mysterious. And depression is in fact correlated with specific neurobiological states, but only because every psychological state--falling in love, undergoing religious conversion--can only be based in the brain. The question is not whether any given psychological phenomenon has a biological correlate (of course it does); the question is whether said phenomenon is best understood and potentially modified in chemical as opposed to other (psychological, interpersonal, social) terms.

It is one thing to claim that antidepressants are overblown and oversold; it is quite another, of course, to claim that they are useless or even pernicious. For instance, Robert Whitaker's arguments can lead only to the conclusion that antidepressant drugs should be expunged from the earth, and that psychiatrists are either unwitting or cynical quacks for prescribing them. And of course, as psychologists and social workers have taken over much of the psychotherapy territory that used to belong to psychiatry, the profession's identity has been ever more given over to psychopharmacology. After all, Freud didn't think psychoanalysts needed to be physicians, and there is no evidence that psychiatrists make better therapists than those with other degrees, so absent real results from biological treatment, why does psychiatry exist, exactly, beyond a function as a research program?

As someone who has, regrettably, long recognized the limitations of existing drugs but who still prescribes them, what do I believe? And can what I believe be remotely legitimate inasmuch as my current livelihood (by no means opulent in doctorate-level terms, but reasonable) depends on these medications having a role? Intellectual honesty demands that if one has a pressing self-interest in believing something, one should subject that belief to fierce and insistent criticism. There is no sin greater than tendentiousness.

This discussion derives from the valorization of the randomized, placebo-controlled trial as the ultimate arbiter of medical outcome, very much at the expense of individual clinical judgment. After all, many hold that clinical judgment is subjective and idiosyncratic, and therefore open to bias and not to be trusted. If all that needs to be known about medications can be inferred from statistical trials, than anyone (such as Whitaker, a journalist) can know more about them than a physician. Indeed, on this view only the non-physician can accurately appraise medical treatments because his view is not warped by self-interest. And yet there is considerable question as to whether patients (or "patients") in rigidly controlled research studies are truly representative of real-world clinical encounters.

What, then, do I believe? I believe, with the Buddhists, that life is suffering (but not only that); the long history of humanity is one of untold miseries of anxiety and depression that were either merely endured (there being no other choice) or compensated for by relationships, religion, art, or alcohol. Like the agonies of even routine childbirth or the ravages of even typical old age, mental disorders have always been part of the human condition; only relatively recently have we tried to modify them. One can make an argument that all of these things should, again, be merely endured, but I don't think history has a rewind button. Yet the expectations regarding mood and anxiety have exceeded all bounds, as has the expectation that one has some right to reach ninety with sound mind and body.

I believe that existing drugs do not counteract specific or discrete physiological processes, but (like psychotherapy) are nonspecific mental balms. SSRI's and benzodiazepines are to mental distress as NSAID's and opiates are to physical distress, that is, they are often disappointing and attended by sometimes dismaying side effects, but millions of patients have found them of some use. I believe that in a modest way they reduce suffering, by no means always or even often, but on average. I believe this on the basis not of research studies, but of my clinical experience and that of many others. And the day I stop believing that is the day I will stop prescribing.

Wednesday, June 1, 2011

Who Needs Narrative?

Arguing for the psychological uses of narrative, Bill Benzon at The Valve distinguishes the "autobiographical self" (i.e. identity over time) from the "core self" (i.e. one's integrated psycho-physiological state at any given time). He claims that the "core self," influenced as it may be by intense situational and physical factors (he uses hunger and sexual desire as examples), not to mention its transient nature, threatens to disrupt the autobiographical self. He suggests that narrative (he specifically mentions "play-acting" and "storytelling") usefully provides an overarching frame within which to understand and evaluate our dispositions and behaviors over time.

The account leaves out a lot of course (for instance, it would seem that temperament straddles both kinds of self). And his case seems a bit extreme--as if even a starving man would look back on his life as having been little more than an ultimately unsuccessful quest for food--but there may be something to it. After all, someone in a deep depression may view much of his past "through a glass darkly" in a way that lightens considerably when the episode relents. And obviously the two selves affect each other reciprocally and continuously.

Staying with Benzon's schema, it would seem that psychological distress occurs in two varieties. Unhappiness is a malady of the autobiographical self, a dismayed sense that one's story has somehow gone awry through vicisitudes of sensibility or circumstance. One seeks in a therapist a kind of narrative catalyst that will open up unimagined possibilities, including the often profound possibility of actually being listened to and perhaps even understood. Dysfunction of the core self manifests as symptoms that may actively impede functioning. There is considerable overlap between the two, but arguably we resort to psychotropic medication inasmuch as symptoms appear to be beyond the power of narrative to reframe. But nothing is more frustrating than to try to treat unhappiness with meds or to tackle narrative-resistant symptoms with more narrative. Diagnostic confusions and controversies arise from the difficulty of distinguishing symptoms from unhappiness.

It occurs to me that like certain other phenomena such as religion and even music, narrative broadly considered (that is, interest in all stories whether contained in books, film, gossip, or hearsay) is hard to explain because it is very widespread but not truly universal. Some faculties, such as hunger and thirst, are obviously ubiquitous because their absence is not compatible with life. Others, such as basic senses or sexuality, are not imperative for individual life but are so typical of the species that their absence is uncontroversially deemed pathological.

Inasmuch as existence is necessarily temporal, some interest in narrative is presupposed, even if only speculation as to where the next meal will come from. But sophisticated narrative--that is, at least at the level of communal folk tales--has, like religion, been found to exist in virtually every human society. And yet just as there is a reliable minority of individuals who are irreligious, there are of course people here and there who are relatively free of the narrative bug, who may be more invested in other domains of experience (facts, ideas, bodily experience, etc.). If religion and narrative truly are central to (individual and species) human identity, then how is it that even a (non-pathological) small minority more or less escape their purview? Perhaps diversity itself is such a powerful evolutionary engine that it constantly throws out alternatives to the prevailing cultural trajectory, suggesting of course that those faculties we view as indispensable are actually contingent.