Monday, December 1, 2008

Psychiatry...in One Page or Less



I said, "Hold it, Doc, a World War passed through my brain."
He said, "Nurse, get your pad, this boy's insane."
He grabbed my arm, I said "Ouch."
As I landed on the psychiatric couch
He said, "Tell me about it."

Bob Dylan, "Talkin' World War III Blues"


(Continued from last post). As I see it, whenever I see a patient for the first time, any of the following general considerations counts for more than nailing the best of 297 possible DSM-IV diagnoses.

1. Does the patient need to be in a hospital?
2. Regardless of the answer to #1, is the patient suicidal (or, very rarely, homicidal) at all?
3. Very likely the patient presents with some aspect of high neuroticism (anxiety, depression, and/or eating disorder) as this constitutes the bulk of general practice. If not, why is he here? If so, is there anything in addition?
4. Is the patient bipolar, or at least bipolar enough to affect prescribing decisions?
5. Is the patient psychotic?
6. Is there a substance abuse issue?
7. Is the patient cognitively impaired, by mental retardation, dementia, or delirium?
8. Could a medical condition or medication of the patient's be generating the symptoms?
9. Is there a personality disorder, or at least sufficient character pathology to affect prognosis and the clinical relationship?
10. Could the patient have ADHD?
11. Is the patient impaired enough to warrant medication treatment at all?
12. Does the patient have the curiosity and determination to pursue psychotherapy, or is he seeking a pill?
13. Does the patient have financial and logistical access to treatment, both psychotherapy and medication?

Obviously, the answers to these often are not straightforwardly yes or no. The question is always whether there is enough bipolarity or psychosis or whatever to affect clinical decision-making. Whether to call it bipolar I or bipolar II on the one hand, or schizophrenia or schizoaffective disorder or delusional disorder on the other, is hair-splitting, indeed, it's academic. This sort of hair-splitting was fascinating in medical school, somewhat less so in residency, and not at all any more (as Cleopatra put it, those were "My salad days, when I was green in judgment...").

I mentioned last post that diagnosis can matter for prognosis more than for treatment. That is true, but each of the following factors might influence prognosis even more:

1. Has the patient ever maintained a long term (maybe greater than one year) significant relationship?
2. Has the patient been able to hold down a job?
3. How much formal education has the patient had?
4. Is the patient able to live independently?
5. Is the patient on disability?

In general I do not, unsurprisingly, make a big deal of diagnosis. The exception is when doing so might influence motivation for treatment. If a person is seriously depressed or psychotic, but has poor insight or motivation, then it can be useful to pull the somewhat paternalistic medical card of emphasizing Diagnosis (perhaps while brandishing a caduceus and declaring, "By the power vested in me, I name you Sick."). If someone really needs psychotherapy but is reluctant, even the Borderline Personality Disorder card can be played, with sensitivity. But in general, the DSM is for researchers and insurance companies.

I don't expect any of this to change, of course, with DSM-V in 2012. There is a long way to go.

Yin and Yang



Some time ago a crazy dream came to me
I dreamt I was walkin' into World War Three
I went to the doctor the very next day
To see what kinda words he could say
He said it was a bad dream
I wouldn't worry 'bout it none, though
They're dreams and they're only in your head.

Bob Dylan, "Talkin' World War III Blues"


I did a double-take when I saw this NPR headline a few minutes ago: "Manufacturing shrinks at fastest rate since 1980." My instant and automatic thought was, "Who is? Are we manufactured?" and then, upon further reflection, "Who uses 'shrink' as a verb?"

Psychiatrists are famously (among ourselves anyway) divided into "lumpers" and "splitters" when it comes to diagnosis. Lumpers assort symptoms into a relatively few (often very broad) categories, while splitters delight in, among other things, getting the DSM code just right. But I would argue that good psychiatrists manage to be both very coarse lumpers and extremely discerning splitters at the same time. In my opinion psychiatry does not (yet) benefit from the intermediate kind of splitting that other areas of medicine depend on, and that psychiatry pretends to have by means of the DSM. I will try to explain what I mean.

I am, I suppose, an ultimate splitter inasmuch as I try to appreciate the uniqueness of each patient and his or her life story. In that basic respect there are as many "diagnoses" as there are patients, and the clinical relationship, which in psychiatry has always been a crucial part of the treatment process, stems from irreducible idiosyncrasies. Even when formal psychotherapy is not going on, how I might phrase or cast an explanation or recommendation should rely on my supposition of what might be most effective given the very specific life story involved. To be sure, all physicians should do this, and many do, but psychiatrists are supposed to be especially good at it.

But when it comes to what we might call the medical aspect of psychiatry, and particularly the use of medication, I, like many clinicians I think, end up being a pragmatic lumper. The reason is that our medication treatments are not (yet) specific enough to make highly fine-tuned medical diagnosis useful. Consider "antidepressants," which actually are modestly helpful for a wide range of depressive symptoms, but also for all the anxiety disorders and even eating disorders in some cases. Or consider "antipsychotics," which are used for bipolar disorder and sometimes for refractory anxiety syndromes or even for sleep.

Psychopharmacology remains a highly pragmatic business of weighing symptoms against prospective side effects. Of course, diagnosis comes in handy as a way to bundle symptoms together in a useful way, but arguably the average clinical psychiatrist keeps only a very few of these bundles in mind at any one time. Diagnosis can be helpful for determining prognosis, but much less so for treatment. In terms of decisions I make in the office, it makes little difference whether I call something panic disorder, generalized anxiety disorder, social phobia, or anxiety disorder not otherwise specified. The specific symptoms, in conjunction with the unique person behind them, will determine the treatment more than the diagnosis.

For DSM-style diagnosis to matter more significantly, our treatments would have to be much more specifically pegged to diagnosis, and they aren't. If a medication emerged that was helpful for, say, bulimia, but not at all for depression or anxiety, well, that would be major news. The same general principle is true for psychotherapies as well. Cognitive-behavioral psychotherapy is no more specific than an antidepressant; it ends up being guided more by a specific patient's symptom profile and by his priorities and idiosyncrasies than by the diagnostic code.

Most other medical disciplines are quite different. A person who goes to an internist with chest discomfort could have any one of a significant number of possible diagnoses, and pinning down the right one makes a huge difference not only for prognosis but for treatment. If it is in fact a heart attack, the internist had surely better not treat it as gastroesophageal reflux, costochondritis, pleurisy, or a gall bladder episode. This kind of intermediate diagnostic splitting is crucial to most medical disciplines, and it is one reason why we are making major inroads into the treatment of heart disease, cancer, etc. We are not there yet in psychiatry--neither our diagnostic "system" nor our treatment options are specific enough for intermediate splitting.

But we also do not just listen to someone for fifty minutes and reach for a medication at random. In the next post I'll consider the relatively few, but very weighty, "diagnostic" considerations that any psychiatrist should bear in mind with any patient.

Sunday, November 30, 2008

Breathe In, Breathe Out



The mind is its own place, and in itself
Can make a heav'n of hell, a hell of heav'n.


Satan, Paradise Lost



A miserable day in the East, and I chose to face the holiday-shopping hordes, unbowed by cold rain and recession. I drove around for hours in search of the lump-of-coal megastore. In vain.


One must cleanse the palate before a Monday.


"I have no name;
I am but two days old."
What shall I call thee?
"I happy am,
Joy is my name."
Sweet joy befall thee!

Pretty joy!
Sweet joy, but two days old.
Sweet joy I call thee:
Thou dost smile.

William Blake, "Infant Joy"

Evil: The Leftovers



"Come, you spirits

That tend on mortal thoughts, unsex me here,

And fill me from the crown to the toe top-full

Of direst cruelty! make thick my blood;

Stop up the access and passage to remorse,

That no compunctious visitings of nature

Shake my fell purpose, nor keep peace between

The effect and it!"

Lady Macbeth


Or should it be "Leftovers: the Evil?" At any rate, moral outrage, like revenge, is probably a dish best served cold.

When I think about the Mumbai murderers, I realize that indignation, in its simplifications and its threatened demarcations of "us" and "them," can be too pleasurable for our own good. But Scott Simon's commentary at NPR agrees, vis a vis evil, that some acts are so heinous that no adjective will better serve. As he points out, for the truly evil, there are no innocents; in that sense, perhaps evil is itself a theory of human nature (a theory that, despicable in itself, views human beings as inevitably despicable).

Ironically, these Mumbai horrors feel worse than suicide bombings (even those that, like 9/11, killed far more people) because they were so much more cold-blooded and required sustained, ongoing deliberation. The suicide bomber must of course massively rationalize his act, but he knows that he won't be around to witness the suffering and mayhem he generates. Accounts in Mumbai agree that these people went out of their way to kill indiscriminately at point-blank range--the old, the young, women, men, it didn't matter. Consider how much effort it surely must have taken to suppress any stirrings of empathy as the killers methodically went from room to room. These actions were evil in a very intellectual sort of way.

That these acts presumably had political ends makes them no less evil. I used to think that the routine denunciations of such atrocities by the President and other heads of state was fairly absurd, stating the obvious. But I'm starting to think that nothing can be taken for granted morally any longer, and the world needs these "routine" restatements of what the civilized realm holds to be justified or unjustified. If we want to "despise the sin, not the sinner," that is fine with me. Evil is the ultimate diagnosis, I suppose, so let us label behavior, not persons. But let us label certain kinds of behavior unambiguously and not bring in extenuating factors of upbringing, biology, or political ideology.

A recent book I need to pick up is the philosopher Susan Neiman's Moral Clarity (her Evil in Modern Thought of a couple of years ago was both accessible and illuminating). Without resorting to simple-minded dichotomies, we need guidelines in a world of increasing moral murk, in which the more we know, or think we know, about the infinite complexities of culture and biology threatens to generate Hegel's "night in which all cows are black."

I am no Jonathan Swift (1667-1745), but his epitaph has been on my mind lately:

Hic depositum est Corpus
JONATHAN SWIFT S.T.D.
Huyus Ecclesiae Cathedralis
Decani,
Ubi saeva Indignatio
Ulterius
Cor lacerare nequit,
Abi viator.
Et imitare, si poteris,
Strenuum pro virili
Libertatis vindicatorem.

Translated by William Butler Yeats as:

Swift has sailed into his rest,
savage indignation there
cannot lacerate his breast.
Imitate him if you dare,
world-besotted traveller,
He served human liberty.
Addendum 11:39: It occurs to me that evil is the obscene negative, in the moral realm, of what God is in the metaphysical realm. It seems that both must exist, even if, disappointingly, as sociological necessities.

Saturday, November 29, 2008

Time Out of Joint



"Demand me nothing; what you know, you know:

From this time forth I never will speak word."


Iago



We have had a month of relative, even sentimental political hopefulness, and a few days of giving thanks. Now the moral rhythm, as well as recent events, demands a return to the tremendous subject of human evil. (Ho Ho Ho to you too, dear reader, by the way; well, okay, I never was asked to be Santa Claus at work, I always wondered why).

The immediately infamous incident of the hapless Wal-Mart worker trampled to death during a pre-dawn Black Friday shopping stampede was enough to get me brooding (okay, it doesn't take much). Not only did the initial pressure of the mob lead to the appalling turn of events, but individual shoppers reportedly resisted attempts to clear the area for rescue efforts and for the sake of at least minimal dignity owed toward the dead. It is hard to think of a more grotesque reflection of contemporary capitalist consumption; may those shoppers relish their plasma TV's in complete moral equanimity! As commentary on our economy, it is a visceral, freakish and microcosmic counterpart of the recent Wall Street mayhem. Our culture satirized itself perfectly. I hear Sweden is awfully nice...in June and July.

I've always been interested in the way psychology has struggled to deal with the hulking fact of human depravity. We try to stow it away in little boxes like psychopathy, antisocial personality disorder, and situational and contextual predispositions to malfeasance (see Walmart, stampedes). The poets and artists have been much better, recognizing that this is, along with love, one of the great subjects. Shakespeare's Hamlet, King Lear, and Othello derive much of their irresistible, infernal power from the central fact of nefarious humanity. It doesn't make for very pleasant drawing room conversation, however, and perhaps not for proper blogging, I don't know. So many other things are easier to talk about.

Moral indignation can be a very strong human emotion, and one that very likely has deep evolutionary roots. Particularly outside of the United States (well, and China), capital punishment is politically incorrect, but it can be surprising how often relatives of murder victims not only clamor for the death penalty, but want to be present when it is carried out. Reportedly one of the ten Mumbai terrorists was taken alive. How we wish he could provide some eloqent window into the darkness of those deeds! Unfortunately he will probably be, like Iago, morally mute; evil cannot ultimately justify itself, but neither does it have to.

The child psychoanalyst Melanie Klein postulated that we experience evil in ourselves first but, unable to bear it, project it upon others, in what she called the "paranoid-schizoid position" of infancy; for the baby, the Other is either all good (breast available) or all bad (breast withdrawn). She argued that moral complexity develops in the "depressive position," in which we recognize the Other as, inevitably, an amalgam of good and evil (it is "depressive" because the rage that, one hoped, could eradicate the Evil Other would also, it turns out, eliminate the Good Other as well; moral complexity is difficult and therefore disheartening).

But evil, like beauty, artistic capacity, or love is distributed unequally among individuals and, arguably, among cultures and epochs as well. Freud, famously shaken by the epic brutalities of World War I, was driven to postulate the "Death Instinct" as an explanation for the darker regions of our nature that he felt he had previously not adequately accounted for. This formulation turned out to be one of the most controversial and least substantiated of his provocations (and with Freud, that's saying a lot), but it does embody a psychological attitude that, I think, we don't have a good name for. It is the confrontation, not of the primitive babe, but of the average adult, with the moral abyss. Moral outrage we might call it, or moral disappointment. Or perhaps it is a kind of moral grief, a mourning of an ideal of human nature; of course, we know that idealists fall, hard.
Okay, no succumbing to the "spirit of gravity," but I also know I won't be setting foot in Walmart this holiday season. Actually, I am always surprised by how many patients mention trips to Walmart as a virtually pathognomonic stressor; no experience seems to set off a smoldering panic disorder, or sometimes something worse, so reliably. Of course, is this "mere" faulty anxiety, or rather the canary in the coal mine, a tell-tale sign of deep evolutionary wisdom?
"Keep up your bright swords, for the dew will rust them."
Othello

Wednesday, November 26, 2008

Wishful Thinking



Edward Munch, The Sun (1912)
(Giving thanks four months in advance).


At the earliest ending of winter,
In March, a scrawny cry from outside
Seemed like a sound in his mind.

He knew that he heard it,
A bird's cry, at daylight or before,
In the early March wind.

The sun was rising at six,
No longer a battered panache above snow...
It would have been outside.

It was not from the vast ventriloquism
Of sleep's faded papier-mache...
The sun was coming from the outside.

That scrawny cry--It was
A chorister whose c preceded the choir.
It was part of the colossal sun,

Surrounded by its choral rings,
Still far away. It was like
A new knowledge of reality.

Wallace Stevens, "Not Ideas About the Thing but the Thing Itself"

Tuesday, November 25, 2008

Scrapblogs




"They have been at a great feast of languages, and have stolen the scraps."

Love's Labour's Lost


The New Republic features a review of a history of scrapbooks in America. In the Table of Contents the topic didn't initially draw my interest, but I found myself reading it and am glad that I did. Like many hands-on pursuits, the endeavor may have waned in recent years, but it has a wider history and extent than I was aware of. I never did keep a scrapbook per se, although I have been an inveterate collector of various things, this side of hoarding I hope.

The article brought to mind the strange hybrid identity of the weblog, even though I was surprised that the author didn't make the obvious connection to blogging. People blog for myriad reasons, of course, including the advancement of political points of view, academic arguments, or quasi-professional self-expression or activism. And the blog has features of a public, or in the case of anonymous bloggers, semi-public, journal, that is, a record of personal events or observations.

But blogs also often serve as digital scrapbooks, asserting both general value and individuality, often in non-verbal ways. Both blogs and scrapbooks take what may seem to be personal or cultural ephemera (in one case, digital, in the other, paper) and lend them some permanence. Some are more personal and some are more generally cultural, but all are declarations to the world: "I am a person; this is what I care about." The implication is usually, as Mr. Rogers might say, "Wouldn't you like to care about it too?" It has been said that most writing, most art in general, bears an element of seduction, in a sense far more broad and subtle than the erotic. As a hybrid journal/scrapbook, the blog is like this too, reaching out and documenting items that are somehow poignant, compelling, or lovely, and expressing the hope that someone else out there will agree.

In terms of style, few things in a blog are incidental or accidental. Given the finite number of templates available (for those of us too cheap or time-pressed to use custom platforms), I am continually amazed by the diversity of verbal and pictorial worlds in the blogosphere. The basic template; the ratio of text to image; the tone, frequency, and content of posts all convey an unmistakeable individuality (even if pseudonymous). Even a blog that, like a therapist's office, might aspire to seem "neutral" would in fact convey far more. Of course, some blogs are endearing, some plain, some downright offputting.

The amazing thing is how the Internet has leveled psychological barriers to self-exposure. How many people would walk down the street asking random people to look at their journal or scrapbook? Not many, but the blogosphere is a virtual street in which those "random" people are in fact in search of scrapbooks to look at. If ebay enables the exchange of merchandise, and datings services enable relationships, the blogosphere enables a weird kind of communal digital scrapbook that both reflects and refracts the world in real time.