"All comes by the body, only health puts you rapport with the universe."
Walt Whitman, from "By Blue Ontario's Shore"
This quote--which could serve as credo for integrative medicine--is the kind of thing that occasionally reminds me why I went into psychiatry. Beyond the often questionable DSM diagnoses, the vagaries of therapy, and the imperfect biological treatments, what we are after is a state of attunement and acceptance in which a biological being achieves transcendence of the merely physical without, necessarily, any recourse to the supernatural.
It is not the work of poetry to answer all our questions, of course, and one can legitimately wonder what sorts of subjective states, interpersonal relationships, and achievements of meaning must come together to constitute "rapport." But if we say that health is merely the absence of disease (or disorder), if only to trim the ambitions of restless and overweening doctors (and their many accomplices and handmaidens in the behemoth that is the health care industry), it is nonetheless true that it is typical of consciousness to aspire to something more than just the absence of suffering. Perhaps poets pick up about where physicians trail off.
Monday, September 12, 2011
Wednesday, September 7, 2011
Mental Illness Is Whatever We Say It Is
"Psychology, which explains everything
explains nothing,
and we are still in doubt."
Marianne Moore, from "Marriage"
By "we" I don't mean we psychiatrists, at least not primarily, but rather "we the people." Caseness, or the determination of what counts as a mental disorder and what doesn't, is not something we go out and discover in nature; rather, it is a social category arrived at both explicitly and implicitly through cultural debate. The psychiatric profession obviously has opinions about caseness, but these do not go unanswered or unlimited by society at large.
In large part, antipsychiatry critique has been aimed at the extent of psychiatric diagnoses, both the numbers of diagnoses themselves (larger in every succeeding edition of DSM, we are reminded) and of course the numbers of people given those diagnoses. Suddenly it seems as if every other kid has ADHD and/or autism. Recently several psych blogs cited a recent survey claiming that 38% of a European sample suffers a mental disorder in a given year. This included substance abuse and dementia, but nonetheless it seems like a high number (the 5 or 10-year prevalence would be significantly higher).
I think that 38% seems like a high number for reasons both illegitimate and legitimate. Even now there is a tendency, more latent in some than others, to view those with mental disorders as the mad, an appalling but surely very minority group safely stowed away in institutions. The notion that "the mentally ill" walk the streets and even have jobs and families like you and I remains foreign to some. But there is also the real concern that the sick role, a transaction that officially relieves the patient of at least some social responsibility, loses its meaning when used too widely. In that respect, there is too little appreciation of the great variation in severity of mental disorders; just as one may go to an internist for a touch of gastritis or for cancer, a technical psychiatric diagnosis may or may not involve significant disability or the use of the sick role.
Whether medical or psychiatric, diagnosis when applied liberally enough approaches the condition of enhancement. For Freudians neurosis was an inescapable condition of humanity, so at certain times and places (and with sufficient economic resources) to be in analysis did not mark one as "sick" so much as self-aware and ambitious. Similarly, in those older than 85, significant dementia is closer to the rule than to the exception, so statistically speaking the effective treatment (which we don't yet have) of dementia in the very old would in fact qualify as enhancement. And for modern medicine, mortality itself has virtually become a disease (which as the Onion occasionally reminds us, retains its 100% prevalence despite our best efforts). When we seriously discuss mental disorders having a prevalence greater than 50%, we start to consider syndromes that are, in toto, to be expected of the human condition, at least at this place and time.
Enhancement may well be justified, depending on the circumstances. The question is always: is treating any given phenomenon clinically (that is, as a syndrome worthy of specific medical intervention) likely to be helpful (that is, to lead to better functional outcomes, in the case of those problems for which we really do have treatments, or to better understanding of ourselves and others, in the case of those problems that remain intractable)? Or would it be better to consider the issue as a social/moral/cultural/existential difficulty? That is really the question, and not one that neuroscience can shed any light on whatsoever. Biologically, all human capacities appear to exist on dimensional continua, and the point at which we indicate "pathology" or "caseness" is a social and interpretive outcome.
explains nothing,
and we are still in doubt."
Marianne Moore, from "Marriage"
By "we" I don't mean we psychiatrists, at least not primarily, but rather "we the people." Caseness, or the determination of what counts as a mental disorder and what doesn't, is not something we go out and discover in nature; rather, it is a social category arrived at both explicitly and implicitly through cultural debate. The psychiatric profession obviously has opinions about caseness, but these do not go unanswered or unlimited by society at large.
In large part, antipsychiatry critique has been aimed at the extent of psychiatric diagnoses, both the numbers of diagnoses themselves (larger in every succeeding edition of DSM, we are reminded) and of course the numbers of people given those diagnoses. Suddenly it seems as if every other kid has ADHD and/or autism. Recently several psych blogs cited a recent survey claiming that 38% of a European sample suffers a mental disorder in a given year. This included substance abuse and dementia, but nonetheless it seems like a high number (the 5 or 10-year prevalence would be significantly higher).
I think that 38% seems like a high number for reasons both illegitimate and legitimate. Even now there is a tendency, more latent in some than others, to view those with mental disorders as the mad, an appalling but surely very minority group safely stowed away in institutions. The notion that "the mentally ill" walk the streets and even have jobs and families like you and I remains foreign to some. But there is also the real concern that the sick role, a transaction that officially relieves the patient of at least some social responsibility, loses its meaning when used too widely. In that respect, there is too little appreciation of the great variation in severity of mental disorders; just as one may go to an internist for a touch of gastritis or for cancer, a technical psychiatric diagnosis may or may not involve significant disability or the use of the sick role.
Whether medical or psychiatric, diagnosis when applied liberally enough approaches the condition of enhancement. For Freudians neurosis was an inescapable condition of humanity, so at certain times and places (and with sufficient economic resources) to be in analysis did not mark one as "sick" so much as self-aware and ambitious. Similarly, in those older than 85, significant dementia is closer to the rule than to the exception, so statistically speaking the effective treatment (which we don't yet have) of dementia in the very old would in fact qualify as enhancement. And for modern medicine, mortality itself has virtually become a disease (which as the Onion occasionally reminds us, retains its 100% prevalence despite our best efforts). When we seriously discuss mental disorders having a prevalence greater than 50%, we start to consider syndromes that are, in toto, to be expected of the human condition, at least at this place and time.
Enhancement may well be justified, depending on the circumstances. The question is always: is treating any given phenomenon clinically (that is, as a syndrome worthy of specific medical intervention) likely to be helpful (that is, to lead to better functional outcomes, in the case of those problems for which we really do have treatments, or to better understanding of ourselves and others, in the case of those problems that remain intractable)? Or would it be better to consider the issue as a social/moral/cultural/existential difficulty? That is really the question, and not one that neuroscience can shed any light on whatsoever. Biologically, all human capacities appear to exist on dimensional continua, and the point at which we indicate "pathology" or "caseness" is a social and interpretive outcome.
Tuesday, September 6, 2011
The Religion of the Good, Part 2
A recent New Yorker profile of the philosopher Derek Parfit mentioned that the late Bernard Williams once dismissed the ideal of a universally compelling moral code as (I paraphrase) "something you use on the men who come to take you away." Indeed, implied in the "problem of evil" is the conviction (or fantasy perhaps) that if we could only find the right combination for the great moral mystery vault, the ponderous door of error would swing open, releasing a radiance that would burn away the scales from the eyes of the benighted.
I imagine that some religious believers have a similar feeling that if they could only depict or praise God rightly, his existence and glory would be as plain to everyone else as they are to them. The holy grail of thought is the proposition (or grand scheme of propositions) that is as self-evident as 2 + 2 = 4 but as transcendent and as life-changing as the existence of God. That is the constructed idea(l) that we imagine would stop the bad men in their tracks and bring them to their knees. If God does not exist, then it will be necessary to invent (it)--this is the project that is at least implicit in non-relativistic philosophy. As Wallace Stevens wrote, "One day they will get it right at the Sorbonne."
I once read a review by Helen Vendler in which she claimed that the role of the critic is not only (or even primarily) to explain or to justify, but also to celebrate. Similarly, I think that for anyone who reflects seriously about the moral life, explanation and justification go only so far, beyond which point one can only aspire to praise and embody one's views. The barbarians who burn down the monastery are unfazed by the crucifix; likewise, no secular moral system achieves the potency of a talisman. To accept this is also to accept a troubling existential diversity in human nature--other people see the great questions in the same way that I do, except when they don't do so at all. Perhaps the Tower of Babel is the central metaphor for humanity, making us the most atypical species. There is a strain in philosophy that seeks to tear down the tower in favor of a second Garden of Eden, done rightly this time.
The problem is that many men (most of them, alas, have been men) have been sure that they beheld the Truth, and terrible things have been done in the name of Truth. The point is to religiously (in the generic sense) embrace a system of meaning while avoiding clinical or moral insanity. Just as Satanism may be an internally consistent religion, so may there be functioning philosophies of evil (National Socialism, al Qaeda, etc.). We denounce them not because they have no justification (they do have their internal justifications), but because we find them pernicious and repugnant. Our grounds for doing so may be ultimately contingent on the creatures that we evolved to be, but that is the best we can do--we can never escape history by inventing ourselves de novo. By and large, we also happen contingently to find the blues and golds of sea, sky, and sun to be gratifying, and we can only be grateful that we do so. The truth is not given in any simplistic way, but there is also no truth that does not derive, in some fantastically complicated way and filtered through many generations of human consciousness, from our origin.
I imagine that some religious believers have a similar feeling that if they could only depict or praise God rightly, his existence and glory would be as plain to everyone else as they are to them. The holy grail of thought is the proposition (or grand scheme of propositions) that is as self-evident as 2 + 2 = 4 but as transcendent and as life-changing as the existence of God. That is the constructed idea(l) that we imagine would stop the bad men in their tracks and bring them to their knees. If God does not exist, then it will be necessary to invent (it)--this is the project that is at least implicit in non-relativistic philosophy. As Wallace Stevens wrote, "One day they will get it right at the Sorbonne."
I once read a review by Helen Vendler in which she claimed that the role of the critic is not only (or even primarily) to explain or to justify, but also to celebrate. Similarly, I think that for anyone who reflects seriously about the moral life, explanation and justification go only so far, beyond which point one can only aspire to praise and embody one's views. The barbarians who burn down the monastery are unfazed by the crucifix; likewise, no secular moral system achieves the potency of a talisman. To accept this is also to accept a troubling existential diversity in human nature--other people see the great questions in the same way that I do, except when they don't do so at all. Perhaps the Tower of Babel is the central metaphor for humanity, making us the most atypical species. There is a strain in philosophy that seeks to tear down the tower in favor of a second Garden of Eden, done rightly this time.
The problem is that many men (most of them, alas, have been men) have been sure that they beheld the Truth, and terrible things have been done in the name of Truth. The point is to religiously (in the generic sense) embrace a system of meaning while avoiding clinical or moral insanity. Just as Satanism may be an internally consistent religion, so may there be functioning philosophies of evil (National Socialism, al Qaeda, etc.). We denounce them not because they have no justification (they do have their internal justifications), but because we find them pernicious and repugnant. Our grounds for doing so may be ultimately contingent on the creatures that we evolved to be, but that is the best we can do--we can never escape history by inventing ourselves de novo. By and large, we also happen contingently to find the blues and golds of sea, sky, and sun to be gratifying, and we can only be grateful that we do so. The truth is not given in any simplistic way, but there is also no truth that does not derive, in some fantastically complicated way and filtered through many generations of human consciousness, from our origin.
Monday, September 5, 2011
The Religion of the Good
A couple of weeks ago, in the New York Times philosophy feature "The Stone," Joel Marks confessed his loss of faith in objective morality:
"I thought I was a secularist because I conceived of right and wrong as standing on their own two feet, without prop or crutch from God. We should do the right thing because it is the right thing to do, period. But this was a God too. It was the Godless God of secular morality, which commanded without commander--whose ways were thus even more mysterious than the God I did not believe in, who at least had the intelligible motive of rewarding us for what He wanted us to do." (Italics in original).
Marks goes on to claim that even if we withdraw the quasi-theistic vehemence of our confidence in objective morality, and thus acknowledge the mere contingency of our beliefs, this needn't change our actual practice. We continue to believe what we believe and have the right to advocate our views in accord or in competition with others, but according to Marks, we can never claim that the views of others are wrong, only that they lead to different consequences. Such advocacy would seem able to achieve moral consistency, and not full justification. For instance, Marks notes animal welfare as one of his central preoccupations. Alluding to the basic moral tenet that avoidable suffering is wrong, one may educate others about animals' lives in factory farms, but not add the emotional force of moral disapprobation (which, Marks maintains, may provoke resistance or resentment as much as anything).
I think that this is wrong and that it mistakes human moral development. At a certain level we embrace certain traditions, rituals, and moral standards not because we pretend to ultimate moral justification of them, but because the alternative is chaos. We raise our children to believe that certain behaviors are not merely different from what we happen to do--they are wrong. We watch football rather than soccer by virtue of mere geographic contingency; while we may prefer football, we recognize that this is likely due to acculturation and habituation. But when we say that it is not right to abuse animals, we assert that this true everywhere and for everyone.
Secular morality does therefore partake of the emotional conviction of religious faith, but this reflects its fervor, not its groundlessness, and hence is a mark of strength and not weakness. The "God" of secular morality is an impersonal ideal that we collectively construct, not a personal interlocutor that we discover. There are, of course, many versions of this "God" just as there are many versions of the God of the Christian church (and obviously Islam and Judaism). But I think there can therefore be a fundamental secular referent of the term "Godless," which denotes not merely he who lacks faith in the supernatural, but he who is unable or unwilling to shape his behavior according to moral ideals and/or the suffering of others (conduct which we may designate as psychopathic or evil).
Near the beginning of Terence Malick's "The Tree of Life," the narrator comments that there are those who live in a "state of nature" and those who live in a "state of grace." We live in a "state of nature" insofar as we merely gratify our impulses, even if to the detriment of others, or complacently embrace our (evolutionarily) contingent dispositions. And there is a secular version of the "state of grace" whereby we believe ourselves to be free to (collaboratively) fashion a moral ideal.
The "religion" of secular ethics is prey to the same pathologies as conventional religion, i.e. propensities to rigidity, dogma, self-righteousness, hypocrisy, and exclusion. But it is also offers the same potential for affiliation and transcendence (if not, granted, the same degree of narrative interest or life-after-death consolation). I consider myself agnostic because I do not find any of the world's supernatural deities to be existentially compelling, but my attachment to, say, the Golden Rule (among other moral precepts) does have, as Joel Marks rightly argues, a good deal of faith to it. But inasmuch as there can really be no doubt as to whether the Golden Rule exists, this my attitude could be said to involve love more than belief.
"I thought I was a secularist because I conceived of right and wrong as standing on their own two feet, without prop or crutch from God. We should do the right thing because it is the right thing to do, period. But this was a God too. It was the Godless God of secular morality, which commanded without commander--whose ways were thus even more mysterious than the God I did not believe in, who at least had the intelligible motive of rewarding us for what He wanted us to do." (Italics in original).
Marks goes on to claim that even if we withdraw the quasi-theistic vehemence of our confidence in objective morality, and thus acknowledge the mere contingency of our beliefs, this needn't change our actual practice. We continue to believe what we believe and have the right to advocate our views in accord or in competition with others, but according to Marks, we can never claim that the views of others are wrong, only that they lead to different consequences. Such advocacy would seem able to achieve moral consistency, and not full justification. For instance, Marks notes animal welfare as one of his central preoccupations. Alluding to the basic moral tenet that avoidable suffering is wrong, one may educate others about animals' lives in factory farms, but not add the emotional force of moral disapprobation (which, Marks maintains, may provoke resistance or resentment as much as anything).
I think that this is wrong and that it mistakes human moral development. At a certain level we embrace certain traditions, rituals, and moral standards not because we pretend to ultimate moral justification of them, but because the alternative is chaos. We raise our children to believe that certain behaviors are not merely different from what we happen to do--they are wrong. We watch football rather than soccer by virtue of mere geographic contingency; while we may prefer football, we recognize that this is likely due to acculturation and habituation. But when we say that it is not right to abuse animals, we assert that this true everywhere and for everyone.
Secular morality does therefore partake of the emotional conviction of religious faith, but this reflects its fervor, not its groundlessness, and hence is a mark of strength and not weakness. The "God" of secular morality is an impersonal ideal that we collectively construct, not a personal interlocutor that we discover. There are, of course, many versions of this "God" just as there are many versions of the God of the Christian church (and obviously Islam and Judaism). But I think there can therefore be a fundamental secular referent of the term "Godless," which denotes not merely he who lacks faith in the supernatural, but he who is unable or unwilling to shape his behavior according to moral ideals and/or the suffering of others (conduct which we may designate as psychopathic or evil).
Near the beginning of Terence Malick's "The Tree of Life," the narrator comments that there are those who live in a "state of nature" and those who live in a "state of grace." We live in a "state of nature" insofar as we merely gratify our impulses, even if to the detriment of others, or complacently embrace our (evolutionarily) contingent dispositions. And there is a secular version of the "state of grace" whereby we believe ourselves to be free to (collaboratively) fashion a moral ideal.
The "religion" of secular ethics is prey to the same pathologies as conventional religion, i.e. propensities to rigidity, dogma, self-righteousness, hypocrisy, and exclusion. But it is also offers the same potential for affiliation and transcendence (if not, granted, the same degree of narrative interest or life-after-death consolation). I consider myself agnostic because I do not find any of the world's supernatural deities to be existentially compelling, but my attachment to, say, the Golden Rule (among other moral precepts) does have, as Joel Marks rightly argues, a good deal of faith to it. But inasmuch as there can really be no doubt as to whether the Golden Rule exists, this my attitude could be said to involve love more than belief.
Monday, July 25, 2011
Captain America
"The denial of moral absolutism leads not to relativism, but to nihilism."
Paul Boghossian, "The Maze of Moral Relativism"
I never thought I'd see a decent Captain America film in my lifetime, but this time Marvel has managed brio without ponderousness. When I was into comics in the 1980's, Cap was, it must be said, my favorite. While I enjoyed a number of titles, he eschewed the smart-alecky goofiness of Spider-Man, the self-involvement of the X-Men, and the contrived contortions of the Fantastic Four; sober but spirited, he was neither the hipster Batman nor the staid Superman (that George Washington of superheroes).
In the 1980's, shrouded by the forgetfulness of his reading public, Captain America bore little resemblance to the "old-growth superhero" (in A. O. Scott's memorable phrase) of the 1940's. Making up in steadfastness for what he lacked in flamboyance, he merely did his workaday thing month after obdurate month. In the new movie he reclaims a bit of the Nazi-slugging romance (Red Skull always was the villain par excellence, implacable and inscrutable without being ridiculous, compared to which Darth Vader was a clown).
Needless to say, Cap also embodies American exceptionalism as well as the absolute injunction to act morally. As Boghossian compellingly argues in his piece, if one wishes to avoid believing in nothing, it is logically necessary to believe in something. For the non-psychopath there is no evading moral dialogue (or in the case of comic book films, moral combat).
Paul Boghossian, "The Maze of Moral Relativism"
I never thought I'd see a decent Captain America film in my lifetime, but this time Marvel has managed brio without ponderousness. When I was into comics in the 1980's, Cap was, it must be said, my favorite. While I enjoyed a number of titles, he eschewed the smart-alecky goofiness of Spider-Man, the self-involvement of the X-Men, and the contrived contortions of the Fantastic Four; sober but spirited, he was neither the hipster Batman nor the staid Superman (that George Washington of superheroes).
In the 1980's, shrouded by the forgetfulness of his reading public, Captain America bore little resemblance to the "old-growth superhero" (in A. O. Scott's memorable phrase) of the 1940's. Making up in steadfastness for what he lacked in flamboyance, he merely did his workaday thing month after obdurate month. In the new movie he reclaims a bit of the Nazi-slugging romance (Red Skull always was the villain par excellence, implacable and inscrutable without being ridiculous, compared to which Darth Vader was a clown).
Needless to say, Cap also embodies American exceptionalism as well as the absolute injunction to act morally. As Boghossian compellingly argues in his piece, if one wishes to avoid believing in nothing, it is logically necessary to believe in something. For the non-psychopath there is no evading moral dialogue (or in the case of comic book films, moral combat).
Sunday, July 24, 2011
Eccentrics
Leon Wieseltier at The New Republic is the rare prophet with subtlety, arguing with great ingenuity but always in opposition, whether to thoughtlessness, smug certitude, or superficial sociability. He is the rare intellectual insider who dares to be deeply and skeptically unfashionable; as such, he steers a tight course between the curmudgeonly, the lugubrious, and the devastating.
In his most recent piece (not available online except to TNR subscribers), he uses the metaphor of birds that sing at night (because they can't get a tune in edgewise in the growing cacophony of the urban day) to lament his growing disconnection with the insulted and humiliated of the world:
"Not long ago I surprised myself with the embarrassing thought that I no longer know any lonely people...But I am cut off from the ones who are cut off, from the disconnected and the un-networked (our technology of communications is supposed to have made such marginalizations obsolete, but I do not believe it: our culture is filling up with evidence of the lonely digital crowd), the ones who lead lives of radical solitariness, of aloneness without appeal, with no bonds to console them and no prospects to divert them, who struggle for stimulation and expression, whose beds are deserts, whose phones almost never ring, who march through their difficulties without any expectation of serendipity or transcendence. Their absence from my experience makes me feel disgracefully narrow."
This is a brave admission, and an acknowledgement by Wieseltier that he is, despite himself, one of the elite. But as one who gets to know many such people (as many physicians and most social workers do), I see a risk in extolling the lives of the disaffected and alienated. There seems to be romanticizing here, as of the overlooked poet scribbling in his garret, the anchorite glorying in his desert cave, or the oppressed dissident in the labor camp. Wieseltier seems to be claiming the inherent dignity of suffering, and while there is that, does this mean we should be any less assiduous in our struggle to alleviate distress? Suffering has the potential to lead to wisdom, but arguably in actuality it most commonly does not.
The prophet (whether secular or religious) is always positioned somewhere between the eccentric and the crank. The eccentric lingers "away from the center" of human experience, but can still engage in dialogue with a significant part of his fellows, whereas the crank has been cut off, as when a man goes into the desert for transcendence but never makes it back to relate the tale.
In his most recent piece (not available online except to TNR subscribers), he uses the metaphor of birds that sing at night (because they can't get a tune in edgewise in the growing cacophony of the urban day) to lament his growing disconnection with the insulted and humiliated of the world:
"Not long ago I surprised myself with the embarrassing thought that I no longer know any lonely people...But I am cut off from the ones who are cut off, from the disconnected and the un-networked (our technology of communications is supposed to have made such marginalizations obsolete, but I do not believe it: our culture is filling up with evidence of the lonely digital crowd), the ones who lead lives of radical solitariness, of aloneness without appeal, with no bonds to console them and no prospects to divert them, who struggle for stimulation and expression, whose beds are deserts, whose phones almost never ring, who march through their difficulties without any expectation of serendipity or transcendence. Their absence from my experience makes me feel disgracefully narrow."
This is a brave admission, and an acknowledgement by Wieseltier that he is, despite himself, one of the elite. But as one who gets to know many such people (as many physicians and most social workers do), I see a risk in extolling the lives of the disaffected and alienated. There seems to be romanticizing here, as of the overlooked poet scribbling in his garret, the anchorite glorying in his desert cave, or the oppressed dissident in the labor camp. Wieseltier seems to be claiming the inherent dignity of suffering, and while there is that, does this mean we should be any less assiduous in our struggle to alleviate distress? Suffering has the potential to lead to wisdom, but arguably in actuality it most commonly does not.
The prophet (whether secular or religious) is always positioned somewhere between the eccentric and the crank. The eccentric lingers "away from the center" of human experience, but can still engage in dialogue with a significant part of his fellows, whereas the crank has been cut off, as when a man goes into the desert for transcendence but never makes it back to relate the tale.
Monday, July 18, 2011
The Elegaic Mode
For W. G. Sebald, the modern world, a composite of contemporary detritus and forlorn nature, is a kind of forme fruste of the historical human condition. Sebald was a past-intoxicated writer, and in The Rings of Saturn a ramble through southeastern England yields disquisitions on Joseph Conrad, herring fisheries, imperial decline, and the silkworm industry. The entropy is inescapable. Here are a few choice quotes:
(On fishermen): "I do not believe that these men sit by the sea all day and all night so as not to miss the time when the whiting pass, the flounder rise or the cod come in to the shallower waters, as they claim. They just want to be in a place where they have the world behind them, and before them nothing but emptiness."
(On the writer Michael Hamburger): "Perhaps we all lose our sense of reality to the precise degree to which we are engrossed in our own work, and perhaps that is why we see in the increasing complexity of our mental constructs a means for greater understanding, even while intuitively we know that we shall never be able to fathom the imponderables that govern our course through life."
(On Thomas Abrams, who devoted his life to a minute reconstruction of the Temple of Jerusalem): "In the final analysis, our entire work is based on nothing but ideas, ideas which change over the years and which time and again cause one to tear down what one had thought to be finished, and begin again from scratch."
(On the melancholy of medieval weavers): "It is difficult to imagine the depths of despair into which those can be driven who, even after the end of the working day, are engrossed in their intricate designs and who are pursued, into their dreams, by the feeling that they have got hold of the wrong thread."
(On the destructiveness of civilization): "Like our bodies and like our desires, the machines we have devised are possessed of a heart which is slowly reduced to embers. From the earliest times, human civilization has been no more than a strange luminescence growing more intense by the hour, of which no one can say when it will begin to wane and when it will fade away."
If as many say, we now live in the Anthropocene era, in which the activities of Homo sapiens directly affect planet-wide processes, why can't we regard humanity with kindness, as we might regard any natural force? Just as a levee is meant to withstand the flood, one's mourning, indignation, and even resentment are meant to withstand and, if possible, to divert the human flood from that which one holds dear.
(On fishermen): "I do not believe that these men sit by the sea all day and all night so as not to miss the time when the whiting pass, the flounder rise or the cod come in to the shallower waters, as they claim. They just want to be in a place where they have the world behind them, and before them nothing but emptiness."
(On the writer Michael Hamburger): "Perhaps we all lose our sense of reality to the precise degree to which we are engrossed in our own work, and perhaps that is why we see in the increasing complexity of our mental constructs a means for greater understanding, even while intuitively we know that we shall never be able to fathom the imponderables that govern our course through life."
(On Thomas Abrams, who devoted his life to a minute reconstruction of the Temple of Jerusalem): "In the final analysis, our entire work is based on nothing but ideas, ideas which change over the years and which time and again cause one to tear down what one had thought to be finished, and begin again from scratch."
(On the melancholy of medieval weavers): "It is difficult to imagine the depths of despair into which those can be driven who, even after the end of the working day, are engrossed in their intricate designs and who are pursued, into their dreams, by the feeling that they have got hold of the wrong thread."
(On the destructiveness of civilization): "Like our bodies and like our desires, the machines we have devised are possessed of a heart which is slowly reduced to embers. From the earliest times, human civilization has been no more than a strange luminescence growing more intense by the hour, of which no one can say when it will begin to wane and when it will fade away."
If as many say, we now live in the Anthropocene era, in which the activities of Homo sapiens directly affect planet-wide processes, why can't we regard humanity with kindness, as we might regard any natural force? Just as a levee is meant to withstand the flood, one's mourning, indignation, and even resentment are meant to withstand and, if possible, to divert the human flood from that which one holds dear.
Tuesday, June 28, 2011
That (Self-) Blaming Feeling
"Why, worthy thane,
You do unbend your noble strength to think
So brainsickly of things. Go get some water,
And wash this filthy witness from your hand."
Lady Macbeth
In a compelling argument for the congruence of brain and mind, and the ethics that ought to follow therefrom, David Eagleman maintains that blame derives from a misguided and outmoded belief in free will. He claims that blame is basically backward-looking, implying that one could and should have done differently (than one just did). But when we face up to monism and the brain-as-mechanism, we realize that, after the fact, there is nothing to do but acknowledge that given the conditions that prevailed at any past time, one could not in fact have acted differently.
Eagleman argues that shame and blame are not, in fact, very good at modifying behavior, and what we need is a more rational and forward-looking attempt to achieve desired outcomes, in ourselves and others. A la B. F. Skinner, he proposes that we approach brains as we would approach engines or computers that are on the blink. Both sticks and carrots may be necessary to shape desired behaviors, but they should be undertaken in a dispassionate way, free of messy or reckless vindictiveness.
There is nothing inherently objectionable about his advocacy of what he calls "the frontal workout," that is, an updated biofeedback project whereby one might learn (or teach) better control over impulses. But he might have said more about the phenomenology of guilt and blame, which are, after all, very deep aspects of human experience. These are very distinct and familiar subjective phenomena, and arguably they are far from arbitrary or nonsensical.
Blame is the social group's means of imposing its norms, and blame works most effectively when it is internalized as guilt and shame. Blame is a deterrent, plain and simple. And as is so often the case, it works best when it is involuntary (when blame is reflected on too carefully, one arrives at Hamlet). This is not to say that shame and blame are generally good things, merely that they are natural (and many perfectly natural human behaviors are odious). However, even in Eagleman's handyman-of-the-brain world, some impetus and motivation for change must exist, and I don't know whether that motive would come from unless from those primeval emotions of guilt and shame. They merely exist in healthy and in pathological forms. Guilt and shame may seem to be primarily about the past, but really they project forward into the future; like pain, they are the brain's message to itself: That didn't go well, so try something different. Blame and guilt are modes of moral (self-)argument.
And in a follow-up to the recent post about reading, the literati are a bit atwitter about Philip Roth's declaration that he has stopped reading fiction. In a Salon article Laura Miller speculates that inasmuch as fiction provides insight into character and human subjectivity, perhaps some do reach a point at which they have all the insight they need. After all, the novel isn't called the novel for nothing, and some readers do believe there is nothing new under the sun. But then again, one could paraphrase Samuel Johnson and say that "He who is tired of fiction is tired of life."
You do unbend your noble strength to think
So brainsickly of things. Go get some water,
And wash this filthy witness from your hand."
Lady Macbeth
In a compelling argument for the congruence of brain and mind, and the ethics that ought to follow therefrom, David Eagleman maintains that blame derives from a misguided and outmoded belief in free will. He claims that blame is basically backward-looking, implying that one could and should have done differently (than one just did). But when we face up to monism and the brain-as-mechanism, we realize that, after the fact, there is nothing to do but acknowledge that given the conditions that prevailed at any past time, one could not in fact have acted differently.
Eagleman argues that shame and blame are not, in fact, very good at modifying behavior, and what we need is a more rational and forward-looking attempt to achieve desired outcomes, in ourselves and others. A la B. F. Skinner, he proposes that we approach brains as we would approach engines or computers that are on the blink. Both sticks and carrots may be necessary to shape desired behaviors, but they should be undertaken in a dispassionate way, free of messy or reckless vindictiveness.
There is nothing inherently objectionable about his advocacy of what he calls "the frontal workout," that is, an updated biofeedback project whereby one might learn (or teach) better control over impulses. But he might have said more about the phenomenology of guilt and blame, which are, after all, very deep aspects of human experience. These are very distinct and familiar subjective phenomena, and arguably they are far from arbitrary or nonsensical.
Blame is the social group's means of imposing its norms, and blame works most effectively when it is internalized as guilt and shame. Blame is a deterrent, plain and simple. And as is so often the case, it works best when it is involuntary (when blame is reflected on too carefully, one arrives at Hamlet). This is not to say that shame and blame are generally good things, merely that they are natural (and many perfectly natural human behaviors are odious). However, even in Eagleman's handyman-of-the-brain world, some impetus and motivation for change must exist, and I don't know whether that motive would come from unless from those primeval emotions of guilt and shame. They merely exist in healthy and in pathological forms. Guilt and shame may seem to be primarily about the past, but really they project forward into the future; like pain, they are the brain's message to itself: That didn't go well, so try something different. Blame and guilt are modes of moral (self-)argument.
And in a follow-up to the recent post about reading, the literati are a bit atwitter about Philip Roth's declaration that he has stopped reading fiction. In a Salon article Laura Miller speculates that inasmuch as fiction provides insight into character and human subjectivity, perhaps some do reach a point at which they have all the insight they need. After all, the novel isn't called the novel for nothing, and some readers do believe there is nothing new under the sun. But then again, one could paraphrase Samuel Johnson and say that "He who is tired of fiction is tired of life."
Sunday, June 26, 2011
The Tree of Life
Solitude from mere outward condition of existence becomes very swiftly a state of soul in which the affectation of irony and skepticism have no place...After three days of waiting for the sight of some human face, Decoud caught himself entertaining a doubt of his own individuality. It had emerged into the world of cloud and water, of natural forces and forms of nature. In our activity alone we find the sustaining illusion of an independent existence as against the whole scheme of things of which we form a helpless part.
Joseph Conrad, Nostromo
Terence Malick's The Tree of Life consistently defies expectations of coherent narrative, instead implanting myriad images implacably in the mind. One could be haunted by this film. As David Thomson wrote in his review in The New Republic: "Less than a framework of story, we have a situation, and this is itself not just fair, but an enlightening novelty. Most of us do not feel that we are living stories (at least not until later); we believe we are getting on with a situation."
As the movie's epigraph from Job suggests, the situation is one of inevitable suffering and loss, albeit experienced in a perpetual haze of existential glory. The tone of the work is continually exalted, which probably accounts for its controversial and varied reception. For those predisposed to its message, irony is silenced; the sacred is always a puzzle to the intelligentsia.
The situation in The Tree of Life, is, most mundanely, that of a family in 1950's Texas, but really Malick is concerned with the situation of human life and its vexed relation to life, broadly considered. Much has been made, both derisively and respectfully, of Malick's depiction of the history of the universe and the pre-human earth (dinosaurs even!), but I'm not sure why. Narratively, this is merely the use of a very wide-angle lens, and a salutary use at that--there is more to heaven and earth than is dreamt of in Manhattan. Indeed, a few aerial shots of early hominids would not have been out of place. Psychologically, the "family romance" may seem endlessly interesting, but neither man nor woman lives by interpersonal relationships alone. There is that which preceded us and that which will outlast us.
In the first few minutes of the film, as we get our first impressionistic views of the O'Brien family, a female voice-over poses the contrast of nature and grace, asserting that the way of nature is domination and self-indulgence, whereas the way of grace is care and endurance. Much of the film unforgettably documents the necessity of nature--deep space, inscrutable water, arboreal visions, scathing light, barren rock, towering glass and steel--but the realm of grace is uniquely human. Consciousness is dualistic not in substance (body and soul, brain and mind) but in moral experience, in what we have no option but to choose.
Only human beings, in all of life that we know of, can fail the test of grace, and we see the risk and stakes of such failure in the boy, Jack, of 1950's Waco and the contemporary man, Jack (a ravaged Sean Penn). Violence and predation antedated humanity by many millions of years, but only with the first glimmer of consciousness did the storyline of Cain and Abel come into the world. We see it in the boy Jack's sullen resentment of his father and his acts of petty boyhood mayhem (breaking windows, mistreating frogs, stealing lingerie). Similarly, only humanity is prey to despair, of which contemporary Jack appears to be a classic example, suffering Kierkegaard's "sickness unto death."
Some reviews I've read seemed to infer that the culminating beach scene was some kind of Rapture-like representation of the end of the world, but to me it seemed a symbolic depiction of redemption, as Jack somehow breaks through his granitic alienation. The idea and the ideal of the sacred presume that amid seemingly endless tawdriness or trauma there are still spaces and times of grace if we can only find them.
Joseph Conrad, Nostromo
Terence Malick's The Tree of Life consistently defies expectations of coherent narrative, instead implanting myriad images implacably in the mind. One could be haunted by this film. As David Thomson wrote in his review in The New Republic: "Less than a framework of story, we have a situation, and this is itself not just fair, but an enlightening novelty. Most of us do not feel that we are living stories (at least not until later); we believe we are getting on with a situation."
As the movie's epigraph from Job suggests, the situation is one of inevitable suffering and loss, albeit experienced in a perpetual haze of existential glory. The tone of the work is continually exalted, which probably accounts for its controversial and varied reception. For those predisposed to its message, irony is silenced; the sacred is always a puzzle to the intelligentsia.
The situation in The Tree of Life, is, most mundanely, that of a family in 1950's Texas, but really Malick is concerned with the situation of human life and its vexed relation to life, broadly considered. Much has been made, both derisively and respectfully, of Malick's depiction of the history of the universe and the pre-human earth (dinosaurs even!), but I'm not sure why. Narratively, this is merely the use of a very wide-angle lens, and a salutary use at that--there is more to heaven and earth than is dreamt of in Manhattan. Indeed, a few aerial shots of early hominids would not have been out of place. Psychologically, the "family romance" may seem endlessly interesting, but neither man nor woman lives by interpersonal relationships alone. There is that which preceded us and that which will outlast us.
In the first few minutes of the film, as we get our first impressionistic views of the O'Brien family, a female voice-over poses the contrast of nature and grace, asserting that the way of nature is domination and self-indulgence, whereas the way of grace is care and endurance. Much of the film unforgettably documents the necessity of nature--deep space, inscrutable water, arboreal visions, scathing light, barren rock, towering glass and steel--but the realm of grace is uniquely human. Consciousness is dualistic not in substance (body and soul, brain and mind) but in moral experience, in what we have no option but to choose.
Only human beings, in all of life that we know of, can fail the test of grace, and we see the risk and stakes of such failure in the boy, Jack, of 1950's Waco and the contemporary man, Jack (a ravaged Sean Penn). Violence and predation antedated humanity by many millions of years, but only with the first glimmer of consciousness did the storyline of Cain and Abel come into the world. We see it in the boy Jack's sullen resentment of his father and his acts of petty boyhood mayhem (breaking windows, mistreating frogs, stealing lingerie). Similarly, only humanity is prey to despair, of which contemporary Jack appears to be a classic example, suffering Kierkegaard's "sickness unto death."
Some reviews I've read seemed to infer that the culminating beach scene was some kind of Rapture-like representation of the end of the world, but to me it seemed a symbolic depiction of redemption, as Jack somehow breaks through his granitic alienation. The idea and the ideal of the sacred presume that amid seemingly endless tawdriness or trauma there are still spaces and times of grace if we can only find them.
Saturday, June 25, 2011
Drill Imagination Right Through Necessity
Play
Nothing's going to become of anyone
except death:
therefore: it's okay
to yearn
too high:
the grave accommodates
swell rambunctiousness &
ruin's not
compromised by magnificence:
that cut-off point
liberates us to the
common disaster: so
pick a perch--
apple branch for example in bloom--
tune up
and
drill imagination right through necessity:
it's all right:
it's been taken care of:
is allowed, considering
A. R. Ammons
Sunday, June 19, 2011
Missing
"How can one transmit to others the infinite Aleph, which my timorous memory can scarcely contain?"
Jorge Luis Borges
Some time ago Linda Holmes at NPR wrote a wonderful piece observing that, by virtue of sheer plenitude of space and time, each of us is destined to miss out on the vast majority of whatever it is we love in life. Far from being a downer, it is comforting and even self-transcending to realize that no matter how assiduous or dynamic one may be, there are just more people to meet, books to read, films to see, or sunsets to witness than any one life can manage. It is a reminder that even if, as the cliche goes, the world is much shrunken owing to the speed of travel and communication, one can divide infinity many times over and still be left with infinity. To live a lifetime is to gaze upon an ocean of experience, yet be allowed to dip one's hand in the water only once.
One consequence of having a large "physical" library (as opposed to having a Kindle sitting unobtrusively on the table) is that the many hundreds of tomes mutely gaze outward, as if in reproach of my all-too-human forgetfulness. My eight-year-old has asked before, "Daddy, why do you have all of these books if you can get them all on the computer?" One reason is that my recall isn't what it once was, and my library is one kind of living personal record. Many volumes I do dip into now and again--a poem here, an essay or short story there--but how many, realistically, will I live to reread altogether?
For some 20 years--roughly, from 15 to 35--I was a prolific reader, of all genres, but particularly fiction. You know: the canon, the great books (and many that were not-so-great). While I still read, of course, typical life circumstances have much reduced the time available for it. Whether by coincidence or not, I find myself less patient with fiction, and more given to non-fiction, than used to be the case, but I continue to fight that. Proust I feel sure I will live to reread, all 3000 pages. But the 1000 pages of Les Miserables? Probably not. Much of Dickens I hope to reread, but probably not Barnaby Rudge. Recently I read Harold Bloom claiming that rereading Samuel Richardson's Clarissa was a great priority. Really? I've never read Richardson even once. Do I need to read him before expending time on rereading Jonathan Swift? And should I do that before, or after, I brush up on American history?
Inasmuch as there is nothing outside of reality, fiction is merely a peculiar branch of non-fiction, reality's myriad conscious self-reflections. Per Stendhal, a novel is a mirror carried along a main road, but it is a puzzling kind of mirror, with surprising concavities and convexities. Fiction seeks reflections that reverberate and recreate reality in microcosm, a la Borges's Aleph. A successful work of art achieves a unity that symbolically reproduces the completeness of reality. Non-fiction is always a magnifying glass, if not a microscope--clarity is purchased at the expense of breadth. Fiction is a necessarily distorting mirror, since any simple mirror or magnifying glass capable of capturing everything we care about would have to be as large as the universe itself.
The stakes are high in the arts--the potential payoff is high, but when fiction seriously fails, it is upsetting, because it is as if reality itself is being mocked or even maimed. Bad non-fiction is like a lie, which is bad enough, but bad fiction is like blasphemy. I forever vacillate between Plato--who saw the arts as begetting deceptive images (among the myriad shadows on the wall of the cave), distractions from the pursuit of truth--and Aristotle, who argued that poetry at its best reveals necessary truths, while history merely documents contingencies. Perhaps it is just a matter of epistemological and existential focus, the iris of the inquisitive mind.
Jorge Luis Borges
Some time ago Linda Holmes at NPR wrote a wonderful piece observing that, by virtue of sheer plenitude of space and time, each of us is destined to miss out on the vast majority of whatever it is we love in life. Far from being a downer, it is comforting and even self-transcending to realize that no matter how assiduous or dynamic one may be, there are just more people to meet, books to read, films to see, or sunsets to witness than any one life can manage. It is a reminder that even if, as the cliche goes, the world is much shrunken owing to the speed of travel and communication, one can divide infinity many times over and still be left with infinity. To live a lifetime is to gaze upon an ocean of experience, yet be allowed to dip one's hand in the water only once.
One consequence of having a large "physical" library (as opposed to having a Kindle sitting unobtrusively on the table) is that the many hundreds of tomes mutely gaze outward, as if in reproach of my all-too-human forgetfulness. My eight-year-old has asked before, "Daddy, why do you have all of these books if you can get them all on the computer?" One reason is that my recall isn't what it once was, and my library is one kind of living personal record. Many volumes I do dip into now and again--a poem here, an essay or short story there--but how many, realistically, will I live to reread altogether?
For some 20 years--roughly, from 15 to 35--I was a prolific reader, of all genres, but particularly fiction. You know: the canon, the great books (and many that were not-so-great). While I still read, of course, typical life circumstances have much reduced the time available for it. Whether by coincidence or not, I find myself less patient with fiction, and more given to non-fiction, than used to be the case, but I continue to fight that. Proust I feel sure I will live to reread, all 3000 pages. But the 1000 pages of Les Miserables? Probably not. Much of Dickens I hope to reread, but probably not Barnaby Rudge. Recently I read Harold Bloom claiming that rereading Samuel Richardson's Clarissa was a great priority. Really? I've never read Richardson even once. Do I need to read him before expending time on rereading Jonathan Swift? And should I do that before, or after, I brush up on American history?
Inasmuch as there is nothing outside of reality, fiction is merely a peculiar branch of non-fiction, reality's myriad conscious self-reflections. Per Stendhal, a novel is a mirror carried along a main road, but it is a puzzling kind of mirror, with surprising concavities and convexities. Fiction seeks reflections that reverberate and recreate reality in microcosm, a la Borges's Aleph. A successful work of art achieves a unity that symbolically reproduces the completeness of reality. Non-fiction is always a magnifying glass, if not a microscope--clarity is purchased at the expense of breadth. Fiction is a necessarily distorting mirror, since any simple mirror or magnifying glass capable of capturing everything we care about would have to be as large as the universe itself.
The stakes are high in the arts--the potential payoff is high, but when fiction seriously fails, it is upsetting, because it is as if reality itself is being mocked or even maimed. Bad non-fiction is like a lie, which is bad enough, but bad fiction is like blasphemy. I forever vacillate between Plato--who saw the arts as begetting deceptive images (among the myriad shadows on the wall of the cave), distractions from the pursuit of truth--and Aristotle, who argued that poetry at its best reveals necessary truths, while history merely documents contingencies. Perhaps it is just a matter of epistemological and existential focus, the iris of the inquisitive mind.
Wednesday, June 15, 2011
Is Psychiatry Like Acupuncture?
As I've discussed a few times here, this is the worst of times for antidepressants and other psychiatric medications; considering questionable efficacy and likely side effects, their popular esteem is at a low ebb. This makes them...a great deal like various alternative medicine treatments that remain highly popular and widely used (and paid for) despite the disdain of evidence-based medical critics.
In The Atlantic David H. Freedman discusses the persistent popularity of alternative medicine and its unlikely cohabitation with conventional research even at the Mayo Clinic and other hallowed institutions. He points out that while medicine made its reputation in the first half of the twentieth century with the significant (if not complete) conquest of infectious disease, its efforts to extend its domain to the kinds of chronic diseases that plague us today (diabetes, heart disease, cancer, Alzheimer's disease) have been frankly disappointing. What, exactly, has medicine done for us lately?
Psychotherapy and psychiatric medication have been targets of critical and cultural derision on the part of many for decades, yet millions of patients seem to derive some kind of healing experience from the pill or the couch, as the case (and the personal inclination) may be. The same could be said of the masses flocking to chiropractors, homeopaths, and, yes, acupuncturists in defiance of the conventional medical wisdom. We spend years in medical school learning about physiology, when practically speaking, healing arguably has more to do with constructing a healing ritual than with one's board scores. The "chemical imbalance," absurdly oversimplified though we hold it to be, may be like the acupuncturist's "lines of force," a necessary if fictional semantic scaffold on which to mount a clinical encounter. The shaman lives!
Monday, June 6, 2011
History of a Suicide
"I perceive I have not really understood any thing, not a single object, and
that no man ever can,
Nature here in sight of the sea taking advantage of me to dart upon me and
sting me,
Because I have dared to open my mouth to sing at all."
Walt Whitman
In puzzling over an unexpected suicide (and how many suicides are not, at some level, surprises?), we often ask empirical questions, as a detective might. How did this come about? Who or what is the primary culprit? But arguably the challenges suicide poses are chiefly existential and interpersonal, not factual. That is, the suicide, in rejecting life itself, dissents from values that we hold very dear.
And the question of "How could we not have known?" is more relational than epistemological. That is, suicide reminds us of the perturbing basic inscrutability of human relationships. If we do not know something so basic as whether someone is suicidal, what do we really know about them? That's why psychotherapeutic relationships can be the most intimate of all--not obviously in a physical sense, but in an existential one. The therapist often hears things that no one else in a person's life hears.
I just finished Jill Bialosky's History of a Suicide, which considers the suicide of her younger sister Kim some twenty years ago at the age of 21. It is a worthwhile and reflective addition to the suicide memoir shelf, but Bialosky is, like many, preoccupied with questions of causation. The problem is that completed suicide is complex and rare (relative to the numbers of the depressed); why would we expect suicide to be any more fathomable or predictable than other atypical behaviors, such as murder or sudden religious conversion? If we had the technology or insight to predict individual suicides, what other behaviors might we be able to foretell?
Bialosky seeks out a suicide specialist who tellingly conducts a "psychological autopsy," as if we can answer the dilemma of suicide using the tools of pathology. Unsurprisingly, a number of potential contributing factors come to light: a family history of mental illness and even suicide, a father who abandoned the family and ignored or rejected Kim, a depressed and withdrawn mother, an abusive boyfriend, and alcohol and drugs. This list is noteworthy for its obviousness and for the fact that every one of these things is objectionable in its own right even apart from any possible relation to suicide. The things we might do to reduce suicide risk--maintain family integrity, shore up communities, limit drug use, and increase awareness and treatment of depression--are things that we ought to be doing anyway. These influences ultimately tell us nothing, because we do not know which is necessary or sufficient.
The other thing that suicide teaches is how little we sometimes know of ourselves. It appears that a certain fraction of suicides, at least the final determination to act, are impulsive. If we could interview completed suicides after the fact, I suspect that a significant number would express surprise, if not dismay, that they actually went through with it.
that no man ever can,
Nature here in sight of the sea taking advantage of me to dart upon me and
sting me,
Because I have dared to open my mouth to sing at all."
Walt Whitman
In puzzling over an unexpected suicide (and how many suicides are not, at some level, surprises?), we often ask empirical questions, as a detective might. How did this come about? Who or what is the primary culprit? But arguably the challenges suicide poses are chiefly existential and interpersonal, not factual. That is, the suicide, in rejecting life itself, dissents from values that we hold very dear.
And the question of "How could we not have known?" is more relational than epistemological. That is, suicide reminds us of the perturbing basic inscrutability of human relationships. If we do not know something so basic as whether someone is suicidal, what do we really know about them? That's why psychotherapeutic relationships can be the most intimate of all--not obviously in a physical sense, but in an existential one. The therapist often hears things that no one else in a person's life hears.
I just finished Jill Bialosky's History of a Suicide, which considers the suicide of her younger sister Kim some twenty years ago at the age of 21. It is a worthwhile and reflective addition to the suicide memoir shelf, but Bialosky is, like many, preoccupied with questions of causation. The problem is that completed suicide is complex and rare (relative to the numbers of the depressed); why would we expect suicide to be any more fathomable or predictable than other atypical behaviors, such as murder or sudden religious conversion? If we had the technology or insight to predict individual suicides, what other behaviors might we be able to foretell?
Bialosky seeks out a suicide specialist who tellingly conducts a "psychological autopsy," as if we can answer the dilemma of suicide using the tools of pathology. Unsurprisingly, a number of potential contributing factors come to light: a family history of mental illness and even suicide, a father who abandoned the family and ignored or rejected Kim, a depressed and withdrawn mother, an abusive boyfriend, and alcohol and drugs. This list is noteworthy for its obviousness and for the fact that every one of these things is objectionable in its own right even apart from any possible relation to suicide. The things we might do to reduce suicide risk--maintain family integrity, shore up communities, limit drug use, and increase awareness and treatment of depression--are things that we ought to be doing anyway. These influences ultimately tell us nothing, because we do not know which is necessary or sufficient.
The other thing that suicide teaches is how little we sometimes know of ourselves. It appears that a certain fraction of suicides, at least the final determination to act, are impulsive. If we could interview completed suicides after the fact, I suspect that a significant number would express surprise, if not dismay, that they actually went through with it.
Sunday, June 5, 2011
Who Needs Psychiatrists?
I have seen a medicine
That's able to breathe life into a stone,
Quicken a rock, and make you dance canary
With spritely fire and motion, whose simple touch
Is powerful to araise King Pippen, nay,
To give great Charlemain a pen in's hand
And write to her a love-line.
All's Well that Ends Well
The criticisms of contemporary psychiatry are coming fast and furious now, and not just from the fringe any more. Cheryl Fuller at Jung at Heart refers to a review by Marcia Angell of three recent anti-psychiatry volumes (of which I have read Daniel Carlat's Unhinged and Robert Whitaker's Anatomy of an Epidemic, but not Irving Kirsch's The Emperor's New Drugs). And while it's not specifically about psychiatry, an American Scholar article by Harriet Washington documents the discouraging corruption of medical research and publishing by so-called Big Pharma.
The mounting charges are of the most serious kind, and warrant a full-on response from the profession (which this blog post does not aspire to be). To very briefly summarize, the basic effectiveness of antidepressant drugs (and to greater or lesser extents, all psychiatric medications) is increasingly dubious as the integrity of research purportedly showing their efficacy is called into question. Critics maintain that for decades (antidepressants came into general use in the 1960's), thousands of psychiatrists (and of course other physicians as well) and millions of patients have prescribed and taken non-therapeutic compounds based on an underestimation of the placebo effect.
As for neurobiology, critics point out, correctly, that there is no evidence for any specific "chemical imbalance" that antidepressants allegedly alleviate. However, this is not the crux of the issue, for other central nervous system agents (e.g. anticonvulsants and anesthetics) have mechanisms of action that remain somewhat mysterious. And depression is in fact correlated with specific neurobiological states, but only because every psychological state--falling in love, undergoing religious conversion--can only be based in the brain. The question is not whether any given psychological phenomenon has a biological correlate (of course it does); the question is whether said phenomenon is best understood and potentially modified in chemical as opposed to other (psychological, interpersonal, social) terms.
It is one thing to claim that antidepressants are overblown and oversold; it is quite another, of course, to claim that they are useless or even pernicious. For instance, Robert Whitaker's arguments can lead only to the conclusion that antidepressant drugs should be expunged from the earth, and that psychiatrists are either unwitting or cynical quacks for prescribing them. And of course, as psychologists and social workers have taken over much of the psychotherapy territory that used to belong to psychiatry, the profession's identity has been ever more given over to psychopharmacology. After all, Freud didn't think psychoanalysts needed to be physicians, and there is no evidence that psychiatrists make better therapists than those with other degrees, so absent real results from biological treatment, why does psychiatry exist, exactly, beyond a function as a research program?
As someone who has, regrettably, long recognized the limitations of existing drugs but who still prescribes them, what do I believe? And can what I believe be remotely legitimate inasmuch as my current livelihood (by no means opulent in doctorate-level terms, but reasonable) depends on these medications having a role? Intellectual honesty demands that if one has a pressing self-interest in believing something, one should subject that belief to fierce and insistent criticism. There is no sin greater than tendentiousness.
This discussion derives from the valorization of the randomized, placebo-controlled trial as the ultimate arbiter of medical outcome, very much at the expense of individual clinical judgment. After all, many hold that clinical judgment is subjective and idiosyncratic, and therefore open to bias and not to be trusted. If all that needs to be known about medications can be inferred from statistical trials, than anyone (such as Whitaker, a journalist) can know more about them than a physician. Indeed, on this view only the non-physician can accurately appraise medical treatments because his view is not warped by self-interest. And yet there is considerable question as to whether patients (or "patients") in rigidly controlled research studies are truly representative of real-world clinical encounters.
What, then, do I believe? I believe, with the Buddhists, that life is suffering (but not only that); the long history of humanity is one of untold miseries of anxiety and depression that were either merely endured (there being no other choice) or compensated for by relationships, religion, art, or alcohol. Like the agonies of even routine childbirth or the ravages of even typical old age, mental disorders have always been part of the human condition; only relatively recently have we tried to modify them. One can make an argument that all of these things should, again, be merely endured, but I don't think history has a rewind button. Yet the expectations regarding mood and anxiety have exceeded all bounds, as has the expectation that one has some right to reach ninety with sound mind and body.
I believe that existing drugs do not counteract specific or discrete physiological processes, but (like psychotherapy) are nonspecific mental balms. SSRI's and benzodiazepines are to mental distress as NSAID's and opiates are to physical distress, that is, they are often disappointing and attended by sometimes dismaying side effects, but millions of patients have found them of some use. I believe that in a modest way they reduce suffering, by no means always or even often, but on average. I believe this on the basis not of research studies, but of my clinical experience and that of many others. And the day I stop believing that is the day I will stop prescribing.
That's able to breathe life into a stone,
Quicken a rock, and make you dance canary
With spritely fire and motion, whose simple touch
Is powerful to araise King Pippen, nay,
To give great Charlemain a pen in's hand
And write to her a love-line.
All's Well that Ends Well
The criticisms of contemporary psychiatry are coming fast and furious now, and not just from the fringe any more. Cheryl Fuller at Jung at Heart refers to a review by Marcia Angell of three recent anti-psychiatry volumes (of which I have read Daniel Carlat's Unhinged and Robert Whitaker's Anatomy of an Epidemic, but not Irving Kirsch's The Emperor's New Drugs). And while it's not specifically about psychiatry, an American Scholar article by Harriet Washington documents the discouraging corruption of medical research and publishing by so-called Big Pharma.
The mounting charges are of the most serious kind, and warrant a full-on response from the profession (which this blog post does not aspire to be). To very briefly summarize, the basic effectiveness of antidepressant drugs (and to greater or lesser extents, all psychiatric medications) is increasingly dubious as the integrity of research purportedly showing their efficacy is called into question. Critics maintain that for decades (antidepressants came into general use in the 1960's), thousands of psychiatrists (and of course other physicians as well) and millions of patients have prescribed and taken non-therapeutic compounds based on an underestimation of the placebo effect.
As for neurobiology, critics point out, correctly, that there is no evidence for any specific "chemical imbalance" that antidepressants allegedly alleviate. However, this is not the crux of the issue, for other central nervous system agents (e.g. anticonvulsants and anesthetics) have mechanisms of action that remain somewhat mysterious. And depression is in fact correlated with specific neurobiological states, but only because every psychological state--falling in love, undergoing religious conversion--can only be based in the brain. The question is not whether any given psychological phenomenon has a biological correlate (of course it does); the question is whether said phenomenon is best understood and potentially modified in chemical as opposed to other (psychological, interpersonal, social) terms.
It is one thing to claim that antidepressants are overblown and oversold; it is quite another, of course, to claim that they are useless or even pernicious. For instance, Robert Whitaker's arguments can lead only to the conclusion that antidepressant drugs should be expunged from the earth, and that psychiatrists are either unwitting or cynical quacks for prescribing them. And of course, as psychologists and social workers have taken over much of the psychotherapy territory that used to belong to psychiatry, the profession's identity has been ever more given over to psychopharmacology. After all, Freud didn't think psychoanalysts needed to be physicians, and there is no evidence that psychiatrists make better therapists than those with other degrees, so absent real results from biological treatment, why does psychiatry exist, exactly, beyond a function as a research program?
As someone who has, regrettably, long recognized the limitations of existing drugs but who still prescribes them, what do I believe? And can what I believe be remotely legitimate inasmuch as my current livelihood (by no means opulent in doctorate-level terms, but reasonable) depends on these medications having a role? Intellectual honesty demands that if one has a pressing self-interest in believing something, one should subject that belief to fierce and insistent criticism. There is no sin greater than tendentiousness.
This discussion derives from the valorization of the randomized, placebo-controlled trial as the ultimate arbiter of medical outcome, very much at the expense of individual clinical judgment. After all, many hold that clinical judgment is subjective and idiosyncratic, and therefore open to bias and not to be trusted. If all that needs to be known about medications can be inferred from statistical trials, than anyone (such as Whitaker, a journalist) can know more about them than a physician. Indeed, on this view only the non-physician can accurately appraise medical treatments because his view is not warped by self-interest. And yet there is considerable question as to whether patients (or "patients") in rigidly controlled research studies are truly representative of real-world clinical encounters.
What, then, do I believe? I believe, with the Buddhists, that life is suffering (but not only that); the long history of humanity is one of untold miseries of anxiety and depression that were either merely endured (there being no other choice) or compensated for by relationships, religion, art, or alcohol. Like the agonies of even routine childbirth or the ravages of even typical old age, mental disorders have always been part of the human condition; only relatively recently have we tried to modify them. One can make an argument that all of these things should, again, be merely endured, but I don't think history has a rewind button. Yet the expectations regarding mood and anxiety have exceeded all bounds, as has the expectation that one has some right to reach ninety with sound mind and body.
I believe that existing drugs do not counteract specific or discrete physiological processes, but (like psychotherapy) are nonspecific mental balms. SSRI's and benzodiazepines are to mental distress as NSAID's and opiates are to physical distress, that is, they are often disappointing and attended by sometimes dismaying side effects, but millions of patients have found them of some use. I believe that in a modest way they reduce suffering, by no means always or even often, but on average. I believe this on the basis not of research studies, but of my clinical experience and that of many others. And the day I stop believing that is the day I will stop prescribing.
Wednesday, June 1, 2011
Who Needs Narrative?
Arguing for the psychological uses of narrative, Bill Benzon at The Valve distinguishes the "autobiographical self" (i.e. identity over time) from the "core self" (i.e. one's integrated psycho-physiological state at any given time). He claims that the "core self," influenced as it may be by intense situational and physical factors (he uses hunger and sexual desire as examples), not to mention its transient nature, threatens to disrupt the autobiographical self. He suggests that narrative (he specifically mentions "play-acting" and "storytelling") usefully provides an overarching frame within which to understand and evaluate our dispositions and behaviors over time.
The account leaves out a lot of course (for instance, it would seem that temperament straddles both kinds of self). And his case seems a bit extreme--as if even a starving man would look back on his life as having been little more than an ultimately unsuccessful quest for food--but there may be something to it. After all, someone in a deep depression may view much of his past "through a glass darkly" in a way that lightens considerably when the episode relents. And obviously the two selves affect each other reciprocally and continuously.
Staying with Benzon's schema, it would seem that psychological distress occurs in two varieties. Unhappiness is a malady of the autobiographical self, a dismayed sense that one's story has somehow gone awry through vicisitudes of sensibility or circumstance. One seeks in a therapist a kind of narrative catalyst that will open up unimagined possibilities, including the often profound possibility of actually being listened to and perhaps even understood. Dysfunction of the core self manifests as symptoms that may actively impede functioning. There is considerable overlap between the two, but arguably we resort to psychotropic medication inasmuch as symptoms appear to be beyond the power of narrative to reframe. But nothing is more frustrating than to try to treat unhappiness with meds or to tackle narrative-resistant symptoms with more narrative. Diagnostic confusions and controversies arise from the difficulty of distinguishing symptoms from unhappiness.
It occurs to me that like certain other phenomena such as religion and even music, narrative broadly considered (that is, interest in all stories whether contained in books, film, gossip, or hearsay) is hard to explain because it is very widespread but not truly universal. Some faculties, such as hunger and thirst, are obviously ubiquitous because their absence is not compatible with life. Others, such as basic senses or sexuality, are not imperative for individual life but are so typical of the species that their absence is uncontroversially deemed pathological.
Inasmuch as existence is necessarily temporal, some interest in narrative is presupposed, even if only speculation as to where the next meal will come from. But sophisticated narrative--that is, at least at the level of communal folk tales--has, like religion, been found to exist in virtually every human society. And yet just as there is a reliable minority of individuals who are irreligious, there are of course people here and there who are relatively free of the narrative bug, who may be more invested in other domains of experience (facts, ideas, bodily experience, etc.). If religion and narrative truly are central to (individual and species) human identity, then how is it that even a (non-pathological) small minority more or less escape their purview? Perhaps diversity itself is such a powerful evolutionary engine that it constantly throws out alternatives to the prevailing cultural trajectory, suggesting of course that those faculties we view as indispensable are actually contingent.
The account leaves out a lot of course (for instance, it would seem that temperament straddles both kinds of self). And his case seems a bit extreme--as if even a starving man would look back on his life as having been little more than an ultimately unsuccessful quest for food--but there may be something to it. After all, someone in a deep depression may view much of his past "through a glass darkly" in a way that lightens considerably when the episode relents. And obviously the two selves affect each other reciprocally and continuously.
Staying with Benzon's schema, it would seem that psychological distress occurs in two varieties. Unhappiness is a malady of the autobiographical self, a dismayed sense that one's story has somehow gone awry through vicisitudes of sensibility or circumstance. One seeks in a therapist a kind of narrative catalyst that will open up unimagined possibilities, including the often profound possibility of actually being listened to and perhaps even understood. Dysfunction of the core self manifests as symptoms that may actively impede functioning. There is considerable overlap between the two, but arguably we resort to psychotropic medication inasmuch as symptoms appear to be beyond the power of narrative to reframe. But nothing is more frustrating than to try to treat unhappiness with meds or to tackle narrative-resistant symptoms with more narrative. Diagnostic confusions and controversies arise from the difficulty of distinguishing symptoms from unhappiness.
It occurs to me that like certain other phenomena such as religion and even music, narrative broadly considered (that is, interest in all stories whether contained in books, film, gossip, or hearsay) is hard to explain because it is very widespread but not truly universal. Some faculties, such as hunger and thirst, are obviously ubiquitous because their absence is not compatible with life. Others, such as basic senses or sexuality, are not imperative for individual life but are so typical of the species that their absence is uncontroversially deemed pathological.
Inasmuch as existence is necessarily temporal, some interest in narrative is presupposed, even if only speculation as to where the next meal will come from. But sophisticated narrative--that is, at least at the level of communal folk tales--has, like religion, been found to exist in virtually every human society. And yet just as there is a reliable minority of individuals who are irreligious, there are of course people here and there who are relatively free of the narrative bug, who may be more invested in other domains of experience (facts, ideas, bodily experience, etc.). If religion and narrative truly are central to (individual and species) human identity, then how is it that even a (non-pathological) small minority more or less escape their purview? Perhaps diversity itself is such a powerful evolutionary engine that it constantly throws out alternatives to the prevailing cultural trajectory, suggesting of course that those faculties we view as indispensable are actually contingent.
Monday, May 30, 2011
Human Experience
"(We) occupy landscapes of values--worlds made up not of quantum lattice structures, but of opportunities and obstacles, affordances and hindrances."
Alva Noe
The full 13.7 post is here and worth reading.
I think that this, the sensation of swimming in a sea of significance(s), whether noxious or gratifying, is a major reason I wound up a psychiatrist. It is no wonder that our species yields paranoids and creates deities to worship. We do not generally perceive the universe as what it would be without us--that is, a constellation of infinite facts--but rather as a shifting drama of desire and revulsion, of affirmation and repudiation.
Alva Noe
The full 13.7 post is here and worth reading.
I think that this, the sensation of swimming in a sea of significance(s), whether noxious or gratifying, is a major reason I wound up a psychiatrist. It is no wonder that our species yields paranoids and creates deities to worship. We do not generally perceive the universe as what it would be without us--that is, a constellation of infinite facts--but rather as a shifting drama of desire and revulsion, of affirmation and repudiation.
Tuesday, May 24, 2011
You (Don't) Say It's Your Birthday
Happy birthday, Bob.
Alex Ross posted some favorite lines.
There is something about Dylan--the musician, the poet, the cryptic cultural figure-- that is gratuitously compelling. If grace were to exist, it would feel something like listening to Dylan, to that "thin wild mercury sound" of 45 years ago.
It seems miraculous that he is still alive, both literally and figuratively.
Favorite line? At random:
"Well, the comic book and me, just us, we caught the bus"
Alex Ross posted some favorite lines.
There is something about Dylan--the musician, the poet, the cryptic cultural figure-- that is gratuitously compelling. If grace were to exist, it would feel something like listening to Dylan, to that "thin wild mercury sound" of 45 years ago.
It seems miraculous that he is still alive, both literally and figuratively.
Favorite line? At random:
"Well, the comic book and me, just us, we caught the bus"
Subscribe to:
Posts (Atom)