tag:blogger.com,1999:blog-44257323525114686942024-02-07T00:09:59.927-08:00Ars PsychiatricaThe Arts of Psychiatry...Psychiatry of the ArtsNovalishttp://www.blogger.com/profile/10501890494890617030noreply@blogger.comBlogger422125tag:blogger.com,1999:blog-4425732352511468694.post-64664221883650390022012-08-25T13:31:00.001-07:002012-08-25T13:31:12.982-07:00Moving Day I have decided to embark on a new blog, this time on Typepad--it can be found <a href="http://narrativepsychiatry.typepad.com/">here</a>.Novalishttp://www.blogger.com/profile/10501890494890617030noreply@blogger.com0tag:blogger.com,1999:blog-4425732352511468694.post-57544586853448364822012-01-21T13:02:00.000-08:002012-01-22T09:41:55.879-08:00The Thousand Natural Shocks<div> John Gray <a href="http://www.prospectmagazine.co.uk/2011/12/freud-the-last-great-enlightenment-thinker/">argues</a> that Freud is out of fashion these days owing to his basically tragic view of human nature, according to which we are fundamentally conflicted creatures condemned to interpersonal (and intrapersonal) struggle. The point, to Freud, was to learn to live productively with that state of affairs--no "chicken soup for the soul" here! Full social and personal harmony and ultimate existential consolation are ideals we cannot achieve, so our best and only redemption is to learn to do without them.</div><div> </div><div> </div><div>Personally I happen to find this aspect of Freud appealing, much more so at any rate than his overweening dogmatism or his far-fetched psychosexual speculation. But if Freud was in fact a modern Stoic, his decline in influence and popularity merely reflects the fact that stoicism as a way of life has never been a mainstream ethos, at least in Western civilization. We'll be waiting a long time to hear a presidential candidate declare that his favorite philosopher is Epictetus. Whether wisely or not, human nature seems to crave more than what some of the more dour tenets of psychoanalysis can provide.</div><div> </div><div> </div><div>In a similar vein, Daniel Smith in the Times wonders (having trouble with the hyperlink function, sorry) at the persistently high level of anxiety in western culture, which objectively speaking is one of the more successful societies in history. Even the poor in the United States enjoy levels of material comfort unimagined by all but the most wealthy in most past eras. Whereas we experience "stress," the countless generations of history endured miseries of labor, climate, poverty, the random death of children due to infectious disease. This goes to show (unless we want to assume that our forebears experienced their lives as appalling affliction) that beyond a visceral baseline, suffering is never objective or absolute, but rather relative to our expectations, for ourselves and relative to others. </div>Novalishttp://www.blogger.com/profile/10501890494890617030noreply@blogger.com0tag:blogger.com,1999:blog-4425732352511468694.post-65289485862212603432012-01-18T16:02:00.000-08:002012-01-18T17:31:09.816-08:00Nature and ConservatismIn the <em>Times</em> Richard Friedman, M.D. <a href="http://www.nytimes.com/2012/01/17/health/depression-defies-rush-to-find-evolutionary-upside.html?_r=1&ref=health">questions</a> the widely debated evolutionary origins and/or advantages of depression. While happiness may not have been selected for survival advantage over the eons (emotional hypersensitivity, paranoia, and compulsivity have their uses in certain environments), he reminds us of the naturalistic fallacy, that is, we shalt not derive an ought from an is. We do not hesitate to decry genocide, bacterial infection, or "nature red in tooth and claw" even though such phenomena are eminently natural.<br /><br />Theoretically there is nothing, not even cheesecake or Youtube, outside of nature (there is only one reality after all), but practically human beings have always distinguished between realms of culture (that which we believe we have some power to modify) and nature (about which, like the weather, we can only ultimately talk and not do anything). And one doesn't have to be a tree-hugger to acknowledge some sublimity of nature as the realm from which we came and which remains ultimately beyond us. Insofar as nature has accomodated the evolution of human beings over a million years (and of life in general over several billion years), it constitutes a kind of metaphysical cradle that we do well to rock only gently. It is a comfort to know that countless galaxies are beyond the capacity of humanity to despoil. Confronted with nature's nearly infinite array of figurative knobs and levers, we eagerly push this or switch that, but it still remains quite possible that human civilization will drive life on earth into an ecological ditch over the next millenium. The birth of consciousness may turn out to have been a tragedy for the biosphere--or not.<br /> <br />And yet one does commit the naturalistic fallacy every day, every moment, as life itself is the fundamental is from which we derive the ought. Nietzsche's ideal of the "eternal recurrence," the willingness to live one's life over again, in every inevitable detail and infinitely many times, is the absolute expression of the naturalistic fallacy. Some fallacy. If the ought has no connection to the is, where else could it come from?Novalishttp://www.blogger.com/profile/10501890494890617030noreply@blogger.com0tag:blogger.com,1999:blog-4425732352511468694.post-88572485672886337982012-01-11T18:27:00.000-08:002012-01-11T18:47:50.567-08:00In The BeginningAs I read about the <a href="http://www.3quarksdaily.com/3quarksdaily/2012/01/the-accidental-universe-sciences-crisis-of-faith.html">physicists' dismay</a> at the possibility of our "accidental universe" amid myriad possible universes, I am puzzled over what we expect to find, ultimately, in so-called fundamental particles or laws. After all, what physical law could be so fundamental as to entail the existence of something rather than nothing?<br /><br />The human mind has two explanatory needs, one for cause-and-effect and the other for narrative meaning, but it seems to me that both of these cannot be satisfied at the same time. Science does a marvelous job of explaining the behavior of matter within the range of conceivable human experience, but as we pursue cause-and-effect into the remoteness of time and abstraction, science leads to infinite regression. At a certain point, neither the Big Bang nor the infinite multiverse suffices as explanation; one can only say that there is something rather than nothing and that is that. We don't know why.<br /><br />Narrative accounts on the other hand may gratify the basic emotional need for explanation, but then science goes out the window. There is something rather than nothing because God is in all places and all times--on this view a warm glow of necessity takes the place of the implacably arbitrary.<br /><br />We have evolved as both calculating and valuing creatures, but these local faculties, while estimable in the human milieu, bear diminishing power into the deeps of space and time.Novalishttp://www.blogger.com/profile/10501890494890617030noreply@blogger.com1tag:blogger.com,1999:blog-4425732352511468694.post-5994949504296640252011-09-12T08:47:00.000-07:002011-09-12T09:08:39.999-07:00The Shaman Speaks"All comes by the body, only health puts you rapport with the universe."<br /><br />Walt Whitman, from "By Blue Ontario's Shore"<br /><br /><br />This quote--which could serve as credo for integrative medicine--is the kind of thing that occasionally reminds me why I went into psychiatry. Beyond the often questionable DSM diagnoses, the vagaries of therapy, and the imperfect biological treatments, what we are after is a state of attunement and acceptance in which a biological being achieves transcendence of the merely physical without, necessarily, any recourse to the supernatural.<br /><br />It is not the work of poetry to answer all our questions, of course, and one can legitimately wonder what sorts of subjective states, interpersonal relationships, and achievements of meaning must come together to constitute "rapport." But if we say that health is merely the absence of disease (or disorder), if only to trim the ambitions of restless and overweening doctors (and their many accomplices and handmaidens in the behemoth that is the health care industry), it is nonetheless true that it is typical of consciousness to aspire to something more than just the absence of suffering. Perhaps poets pick up about where physicians trail off.Novalishttp://www.blogger.com/profile/10501890494890617030noreply@blogger.com1tag:blogger.com,1999:blog-4425732352511468694.post-24336574131255586522011-09-07T13:09:00.000-07:002011-09-07T17:08:29.807-07:00Mental Illness Is Whatever We Say It Is"Psychology, which explains everything<br />explains nothing,<br />and we are still in doubt."<br /><br />Marianne Moore, from "Marriage"<br /><br /><br />By "we" I don't mean we <span style="font-style: italic;">psychiatrists</span>, at least not primarily, but rather "we the people." <span style="font-style: italic;">Caseness</span>, or the determination of what counts as a mental disorder and what doesn't, is not something we go out and discover in nature; rather, it is a social category arrived at both explicitly and implicitly through cultural debate. The psychiatric profession obviously has opinions about caseness, but these do not go unanswered or unlimited by society at large.<br /><br />In large part, antipsychiatry critique has been aimed at the extent of psychiatric diagnoses, both the numbers of diagnoses themselves (larger in every succeeding edition of DSM, we are reminded) and of course the numbers of people given those diagnoses. Suddenly it seems as if every other kid has ADHD and/or autism. Recently several psych blogs cited a recent <a href="http://psychcentral.com/news/2011/09/06/study-finds-nearly-2-in-5-europeans-suffer-from-mental-disorders/29177.html">survey</a> claiming that 38% of a European sample suffers a mental disorder <span style="font-style: italic;">in a given year</span>. This included substance abuse and dementia, but nonetheless it seems like a high number (the 5 or 10-year prevalence would be significantly higher).<br /><br />I think that 38% seems like a high number for reasons both illegitimate and legitimate. Even now there is a tendency, more latent in some than others, to view those with mental disorders as <span style="font-style: italic;">the mad</span>, an appalling but surely very minority group safely stowed away in institutions. The notion that "the mentally ill" walk the streets and even have jobs and families like you and I remains foreign to some. But there is also the real concern that the<span style="font-style: italic;"> sick role,</span> a transaction that officially relieves the patient of at least some social responsibility, loses its meaning when used too widely. In that respect, there is too little appreciation of the great variation in severity of mental disorders; just as one may go to an internist for a touch of gastritis or for cancer, a technical psychiatric diagnosis may or may not involve significant disability or the use of the sick role.<br /><br />Whether medical or psychiatric, diagnosis when applied liberally enough approaches the condition of <span style="font-style: italic;">enhancement</span>. For Freudians neurosis was an inescapable condition of humanity, so at certain times and places (and with sufficient economic resources) to be in analysis did not mark one as "sick" so much as self-aware and ambitious. Similarly, in those older than 85, significant dementia is closer to the rule than to the exception, so statistically speaking the effective treatment (which we don't yet have) of dementia in the very old would in fact qualify as enhancement. And for modern medicine, mortality itself has virtually become a disease (which as the Onion occasionally reminds us, retains its 100% prevalence despite our best efforts). When we seriously discuss mental disorders having a prevalence greater than 50%, we start to consider syndromes that are, <span style="font-style: italic;">in toto</span>, to be expected of the human condition, at least at this place and time.<br /><br />Enhancement may well be justified, depending on the circumstances. The question is always: is treating any given phenomenon <span style="font-style: italic;">clinically</span> (that is, as a syndrome worthy of specific medical intervention) likely to be helpful (that is, to lead to better functional outcomes, in the case of those problems for which we really do have treatments, or to better understanding of ourselves and others, in the case of those problems that remain intractable)? Or would it be better to consider the issue as a social/moral/cultural/existential difficulty? That is really the question, and not one that neuroscience can shed any light on whatsoever. Biologically, all human capacities appear to exist on dimensional continua, and the point at which we indicate "pathology" or "caseness" is a social and interpretive outcome.Novalishttp://www.blogger.com/profile/10501890494890617030noreply@blogger.com0tag:blogger.com,1999:blog-4425732352511468694.post-63205324319876434202011-09-06T13:28:00.000-07:002011-09-06T14:21:22.816-07:00The Religion of the Good, Part 2A recent <span style="font-style: italic;">New Yorker</span> profile of the philosopher Derek Parfit mentioned that the late Bernard Williams once dismissed the ideal of a universally compelling moral code as (I paraphrase) "something you use on the men who come to take you away." Indeed, implied in the "problem of evil" is the conviction (or fantasy perhaps) that if we could only find the right combination for the great moral mystery vault, the ponderous door of error would swing open, releasing a radiance that would burn away the scales from the eyes of the benighted.<br /><br />I imagine that some religious believers have a similar feeling that if they could only depict or praise God rightly, his existence and glory would be as plain to everyone else as they are to them. The holy grail of thought is the proposition (or grand scheme of propositions) that is as self-evident as 2 + 2 = 4 but as transcendent and as life-changing as the existence of God. That is the constructed idea(l) that we imagine would stop the bad men in their tracks and bring them to their knees. If God does not exist, then it will be necessary to invent (it)--this is the project that is at least implicit in non-relativistic philosophy. As Wallace Stevens wrote, "One day they will get it right at the Sorbonne."<br /><br />I once read a review by Helen Vendler in which she claimed that the role of the critic is not only (or even primarily) to explain or to justify, but also to <span style="font-style: italic;">celebrate</span>. Similarly, I think that for anyone who reflects seriously about the moral life, explanation and justification go only so far, beyond which point one can only aspire to praise and embody one's views. The barbarians who burn down the monastery are unfazed by the crucifix; likewise, no secular moral system achieves the potency of a talisman. To accept this is also to accept a troubling existential diversity in human nature--other people see the great questions in the same way that I do, except when they don't do so at all. Perhaps the Tower of Babel is the central metaphor for humanity, making us the most atypical species. There is a strain in philosophy that seeks to tear down the tower in favor of a second Garden of Eden, done rightly this time.<br /><br />The problem is that many men (most of them, alas, have been men) have been sure that they beheld the Truth, and terrible things have been done in the name of Truth. The point is to religiously (in the generic sense) embrace a system of meaning while avoiding clinical or moral insanity. Just as Satanism may be an internally consistent religion, so may there be functioning philosophies of evil (National Socialism, al Qaeda, etc.). We denounce them not because they have no justification (they do have their internal justifications), but because we find them pernicious and repugnant. Our grounds for doing so may be ultimately contingent on the creatures that we evolved to be, but that is the best we can do--we can never escape history by inventing ourselves <span style="font-style: italic;">de novo</span>. By and large, we also happen contingently to find the blues and golds of sea, sky, and sun to be gratifying, and we can only be grateful that we do so. The truth is not given in any simplistic way, but there is also no truth that does not derive, in some fantastically complicated way and filtered through many generations of human consciousness, from our origin.Novalishttp://www.blogger.com/profile/10501890494890617030noreply@blogger.com0tag:blogger.com,1999:blog-4425732352511468694.post-68737972365676246232011-09-05T13:00:00.000-07:002011-09-05T14:03:46.622-07:00The Religion of the GoodA couple of weeks ago, in the <span style="font-style: italic;">New York Times</span> philosophy feature "The Stone," Joel Marks <a href="http://opinionator.blogs.nytimes.com/2011/08/21/confessions-of-an-ex-moralist/">confessed</a> his loss of faith in objective morality:
<br />
<br /><span style="font-weight: bold;">"I thought I was a secularist because I conceived of right and wrong as standing on their own two feet, without prop or crutch from God. We should do the right thing because it is the right thing to do, period. </span><span style="font-style: italic; font-weight: bold;">But this was a God too</span><span style="font-weight: bold;">. It was the Godless God of secular morality, which commanded without commander--whose ways were thus even more mysterious than the God I did not believe in, who at least had the intelligible motive of rewarding us for what He wanted us to do." (Italics in original).</span>
<br />
<br />Marks goes on to claim that even if we withdraw the quasi-theistic vehemence of our confidence in objective morality, and thus acknowledge the mere contingency of our beliefs, this needn't change our actual practice. We continue to believe what we believe and have the right to advocate our views in accord or in competition with others, but according to Marks, we can never claim that the views of others are <span style="font-style: italic;">wrong</span>, only that they lead to different consequences. Such advocacy would seem able to achieve moral consistency, and not full justification. For instance, Marks notes animal welfare as one of his central preoccupations. Alluding to the basic moral tenet that avoidable suffering is wrong, one may educate others about animals' lives in factory farms, but not add the emotional force of moral disapprobation (which, Marks maintains, may provoke resistance or resentment as much as anything).
<br />
<br />I think that this is wrong and that it mistakes human moral development. At a certain level we embrace certain traditions, rituals, and moral standards not because we pretend to ultimate moral justification of them, but because the alternative is chaos. We raise our children to believe that certain behaviors are not merely different from what we happen to do--they are wrong. We watch football rather than soccer by virtue of mere geographic contingency; while we may prefer football, we recognize that this is likely due to acculturation and habituation. But when we say that it is not right to abuse animals, we assert that this true everywhere and for everyone.
<br />
<br />Secular morality does therefore partake of the emotional conviction of religious faith, but this reflects its fervor, not its groundlessness, and hence is a mark of strength and not weakness. The "God" of secular morality is an impersonal ideal that we collectively construct, not a personal interlocutor that we discover. There are, of course, many versions of this "God" just as there are many versions of the God of the Christian church (and obviously Islam and Judaism). But I think there can therefore be a fundamental secular referent of the term "Godless," which denotes not merely he who lacks faith in the supernatural, but he who is unable or unwilling to shape his behavior according to moral ideals and/or the suffering of others (conduct which we may designate as psychopathic or evil).
<br />
<br />Near the beginning of Terence Malick's "The Tree of Life," the narrator comments that there are those who live in a "state of nature" and those who live in a "state of grace." We live in a "state of nature" insofar as we merely gratify our impulses, even if to the detriment of others, or complacently embrace our (evolutionarily) contingent dispositions. And there is a secular version of the "state of grace" whereby we believe ourselves to be free to (collaboratively) fashion a moral ideal.
<br />
<br />The "religion" of secular ethics is prey to the same pathologies as conventional religion, i.e. propensities to rigidity, dogma, self-righteousness, hypocrisy, and exclusion. But it is also offers the same potential for affiliation and transcendence (if not, granted, the same degree of narrative interest or life-after-death consolation). I consider myself agnostic because I do not find any of the world's supernatural deities to be existentially compelling, but my attachment to, say, the Golden Rule (among other moral precepts) does have, as Joel Marks rightly argues, a good deal of faith to it. But inasmuch as there can really be no doubt as to whether the Golden Rule exists, this my attitude could be said to involve love more than belief.
<br />Novalishttp://www.blogger.com/profile/10501890494890617030noreply@blogger.com0tag:blogger.com,1999:blog-4425732352511468694.post-53348764768530607692011-07-25T17:40:00.000-07:002011-07-25T18:01:25.699-07:00Captain America"The denial of moral absolutism leads not to relativism, but to nihilism."<br /><br />Paul Boghossian, <a href="http://opinionator.blogs.nytimes.com/2011/07/24/the-maze-of-moral-relativism/?hp">"The Maze of Moral Relativism"</a><br /><br /><br />I never thought I'd see a decent Captain America film in my lifetime, but this time Marvel has managed brio without ponderousness. When I was into comics in the 1980's, Cap was, it must be said, my favorite. While I enjoyed a number of titles, he eschewed the smart-alecky goofiness of Spider-Man, the self-involvement of the X-Men, and the contrived contortions of the Fantastic Four; sober but spirited, he was neither the hipster Batman nor the staid Superman (that George Washington of superheroes).<br /><br />In the 1980's, shrouded by the forgetfulness of his reading public, Captain America bore little resemblance to the "old-growth superhero" (in A. O. Scott's memorable phrase) of the 1940's. Making up in steadfastness for what he lacked in flamboyance, he merely did his workaday thing month after obdurate month. In the new movie he reclaims a bit of the Nazi-slugging romance (Red Skull always was the villain par excellence, implacable and inscrutable without being ridiculous, compared to which Darth Vader was a clown).<br /><br />Needless to say, Cap also embodies American exceptionalism as well as the absolute injunction to act morally. As Boghossian compellingly argues in his piece, if one wishes to avoid believing in nothing, it is logically necessary to believe in something. For the non-psychopath there is no evading moral dialogue (or in the case of comic book films, moral combat).Novalishttp://www.blogger.com/profile/10501890494890617030noreply@blogger.com3tag:blogger.com,1999:blog-4425732352511468694.post-28562534632007212522011-07-24T13:59:00.000-07:002011-07-24T14:31:58.977-07:00EccentricsLeon Wieseltier at <span style="font-style: italic;">The New Republic</span> is the rare prophet with subtlety, arguing with great ingenuity but always in <span style="font-style: italic;">opposition</span>, whether to thoughtlessness, smug certitude, or superficial sociability. He is the rare intellectual insider who dares to be deeply and skeptically unfashionable; as such, he steers a tight course between the curmudgeonly, the lugubrious, and the devastating.<br /><br />In his most recent piece (not available online except to TNR subscribers), he uses the metaphor of birds that sing at night (because they can't get a tune in edgewise in the growing cacophony of the urban day) to lament his growing disconnection with the insulted and humiliated of the world:<br /><br /><span style="font-style: italic;">"Not long ago I surprised myself with the embarrassing thought that I no longer know any lonely people...But I am cut off from the ones who are cut off, from the disconnected and the un-networked (our technology of communications is supposed to have made such marginalizations obsolete, but I do not believe it: our culture is filling up with evidence of the lonely digital crowd), the ones who lead lives of radical solitariness, of aloneness without appeal, with no bonds to console them and no prospects to divert them, who struggle for stimulation and expression, whose beds are deserts, whose phones almost never ring, who march through their difficulties without any expectation of serendipity or transcendence. Their absence from my experience makes me feel disgracefully narrow."</span><br /><br />This is a brave admission, and an acknowledgement by Wieseltier that he is, despite himself, one of the elite. But as one who gets to know many such people (as many physicians and most social workers do), I see a risk in extolling the lives of the disaffected and alienated. There seems to be romanticizing here, as of the overlooked poet scribbling in his garret, the anchorite glorying in his desert cave, or the oppressed dissident in the labor camp. Wieseltier seems to be claiming the inherent dignity of suffering, and while there is that, does this mean we should be any less assiduous in our struggle to alleviate distress? Suffering has the potential to lead to wisdom, but arguably in actuality it most commonly does not.<br /><br />The prophet (whether secular or religious) is always positioned somewhere between the eccentric and the crank. The eccentric lingers "away from the center" of human experience, but can still engage in dialogue with a significant part of his fellows, whereas the crank has been cut off, as when a man goes into the desert for transcendence but never makes it back to relate the tale.Novalishttp://www.blogger.com/profile/10501890494890617030noreply@blogger.com1tag:blogger.com,1999:blog-4425732352511468694.post-20802581021282916702011-07-18T14:05:00.000-07:002011-07-18T14:32:48.187-07:00The Elegaic ModeFor W. G. Sebald, the modern world, a composite of contemporary detritus and forlorn nature, is a kind of <span style="font-style: italic;">forme fruste</span> of the historical human condition. Sebald was a past-intoxicated writer, and in <span style="font-style: italic;">The Rings of Saturn</span> a ramble through southeastern England yields disquisitions on Joseph Conrad, herring fisheries, imperial decline, and the silkworm industry. The entropy is inescapable. Here are a few choice quotes:<br /><br />(On fishermen): "I do not believe that these men sit by the sea all day and all night so as not to miss the time when the whiting pass, the flounder rise or the cod come in to the shallower waters, as they claim. They just want to be in a place where they have the world behind them, and before them nothing but emptiness."<br /><br />(On the writer Michael Hamburger): "Perhaps we all lose our sense of reality to the precise degree to which we are engrossed in our own work, and perhaps that is why we see in the increasing complexity of our mental constructs a means for greater understanding, even while intuitively we know that we shall never be able to fathom the imponderables that govern our course through life."<br /><br />(On Thomas Abrams, who devoted his life to a minute reconstruction of the Temple of Jerusalem): "In the final analysis, our entire work is based on nothing but ideas, ideas which change over the years and which time and again cause one to tear down what one had thought to be finished, and begin again from scratch."<br /><br />(On the melancholy of medieval weavers): "It is difficult to imagine the depths of despair into which those can be driven who, even after the end of the working day, are engrossed in their intricate designs and who are pursued, into their dreams, by the feeling that they have got hold of the wrong thread."<br /><br />(On the destructiveness of civilization): "Like our bodies and like our desires, the machines we have devised are possessed of a heart which is slowly reduced to embers. From the earliest times, human civilization has been no more than a strange luminescence growing more intense by the hour, of which no one can say when it will begin to wane and when it will fade away."<br /><br />If as many say, we now live in the Anthropocene era, in which the activities of <span style="font-style: italic;">Homo sapiens</span> directly affect planet-wide processes, why can't we regard humanity with kindness, as we might regard any natural force? Just as a levee is meant to withstand the flood, one's mourning, indignation, and even resentment are meant to withstand and, if possible, to divert the human flood from that which one holds dear.Novalishttp://www.blogger.com/profile/10501890494890617030noreply@blogger.com0tag:blogger.com,1999:blog-4425732352511468694.post-21951270895746410482011-06-28T13:58:00.000-07:002011-06-28T14:38:56.424-07:00That (Self-) Blaming Feeling"Why, worthy thane,<br />You do unbend your noble strength to think<br />So brainsickly of things. Go get some water,<br />And wash this filthy witness from your hand."<br /><br />Lady Macbeth<br /><br /><br />In a <a href="http://www.theatlantic.com/magazine/archive/2011/07/the-brain-on-trial/8520/1/">compelling argument</a> for the congruence of brain and mind, and the ethics that ought to follow therefrom, David Eagleman maintains that blame derives from a misguided and outmoded belief in free will. He claims that blame is basically backward-looking, implying that one could and should have done differently (than one just did). But when we face up to monism and the brain-as-mechanism, we realize that, after the fact, there is nothing to do but acknowledge that given the conditions that prevailed at any past time, one could not in fact have acted differently.<br /><br />Eagleman argues that shame and blame are not, in fact, very good at modifying behavior, and what we need is a more rational and forward-looking attempt to achieve desired outcomes, in ourselves and others. A la B. F. Skinner, he proposes that we approach brains as we would approach engines or computers that are on the blink. Both sticks and carrots may be necessary to shape desired behaviors, but they should be undertaken in a dispassionate way, free of messy or reckless vindictiveness.<br /><br />There is nothing inherently objectionable about his advocacy of what he calls "the frontal workout," that is, an updated biofeedback project whereby one might learn (or teach) better control over impulses. But he might have said more about the phenomenology of guilt and blame, which are, after all, very deep aspects of human experience. These are very distinct and familiar subjective phenomena, and arguably they are far from arbitrary or nonsensical.<br /><br />Blame is the social group's means of imposing its norms, and blame works most effectively when it is internalized as guilt and shame. Blame is a deterrent, plain and simple. And as is so often the case, it works best when it is involuntary (when blame is reflected on too carefully, one arrives at <span style="font-style: italic;">Hamlet</span>). This is not to say that shame and blame are generally good things, merely that they are natural (and many perfectly natural human behaviors are odious). However, even in Eagleman's handyman-of-the-brain world, some impetus and motivation for change must exist, and I don't know whether that motive would come from unless from those primeval emotions of guilt and shame. They merely exist in healthy and in pathological forms. Guilt and shame may seem to be primarily about the past, but really they project forward into the future; like pain, they are the brain's message to itself: <span style="font-style: italic;">That didn't go well, so try something different</span>. Blame and guilt are modes of moral (self-)argument.<br /><br /><span style="font-weight: bold;">And in a follow-up</span> to the recent post about reading, the literati are a bit atwitter about Philip Roth's declaration that he has stopped reading fiction. In a <span style="font-style: italic;">Salon</span> <a href="http://www.salon.com/books/readers_and_reading/index.html?story=/books/laura_miller/2011/06/28/stopped_reading_fiction">article</a> Laura Miller speculates that inasmuch as fiction provides insight into character and human subjectivity, perhaps some do reach a point at which they have all the insight they need. After all, the novel isn't called the novel for nothing, and some readers do believe there is nothing new under the sun. But then again, one could paraphrase Samuel Johnson and say that "He who is tired of fiction is tired of life."Novalishttp://www.blogger.com/profile/10501890494890617030noreply@blogger.com2tag:blogger.com,1999:blog-4425732352511468694.post-332962035552421842011-06-26T12:03:00.000-07:002011-06-26T13:07:48.439-07:00The Tree of LifeSolitude from mere outward condition of existence becomes very swiftly a state of soul in which the affectation of irony and skepticism have no place...After three days of waiting for the sight of some human face, Decoud caught himself entertaining a doubt of his own individuality. It had emerged into the world of cloud and water, of natural forces and forms of nature. In our activity alone we find the sustaining illusion of an independent existence as against the whole scheme of things of which we form a helpless part.<br /><br />Joseph Conrad, <span style="font-style: italic;">Nostromo</span><br /><br /><br />Terence Malick's <span style="font-style: italic;">The Tree of Life</span> consistently defies expectations of coherent narrative, instead implanting myriad images implacably in the mind. One could be haunted by this film. As David Thomson wrote in his review in <span style="font-style: italic;">The New Republic</span>: "Less than a framework of story, we have a situation, and this is itself not just fair, but an enlightening novelty. Most of us do not feel that we are living stories (at least not until later); we believe we are getting on with a situation."<br /><br />As the movie's epigraph from <span style="font-style: italic;">Job</span> suggests, the situation is one of inevitable suffering and loss, albeit experienced in a perpetual haze of existential glory. The tone of the work is continually <span style="font-style: italic;">exalted</span>, which probably accounts for its controversial and varied reception. For those predisposed to its message, irony is silenced; the sacred is always a puzzle to the intelligentsia.<br /><br />The situation in <span style="font-style: italic;">The Tree of Life</span>, is, most mundanely, that of a family in 1950's Texas, but really Malick is concerned with the situation of human life and its vexed relation to life, broadly considered. Much has been made, both derisively and respectfully, of Malick's depiction of the history of the universe and the pre-human earth (dinosaurs even!), but I'm not sure why. Narratively, this is merely the use of a very wide-angle lens, and a salutary use at that--there is more to heaven and earth than is dreamt of in Manhattan. Indeed, a few aerial shots of early hominids would not have been out of place. Psychologically, the "family romance" may seem endlessly interesting, but neither man nor woman lives by interpersonal relationships alone. There is that which preceded us and that which will outlast us.<br /><br />In the first few minutes of the film, as we get our first impressionistic views of the O'Brien family, a female voice-over poses the contrast of nature and grace, asserting that the way of nature is domination and self-indulgence, whereas the way of grace is care and endurance. Much of the film unforgettably documents the necessity of nature--deep space, inscrutable water, arboreal visions, scathing light, barren rock, towering glass and steel--but the realm of grace is uniquely human. Consciousness is dualistic not in substance (body and soul, brain and mind) but in moral experience, in what we have no option but to choose.<br /><br />Only human beings, in all of life that we know of, can fail the test of grace, and we see the risk and stakes of such failure in the boy, Jack, of 1950's Waco and the contemporary man, Jack (a ravaged Sean Penn). Violence and predation antedated humanity by many millions of years, but only with the first glimmer of consciousness did the storyline of Cain and Abel come into the world. We see it in the boy Jack's sullen resentment of his father and his acts of petty boyhood mayhem (breaking windows, mistreating frogs, stealing lingerie). Similarly, only humanity is prey to despair, of which contemporary Jack appears to be a classic example, suffering Kierkegaard's "sickness unto death."<br /><br />Some reviews I've read seemed to infer that the culminating beach scene was some kind of Rapture-like representation of the end of the world, but to me it seemed a symbolic depiction of redemption, as Jack somehow breaks through his granitic alienation. The idea and the ideal of the sacred presume that amid seemingly endless tawdriness or trauma there are still spaces and times of grace if we can only find them.Novalishttp://www.blogger.com/profile/10501890494890617030noreply@blogger.com0tag:blogger.com,1999:blog-4425732352511468694.post-22203575862356381752011-06-25T06:35:00.000-07:002011-06-25T06:40:04.894-07:00Drill Imagination Right Through Necessity<div style="text-align: center;">Play<br /></div><br />Nothing's going to become of anyone<br />except death:<br /> therefore: it's okay<br />to yearn<br />too high:<br />the grave accommodates<br />swell rambunctiousness &<br /><br />ruin's not<br />compromised by magnificence:<br /><br />that cut-off point<br />liberates us to the<br /><br />common disaster: so<br /> pick a perch--<br />apple branch for example in bloom--<br />tune up<br />and<br /><br />drill imagination right through necessity:<br />it's all right:<br />it's been taken care of:<br /><br />is allowed, considering<br /><br /><br />A. R. AmmonsNovalishttp://www.blogger.com/profile/10501890494890617030noreply@blogger.com1tag:blogger.com,1999:blog-4425732352511468694.post-35343666110953605142011-06-19T09:42:00.000-07:002011-06-19T10:51:37.021-07:00Missing"How can one transmit to others the infinite Aleph, which my timorous memory can scarcely contain?"<br /><br />Jorge Luis Borges<br /><br /><br />Some time ago Linda Holmes at NPR wrote a <a href="http://www.npr.org/blogs/monkeysee/2011/04/21/135508305/the-sad-beautiful-fact-that-were-all-going-to-miss-almost-everything">wonderful piece</a> observing that, by virtue of sheer plenitude of space and time, each of us is destined to miss out on the vast majority of whatever it is we love in life. Far from being a downer, it is comforting and even self-transcending to realize that no matter how assiduous or dynamic one may be, there are just more people to meet, books to read, films to see, or sunsets to witness than any one life can manage. It is a reminder that even if, as the cliche goes, the world is much shrunken owing to the speed of travel and communication, one can divide infinity many times over and still be left with infinity. To live a lifetime is to gaze upon an ocean of experience, yet be allowed to dip one's hand in the water only once.<br /><br />One consequence of having a large "physical" library (as opposed to having a Kindle sitting unobtrusively on the table) is that the many hundreds of tomes mutely gaze outward, as if in reproach of my all-too-human forgetfulness. My eight-year-old has asked before, "Daddy, why do you have all of these books if you can get them all on the computer?" One reason is that my recall isn't what it once was, and my library is one kind of living personal record. Many volumes I do dip into now and again--a poem here, an essay or short story there--but how many, realistically, will I live to reread altogether?<br /><br />For some 20 years--roughly, from 15 to 35--I was a prolific reader, of all genres, but particularly fiction. You know: the canon, the great books (and many that were not-so-great). While I still read, of course, typical life circumstances have much reduced the time available for it. Whether by coincidence or not, I find myself less patient with fiction, and more given to non-fiction, than used to be the case, but I continue to fight that. Proust I feel sure I will live to reread, all 3000 pages. But the 1000 pages of <span style="font-style: italic;">Les Miserables</span>? Probably not. Much of Dickens I hope to reread, but probably not <span style="font-style: italic;">Barnaby Rudge</span>. Recently I read Harold Bloom claiming that rereading Samuel Richardson's <span style="font-style: italic;">Clarissa</span> was a great priority. Really? I've never read Richardson even once. Do I need to read him before expending time on rereading Jonathan Swift? And should I do that before, or after, I brush up on American history? <br /><br />Inasmuch as there is nothing outside of reality, fiction is merely a peculiar branch of non-fiction, reality's myriad conscious self-reflections. Per Stendhal, a novel is a mirror carried along a main road, but it is a puzzling kind of mirror, with surprising concavities and convexities. Fiction seeks reflections that reverberate and recreate reality in microcosm, a la Borges's Aleph. A successful work of art achieves a unity that symbolically reproduces the completeness of reality. Non-fiction is always a magnifying glass, if not a microscope--clarity is purchased at the expense of breadth. Fiction is a necessarily distorting mirror, since any simple mirror or magnifying glass capable of capturing everything we care about would have to be as large as the universe itself.<br /><br />The stakes are high in the arts--the potential payoff is high, but when fiction seriously fails, it is upsetting, because it is as if reality itself is being mocked or even maimed. Bad non-fiction is like a lie, which is bad enough, but bad fiction is like blasphemy. I forever vacillate between Plato--who saw the arts as begetting deceptive images (among the myriad shadows on the wall of the cave), distractions from the pursuit of truth--and Aristotle, who argued that poetry at its best reveals necessary truths, while history merely documents contingencies. Perhaps it is just a matter of epistemological and existential focus, the iris of the inquisitive mind.Novalishttp://www.blogger.com/profile/10501890494890617030noreply@blogger.com3tag:blogger.com,1999:blog-4425732352511468694.post-22638250406025659662011-06-15T16:49:00.000-07:002011-06-15T17:05:38.786-07:00Is Psychiatry Like Acupuncture?<div>As I've discussed a few times here, this is the worst of times for antidepressants and other psychiatric medications; considering questionable efficacy and likely side effects, their popular esteem is at a low ebb. This makes them...a great deal like various alternative medicine treatments that remain highly popular and widely used (and paid for) despite the disdain of evidence-based medical critics.<br /><br /></div><div> </div><div>In <em>The Atlantic</em> David H. Freedman <a href="http://www.theatlantic.com/magazine/archive/2011/07/the-triumph-of-new-age-medicine/8554/">discusses</a> the persistent popularity of alternative medicine and its unlikely cohabitation with conventional research even at the Mayo Clinic and other hallowed institutions. He points out that while medicine made its reputation in the first half of the twentieth century with the significant (if not complete) conquest of infectious disease, its efforts to extend its domain to the kinds of chronic diseases that plague us today (diabetes, heart disease, cancer, Alzheimer's disease) have been frankly disappointing. What, exactly, has medicine done for us lately?<br /><br /></div><div> </div><div>Psychotherapy and psychiatric medication have been targets of critical and cultural derision on the part of many for decades, yet millions of patients seem to derive some kind of healing experience from the pill or the couch, as the case (and the personal inclination) may be. The same could be said of the masses flocking to chiropractors, homeopaths, and, yes, acupuncturists in defiance of the conventional medical wisdom. We spend years in medical school learning about physiology, when practically speaking, healing arguably has more to do with constructing a healing ritual than with one's board scores. The "chemical imbalance," absurdly oversimplified though we hold it to be, may be like the acupuncturist's "lines of force," a necessary if fictional semantic scaffold on which to mount a clinical encounter. The shaman lives! </div>Novalishttp://www.blogger.com/profile/10501890494890617030noreply@blogger.com2tag:blogger.com,1999:blog-4425732352511468694.post-80793615100218884072011-06-06T10:34:00.000-07:002011-06-07T03:03:39.082-07:00History of a Suicide"I perceive I have not really understood any thing, not a single object, and<br />that no man ever can,<br />Nature here in sight of the sea taking advantage of me to dart upon me and<br />sting me,<br />Because I have dared to open my mouth to sing at all."<br /><br />Walt Whitman<br /><br /><br />In puzzling over an unexpected suicide (and how many suicides are not, at some level, surprises?), we often ask empirical questions, as a detective might. How did this come about? Who or what is the primary culprit? But arguably the challenges suicide poses are chiefly existential and interpersonal, not factual. That is, the suicide, in rejecting life itself, dissents from values that we hold very dear.<br /><br />And the question of "How could we not have known?" is more relational than epistemological. That is, suicide reminds us of the perturbing basic inscrutability of human relationships. If we do not know something so basic as whether someone is suicidal, what do we really know about them? That's why psychotherapeutic relationships can be the most intimate of all--not obviously in a physical sense, but in an existential one. The therapist often hears things that no one else in a person's life hears.<br /><br />I just finished Jill Bialosky's <span style="font-style: italic;">History of a Suicide</span>, which considers the suicide of her younger sister Kim some twenty years ago at the age of 21. It is a worthwhile and reflective addition to the suicide memoir shelf, but Bialosky is, like many, preoccupied with questions of causation. The problem is that completed suicide is complex and rare (relative to the numbers of the depressed); why would we expect suicide to be any more fathomable or predictable than other atypical behaviors, such as murder or sudden religious conversion? If we had the technology or insight to predict individual suicides, what other behaviors might we be able to foretell?<br /><br />Bialosky seeks out a suicide specialist who tellingly conducts a "psychological autopsy," as if we can answer the dilemma of suicide using the tools of pathology. Unsurprisingly, a number of potential contributing factors come to light: a family history of mental illness and even suicide, a father who abandoned the family and ignored or rejected Kim, a depressed and withdrawn mother, an abusive boyfriend, and alcohol and drugs. This list is noteworthy for its obviousness and for the fact that every one of these things is objectionable in its own right even apart from any possible relation to suicide. The things we might do to reduce suicide risk--maintain family integrity, shore up communities, limit drug use, and increase awareness and treatment of depression--are things that we ought to be doing anyway. These influences ultimately tell us nothing, because we do not know which is necessary or sufficient.<br /><br />The other thing that suicide teaches is how little we sometimes know of ourselves. It appears that a certain fraction of suicides, at least the final determination to act, are impulsive. If we could interview completed suicides after the fact, I suspect that a significant number would express surprise, if not dismay, that they actually went through with it.Novalishttp://www.blogger.com/profile/10501890494890617030noreply@blogger.com4tag:blogger.com,1999:blog-4425732352511468694.post-660078876050538392011-06-05T13:14:00.000-07:002011-06-05T16:03:14.110-07:00Who Needs Psychiatrists?I have seen a medicine<br />That's able to breathe life into a stone,<br />Quicken a rock, and make you dance canary<br />With spritely fire and motion, whose simple touch<br />Is powerful to araise King Pippen, nay,<br />To give great Charlemain a pen in's hand<br />And write to her a love-line.<br /><br /><span style="font-style: italic;">All's Well that Ends Well</span><br /><br /><br />The criticisms of contemporary psychiatry are coming fast and furious now, and not just from the fringe any more. Cheryl Fuller at <span style="font-style: italic;">Jung at Heart</span> <a href="http://www.jung-at-heart.com/jung_at_heart/good-review.html">refers</a> to a <a href="http://www.nybooks.com/articles/archives/2011/jun/23/epidemic-mental-illness-why/">review</a> by Marcia Angell of three recent anti-psychiatry volumes (of which I have read Daniel Carlat's <span style="font-style: italic;">Unhinged</span> and Robert Whitaker's <span style="font-style: italic;">Anatomy of an Epidemic</span>, but not Irving Kirsch's <span style="font-style: italic;">The Emperor's New Drugs</span>). And while it's not specifically about psychiatry, an <span style="font-style: italic;">American Scholar</span> <a href="http://www.theamericanscholar.org/flacking-for-big-pharma/">article</a> by Harriet Washington documents the discouraging corruption of medical research and publishing by so-called Big Pharma.<br /><br />The mounting charges are of the most serious kind, and warrant a full-on response from the profession (which this blog post does not aspire to be). To very briefly summarize, the basic effectiveness of antidepressant drugs (and to greater or lesser extents, all psychiatric medications) is increasingly dubious as the integrity of research purportedly showing their efficacy is called into question. Critics maintain that for decades (antidepressants came into general use in the 1960's), thousands of psychiatrists (and of course other physicians as well) and millions of patients have prescribed and taken non-therapeutic compounds based on an underestimation of the placebo effect.<br /><br />As for neurobiology, critics point out, correctly, that there is no evidence for any specific "chemical imbalance" that antidepressants allegedly alleviate. However, this is not the crux of the issue, for other central nervous system agents (e.g. anticonvulsants and anesthetics) have mechanisms of action that remain somewhat mysterious. And depression is in fact correlated with specific neurobiological states, but only because <span style="font-style: italic;">every</span> psychological state--falling in love, undergoing religious conversion--can only be based in the brain. The question is not whether any given psychological phenomenon has a biological correlate (of course it does); the question is whether said phenomenon is best understood and potentially modified in chemical as opposed to other (psychological, interpersonal, social) terms.<br /><br />It is one thing to claim that antidepressants are overblown and oversold; it is quite another, of course, to claim that they are useless or even pernicious. For instance, Robert Whitaker's arguments can lead only to the conclusion that antidepressant drugs should be expunged from the earth, and that psychiatrists are either unwitting or cynical quacks for prescribing them. And of course, as psychologists and social workers have taken over much of the psychotherapy territory that used to belong to psychiatry, the profession's identity has been ever more given over to psychopharmacology. After all, Freud didn't think psychoanalysts needed to be physicians, and there is no evidence that psychiatrists make better therapists than those with other degrees, so absent real results from biological treatment, why does psychiatry exist, exactly, beyond a function as a research program?<br /><br />As someone who has, regrettably, long recognized the limitations of existing drugs but who still prescribes them, what do I believe? And can what I believe be remotely legitimate inasmuch as my current livelihood (by no means opulent in doctorate-level terms, but reasonable) depends on these medications having a role? Intellectual honesty demands that if one has a pressing self-interest in believing something, one should subject that belief to fierce and insistent criticism. There is no sin greater than tendentiousness.<br /><br />This discussion derives from the valorization of the randomized, placebo-controlled trial as the ultimate arbiter of medical outcome, very much at the expense of individual clinical judgment. After all, many hold that clinical judgment is subjective and idiosyncratic, and therefore open to bias and not to be trusted. If all that needs to be known about medications can be inferred from statistical trials, than anyone (such as Whitaker, a journalist) can know more about them than a physician. Indeed, on this view <span style="font-style: italic;">only</span> the non-physician can accurately appraise medical treatments because his view is not warped by self-interest. And yet there is considerable question as to whether patients (or "patients") in rigidly controlled research studies are truly representative of real-world clinical encounters.<br /><br />What, then, do I believe? I believe, with the Buddhists, that life is suffering (but not only that); the long history of humanity is one of untold miseries of anxiety and depression that were either merely endured (there being no other choice) or compensated for by relationships, religion, art, or alcohol. Like the agonies of even routine childbirth or the ravages of even typical old age, mental disorders have always been part of the human condition; only relatively recently have we tried to modify them. One can make an argument that all of these things should, again, be merely endured, but I don't think history has a rewind button. Yet the expectations regarding mood and anxiety have exceeded all bounds, as has the expectation that one has some right to reach ninety with sound mind and body.<br /><br />I believe that existing drugs do not counteract specific or discrete physiological processes, but (like psychotherapy) are nonspecific mental balms. SSRI's and benzodiazepines are to mental distress as NSAID's and opiates are to physical distress, that is, they are often disappointing and attended by sometimes dismaying side effects, but millions of patients have found them of some use. I believe that in a modest way they reduce suffering, by no means always or even often, but <span style="font-style: italic;">on average</span>. I believe this on the basis not of research studies, but of my clinical experience and that of many others. And the day I stop believing that is the day I will stop prescribing.Novalishttp://www.blogger.com/profile/10501890494890617030noreply@blogger.com3tag:blogger.com,1999:blog-4425732352511468694.post-84427454374372478592011-06-01T07:34:00.000-07:002011-06-01T08:31:19.451-07:00Who Needs Narrative?Arguing for the psychological uses of narrative, Bill Benzon at <em>The Valve</em> <a href="http://www.thevalve.org/go/valve/article/neurochemistry_and_autobiography_on_the_benefits_of_narrative_for_a_coheren/">distinguishes</a> the "autobiographical self" (i.e. identity over time) from the "core self" (i.e. one's integrated psycho-physiological state at any given time). He claims that the "core self," influenced as it may be by intense situational and physical factors (he uses hunger and sexual desire as examples), not to mention its transient nature, threatens to disrupt the autobiographical self. He suggests that narrative (he specifically mentions "play-acting" and "storytelling") usefully provides an overarching frame within which to understand and evaluate our dispositions and behaviors over time.<br /><br />The account leaves out a lot of course (for instance, it would seem that temperament straddles both kinds of self). And his case seems a bit extreme--as if even a starving man would look back on his life as having been little more than an ultimately unsuccessful quest for food--but there may be something to it. After all, someone in a deep depression may view much of his past "through a glass darkly" in a way that lightens considerably when the episode relents. And obviously the two selves affect each other reciprocally and continuously.<br /><br />Staying with Benzon's schema, it would seem that psychological distress occurs in two varieties. Unhappiness is a malady of the autobiographical self, a dismayed sense that one's story has somehow gone awry through vicisitudes of sensibility or circumstance. One seeks in a therapist a kind of narrative catalyst that will open up unimagined possibilities, including the often profound possibility of actually being listened to and perhaps even understood. Dysfunction of the core self manifests as symptoms that may actively impede functioning. There is considerable overlap between the two, but arguably we resort to psychotropic medication inasmuch as symptoms appear to be beyond the power of narrative to reframe. But nothing is more frustrating than to try to treat unhappiness with meds or to tackle narrative-resistant symptoms with more narrative. Diagnostic confusions and controversies arise from the difficulty of distinguishing symptoms from unhappiness.<br /><br />It occurs to me that like certain other phenomena such as religion and even music, narrative broadly considered (that is, interest in all stories whether contained in books, film, gossip, or hearsay) is hard to explain because it is very widespread but not truly universal. Some faculties, such as hunger and thirst, are obviously ubiquitous because their absence is not compatible with life. Others, such as basic senses or sexuality, are not imperative for individual life but are so typical of the species that their absence is uncontroversially deemed pathological.<br /><br />Inasmuch as existence is necessarily temporal, some interest in narrative is presupposed, even if only speculation as to where the next meal will come from. But sophisticated narrative--that is, at least at the level of communal folk tales--has, like religion, been found to exist in virtually every human society. And yet just as there is a reliable minority of individuals who are irreligious, there are of course people here and there who are relatively free of the narrative bug, who may be more invested in other domains of experience (facts, ideas, bodily experience, etc.). If religion and narrative truly are central to (individual and species) human identity, then how is it that even a (non-pathological) small minority more or less escape their purview? Perhaps diversity itself is such a powerful evolutionary engine that it constantly throws out alternatives to the prevailing cultural trajectory, suggesting of course that those faculties we view as indispensable are actually contingent.Novalishttp://www.blogger.com/profile/10501890494890617030noreply@blogger.com0tag:blogger.com,1999:blog-4425732352511468694.post-86276158997123117322011-05-30T17:28:00.000-07:002011-05-30T17:39:59.333-07:00Human Experience"(We) occupy landscapes of values--worlds made up not of quantum lattice structures, but of opportunities and obstacles, affordances and hindrances."<br /><br />Alva Noe<br /><br />The full <span style="font-style: italic;">13.7</span> post is <a href="http://www.npr.org/blogs/13.7/2011/05/28/136726099/home-sweet-home-finding-ourselves">here</a> and worth reading.<br /><br />I think that this, the sensation of swimming in a sea of significance(s), whether noxious or gratifying, is a major reason I wound up a psychiatrist. It is no wonder that our species yields paranoids and creates deities to worship. We do not generally perceive the universe as what it would be without us--that is, a constellation of infinite facts--but rather as a shifting drama of desire and revulsion, of affirmation and repudiation.Novalishttp://www.blogger.com/profile/10501890494890617030noreply@blogger.com0tag:blogger.com,1999:blog-4425732352511468694.post-63210272857219271912011-05-24T17:52:00.000-07:002011-05-25T05:34:09.250-07:00You (Don't) Say It's Your BirthdayHappy birthday, Bob.<br /><br /><br />Alex Ross <a href="http://www.therestisnoise.com/2011/05/happy-70th-bob-dylan.html">posted some favorite lines</a>.<br /><br /><br />There is something about Dylan--the musician, the poet, the cryptic cultural figure-- that is gratuitously compelling. If grace were to exist, it would feel something like listening to Dylan, to that "thin wild mercury sound" of 45 years ago.<br /><br /><br />It seems miraculous that he is still alive, both literally and figuratively.<br /><br /><br />Favorite line? At random:<br /><br /><br />"Well, the comic book and me, just us, we caught the bus"Novalishttp://www.blogger.com/profile/10501890494890617030noreply@blogger.com0tag:blogger.com,1999:blog-4425732352511468694.post-35963946221611631382010-11-22T16:35:00.000-08:002010-11-22T17:18:13.873-08:00PostscriptOne more thing. I ended a bit abruptly a month ago, yet I recently came across two links that encapsulate the blog's preoccupations so fittingly that I cannot resist tying this last speculative bow.<br /><br />The Wittgenstein scholar Peter Hacker <a href="http://www.philosophypress.co.uk/?p=1583">explains</a> that philosophy, unlike science, does not add to our knowledge of reality; rather, it examines the conceptual schemas through which we consider reality. Formal science is extremely successful in the relatively narrow task of documenting external reality, and it brooks no competitors; but arguably everything we most care about exists outside of science's purview. I particularly liked his comment that science yields an aggregate of facts that can be transmitted from generation to generation as a kind of epistemological bolus, whereas philosophy--like the arts--must be perpetually recreated.<br /><br />Hacker also assails the prevailing scientistic fetish for neuroscience, arguing that from the point of view of real human priorities, it is the unified human agent that counts, not his or her brain and its myriad parts. "My amygdala made me do it" is not so different from "My soul made me do it." The moral self must take ownership of its concepts and its actions, not hide from them by ascribing them to the brain. Neuroscience may increasingly give us the capability to tinker more viscerally with our own experience, but this is nothing but the means to an ever debatable end. Science is nothing but a method, and one which can never identify the life most worth living. The latter can only be arrived at biographically and culturally, through lived experience, dialogue, and contingency. Everything that is not a fact exists in the vast penumbra of narrative.<br /><br />Andy Martin <a href="http://opinionator.blogs.nytimes.com/2010/11/21/beyond-understanding/">looks at </a>the overlap of autism and philosophy, arguing that both phenomena (endeavors? conditions?) involve a basic inability, or perhaps unwillingness, to fathom seemingly transparent communications. He suggests a tension between a philosophy that seeks to eradicate or solve conceptual confusions and one that accepts their inevitability. The latter is what always drew me to philosophy and to literature, which to me constitute the infinite project of outlining and marveling at the fundamental riddles of (inter)subjective experience. Consciousness is interesting not despite, but precisely because, imperfect understanding cannot be avoided. A philosophy or a science that proposes to eliminate conundrums is oppressive and must be resisted; a refusal to fully understand or be understood is a kind of assertion of freedom.<br /><br />However, philosophy should not be sheer mystification. Language is the most powerful tool ever devised, and as such it can never be totally under our control; to some degree it always has a life of its own. Its spontaneous complexity is luxuriant and life-giving, as I have said, but it is well-known that metaphors can become stifling vines threatening to choke off light and space. Philosophy is fundamentally a linguistic pruning operation, lopping off conceptual excresences that threaten our narrative well-being.<br /><br />Philosophical, that is, moral and aesthetic truths can never be as unambiguous as scientific ones, but they achieve a certain pragmatic objectivity because, well, human beings are so constituted that we need certain standards that are not lightly or trivially modifiable. Where does psychology fit? Like medicine, psychology derives from science a sense of realistic empirical boundaries of what may be technically achieved, but its aims must arise through personal and cultural narrative philosophy.<br /><br />And that really is all I have to say for now.Novalishttp://www.blogger.com/profile/10501890494890617030noreply@blogger.com5tag:blogger.com,1999:blog-4425732352511468694.post-39752502513353576842010-10-19T17:43:00.000-07:002010-10-19T18:18:19.298-07:00Finis"Now my charms are all o'erthrown,<br />And what strength I have is mine own,<br />Which is most faint."<br /><br />Prospero<br /><br />In the <em>Times</em> Judith Lichtenberg <a href="http://opinionator.blogs.nytimes.com/2010/10/19/is-pure-altruism-possible/?hp">examines altruism</a>, in particular the fact that it seems impossible to isolate pure unselfishness, uncontaminated by all self-interested motives (even if only the often unconscious satisfaction of having done good). But she argues that altruism is no less desirable, individually and socially, for all its imperfections; indeed, a flawed, all-too-human altruism is the best we can hope for in this world, that is, at all.<br /><br />It seems to me that the wish for unsullied altruism is parallel to the fantasy of an absolute free will, untrammeled by ambivalence, weakness, or material considerations. The totally free and altruistic act would, of course, be the act of God, not of human beings.<br /><br />That seems like a fine note upon which to end this blog, which has now run for more than two years and 400 posts. A blog has no natural ending apart from the demise or sheer exhaustion of its author. I find that I have said all that I have to say in this format, and nothing would remain here but the recycling of old themes and, of course, gawking at the baubles of the Web as they flash by. I have arrived at that definite point marked not by ambivalence or by frustrated block, but by dispassion--it is time to move on.<br /><br />If Emerson was right that life consists of what a man (sic) thinks about all day, then this blog has been a reasonable record of the past two years of my life. Many posts have been tossed off, but many have been thoughtful, carefully wrought and even alarmingly personal, especially to any perceptive readers out there. It has been a transitional time, befitting a blog I suppose.<br /><br />Other projects await. I will need to prepare for a Grand Rounds presentation a few months hence (a late echo of the academic life), and I am getting closer, finally, to starting a private practice, which will take considerable doing. Any additional post here in the future would be a link to a possible different kind of blog, a more professionally discreet and decorous one that might support a practice.<br /><br />Thanks to readers--be well.Novalishttp://www.blogger.com/profile/10501890494890617030noreply@blogger.com10tag:blogger.com,1999:blog-4425732352511468694.post-36720088460142070972010-10-17T13:09:00.000-07:002010-10-17T14:04:33.002-07:00A Score of Scores<a href="http://users.manchester.edu/FacStaff/ssnaragon/Naragon/images/Klee2.jpg"><img style="TEXT-ALIGN: center; MARGIN: 0px auto 10px; WIDTH: 380px; DISPLAY: block; HEIGHT: 500px; CURSOR: hand" border="0" alt="" src="http://users.manchester.edu/FacStaff/ssnaragon/Naragon/images/Klee2.jpg" /></a><br /><div></div><br /><div>"Thus the whirligig of time brings in his revenges."</div><div></div><br /><div><em>Twelfth Night</em></div><div></div><br /><div></div><br /><div>As mundane commemoration of this blog's 400th post, a few points on the infinite Web:</div><div></div><br /><div>1. After I read <a href="http://www.nytimes.com/2010/10/17/magazine/17part-t.html?hpw">this profile </a>of Arvo Pärt, I went back and listened again to the wonderfully haunting "Tabula Rasa." It's spookily spiritual, scarily good, and perfect for Sunday Halloween this year. Music comes in two basic varieties: that which sets you in motion, and that which makes you more still.</div><div></div><br /><div>2. <em>The Atlantic</em> on the <a href="http://www.theatlantic.com/magazine/archive/2010/11/lies-damned-lies-and-medical-science/8269">unnerving unreliability </a>of medical research. I have not been to a primary care physician in a dozen years, and barring any new or unusual symptoms, I hope to extend that streak far into the future (do not try this at home).</div><div></div><br /><div>3. Melvin Konner on the likely <a href="http://www.psychologytoday.com/blog/the-tangled-wing/201009/is-adhd-disease-civilization">primeval advantages </a>of currently unfashionable distractibility and hyperactivity.</div><div></div><br /><div>4. I happened to see three great local productions of Shakespeare comedies (<em>Twelfth Night</em>, <em>As You Like It</em>,<em> A Midsummer Night's Dream</em>) over the past two weeks. His comedies, entertaining though they are, are ultimately more disturbing than even the depths of Lear or the black hole of Iago because they show us the arbitrariness of erotic attachment. I was wondering why Macbeth wasn't making an appearance in the season of ghouls and goblins, but Viola, Olivia, Orlando, Rosalind, and Demetrius & Co. are finally more frightening than Lady Macbeth. Never look to Shakespeare for consolation--even the funhouse mirror does not flatter in the end.</div><div></div><br /><div>5. The comedy club last night was uneven. Beyond a certain point, raunchiness is to true humor as bathos is to pathos; both are varieties of sentimentality, and failures of feeling. But okay for Saturday night. </div>Novalishttp://www.blogger.com/profile/10501890494890617030noreply@blogger.com0tag:blogger.com,1999:blog-4425732352511468694.post-88285100582874695822010-10-16T05:24:00.000-07:002010-10-16T06:15:35.793-07:00The Haunted Future<a href="http://images.worldgallery.co.uk/i/prints/rw/lg/1/6/Edward-Munch-Sommernacht-am-Strand-165741.jpg"><img style="TEXT-ALIGN: center; MARGIN: 0px auto 10px; WIDTH: 400px; DISPLAY: block; HEIGHT: 304px; CURSOR: hand" border="0" alt="" src="http://images.worldgallery.co.uk/i/prints/rw/lg/1/6/Edward-Munch-Sommernacht-am-Strand-165741.jpg" /></a><br /><div></div><br /><div>"An apple serves as well as any skull</div><div>To be the book in which to read a round,</div><div>And is as excellent, in that it is composed</div><div>Of what, like skulls, comes rotting back to ground."</div><div></div><br /><div>Wallace Stevens, from "Le Monocle de Mon Oncle"</div><div></div><br /><div></div><br /><div>I've never gone in much for ghosts, but I'm reconsidering this after reading Leon Wieseltier's <a href="http://www.tnr.com/article/politics/magazine/78192/ghosts-beltway-washington-wieseltier">meditation</a> on the presence of the unseen. He is writing about historical and cultural memory, but to be sure, there are myriad ghosts of the non-supernatural variety if we would just open our eyes and see them. Wieseltier writes, "Ghosts are the natural companions of estrangement; the invisible officers of tradition, of all the valuable things that have been declared obsolete but, in some stubborn hearts, are not obsolete. It is one of the fundamental properties of the human that the absent may be more significant than the present."</div><div></div><br /><div>Humanity has always been locked in life-and-death struggle with its various ghosts. Monotheism sought to displace the ghosts of sky, sea, and mountain in favor of one great ghost-in-chief (of all absences, perhaps the one most present). The Enlightenment and modernity routed the fairies and ghouls of cave, dell, and stream. Perhaps the third great usurpation has been the perennial presentism of ubiquitous 24/7 Internet media, whose blinding glare renders the pre-millenial past ever more faint.</div><div></div><br /><div>Memory, both personal and global, comprises legions of ghosts, as does the written word. Perhaps even the spoken word commemorates that which has passed--as Nietzsche wrote (and as the ever elegaic Harold Bloom was fond of quoting), "That for which we find words is something already dead in our hearts. There is always a kind of contempt in the act of speaking." In other words, we never quite catch up even to the present moment, much less the future.</div><div></div><br /><div>I think that I have always had to work hard to free myself of ghosts. I have often felt like Frodo when the Ring was on his finger: reality dimmed and retreated and he found himself in a parallel or superimposed shadow world. At any given moment or situation, it is difficult to remind myself that "this, here is reality," for I know that "this, here" can only be the most miniscule excerpt of Reality, an atom in the universe. The ghosts vastly outnumber the living, beyond measure.</div><div></div><br /><div>Internet media is interesting inasmuch as it locks us into a perpetual present, yet also displaces us from an actual present. The hordes of Blackberry and Facebook-checkers are not entirely "there," but they also are not haunted in any meaningful way; they are not afflicted by ghosts, rather, they and their living interlocutors are in a kind of Limbo. All virtualities are not created equal, and I prefer mine to have a history.</div><div></div><br /><div>And then there are other specters, of alternative selves (and the persons-to-ghosts those selves would have encountered or even conceived) that occupy the paths not taken. There are ghosts of the future, those beings we think we or those we love might become. As James Surowiecki <a href="http://www.newyorker.com/arts/critics/books/2010/10/11/101011crbo_books_surowiecki">writes</a>, procrastination can be a way of fending off or at least questioning such spirits.</div><div></div><br /><div>Procrastination can be a manifestation of mere bewilderment or self-doubt, in which case it may help to break down a daunting or nebulous project into smaller, more concrete, and more practical stages. But as Surowiecki notes, procrastination may reflect a more basic instability in motivation, as identity is somewhat fluid and we can never be entirely sure that we will want tomorrow what we want today. He also reports--news flash!--that, believe it or not, human beings are ambivalent creatures beset with inner conflict (apparently economists and behavioral psychologists are just finding this out).</div><div></div><br /><div>Inasmuch as it represents skepticism about distant, abstract goals in favor of more short-term rewards, procrastination may be a malady peculiar to modernity. Indeed there is a double-whammy here since complex societies demand deferred gratification at the same time that pleasurable and instant distractions grow more abundant. But there is a more fundamental existential issue. We often put things off because we are not yet sure of their value, and hope that the passage of time will clarify it, so that we can decide which among the plethora of "ghosts of the future" may become real. </div>Novalishttp://www.blogger.com/profile/10501890494890617030noreply@blogger.com0