Plus ca change, plus c'est la meme chose? Can't be.
A "low dishonest decade" on multiple levels.
"September 1, 1939"
W. H. Auden
Thursday, December 31, 2009
Wednesday, December 30, 2009
Metafictions
A Commonplace Blog sets me to thinking, as usual, this time about literary style and its relation to genre and, by implication, to the whole point of writing and reading. D. G. Myers laments the profusion of slapdash Web chatter, likening it to street noise haphazardly translated into prose. In this I concur. But in promoting the self-conscious artifice of good writing, he claims that fiction is the "permanent home" of the best prose. It is an assertion that may come as a great surprise to many poets and essayists, among many others.
What does good writing do, and wherein is its appeal? I always come back to Horace's claim that the point of art is to delight and to instruct. Both are necessary, neither sufficient, although different genres and styles may call for more emphasis upon one or the other. The "instruction" of art involves education about the world as it is, whereas "delight" entails the delectation of perception and expression for their own sake.
Good writing, whether fiction or non-fiction, enhances our understanding of and attachment to reality, but by means of desire and acceptance, not mere resignation. Good writing is ultimately about attentiveness, about getting reality right, not in itself (as if we could know what reality is in or for itself), but crucially with respect to human needs. As a basic level good writing is an ode to the world as beloved; this involves both accurate appraisal and enthusiasm.
Fiction does hold a special and honored place because for (self-)conscious creatures, "reality" is forever in flux and in question; it is an amalgam of what is the case and what could potentially be the case. And technically of course, there is in fact nothing outside of reality. The vast realms of fiction are--or at least began as--mere annexes built upon a prior reality. They become candidates for more or less embraced realities themselves.
What about virtual reality? Consciousness is itself the original virtual reality. There is only reality, but therein are countless regions that are more or less hospitable to human thriving and happiness. That is how we judge narrative, whether in the form of writing, film, or video game; does it either depict or stipulate (depending on whether its focus is more realistic or fantastic) reality in a way that is stimulating, enlightening, and favorable for human experience, such as it is? And of course human experience, being historical, has itself been shaped by previous depictions and stipulations. This implies that human sensibility could change, whether biologically or culturally, such that favored narrative styles may alter profoundly over time; the question is how elastic "human nature" ultimately is.
The psychologically interesting thing, of course, is that writing and reading are not like air or food; some seem to need or crave them, particularly in fictional form, far more than others do. For some, perhaps, reality that is "unenhanced," one might say, is merely dull and difficult to love, such that it is imperative to "make it new," as Pound put it I think. This may reflect a certain critical restlessness of temperament. Others may be connoisseurs of reality, so to speak, who revel in nuance for its own sake. What these two groups seem to agree on is that reality is no settled matter. What then is bad writing? It is writing, I suppose, that either gets reality clumsily or wrong-headedly wrong, or that proposes an alternative reality that we ultimately cannot love or learn from.
As I wrote a couple of posts ago, I have found it harder to enjoy fiction in recent years, and it has been hard to pin down why. And this is particularly true of contemporary fiction; the older I get, the number of books I find indispensable continues to shrink, and the attraction of the latest Pulitzer Prize winner grows ever dimmer. Perhaps in thirty years I will be down to Shakespeare alone. But then again, maybe current life circumstances will moderate such that the central preoccupations of literature--the fine-grained, yet speculative appreciation of how things stand with respect to human beings, as well as subtle suggestions of how they might or ought to be different--will again beckon as open, urgent, and interesting questions.
By the way, farewell to 2009, a year stranger than any fiction.
What does good writing do, and wherein is its appeal? I always come back to Horace's claim that the point of art is to delight and to instruct. Both are necessary, neither sufficient, although different genres and styles may call for more emphasis upon one or the other. The "instruction" of art involves education about the world as it is, whereas "delight" entails the delectation of perception and expression for their own sake.
Good writing, whether fiction or non-fiction, enhances our understanding of and attachment to reality, but by means of desire and acceptance, not mere resignation. Good writing is ultimately about attentiveness, about getting reality right, not in itself (as if we could know what reality is in or for itself), but crucially with respect to human needs. As a basic level good writing is an ode to the world as beloved; this involves both accurate appraisal and enthusiasm.
Fiction does hold a special and honored place because for (self-)conscious creatures, "reality" is forever in flux and in question; it is an amalgam of what is the case and what could potentially be the case. And technically of course, there is in fact nothing outside of reality. The vast realms of fiction are--or at least began as--mere annexes built upon a prior reality. They become candidates for more or less embraced realities themselves.
What about virtual reality? Consciousness is itself the original virtual reality. There is only reality, but therein are countless regions that are more or less hospitable to human thriving and happiness. That is how we judge narrative, whether in the form of writing, film, or video game; does it either depict or stipulate (depending on whether its focus is more realistic or fantastic) reality in a way that is stimulating, enlightening, and favorable for human experience, such as it is? And of course human experience, being historical, has itself been shaped by previous depictions and stipulations. This implies that human sensibility could change, whether biologically or culturally, such that favored narrative styles may alter profoundly over time; the question is how elastic "human nature" ultimately is.
The psychologically interesting thing, of course, is that writing and reading are not like air or food; some seem to need or crave them, particularly in fictional form, far more than others do. For some, perhaps, reality that is "unenhanced," one might say, is merely dull and difficult to love, such that it is imperative to "make it new," as Pound put it I think. This may reflect a certain critical restlessness of temperament. Others may be connoisseurs of reality, so to speak, who revel in nuance for its own sake. What these two groups seem to agree on is that reality is no settled matter. What then is bad writing? It is writing, I suppose, that either gets reality clumsily or wrong-headedly wrong, or that proposes an alternative reality that we ultimately cannot love or learn from.
As I wrote a couple of posts ago, I have found it harder to enjoy fiction in recent years, and it has been hard to pin down why. And this is particularly true of contemporary fiction; the older I get, the number of books I find indispensable continues to shrink, and the attraction of the latest Pulitzer Prize winner grows ever dimmer. Perhaps in thirty years I will be down to Shakespeare alone. But then again, maybe current life circumstances will moderate such that the central preoccupations of literature--the fine-grained, yet speculative appreciation of how things stand with respect to human beings, as well as subtle suggestions of how they might or ought to be different--will again beckon as open, urgent, and interesting questions.
By the way, farewell to 2009, a year stranger than any fiction.
Tuesday, December 22, 2009
A Vision
On a Christmas card I couldn't resist yesterday, a snow man is seen, from behind, as "he" rummages in a bag he has apparently found in the snow. "What is this? Two lumps of coal?" he exclaims, in disappointment as I inferred.
Inside, of course, the lumps of coal have become his eyes as he exclaims, "I can see! I can see!"
A mere cipher as metaphors go. And yet...
Inside, of course, the lumps of coal have become his eyes as he exclaims, "I can see! I can see!"
A mere cipher as metaphors go. And yet...
Monday, November 30, 2009
Motive for Metaphor
For years my appetite for fiction, once prodigious, has diminished, and in recent months, in reaction to what Wallace Stevens called "the pressure of reality," it has vanished altogether. Give me facts, give me philosophy; for the time being fantasy repels. I haven't read or seen, much less enjoyed, a story, novel, or film in months, but have I lived? Oh yes, I've lived a great deal these few months, for better and worse.
Intellectually I still see Aristotle's famous point that history merely deals in particulars, whereas poetry at its greatest deals in universals. However, there are stretches in life in which one is so bound by the brambles of particulars that universals are like the dark side of the moon. And fiction is, after all, a species of lying, that propensity that most distinguishes humanity from the other animals.
Fiction is a luxury of civilization, to be relished only so long as no barbarians are banging on the gates. But barbarians infiltrate the citadel all the time, one is never wholly safe...Even now I'm not sure whether art intensifies life or merely offers an escape therefrom.
Intellectually I still see Aristotle's famous point that history merely deals in particulars, whereas poetry at its greatest deals in universals. However, there are stretches in life in which one is so bound by the brambles of particulars that universals are like the dark side of the moon. And fiction is, after all, a species of lying, that propensity that most distinguishes humanity from the other animals.
Fiction is a luxury of civilization, to be relished only so long as no barbarians are banging on the gates. But barbarians infiltrate the citadel all the time, one is never wholly safe...Even now I'm not sure whether art intensifies life or merely offers an escape therefrom.
Thursday, November 26, 2009
Light
The Buddha's Last Instruction
"Make of yourself a light,"
said the Buddha,
before he died.
I think of this every morning
as the east begins
to tear off its many clouds
of darkness, to send up the first
signal--a white fan
streaked with pink and violet,
even green.
An old man, he lay down
between two sala trees,
and he might have said anything,
knowing it was his final hour.
The light burns upward,
it thickens and settles over the fields.
Around him, the villagers gathered
and stretched forward to listen.
Even before the sun itself
hangs, disattached, in the blue air,
I am touched everywhere
by its ocean of yellow waves.
No doubt he thought of everything
that had happened in his difficult life.
And then I feel the sun itself
as it blazes over the hills,
like a million flowers on fire--
clearly I'm not needed,
yet I feel myself turning
into something of inexplicable value.
Slowly, beneath the branches,
he raised his head.
He looked into the faces of that frightened crowd.
Mary Oliver
Monday, November 23, 2009
Missing the Forest
"But philosophy has no direct influence on the great mass of mankind; it is of interest to only a small number even of the top layer of intellectuals and is scarcely intelligible to anyone else. On the other hand, religion is an immense power which has the strongest emotions of human beings at its service."
Freud, "The Question of a Weltanschauung"
Unlike some of the highly vocal neo-atheists like Christopher Hitchens and Richard Dawkins, the militantly unbelieving Freud did not underestimate the opposition--he recognized that religion's roots go far deeper than the relatively thin soil of the intellect. A few sentences on in the essay quoted, he argues that religion performs three crucial functions: "It gives [human beings] information about the origin and coming into existence of the universe, it assures them of its protection and of ultimate happiness in the ups and downs of life and it directs their thoughts and actions by precepts which it lays down with its whole authority."
Providing (purported) knowledge, existential succor, and morality all at once, religion is the ultimate in one-stop shopping. Its transcendental simplicity is hard to beat, not least because the experience of it is both deeply personal and reassuringly communal.
Arts and Letters Daily today provides a link to an Edge article by Rebecca Newberger Goldstein that dispassionately dismantles 36 proposed arguments for the existence of God. All of the classic ones are there--the argument from ontology, from design, from pragmatism--and the fatal flaws of all are coolly and mercilessy exposed; I have come across many alleged proofs and disproofs of God, but never so neatly summarized in one place.
It's all there, literally in black and white, and yet the believer could still say that if this is what logic and consistency demonstrate, then so much the worse for logic and consistency. Indeed, considering the overwhelmingly religious history of humanity, this kind of logical coup de grace really shows how unphilosophical (in the narrow logical sense) human beings tend to be.
For believers, faith is least of all a matter of empiricism or logic. It is a tradition, a way of life, and a profound emotional need, but it is a philosophy only in the post hoc sense that cognitive dissonance must be suppressed somehow. And as some increasingly argue, religion's pride of place in human nature may have a deep evolutionary source.
Given that religious traditions have arisen independently but in parallel patterns across millenia and across the world, it isn't hard to imagine that faith may have offered a survival advantage to groups in a range of circumstances. In this respect religion has been likened to language--the innate capacity is there, and in both cases children famously absorb the tradition in which they are raised.
How to explain agnostics then? Language is so deeply genetic that, barring severe disorders or early linguistic deprivation, its capacity is universal. Religion obviously isn't like that. I wonder if evolution could have provided not only for the propensity for religion, but also for a certain fractional dissent therefrom. I'm speculating that groups with a truly universal religious "gene" may have tended to become rigid or complacent as compared to groups with more flexible religiosity, even if that made it possible, even inevitable, for agnostic psychology to flourish at least as part of a population. Indeed, religion may benefit, even require, an agnostic opposition in order to remain viable over the long term.
Evidence shows that agnostics and atheists, taken individually, can be as healthy, happy, and productive as believers. But the question for the future of religion is whether it is morally and culturally feasible for unbelievers to constitute a majority or even all of a population. Yes, Europe is famously growing quite secular, but birthrates there have also fallen alarmingly. Are relatively agnostic societies, like agnostics themselves, the exceptions that prove the rule?
Freud, "The Question of a Weltanschauung"
Unlike some of the highly vocal neo-atheists like Christopher Hitchens and Richard Dawkins, the militantly unbelieving Freud did not underestimate the opposition--he recognized that religion's roots go far deeper than the relatively thin soil of the intellect. A few sentences on in the essay quoted, he argues that religion performs three crucial functions: "It gives [human beings] information about the origin and coming into existence of the universe, it assures them of its protection and of ultimate happiness in the ups and downs of life and it directs their thoughts and actions by precepts which it lays down with its whole authority."
Providing (purported) knowledge, existential succor, and morality all at once, religion is the ultimate in one-stop shopping. Its transcendental simplicity is hard to beat, not least because the experience of it is both deeply personal and reassuringly communal.
Arts and Letters Daily today provides a link to an Edge article by Rebecca Newberger Goldstein that dispassionately dismantles 36 proposed arguments for the existence of God. All of the classic ones are there--the argument from ontology, from design, from pragmatism--and the fatal flaws of all are coolly and mercilessy exposed; I have come across many alleged proofs and disproofs of God, but never so neatly summarized in one place.
It's all there, literally in black and white, and yet the believer could still say that if this is what logic and consistency demonstrate, then so much the worse for logic and consistency. Indeed, considering the overwhelmingly religious history of humanity, this kind of logical coup de grace really shows how unphilosophical (in the narrow logical sense) human beings tend to be.
For believers, faith is least of all a matter of empiricism or logic. It is a tradition, a way of life, and a profound emotional need, but it is a philosophy only in the post hoc sense that cognitive dissonance must be suppressed somehow. And as some increasingly argue, religion's pride of place in human nature may have a deep evolutionary source.
Given that religious traditions have arisen independently but in parallel patterns across millenia and across the world, it isn't hard to imagine that faith may have offered a survival advantage to groups in a range of circumstances. In this respect religion has been likened to language--the innate capacity is there, and in both cases children famously absorb the tradition in which they are raised.
How to explain agnostics then? Language is so deeply genetic that, barring severe disorders or early linguistic deprivation, its capacity is universal. Religion obviously isn't like that. I wonder if evolution could have provided not only for the propensity for religion, but also for a certain fractional dissent therefrom. I'm speculating that groups with a truly universal religious "gene" may have tended to become rigid or complacent as compared to groups with more flexible religiosity, even if that made it possible, even inevitable, for agnostic psychology to flourish at least as part of a population. Indeed, religion may benefit, even require, an agnostic opposition in order to remain viable over the long term.
Evidence shows that agnostics and atheists, taken individually, can be as healthy, happy, and productive as believers. But the question for the future of religion is whether it is morally and culturally feasible for unbelievers to constitute a majority or even all of a population. Yes, Europe is famously growing quite secular, but birthrates there have also fallen alarmingly. Are relatively agnostic societies, like agnostics themselves, the exceptions that prove the rule?
Saturday, November 21, 2009
Figuratively Speaking
It will soon be eleven years since I read this to Julia:
In courtesy I'd have her chiefly learned;
Hearts are not had as a gift but hearts are earned
By those that are not entirely beautiful;
Yet many, that have played the fool
For beauty's very self, has charm made wise,
And many a poor man that has roved,
Loved and thought himself beloved,
From a glad kindness cannot take his eyes.
May she become a flourishing hidden tree
That all her thoughts may like the linnet be,
And have no business but dispensing round
Their magnanimities of sound,
Nor but in merriment begin a chase,
Nor but in merriment a quarrel.
O may she live like some green laurel
Rooted in one dear perpetual place.
From "A Prayer for my Daughter," W. B. Yeats (full poem here)
In courtesy I'd have her chiefly learned;
Hearts are not had as a gift but hearts are earned
By those that are not entirely beautiful;
Yet many, that have played the fool
For beauty's very self, has charm made wise,
And many a poor man that has roved,
Loved and thought himself beloved,
From a glad kindness cannot take his eyes.
May she become a flourishing hidden tree
That all her thoughts may like the linnet be,
And have no business but dispensing round
Their magnanimities of sound,
Nor but in merriment begin a chase,
Nor but in merriment a quarrel.
O may she live like some green laurel
Rooted in one dear perpetual place.
From "A Prayer for my Daughter," W. B. Yeats (full poem here)
Friday, November 20, 2009
Caged Animals
"The fault, dear Brutus, lies not in our stars, but in ourselves, that we are underlings."
Cassius
I've never found it easy to answer the occasional but predictable question, "Why did you become a psychiatrist?" but a succinct, if not simple, response is evoked for me by a Psychology Today blog post by Dr. Mark Goulston. Bluntly entitling his post "Maybe You're Just Wrong," he claims that for some people he works with--especially those with no major Axis I mental disorder--he gives them the option of being labelled ill on the one hand, or in mincing-no-words fashion, "psychologically flawed and emotionally immature" on the other. In the former case, psychotherapy and possible medication may be indicated, whereas in the latter case, some kind of education or training may be called for.
What this speaks to for me is the ever-present question: to what degree must or can we take responsibility for our lives and identities? There is no life without suffering, that's for sure; without being lugubrious about it, it is clear that on the scale of a moment, a day, or a lifetime, existence not infrequently doesn't turn out the way we seem to feel that it should. Why is that? I went into psychiatry not in order to relieve the most suffering in any kind of generic sense; it is impossible to quantify suffering, of course, but maybe I could do more good by working in a soup kitchen or by becoming a hedge fund manager and then donating the majority of my income to charity.
No, I went into psychiatry to try to relieve a particular kind of suffering, that associated with the "mind-forg'd manacles" that prevent us from being as psychologically or emotionally free as we might. There is an odd little Arcade Fire song called "My Body is a Cage." Well, the mind is a cage too, obviously, and this is a troubling notion only if one thinks freedom could possibly be infinite. All are limited by temperament and disposition, although to be sure, some have cages that are far more spacious and pliable--and in far closer proximity to other cages--than others.
The fundamental premise of medicine is that we are not wholly responsible for our own suffering. The sick role is a socially sanctioned kind of forbearance granted to incapacity, stemming from a recognition that it is deeply unjust to pretend that "the cage" isn't there. However, it is equally unjust--in an infantilizing way--to carry on as though another's cage is more restrictive than it has to be.
In his blog post Goulston claims that most of his clients prefer to be considered wrong--and responsible for their own plight--rather than innocent victims. I wish I had his confidence in human nature (personally I think Dostoevsky's Grand Inquisitor was more on the money). Obviously these people aren't paying him with insurance, which would require a listed diagnosis. And I refer to them as clients rather than patients for a reason.
Goulston sets out the free will conundrum in stark terms, which is why the piece struck me, but clearly he constructs a false opposition. For psychiatric care, if it is enlightened, should always endeavor to seek a fair balance between the claims of responsibility and "the natural shocks that flesh is heir to." After all, situations either of absolute responsibility or the absolute lack thereof are very rare. Life is continually lived in a state of partial--and never precisely known--responsibility. And science does not help much with this. Free will is profoundly social and political, relating to what one ultimately must answer for (to hypothetical others). In that sense there is no one Free Will, but a multiplicity of free wills in different social contexts.
So for instance, it is common for a depressed person not to take of himself. He doesn't exercise or eat healthfully, he gains weight, he isolates himself and loses friends, and perhaps even loses his job because he doesn't drag himself to work on time. To what degree is he responsible for his plight, as opposed to being a victim of the medical condition of depression? Arguably science cannot answer this question. To be sure, ever more sophisticated brain scans may show that depressed brains are different in certain ways than non-depressed brains, but inasmuch as psychology stems from neurobiology, such brain scans could theoretically also show differences between, say, lazy and selfish brains as opposed to motivated and selfless brains.
I would argue that responsibility in this case--in all cases--is a pragmatic construct. What helps this person to function better--varying degrees of encouragement, stigma, and penalty (e.g. unemployment) on the one hand, or direct support and perhaps biological intervention on the other? Of course it is likely to be a combination of the two. The depressed person may be given supportive therapy and even medication, but there is also an expectation that he will exercise and socialize more to improve his own lot. Medical and psychiatric care should aim to balance the sick role with social expectation, the latter being mirror image of social responsibility. What does it mean though to "function better?" Ah, the question of how life ought to be lived is beyond the scope of this post--or this lifetime most likely.
I've been curious about the emergence in recent years of "life coaching," "job coaching," etc. Does this stem from a lingering stigma of therapy, or does it derive from a desire for therapy that is more actively interventional? However, if one is primarily "wrong" and not "sick," then a better metaphor may be teaching or even tutoring. A "coach" implies that life is a sporting event to be won or lost, whereas life perhaps is better approached as a skill, like piano-playing or any other. Well no, life surely isn't so straightforward as that--composing may be the better metaphor. Or since this is my post and I have no head for musical composition (as opposed to appreciation), I'll say writing. Yes, life as writing.
Cassius
I've never found it easy to answer the occasional but predictable question, "Why did you become a psychiatrist?" but a succinct, if not simple, response is evoked for me by a Psychology Today blog post by Dr. Mark Goulston. Bluntly entitling his post "Maybe You're Just Wrong," he claims that for some people he works with--especially those with no major Axis I mental disorder--he gives them the option of being labelled ill on the one hand, or in mincing-no-words fashion, "psychologically flawed and emotionally immature" on the other. In the former case, psychotherapy and possible medication may be indicated, whereas in the latter case, some kind of education or training may be called for.
What this speaks to for me is the ever-present question: to what degree must or can we take responsibility for our lives and identities? There is no life without suffering, that's for sure; without being lugubrious about it, it is clear that on the scale of a moment, a day, or a lifetime, existence not infrequently doesn't turn out the way we seem to feel that it should. Why is that? I went into psychiatry not in order to relieve the most suffering in any kind of generic sense; it is impossible to quantify suffering, of course, but maybe I could do more good by working in a soup kitchen or by becoming a hedge fund manager and then donating the majority of my income to charity.
No, I went into psychiatry to try to relieve a particular kind of suffering, that associated with the "mind-forg'd manacles" that prevent us from being as psychologically or emotionally free as we might. There is an odd little Arcade Fire song called "My Body is a Cage." Well, the mind is a cage too, obviously, and this is a troubling notion only if one thinks freedom could possibly be infinite. All are limited by temperament and disposition, although to be sure, some have cages that are far more spacious and pliable--and in far closer proximity to other cages--than others.
The fundamental premise of medicine is that we are not wholly responsible for our own suffering. The sick role is a socially sanctioned kind of forbearance granted to incapacity, stemming from a recognition that it is deeply unjust to pretend that "the cage" isn't there. However, it is equally unjust--in an infantilizing way--to carry on as though another's cage is more restrictive than it has to be.
In his blog post Goulston claims that most of his clients prefer to be considered wrong--and responsible for their own plight--rather than innocent victims. I wish I had his confidence in human nature (personally I think Dostoevsky's Grand Inquisitor was more on the money). Obviously these people aren't paying him with insurance, which would require a listed diagnosis. And I refer to them as clients rather than patients for a reason.
Goulston sets out the free will conundrum in stark terms, which is why the piece struck me, but clearly he constructs a false opposition. For psychiatric care, if it is enlightened, should always endeavor to seek a fair balance between the claims of responsibility and "the natural shocks that flesh is heir to." After all, situations either of absolute responsibility or the absolute lack thereof are very rare. Life is continually lived in a state of partial--and never precisely known--responsibility. And science does not help much with this. Free will is profoundly social and political, relating to what one ultimately must answer for (to hypothetical others). In that sense there is no one Free Will, but a multiplicity of free wills in different social contexts.
So for instance, it is common for a depressed person not to take of himself. He doesn't exercise or eat healthfully, he gains weight, he isolates himself and loses friends, and perhaps even loses his job because he doesn't drag himself to work on time. To what degree is he responsible for his plight, as opposed to being a victim of the medical condition of depression? Arguably science cannot answer this question. To be sure, ever more sophisticated brain scans may show that depressed brains are different in certain ways than non-depressed brains, but inasmuch as psychology stems from neurobiology, such brain scans could theoretically also show differences between, say, lazy and selfish brains as opposed to motivated and selfless brains.
I would argue that responsibility in this case--in all cases--is a pragmatic construct. What helps this person to function better--varying degrees of encouragement, stigma, and penalty (e.g. unemployment) on the one hand, or direct support and perhaps biological intervention on the other? Of course it is likely to be a combination of the two. The depressed person may be given supportive therapy and even medication, but there is also an expectation that he will exercise and socialize more to improve his own lot. Medical and psychiatric care should aim to balance the sick role with social expectation, the latter being mirror image of social responsibility. What does it mean though to "function better?" Ah, the question of how life ought to be lived is beyond the scope of this post--or this lifetime most likely.
I've been curious about the emergence in recent years of "life coaching," "job coaching," etc. Does this stem from a lingering stigma of therapy, or does it derive from a desire for therapy that is more actively interventional? However, if one is primarily "wrong" and not "sick," then a better metaphor may be teaching or even tutoring. A "coach" implies that life is a sporting event to be won or lost, whereas life perhaps is better approached as a skill, like piano-playing or any other. Well no, life surely isn't so straightforward as that--composing may be the better metaphor. Or since this is my post and I have no head for musical composition (as opposed to appreciation), I'll say writing. Yes, life as writing.
Friday, October 30, 2009
Is This It?
Waving Adieu, Adieu, Adieu
That would be waving and that would be crying,
Crying and shouting and meaning farewell,
Farewell in the eyes and farewell at the center,
Just to stand still without moving a hand.
In a world without heaven to follow, the stops
Would be endings, more poignant than partings, profounder,
And that would be saying farewell, repeating farewell,
Just to be there and just to behold.
To be one's singular self, to despise
The being that yielded so little, acquired
So little, too little to care, to turn
To the ever-jubilant weather, to sip
One's cup and never to say a word,
Or to sleep or just to lie there still,
Just to be there, just to be beheld,
That would be bidding farewell, be bidding farewell.
One likes to practice the thing. They practice,
Enough, for heaven. Ever-jubilant,
What is there here but weather, what spirit
Have I except it comes from the sun?
Wallace Stevens
For the time being I can no longer pretend that I have the time or the motivation to continue blogging here on a regular basis. The nearly 300 posts since last year (counting the longer predecessor of "Blue to Blue") have been a fascinating project, well worth doing. But I have said many of the things I had to say, in this format at least, and circumstances have changed; there are real-life matters that need to be taken care of.
I may occasionally return if I have a poem or other bee in my bonnet that I can't resist sharing, but it won't be on any regular basis, and it would be for myself more than for anyone else. After the first of the year things may have settled down enough that I'll want to undertake something new. I will continue to follow some of the folks on the Blogroll from time to time. But as for this site, thanks for reading up to now.
Monday, October 26, 2009
Poetic Diagnosis
I wasn't familiar with this poem, which I happened upon this morning:
No one gives you a thought, as day by day
You drag your feet, clay-thick with misery.
None think how stalemate in you grinds away,
Holding your spinning wheels an inch too high
To bite on earth. The mind, it's said, is free:
But not your minds. They, rusted stiff, admit
Only what will accuse or horrify,
Like slot-machines only bent pennies fit.
So year by year your tense unfinished faces
Sink further from the light. No one pretends
To want to help you now. For interest passes
Always towards the young and more insistent,
And skirts locked rooms where a hired darkness ends
Your long defence against the non-existent.
Philip Larkin
In some ways this seems a harsh, unlovely, and ungenerous piece, but on the other hand it has its accuracies. When it was written, in 1949 according to my volume, "neurosis" of course was a commonplace term owing to the cultural prominence of psychoanalysis. Neurosis remains a widely recognizable term, of course, but one no longer finds it in mainstream psychiatric diagnosis, as it has been split into myriad anxiety, mood, and perhaps personality disorders.
However, the construct of "neuroticism," which is a general tendency to emotional negativity and instability and susceptibility to stress, still exists as one of five major components of personality as identified in psychological testing (the other four are openness vs. conventionality, conscientiousness vs. expediency, extroversion vs. introversion, and agreeability vs. its lack). Neuroticism is correlated with increased risk for depression, anxiety, and eating disorders among other things. To my mind, describing someone as broadly neurotic can be more helpful and convenient than listing the five DSM-IV diagnoses they may happen to meet criteria for.
As for Larkin's poem, it painfully depicts the disfiguring and ostracizing effects that neuroticism can have; no, it is not (quite) leprosy, but it can alienate almost as much. It conveys the sense of stasis and sluggishness ("clay-thick"), of emotional torpor that results not from repose, but from wasteful psychological exertion (the metaphor of wheels spinning but gaining no traction on earth is just right).
Larkin puts his finger on the core problem of neurosis, which is the lack of internal freedom; while philosophers forever debate freedom vs. determinism in the abstract, the neurotic battles fatalism on a daily basis. Cheer up; don't be afraid; eat less; exercise. How can these things seem so impossible? For one thing, the neurotic lives in a different perceptual world from the rest of us, with a mind that will "admit only what will accuse or horrify."
"Tense unfinished faces" is perfect, suggesting the way in which anxiety inhibits and blurs individuation. There is a sense in which neurosis is a disabled identity. I'm not sure that "hired darkness" works as well, but I assume Larkin means here the classic avoidance by which the neurotic seeks to fend off "non-existent" threats, although ironically the threats in question are in reality all-too-existent, merely within the neurotic's "locked rooms," and not without as he imagines.
Neurotics
No one gives you a thought, as day by day
You drag your feet, clay-thick with misery.
None think how stalemate in you grinds away,
Holding your spinning wheels an inch too high
To bite on earth. The mind, it's said, is free:
But not your minds. They, rusted stiff, admit
Only what will accuse or horrify,
Like slot-machines only bent pennies fit.
So year by year your tense unfinished faces
Sink further from the light. No one pretends
To want to help you now. For interest passes
Always towards the young and more insistent,
And skirts locked rooms where a hired darkness ends
Your long defence against the non-existent.
Philip Larkin
In some ways this seems a harsh, unlovely, and ungenerous piece, but on the other hand it has its accuracies. When it was written, in 1949 according to my volume, "neurosis" of course was a commonplace term owing to the cultural prominence of psychoanalysis. Neurosis remains a widely recognizable term, of course, but one no longer finds it in mainstream psychiatric diagnosis, as it has been split into myriad anxiety, mood, and perhaps personality disorders.
However, the construct of "neuroticism," which is a general tendency to emotional negativity and instability and susceptibility to stress, still exists as one of five major components of personality as identified in psychological testing (the other four are openness vs. conventionality, conscientiousness vs. expediency, extroversion vs. introversion, and agreeability vs. its lack). Neuroticism is correlated with increased risk for depression, anxiety, and eating disorders among other things. To my mind, describing someone as broadly neurotic can be more helpful and convenient than listing the five DSM-IV diagnoses they may happen to meet criteria for.
As for Larkin's poem, it painfully depicts the disfiguring and ostracizing effects that neuroticism can have; no, it is not (quite) leprosy, but it can alienate almost as much. It conveys the sense of stasis and sluggishness ("clay-thick"), of emotional torpor that results not from repose, but from wasteful psychological exertion (the metaphor of wheels spinning but gaining no traction on earth is just right).
Larkin puts his finger on the core problem of neurosis, which is the lack of internal freedom; while philosophers forever debate freedom vs. determinism in the abstract, the neurotic battles fatalism on a daily basis. Cheer up; don't be afraid; eat less; exercise. How can these things seem so impossible? For one thing, the neurotic lives in a different perceptual world from the rest of us, with a mind that will "admit only what will accuse or horrify."
"Tense unfinished faces" is perfect, suggesting the way in which anxiety inhibits and blurs individuation. There is a sense in which neurosis is a disabled identity. I'm not sure that "hired darkness" works as well, but I assume Larkin means here the classic avoidance by which the neurotic seeks to fend off "non-existent" threats, although ironically the threats in question are in reality all-too-existent, merely within the neurotic's "locked rooms," and not without as he imagines.
Friday, October 23, 2009
In the Country of the Blind...
In a culture of visual and information overload, does one close one's eyes and detach, or does one peer and squint all the more intently? It depends, of course, as a couple of recent Web tidbits remind me. At Salon, a letter writer asks Cary Tennis whether he is culpable as a "blog stalker" for anonymously following a family's apparently fairly intimate but not yet exclusive website. The writer's scruples are an example of increasingly unheard of discretion.
When I first became interested in blogs a couple of years ago I went through a period of browsing countless sites, often chosen at random, just to sample the astonishing diversity out there. It felt a bit as I imagine a bird might feel that is able to coast on updrafts, although this bird would possess supernatural sight and hearing, able to openly eavesdrop on passersby below.
As Cary Tennis noted in his answer, some websites are so open--or in some cases so shameless--that propriety suggests that one should avert one's gaze, as one would from a passionate couple in the park. But this is no ethical obligation, rather mere politeness. One can make a case that if people desire total privacy, they should stay in. In the open, gestures are observed and conversations are overheard, often with delectable curiosity. Similarly, anyone writing an open blog is asking for readers, some of them potentially randomly and scandalously inquisitive.
A quick dash to the dictionary reminds me that "voyeur" has an unsavory connotation, which is a bit surprising. Obviously general "curiosity" is widely commended (except among cats), but we really don't have a positive term for curiosity applying specifically to the interpersonal realm. After all, no one wants to be "nosy" or "prying" either. And yet much of what therapists do is a kind of well-intentioned voyeurism, sheer attention to the Other in ways that would seem inappropriate in many settings.
Some of the charm of personal blogs is owing to the mere sense of plenitude and excess. In an age of inane reality television, 24/7 news, and spam, who needs blogs anyway? It seems to me that blogs are one thing that make the Internet a massive metropolis, always a click away. Blogs afford an often unintentional and unaffectedly sincere look at passersby in the digital city; they are most informative when they don't know (whom) they are informing.
If the blog browser is the flaneur "taking it all in" on a busy city street, then there is the contrary tendency to filter out extraneous crap and control access to the answer, or at least an answer one cares about. In an Atlantic Monthly piece Jamais Cascio discusses the growing power and specificity of "augmented reality."
In a philosophical sense of course we never take in reality in an unprejudiced way; there is no escaping preconceptions altogether. And in some cases one doesn't want to--if a (formerly) Red State rube like me strolls through Manhattan, it is good to have access to a guide book of some kind that tells me what I'm looking at. But there is the danger of touring the guidebook and not the city.
"Augmented reality" is the increasing capacity of phones and other devices to superimpose upon one's real-world surroundings a kind of filter that both comments on those surroundings and screens out aspects deemed undesirable beforehand. It favors control over serendipity. And again, one can't see and hear everything; it is necessary to choose. However, it behooves one to choose the method-of-choosing (the reality filter(s)) most wisely. Maybe that's what wisdom is.
If information today is an open fire hydrant, does one retreat inside a poncho, or play in the shower, or slap a hose on, perhaps to spray someone? It depends on one's goals at the time and on how comfortable the surroundings are.
In environments of information scarcity, the challenges are education and opportunity, but in environments of information excess, the challenges are prioritization and motivation.
When I first became interested in blogs a couple of years ago I went through a period of browsing countless sites, often chosen at random, just to sample the astonishing diversity out there. It felt a bit as I imagine a bird might feel that is able to coast on updrafts, although this bird would possess supernatural sight and hearing, able to openly eavesdrop on passersby below.
As Cary Tennis noted in his answer, some websites are so open--or in some cases so shameless--that propriety suggests that one should avert one's gaze, as one would from a passionate couple in the park. But this is no ethical obligation, rather mere politeness. One can make a case that if people desire total privacy, they should stay in. In the open, gestures are observed and conversations are overheard, often with delectable curiosity. Similarly, anyone writing an open blog is asking for readers, some of them potentially randomly and scandalously inquisitive.
A quick dash to the dictionary reminds me that "voyeur" has an unsavory connotation, which is a bit surprising. Obviously general "curiosity" is widely commended (except among cats), but we really don't have a positive term for curiosity applying specifically to the interpersonal realm. After all, no one wants to be "nosy" or "prying" either. And yet much of what therapists do is a kind of well-intentioned voyeurism, sheer attention to the Other in ways that would seem inappropriate in many settings.
Some of the charm of personal blogs is owing to the mere sense of plenitude and excess. In an age of inane reality television, 24/7 news, and spam, who needs blogs anyway? It seems to me that blogs are one thing that make the Internet a massive metropolis, always a click away. Blogs afford an often unintentional and unaffectedly sincere look at passersby in the digital city; they are most informative when they don't know (whom) they are informing.
If the blog browser is the flaneur "taking it all in" on a busy city street, then there is the contrary tendency to filter out extraneous crap and control access to the answer, or at least an answer one cares about. In an Atlantic Monthly piece Jamais Cascio discusses the growing power and specificity of "augmented reality."
In a philosophical sense of course we never take in reality in an unprejudiced way; there is no escaping preconceptions altogether. And in some cases one doesn't want to--if a (formerly) Red State rube like me strolls through Manhattan, it is good to have access to a guide book of some kind that tells me what I'm looking at. But there is the danger of touring the guidebook and not the city.
"Augmented reality" is the increasing capacity of phones and other devices to superimpose upon one's real-world surroundings a kind of filter that both comments on those surroundings and screens out aspects deemed undesirable beforehand. It favors control over serendipity. And again, one can't see and hear everything; it is necessary to choose. However, it behooves one to choose the method-of-choosing (the reality filter(s)) most wisely. Maybe that's what wisdom is.
If information today is an open fire hydrant, does one retreat inside a poncho, or play in the shower, or slap a hose on, perhaps to spray someone? It depends on one's goals at the time and on how comfortable the surroundings are.
In environments of information scarcity, the challenges are education and opportunity, but in environments of information excess, the challenges are prioritization and motivation.
Tuesday, October 20, 2009
In the Beginning...
There may be no better instance of "high" art meeting "low" than R. Crumb's new version of the Book of Genesis (complete and unabridged, primarily using Robert Alter's translation)--an NPR review is here. Crumb, best known for his notoriously carnal and countercultural work in underground comics, explained that the warning on the book's cover--"Adult Supervision Recommended for Minors"--is needed because of the story, not his artwork. And indeed, his version is straight up, honestly depicting the extensive sex and violence of the original without gratuitous detail.
In his generally favorable review from The New Republic, Alter himself explores and questions the extent to which Crumb's illustrations add to the power of the original. However, is this really the primary criterion by which graphic work--whether drawn, staged, or filmed--should be judged? After all, as Alter implies, any one graphic interpretation, inasmuch as it favors one visual version, steers the reader away from imagined alternatives. Indeed, such is the power of primary text that any "strong interpretation" could potentially detract as much as add.
While graphic works can't be considered altogether in isolation from their source texts, I think they also can stand or fall based on their own visual impact. For instance, I love William Blake's illustrations, but I think they retain much of their power even if the words (prodigious in themselves) are blocked out. And some primary texts seem to me to be so profound and uncontainable in themselves that visual interpretations seem almost a distracting disservice. Shakespeare is like this--while I've enjoyed a fair number of staged and filmed versions, I would much prefer to reread the original.
Another way of saying it is that graphic interpretations, while perhaps professing to complement or even enhance the textual original, cannot avoid competing with it and threatening to limit it as well. Perhaps this is why Islam forbids illustrations of the divine. But interpretation, just as criticism itself, can be appreciated as a parallel pursuit, just as I might enjoy Mozart's Mass in C Minor without counting myself a believer. As Santayana I think put it, "There is no God, and Mary is his mother."
When I first read Genesis as a teenager I was most struck, for whatever reason, by the story of Lot's family's flight from Sodom and Gomorrah and his wife turning into a pillar of salt when she turned to look behind her. Perhaps there was some adolescent avidity to know what exactly was going on in Sodom and Gomorrah, but as someone with a weakness for nostalgia to begin with, I also took it as a minatory tale. Descending from high to low culture, I would then proceed to play Boston's "Don't Look Back" at high volume.
So the Book of Genesis, a work of towering influence, doesn't need R. Crumb to augment its stature, but the graphic work is compelling in its own right, establishing its own imaginative region even if, in narrative terms, it does little more than flatter the original. But one definition of a classic--I forget where I read it--is a work that continually generates a buzzing cloud of interpretation.
In his generally favorable review from The New Republic, Alter himself explores and questions the extent to which Crumb's illustrations add to the power of the original. However, is this really the primary criterion by which graphic work--whether drawn, staged, or filmed--should be judged? After all, as Alter implies, any one graphic interpretation, inasmuch as it favors one visual version, steers the reader away from imagined alternatives. Indeed, such is the power of primary text that any "strong interpretation" could potentially detract as much as add.
While graphic works can't be considered altogether in isolation from their source texts, I think they also can stand or fall based on their own visual impact. For instance, I love William Blake's illustrations, but I think they retain much of their power even if the words (prodigious in themselves) are blocked out. And some primary texts seem to me to be so profound and uncontainable in themselves that visual interpretations seem almost a distracting disservice. Shakespeare is like this--while I've enjoyed a fair number of staged and filmed versions, I would much prefer to reread the original.
Another way of saying it is that graphic interpretations, while perhaps professing to complement or even enhance the textual original, cannot avoid competing with it and threatening to limit it as well. Perhaps this is why Islam forbids illustrations of the divine. But interpretation, just as criticism itself, can be appreciated as a parallel pursuit, just as I might enjoy Mozart's Mass in C Minor without counting myself a believer. As Santayana I think put it, "There is no God, and Mary is his mother."
When I first read Genesis as a teenager I was most struck, for whatever reason, by the story of Lot's family's flight from Sodom and Gomorrah and his wife turning into a pillar of salt when she turned to look behind her. Perhaps there was some adolescent avidity to know what exactly was going on in Sodom and Gomorrah, but as someone with a weakness for nostalgia to begin with, I also took it as a minatory tale. Descending from high to low culture, I would then proceed to play Boston's "Don't Look Back" at high volume.
So the Book of Genesis, a work of towering influence, doesn't need R. Crumb to augment its stature, but the graphic work is compelling in its own right, establishing its own imaginative region even if, in narrative terms, it does little more than flatter the original. But one definition of a classic--I forget where I read it--is a work that continually generates a buzzing cloud of interpretation.
Monday, October 19, 2009
Goddess
An Everywhere of Silver
With Ropes of Sand
To keep it from effacing
The Track called Land.
Dickinson
The sea is a creature of opposites. And it has always seemed a creature, the female deity corresponding to the male sun. The sublime Russian version of Stanislaw Lem's Solaris, in its depiction of a questionably sentient quasi-liquid world, brings this home.
Timeless and transient, absolutely monotonous yet infinitely various, caressing yet lethal. Above all, perhaps, solemn yet silly, a vast and inscrutable panoply contrasting with trivial sand and whimsically salty air.
The sea at night seems an affront, though, black-upon-black or death-within-death. Perhaps that is why sunrise and sunset are so much more striking at sea: rituals of redemption.
I have never allowed myself to live within two hours of the ocean. Some day perhaps.
Thursday, October 15, 2009
The Season
"Autumn Day"
Lord: it is time. The huge summer has gone by.
Now overlap the sundials with your shadows,
and on the meadows let the wind go free.
Command the fruits to swell on tree and vine;
grant them a few more warm transparent days,
urge them on to fulfillment then, and press
the final sweetness into the heavy wine.
Whoever has no house now, will never have one.
Whoever is alone will stay alone,
will sit, read, write long letters through the evening,
and wander on the boulevards, up and down,
restlessly, while the dry leaves are blowing.
Rainer Maria Rilke (trans. Stephen Mitchell)
Tuesday, October 13, 2009
The Higgs Boson, Fate, and the Future
A Times article, pondering the vicissitudes of the Large Hadron Collider in Switzerland, suggests that the great machine may be "sabotaged by its own future." That is, the energies generated may be so prodigious, and the elusive Higgs boson may be "so abhorrent to nature," that its creation ripples back in time to nullify itself. This could be viewed as a protective reflex on the part of the universe, or for the theologically inclined, God's version of "What part of no do you not understand?"
Maybe the whole article is tongue-in-cheek (which makes it no less interesting for that), but with particle physics apparently it's hard to tell. Yet I can't help but wonder: how is the time-travelling theory any different from the proposed project simply being impossible or contrary to the laws of nature, even if we don't yet understand how? After all, if I jump off of a tall building, flapping my arms vigorously, I will fall to my death. Is the prospect of me flying unaided across the cityscape "so abhorrent to nature" that my hypothetically successful flight travels backward in time to insure that I plunge to my doom instead? This suggests that the universe must (retro)actively cut off all kinds of transitions to metaphysically unacceptable outcomes. But if we can never directly view such outcomes, how is this different from the more prosaic notion of the inviolable laws of nature?
Maybe the whole article is tongue-in-cheek (which makes it no less interesting for that), but with particle physics apparently it's hard to tell. Yet I can't help but wonder: how is the time-travelling theory any different from the proposed project simply being impossible or contrary to the laws of nature, even if we don't yet understand how? After all, if I jump off of a tall building, flapping my arms vigorously, I will fall to my death. Is the prospect of me flying unaided across the cityscape "so abhorrent to nature" that my hypothetically successful flight travels backward in time to insure that I plunge to my doom instead? This suggests that the universe must (retro)actively cut off all kinds of transitions to metaphysically unacceptable outcomes. But if we can never directly view such outcomes, how is this different from the more prosaic notion of the inviolable laws of nature?
Friday, October 9, 2009
Obama, Laureate
Those Scandinavians, as we know from Munch, Hamsun, and Sibelius, are known for their riotous humor. And this time of year, as northern Europe begins its dive into collective Seasonal Affective Disorder, the Nobel Prizes afford an irresistible opportunity for global titters. Nonetheless, President Obama must be muttering, "With friends like these..." What's next, an honorary doctorate from Harvard, or perhaps a lifetime achievement award from ACORN?
I'm a sucker for grand symbolic gestures, but I have to believe that the Peace Prize is less about Obama himself than about the ideal of the United States as inspiring and responsible superpower, which people in most parts of world appear to want to be back. Who does the world look to? Europe? With polite curiosity. China? With wariness. Russia? Please. If the 19th century lasted, politically, until 1914, it appears the the American century, while a bit dyspneic lately, hasn't yet breathed its last.
I'm a sucker for grand symbolic gestures, but I have to believe that the Peace Prize is less about Obama himself than about the ideal of the United States as inspiring and responsible superpower, which people in most parts of world appear to want to be back. Who does the world look to? Europe? With polite curiosity. China? With wariness. Russia? Please. If the 19th century lasted, politically, until 1914, it appears the the American century, while a bit dyspneic lately, hasn't yet breathed its last.
Tuesday, October 6, 2009
Affection and Affectation
"Some cynical Frenchman has said that there are two parties to a love-transaction: the one who loves and the other who condescends to be so treated."
Thackeray, Vanity Fair
Part of me--a brisk, imperious, tolerating-no-fools-gladly side of me--delights in this. This is deflationary fact, nothing in excess of what is the case. As Wallace Stevens icily wrote, "Let the lamp affix its beam."
And yet...this observation, while surely occasional fact, cannot be truth, or at least the whole truth. For as I reminded myself recently (it's not original), fact is immune to our needs and desires, while truth should not be.
And then there are the Avett Brothers, a (mostly) North Carolina trio whose new record, I and Love and You, I've been enjoying. The CD package includes a preposterous but nonetheless affecting "mission statement" that illuminates for me what I like about the band's music and overall ethos: their honesty, sincerity, and pathos, but also their naivete, painful earnestness, and bathos. Depending on the day, I have either the former virtues or the latter vices, either buoyantly bobbing on the waves or floating bombastically into the heavens. I am, at least, capable of self-puncturing (or is this merely a kind of meta-bombast?). The Avett Brothers' previous record, which was superb, was entitled Emotionalism; that says it all.
I know this "mission statement" for what it is, something I myself could have written, most likely a dozen or more years ago, perhaps after a couple of glasses of wine. Only the most desperately loyal readers of this blog will make it to the end of this Avett Brothers statement, which may belong in some twisted "Purple to Purple" site:
The words "I" and "Love" and "You" are the watermark of humanity. Strung together, they convey our deepest sense of humility, of power, of truth. It is our most common sentiment, even as the feeling of it is so infinitely uncommon: each to proclaim these three words with his or her very own heart and mindset of reason (or lack thereof); a proclamation completely and perfectly new each time it is offered. Uttered daily and nightly by millions, the words are said in an unending array of circumstances: whispered to the newborn in a new mother's arms; shared between best friends on the playground; in the form of sympathy--said by a girl to a boy as the respect continues but the relationship does not. It is said too loudly by parents to embarrassed children in the company of their friends, and by grown children--to their fading parents in hospital beds. The words are thought in the company of the photograph and said in the company of the gravestone. It is how we end our phone calls and our letters...the words at the bottom of the page that trump all those above it, a way to gracefully finish a message, however important or trivial, with the most meaningful gift of all: the communication of love. And yet the words themselves have been the victims of triviality, a ready replacement for lesser salutations among near strangers, burst forth casually as "love ya." Truly? To what degree? Why, how much, and for how long? These are questions befitting the stature of love, though not the everyday banter of vague acquaintance. The words have also been twisted by the dark nature of deceit: to say "I love you" with a dramatic measure of synthetic emotion; a snare set by those who prey upon fellow humanity, driven to whatever selfish end, to gain access to another's body, or their money, or their opportunity. In this realm, the proclamation is disgraced by one seeking to gain rather than to give. In any case, and by whatever inspiration, these words are woven deeply into the fibers of our existence. Our longing to hear them from the right place is maddeningly and simultaneously our finest strength and our most gentle weakness. The album "I and Love and You" is unashamedly defined by such a dynamic of duality. As living people, we are bound by this unavoidable parallel. We are powerful yet weak, capable yet temporary. Inevitably, an attempt to place honesty within an artistic avenue will follow suit. This is a piece which shows us as we are: products of love surrounded by struggle. The music herein is, in many ways, readable as both a milestone and an arrival. A chapter in the story of young men, it bridges the space between the uncertainty of youth and the reality of its release. The record is full with the quality of question and response. As far as questions go, there are plenty...
Oh my, I couldn't even finish transcribing it. Yes, there are plenty of questions here, like What are you thinking by putting this turgid mess into the liner notes? (Fortunately, the music is far better). Yes, some of what it expresses does capture a faint glimmer of truth, but only in the same way that, say, Bach's B Minor Mass played on solo tuba captures a faint glimmer of beauty. This is the kind of thing that might drive me back to the Beastie Boys and Licensed to Ill.
What the Avetts seemed to be after is that "love," like "war" or "God," is a word that is wholly inadequate to the phenomena it purports to describe; words fail, and they are susceptible to debasement. When someone quipped that "all is fair in love and war," why did he leave out "God?" The three words arguably express the most violent emotions we can feel toward our own species: toward, against, and...up, up, and away. In the cases of love and war, at least, we can be sure that there is a there there, that there is a someone we're engaged with.
This is why poetry exists, to save words like these from exsanguination.
Thackeray, Vanity Fair
Part of me--a brisk, imperious, tolerating-no-fools-gladly side of me--delights in this. This is deflationary fact, nothing in excess of what is the case. As Wallace Stevens icily wrote, "Let the lamp affix its beam."
And yet...this observation, while surely occasional fact, cannot be truth, or at least the whole truth. For as I reminded myself recently (it's not original), fact is immune to our needs and desires, while truth should not be.
And then there are the Avett Brothers, a (mostly) North Carolina trio whose new record, I and Love and You, I've been enjoying. The CD package includes a preposterous but nonetheless affecting "mission statement" that illuminates for me what I like about the band's music and overall ethos: their honesty, sincerity, and pathos, but also their naivete, painful earnestness, and bathos. Depending on the day, I have either the former virtues or the latter vices, either buoyantly bobbing on the waves or floating bombastically into the heavens. I am, at least, capable of self-puncturing (or is this merely a kind of meta-bombast?). The Avett Brothers' previous record, which was superb, was entitled Emotionalism; that says it all.
I know this "mission statement" for what it is, something I myself could have written, most likely a dozen or more years ago, perhaps after a couple of glasses of wine. Only the most desperately loyal readers of this blog will make it to the end of this Avett Brothers statement, which may belong in some twisted "Purple to Purple" site:
The words "I" and "Love" and "You" are the watermark of humanity. Strung together, they convey our deepest sense of humility, of power, of truth. It is our most common sentiment, even as the feeling of it is so infinitely uncommon: each to proclaim these three words with his or her very own heart and mindset of reason (or lack thereof); a proclamation completely and perfectly new each time it is offered. Uttered daily and nightly by millions, the words are said in an unending array of circumstances: whispered to the newborn in a new mother's arms; shared between best friends on the playground; in the form of sympathy--said by a girl to a boy as the respect continues but the relationship does not. It is said too loudly by parents to embarrassed children in the company of their friends, and by grown children--to their fading parents in hospital beds. The words are thought in the company of the photograph and said in the company of the gravestone. It is how we end our phone calls and our letters...the words at the bottom of the page that trump all those above it, a way to gracefully finish a message, however important or trivial, with the most meaningful gift of all: the communication of love. And yet the words themselves have been the victims of triviality, a ready replacement for lesser salutations among near strangers, burst forth casually as "love ya." Truly? To what degree? Why, how much, and for how long? These are questions befitting the stature of love, though not the everyday banter of vague acquaintance. The words have also been twisted by the dark nature of deceit: to say "I love you" with a dramatic measure of synthetic emotion; a snare set by those who prey upon fellow humanity, driven to whatever selfish end, to gain access to another's body, or their money, or their opportunity. In this realm, the proclamation is disgraced by one seeking to gain rather than to give. In any case, and by whatever inspiration, these words are woven deeply into the fibers of our existence. Our longing to hear them from the right place is maddeningly and simultaneously our finest strength and our most gentle weakness. The album "I and Love and You" is unashamedly defined by such a dynamic of duality. As living people, we are bound by this unavoidable parallel. We are powerful yet weak, capable yet temporary. Inevitably, an attempt to place honesty within an artistic avenue will follow suit. This is a piece which shows us as we are: products of love surrounded by struggle. The music herein is, in many ways, readable as both a milestone and an arrival. A chapter in the story of young men, it bridges the space between the uncertainty of youth and the reality of its release. The record is full with the quality of question and response. As far as questions go, there are plenty...
Oh my, I couldn't even finish transcribing it. Yes, there are plenty of questions here, like What are you thinking by putting this turgid mess into the liner notes? (Fortunately, the music is far better). Yes, some of what it expresses does capture a faint glimmer of truth, but only in the same way that, say, Bach's B Minor Mass played on solo tuba captures a faint glimmer of beauty. This is the kind of thing that might drive me back to the Beastie Boys and Licensed to Ill.
What the Avetts seemed to be after is that "love," like "war" or "God," is a word that is wholly inadequate to the phenomena it purports to describe; words fail, and they are susceptible to debasement. When someone quipped that "all is fair in love and war," why did he leave out "God?" The three words arguably express the most violent emotions we can feel toward our own species: toward, against, and...up, up, and away. In the cases of love and war, at least, we can be sure that there is a there there, that there is a someone we're engaged with.
This is why poetry exists, to save words like these from exsanguination.
Sunday, October 4, 2009
The Concept of Anxiety
There is no Hope without Fear, and no Fear without Hope.
Spinoza
Today's Times, drawing on now famous research by Jerome Kagan, features a long review article on anxiety and it's relation to temperament. This is a great example of science confirming ancient intuition, in this case the fact that persons really are wired differently from conception. As the article states, any emotion has three components: physiology, subjectivity, and behavior--the question for life is how far the latter two can be unyoked from the first.
I don't have a great deal to say about anxiety except that as compared to the other major mental disorders--mood, psychosis, and substance abuse--it may be the one most likely to hide in plain sight. Anxiety can generate profound misery and grave impairment but does not, in itself, advertise by means of catatonia, craziness, or intoxication. And while all psychological symptoms exist on gray continua, the point at which anxiety becomes pathological may be the hardest of all to pinpoint. Anxiety, after all, is adaptive.
Anxiety and depression often travel together, but arguably they present differing core experiences. If the sine qua non of depression is loss (of a good, of an attachment), that of anxiety is dissatisfaction, the feeling that something is wrong that cannot be set right. In obsessive-compulsive disorder it is the immediate environment that is defective, while post-traumatic stress disorder entails a deep flaw in (social) reality itself--the world itself becomes antagonistic. For the socially anxious, it is the self that is unacceptable. And perhaps generalized and panic anxiety are the most closely aligned to fear, such that the future itself cannot be trusted.
If psychosis is metaphysical, and if the phenomenology of depression is somehow religious--the suspicion that there is not, in the end, enough good in the universe to make up for the bad--then the experience of anxiety is fundamentally moral and/or aesthetic--it is desperately important that something be made right. The other day D. G. Myers at A Commonplace Blog proposed a list of the "most depressing novels of all time" (I agreed with the choices with the notable exception of Miss Lonelyhearts, which is darkly hilarious, and not at all depressing). What are some of the most anxiogenic writings of all time? Hamlet, surely, inasuch as his uncle's perfidy and his mother's frailty constitute the rotten taint at the root of the prince's world. "Notes from the Underground." "The Waste Land." The Trial. Others?
Spinoza
Today's Times, drawing on now famous research by Jerome Kagan, features a long review article on anxiety and it's relation to temperament. This is a great example of science confirming ancient intuition, in this case the fact that persons really are wired differently from conception. As the article states, any emotion has three components: physiology, subjectivity, and behavior--the question for life is how far the latter two can be unyoked from the first.
I don't have a great deal to say about anxiety except that as compared to the other major mental disorders--mood, psychosis, and substance abuse--it may be the one most likely to hide in plain sight. Anxiety can generate profound misery and grave impairment but does not, in itself, advertise by means of catatonia, craziness, or intoxication. And while all psychological symptoms exist on gray continua, the point at which anxiety becomes pathological may be the hardest of all to pinpoint. Anxiety, after all, is adaptive.
Anxiety and depression often travel together, but arguably they present differing core experiences. If the sine qua non of depression is loss (of a good, of an attachment), that of anxiety is dissatisfaction, the feeling that something is wrong that cannot be set right. In obsessive-compulsive disorder it is the immediate environment that is defective, while post-traumatic stress disorder entails a deep flaw in (social) reality itself--the world itself becomes antagonistic. For the socially anxious, it is the self that is unacceptable. And perhaps generalized and panic anxiety are the most closely aligned to fear, such that the future itself cannot be trusted.
If psychosis is metaphysical, and if the phenomenology of depression is somehow religious--the suspicion that there is not, in the end, enough good in the universe to make up for the bad--then the experience of anxiety is fundamentally moral and/or aesthetic--it is desperately important that something be made right. The other day D. G. Myers at A Commonplace Blog proposed a list of the "most depressing novels of all time" (I agreed with the choices with the notable exception of Miss Lonelyhearts, which is darkly hilarious, and not at all depressing). What are some of the most anxiogenic writings of all time? Hamlet, surely, inasuch as his uncle's perfidy and his mother's frailty constitute the rotten taint at the root of the prince's world. "Notes from the Underground." "The Waste Land." The Trial. Others?
Thursday, October 1, 2009
Saving Face
A friend's fondness for Facebook--which I summarily sniffed at in a post a week ago--prompts me to reflect again. If the modality were good for nothing else at all, it is a superb email directory, and I've enjoyed hearing from all kinds of people I hadn't seen in years (in terms of getting looked up, it doesn't hurt to have a first/last name combination that is unique in the world so far as I've been able to discover--thanks Mom and Dad).
The weakness of Facebook as I see it comprises the Friend system and the way it complicates, despite controls, natural impulses to compartmentalize and modulate contact. If a delight of Facebook is hearing from old acquaintances, a downside is...hearing from old acquaintances; after all, there will always be some one is pleased to hear from, whereas in the case of others...not so much. After all, high school reunions happen only every five to ten years for a reason.
Obviously countless people have noted the way in which the Friend system encourages lame popularity contests, such that one would hardly turn down an offer of "Friendship" if the name were recognizable and not a subject of frank enmity. But in my case, it seems that the frequency and interest of posts by "Friends" is inversely proportional to the actual quality of real-life friendship. With an exception or two, the people I'm close to rarely post (if they have something to say, they'll call or email), whereas a relative by marriage (for now) posts countless quasi-delusional anti-Obama posts daily, and people I haven't seen in a quarter century keep me up-to-date on their momentary experiences.
And once someone is a "Friend," it seems a bit over-the-top to Unfriend him unless some specific or alienating incident has occurred. I haven't tried the Blocking feature yet--is this really distinct from Unfriending, and is it really unknown to the "victim?" Only a consummate snob, I'm sure, would amass vast numbers of "Friends" only to block all but those he really wanted to keep in touch with.
In a great Seinfeld episode George becomes distraught when his "worlds collide," that is, the parallel realities of friends and girlfriend commingle, blowing his mind. Facebook is like assembling all your friends and family over a lifetime in one great room, which sounds charming, except that it would be dreadful. Most of these people would have nothing in common apart from having known you, and would have nothing to say to one another (and in many cases, not so much to say to you either).
I think Facebook is great for keeping up with family pictures and major life events (some of which I won't be publicizing there however), and for my purposes, it would be enjoyable if there were a critical mass of actual friends who used Facebook enough to have stimulating discussions about politics or whatever. But beyond that it remains a curiosity.
The weakness of Facebook as I see it comprises the Friend system and the way it complicates, despite controls, natural impulses to compartmentalize and modulate contact. If a delight of Facebook is hearing from old acquaintances, a downside is...hearing from old acquaintances; after all, there will always be some one is pleased to hear from, whereas in the case of others...not so much. After all, high school reunions happen only every five to ten years for a reason.
Obviously countless people have noted the way in which the Friend system encourages lame popularity contests, such that one would hardly turn down an offer of "Friendship" if the name were recognizable and not a subject of frank enmity. But in my case, it seems that the frequency and interest of posts by "Friends" is inversely proportional to the actual quality of real-life friendship. With an exception or two, the people I'm close to rarely post (if they have something to say, they'll call or email), whereas a relative by marriage (for now) posts countless quasi-delusional anti-Obama posts daily, and people I haven't seen in a quarter century keep me up-to-date on their momentary experiences.
And once someone is a "Friend," it seems a bit over-the-top to Unfriend him unless some specific or alienating incident has occurred. I haven't tried the Blocking feature yet--is this really distinct from Unfriending, and is it really unknown to the "victim?" Only a consummate snob, I'm sure, would amass vast numbers of "Friends" only to block all but those he really wanted to keep in touch with.
In a great Seinfeld episode George becomes distraught when his "worlds collide," that is, the parallel realities of friends and girlfriend commingle, blowing his mind. Facebook is like assembling all your friends and family over a lifetime in one great room, which sounds charming, except that it would be dreadful. Most of these people would have nothing in common apart from having known you, and would have nothing to say to one another (and in many cases, not so much to say to you either).
I think Facebook is great for keeping up with family pictures and major life events (some of which I won't be publicizing there however), and for my purposes, it would be enjoyable if there were a critical mass of actual friends who used Facebook enough to have stimulating discussions about politics or whatever. But beyond that it remains a curiosity.
Wednesday, September 30, 2009
The Spirit of Self-Disclosure
The opening of Hawthorne's "The Custom House:"
It is a little remarkable, that--though disinclined to talk overmuch of myself and my affairs at the fireside, and to my personal friends--an autobiographical impulse should twice in my life have taken possession of me, in addressing the public. The first time was three or four years since, when I favored the reader--inexcusably, and for no earthly reason, that either the indulgent reader or the intrusive author could imagine--with a description of my way of life in the deep quietude of an Old Manse. And now--because, beyond my deserts, I was happy enough to find a listener or two on the former occasion--I again seize the public by the button, and talk of my three years' experience in a Custom-House. The example of the famous "P.P., Clerk of this Parish," was never more faithfully followed. The truth seems to be, however, that when he casts his leaves forth upon the wind, the author addresses, not the many who will fling aside his volume, or never take it up, but the few who will understand him, better than most of his schoolmates and lifemates. Some authors, indeed, do far more than this, and indulge themselves in such confidential depths of revelation as could fittingly be addressed, only and exclusively, to the one heart and mind of perfect sympathy; as if the printed book, thrown at large on the wide world, were certain to find out the divided segment of the writer's own nature, and complete his circle of existence by bringing him into communion with it. It is scarcely decorous, however, to speak all, even where we speak impersonally. But--as thoughts are frozen and utterance benumbed, unless the speaker stand in some true relation with his audience--it may be pardonable to imagine that a friend, a kind and apprehensive, though not the closest friend, is listening to our talk; and then, a native reserve being thawed by this genial consciousness, we may prate of the circumstances that lie around us, and even of ourself, but still keep the inmost Me behind its veil. To this extent and within these limits, an author, methinks, may be autobiographical, without violating either the reader's rights or his own.
It is a little remarkable, that--though disinclined to talk overmuch of myself and my affairs at the fireside, and to my personal friends--an autobiographical impulse should twice in my life have taken possession of me, in addressing the public. The first time was three or four years since, when I favored the reader--inexcusably, and for no earthly reason, that either the indulgent reader or the intrusive author could imagine--with a description of my way of life in the deep quietude of an Old Manse. And now--because, beyond my deserts, I was happy enough to find a listener or two on the former occasion--I again seize the public by the button, and talk of my three years' experience in a Custom-House. The example of the famous "P.P., Clerk of this Parish," was never more faithfully followed. The truth seems to be, however, that when he casts his leaves forth upon the wind, the author addresses, not the many who will fling aside his volume, or never take it up, but the few who will understand him, better than most of his schoolmates and lifemates. Some authors, indeed, do far more than this, and indulge themselves in such confidential depths of revelation as could fittingly be addressed, only and exclusively, to the one heart and mind of perfect sympathy; as if the printed book, thrown at large on the wide world, were certain to find out the divided segment of the writer's own nature, and complete his circle of existence by bringing him into communion with it. It is scarcely decorous, however, to speak all, even where we speak impersonally. But--as thoughts are frozen and utterance benumbed, unless the speaker stand in some true relation with his audience--it may be pardonable to imagine that a friend, a kind and apprehensive, though not the closest friend, is listening to our talk; and then, a native reserve being thawed by this genial consciousness, we may prate of the circumstances that lie around us, and even of ourself, but still keep the inmost Me behind its veil. To this extent and within these limits, an author, methinks, may be autobiographical, without violating either the reader's rights or his own.
Monday, September 28, 2009
The Day
"Behold a pale horse: and his name that sat on him was Death, and Hell followed with him."
Revelation
Something possessed me the past few days to reread Pat Frank's nuclear holocaust novel Alas, Babylon, which was an assigned text in 11th grade English, for me circa 1985. At the time I recall finding it both thrilling and horrifying, while this time I found it somewhat quaint; whether this says more about me or about the change in global politics or narrative trends I'm not sure.
Alas, Babylon is fifty years old this year, as is its SF cousin extraordinaire, Walter Miller, Jr.'s quasi-Catholic A Canticle for Leibowitz. There is something about such apocalyptic stories that inspire Biblical references and religious expostulations. Indeed, around the time that I read Alas, Babylon a quarter century ago at an impressionable age I also found myself in a death struggle of sorts with the Southern Baptist teachings that I had equivocally experienced until that time. The notion that the world can change in an instant--unpredictably, whether for good or for ill--is appalling and intolerable. Some forces are of too great a magnitude to be comprehended on a human scale.
Frank's novel probably appealed for one reason because it was set in central Florida, in the drolly fictional town of Fort Repose; somehow one does not associate radiation poisioning with palm trees. Much about the book evokes mid-century American culture: a certain smug can-do attitude even amid the horror, a preoccupation with race relations, and the way in which women anchor domestic life. The very context of a small town--beyond the expediency of a setting in which all of the potential characters haven't already been vaporized--seemed tailor-made to demonstrate the virtues of communal spirit. Barbarism occurs, but in measured forms, and off-stage.
The 1980's, as I recall the period at least, presented an ambiguous threat. The notorious height of the nuclear nightmare was a generation gone, and yet the danger was as present as ever. The age of Reagan was vaguely brisk and invigorating, but for the same reason unsettling; the Soviets seemed to be descending into torpor, but that very fact could foster unthinkable risk. Might the Soviet Union lapse into such decrepit and desperate backwardness that it would have virtually nothing to lose by unloosing its venom in a cataclysmic act of resentment against the West? The made-for-TV movie The Day After appeared in 1983.
Many apocalyptic storylines presented holocaust as arising from a long, if improbable, string of geopolitical tensions, reactions, and counter-reactions. For some reason I always envisioned The Day (as it is forever known in Alas, Babylon) as arising, if it ever did, abruptly and absurdly, not from an overcompensation for comprehensible provocations, but from a mechanical error, or from some psychotic fool pushing the wrong button at the wrong time, a la Dr. Strangelove. No doubt this dread arose from ignorance of what I'm sure are countless safeguards against the ultimate malfunction, but I saw the situations as two men prancing indefinitely on a high-wire; the greatest hazard came not from one deliberately pushing the other off, but from the recklessness of the basic setup.
Now we worry about Iran, about dirty bombs, about terrorists obliterating a city or two. This would be horrifically disastrous. And yet the menace of U.S.-Soviet mutually assured destruction is still, theoretically of course, present. I recall dreaming and daydreaming as a teenager about the second sun on the horizon, the mushroom cloud; my soon-to-be teenaged daughter will be spared these reflections, hopefully, whereas for my younger son these things are curiosities on Youtube, and the greatest thing to fear is the collapse of skyscrapers. Yet all those missiles are still out there, waiting to go, as ready now as they or there predecessors ever were.
Compared to Cormac McCarthy's The Road of a couple of years ago, Alas, Babylon is like a holiday fireworks display. The Road features a man and his son who, beyond numb with deprivation, make their way through an American landscape that has been burnt and scoured as clean as the moon, in which the only sources of nourishment are scavenged canned goods and cannibalism. It is ironic that the actual horrors of our time are borne ambiguously in Afghanistan, forgotten by perhaps the majority of Americans, whereas the fictional horrors of the time, now mainstream narrative tastes, would have turned stomachs in 1959. Are we wiser, or merely more cynical?
Revelation
Something possessed me the past few days to reread Pat Frank's nuclear holocaust novel Alas, Babylon, which was an assigned text in 11th grade English, for me circa 1985. At the time I recall finding it both thrilling and horrifying, while this time I found it somewhat quaint; whether this says more about me or about the change in global politics or narrative trends I'm not sure.
Alas, Babylon is fifty years old this year, as is its SF cousin extraordinaire, Walter Miller, Jr.'s quasi-Catholic A Canticle for Leibowitz. There is something about such apocalyptic stories that inspire Biblical references and religious expostulations. Indeed, around the time that I read Alas, Babylon a quarter century ago at an impressionable age I also found myself in a death struggle of sorts with the Southern Baptist teachings that I had equivocally experienced until that time. The notion that the world can change in an instant--unpredictably, whether for good or for ill--is appalling and intolerable. Some forces are of too great a magnitude to be comprehended on a human scale.
Frank's novel probably appealed for one reason because it was set in central Florida, in the drolly fictional town of Fort Repose; somehow one does not associate radiation poisioning with palm trees. Much about the book evokes mid-century American culture: a certain smug can-do attitude even amid the horror, a preoccupation with race relations, and the way in which women anchor domestic life. The very context of a small town--beyond the expediency of a setting in which all of the potential characters haven't already been vaporized--seemed tailor-made to demonstrate the virtues of communal spirit. Barbarism occurs, but in measured forms, and off-stage.
The 1980's, as I recall the period at least, presented an ambiguous threat. The notorious height of the nuclear nightmare was a generation gone, and yet the danger was as present as ever. The age of Reagan was vaguely brisk and invigorating, but for the same reason unsettling; the Soviets seemed to be descending into torpor, but that very fact could foster unthinkable risk. Might the Soviet Union lapse into such decrepit and desperate backwardness that it would have virtually nothing to lose by unloosing its venom in a cataclysmic act of resentment against the West? The made-for-TV movie The Day After appeared in 1983.
Many apocalyptic storylines presented holocaust as arising from a long, if improbable, string of geopolitical tensions, reactions, and counter-reactions. For some reason I always envisioned The Day (as it is forever known in Alas, Babylon) as arising, if it ever did, abruptly and absurdly, not from an overcompensation for comprehensible provocations, but from a mechanical error, or from some psychotic fool pushing the wrong button at the wrong time, a la Dr. Strangelove. No doubt this dread arose from ignorance of what I'm sure are countless safeguards against the ultimate malfunction, but I saw the situations as two men prancing indefinitely on a high-wire; the greatest hazard came not from one deliberately pushing the other off, but from the recklessness of the basic setup.
Now we worry about Iran, about dirty bombs, about terrorists obliterating a city or two. This would be horrifically disastrous. And yet the menace of U.S.-Soviet mutually assured destruction is still, theoretically of course, present. I recall dreaming and daydreaming as a teenager about the second sun on the horizon, the mushroom cloud; my soon-to-be teenaged daughter will be spared these reflections, hopefully, whereas for my younger son these things are curiosities on Youtube, and the greatest thing to fear is the collapse of skyscrapers. Yet all those missiles are still out there, waiting to go, as ready now as they or there predecessors ever were.
Compared to Cormac McCarthy's The Road of a couple of years ago, Alas, Babylon is like a holiday fireworks display. The Road features a man and his son who, beyond numb with deprivation, make their way through an American landscape that has been burnt and scoured as clean as the moon, in which the only sources of nourishment are scavenged canned goods and cannibalism. It is ironic that the actual horrors of our time are borne ambiguously in Afghanistan, forgotten by perhaps the majority of Americans, whereas the fictional horrors of the time, now mainstream narrative tastes, would have turned stomachs in 1959. Are we wiser, or merely more cynical?
Saturday, September 26, 2009
The Theory of Multiple Intelligences
Speak the speech, I pray you, as I pronounced it to you, trippingly on the tongue; but if you mouth it as many of our players do, I had as lief the town-crier spoke my lines.
Hamlet
Another axiom--That if poetry comes not as naturally as the leaves to a tree, it had better not come at all.
Keats
Arthur Krystal, in a Times article, writes rightly about the difficulties some writers seem to have with the spoken word. Personally I can say that when I open my mouth, it seems as though my available vocabulary drops by half, and my verbal IQ by nearly as much; it is like a microcosm of aphasia, than which I cannot personally imagine a more awful affliction.
When writing, whether on the keyboard or even longhand, it is as if words and ideas come rushing via broadband, whereas while speaking it is as if I am rustling through a sprawling card catalog, a clock ticking loudly in the background. But it is not merely a matter of a certain leisure of writing, for I write quickly. I suppose it is mainly an issue of overlearning a certain mode.
While the literature of medical documentation carries very little general wisdom, one dictum meant to keep the lawsuits at bay--"If it wasn't documented, it didn't happen"--does reflect at attitude one can acquire toward text. The spoken word is usually evanescent, the textual word potentially forever. The spoken word is all about supporting a relationship--whether personal, professional, civic, or legal--whereas the written word can be about instantiating a reality.
Hamlet
Another axiom--That if poetry comes not as naturally as the leaves to a tree, it had better not come at all.
Keats
Arthur Krystal, in a Times article, writes rightly about the difficulties some writers seem to have with the spoken word. Personally I can say that when I open my mouth, it seems as though my available vocabulary drops by half, and my verbal IQ by nearly as much; it is like a microcosm of aphasia, than which I cannot personally imagine a more awful affliction.
When writing, whether on the keyboard or even longhand, it is as if words and ideas come rushing via broadband, whereas while speaking it is as if I am rustling through a sprawling card catalog, a clock ticking loudly in the background. But it is not merely a matter of a certain leisure of writing, for I write quickly. I suppose it is mainly an issue of overlearning a certain mode.
While the literature of medical documentation carries very little general wisdom, one dictum meant to keep the lawsuits at bay--"If it wasn't documented, it didn't happen"--does reflect at attitude one can acquire toward text. The spoken word is usually evanescent, the textual word potentially forever. The spoken word is all about supporting a relationship--whether personal, professional, civic, or legal--whereas the written word can be about instantiating a reality.
Friday, September 25, 2009
Seek and Ye Shall Find
And for what, except for you, do I feel love?
Do I press the extremest book of the wisest man
Close to me, hidden in me day and night?
In the uncertain light of single, certain truth,
Equal in living changingness to the light
In which I meet you, in which we sit at rest,
For a moment in the central of our being,
The vivid transparence that you bring is peace.
Wallace Stevens
On an oppressively overcast morning in North Carolina, I seek and seek again for a subject (besides dissing popular social networking sites); I am answered, in the form of the phenomena of seeking and finding themselves. Dinah in a recent post at Shrink Rap, discussing the buzz over the pending publication of Jung's Red Book, wonders about the quasi-religious devotion of certain Jungians to their master. Specifically, she marvels, acknowledging a certain envy on her part, that some people come to feel that another human being has come up with the answers that matter in life.
There is fact and there is truth. Facts are the objectively verifiable states of affairs of history and science. Truth is a mode of living, whether individual or collective, that responds to deeply felt human needs. Fact is empirical; truth is subjective and spiritual. If needs change, truth may change as well, but facts won't. However, truth bears such conviction that it has a normative dimension; unlike, say, a mere preference for root beer, it cannot by its very nature be merely relative. Truth need not be totalitarian, but it stakes its claim and naturally seeks community. If I enjoy root beer, and it turns out that no one else on earth does, this fact may puzzle but not necessarily dismay me. But if I embrace moral, aesthetic, or spiritual truth that no one else in the world shares, I am made to feel sad and isolated. In contrast, if I am convinced of a fact that no one else in the world seems to see, I am made to feel mad.
Truth is another word for sensibility, whether individual or collective. Stephen Martin, the Jungian alluded to in Dinah's post, seems to have found in Jung's life and thought a corresponding truth and sensibility. Jung speaks to him as perhaps no one else does. This is an entirely different issue from the "evidence" (i.e. facts) that folks perennially try to muster to bolster the case for psychotherapy (often with an eye toward justifying reimbursement). This is not to say that psychotherapy can't have a factual dimension (e.g. exposure and response prevention may factually, on average, reduce symptoms of obsessive-compulsive disorder), but this is far afield from psychotherapy as mode of exploration and discovery.
The lucky ones, perhaps, are those who are so constituted as to find one Truth, or more realistically, a central Truth that towers over other truths as an effective spiritual colossus. That is what the conventionally religious achieve, as well as the figuratively religious, like Stephen Martin perhaps. Others must content themselves with myriad truths, like facets on a great gem whose center can only be imagined; undiscovered facets, and undiscovered gems, must be stipulated as well.
I have no Jung, no one human being or school of thought that organizes the world and my experience. Rather, I have many Jungs, men and women whose thoughts or creations I happen upon and think, "A century before I was born, this person knew me, anticipated me." This is truth by committee, albeit a transcendent committee of unherdable cats (truth is inescapably feline for me). I read somewhere that Bob Dylan once said that of his records Blonde on Blonde best captured a certain ineffable sound in his head; this was inseparable from his sensibility, from who he unavoidably was. I do not have primary creativity like that, so I must rely on others to evoke deeply shared experience.
So when people perpetually talk about great books or author lists, for me that really translates into a kind of personal pantheon of spiritual interlocutors, people--like me but vastly more creative and expressive--who could not help occupying at least part of the same truth. Among poets: Wallace Stevens, Emily Dickinson, Blake, Shakespeare. Among writers: Proust, Hardy, Dickens, Hawthorne, Melville, Dostoevsky, Chekhov, Tolkien, Loren Eiseley, Kafka. Among thinkers: Emerson, Thoreau, Kierkegaard, Nietzsche. Among painters: Kandinsky, Klee, Blake (again), Hopper, O'Keeffe, Brueghel. Among musicians: Beethoven, Schubert, Dylan, Neil Young. I tend to resist film, perhaps because I find that it colonizes consciousness too aggressively, but if I had to pick I would say Hitchcock, Woody Allen, and Stanely Kubrick.
And then there is the different need, for an actual, present, and living person who shares a sensibility, perhaps both reflected and refracted into a form only obliquely recognizable. Yes, there is that too.
Do I press the extremest book of the wisest man
Close to me, hidden in me day and night?
In the uncertain light of single, certain truth,
Equal in living changingness to the light
In which I meet you, in which we sit at rest,
For a moment in the central of our being,
The vivid transparence that you bring is peace.
Wallace Stevens
On an oppressively overcast morning in North Carolina, I seek and seek again for a subject (besides dissing popular social networking sites); I am answered, in the form of the phenomena of seeking and finding themselves. Dinah in a recent post at Shrink Rap, discussing the buzz over the pending publication of Jung's Red Book, wonders about the quasi-religious devotion of certain Jungians to their master. Specifically, she marvels, acknowledging a certain envy on her part, that some people come to feel that another human being has come up with the answers that matter in life.
There is fact and there is truth. Facts are the objectively verifiable states of affairs of history and science. Truth is a mode of living, whether individual or collective, that responds to deeply felt human needs. Fact is empirical; truth is subjective and spiritual. If needs change, truth may change as well, but facts won't. However, truth bears such conviction that it has a normative dimension; unlike, say, a mere preference for root beer, it cannot by its very nature be merely relative. Truth need not be totalitarian, but it stakes its claim and naturally seeks community. If I enjoy root beer, and it turns out that no one else on earth does, this fact may puzzle but not necessarily dismay me. But if I embrace moral, aesthetic, or spiritual truth that no one else in the world shares, I am made to feel sad and isolated. In contrast, if I am convinced of a fact that no one else in the world seems to see, I am made to feel mad.
Truth is another word for sensibility, whether individual or collective. Stephen Martin, the Jungian alluded to in Dinah's post, seems to have found in Jung's life and thought a corresponding truth and sensibility. Jung speaks to him as perhaps no one else does. This is an entirely different issue from the "evidence" (i.e. facts) that folks perennially try to muster to bolster the case for psychotherapy (often with an eye toward justifying reimbursement). This is not to say that psychotherapy can't have a factual dimension (e.g. exposure and response prevention may factually, on average, reduce symptoms of obsessive-compulsive disorder), but this is far afield from psychotherapy as mode of exploration and discovery.
The lucky ones, perhaps, are those who are so constituted as to find one Truth, or more realistically, a central Truth that towers over other truths as an effective spiritual colossus. That is what the conventionally religious achieve, as well as the figuratively religious, like Stephen Martin perhaps. Others must content themselves with myriad truths, like facets on a great gem whose center can only be imagined; undiscovered facets, and undiscovered gems, must be stipulated as well.
I have no Jung, no one human being or school of thought that organizes the world and my experience. Rather, I have many Jungs, men and women whose thoughts or creations I happen upon and think, "A century before I was born, this person knew me, anticipated me." This is truth by committee, albeit a transcendent committee of unherdable cats (truth is inescapably feline for me). I read somewhere that Bob Dylan once said that of his records Blonde on Blonde best captured a certain ineffable sound in his head; this was inseparable from his sensibility, from who he unavoidably was. I do not have primary creativity like that, so I must rely on others to evoke deeply shared experience.
So when people perpetually talk about great books or author lists, for me that really translates into a kind of personal pantheon of spiritual interlocutors, people--like me but vastly more creative and expressive--who could not help occupying at least part of the same truth. Among poets: Wallace Stevens, Emily Dickinson, Blake, Shakespeare. Among writers: Proust, Hardy, Dickens, Hawthorne, Melville, Dostoevsky, Chekhov, Tolkien, Loren Eiseley, Kafka. Among thinkers: Emerson, Thoreau, Kierkegaard, Nietzsche. Among painters: Kandinsky, Klee, Blake (again), Hopper, O'Keeffe, Brueghel. Among musicians: Beethoven, Schubert, Dylan, Neil Young. I tend to resist film, perhaps because I find that it colonizes consciousness too aggressively, but if I had to pick I would say Hitchcock, Woody Allen, and Stanely Kubrick.
And then there is the different need, for an actual, present, and living person who shares a sensibility, perhaps both reflected and refracted into a form only obliquely recognizable. Yes, there is that too.
Wednesday, September 23, 2009
Missing
You know, I make an effort to remain relevant, so again and again, despite my better judgment, I return to Twitter and Facebook to try to figure out what the appeal is. These modalities seem to me to be the Web codification of the appropriately named quantity "small talk." Well, if there is a gene for chitchat, it was spliced out of me at the beginning.
But Web small talk isn't even small talk that is directed at anyone in particular, which drains it of whatever "small" charm it may have to begin with. In "real life," small talk at least may facilitate being with another person, however minimally or casually. If email, which at least has a designated recipient, is a message in a bottle, Facebook and Twitter (like the more otiose blogs around) are confetti cast upon the waters, social and semantic froth. Is life really long enough for these things?
As usual I cast my lot with Emily:
The Missing All -- prevented Me
From missing minor Things.
If nothing larger than a World's
Departure from a Hinge --
Or Sun's extinction, be observed --
'Twas not so large that I
Could lift my Forehead from my work
For Curiosity.
But Web small talk isn't even small talk that is directed at anyone in particular, which drains it of whatever "small" charm it may have to begin with. In "real life," small talk at least may facilitate being with another person, however minimally or casually. If email, which at least has a designated recipient, is a message in a bottle, Facebook and Twitter (like the more otiose blogs around) are confetti cast upon the waters, social and semantic froth. Is life really long enough for these things?
As usual I cast my lot with Emily:
The Missing All -- prevented Me
From missing minor Things.
If nothing larger than a World's
Departure from a Hinge --
Or Sun's extinction, be observed --
'Twas not so large that I
Could lift my Forehead from my work
For Curiosity.
Monday, September 21, 2009
The Wild
"And we've got to get ourselves
Back to the garden"
Joni Mitchell
Atavism and kids go hand in hand, so last night we set up the tent and built a fire in the back yard. But after some blankets and pillows, the first object transferred to the tent was...a portable DVD player. ("Daddy, do you have a long extension cord? NO!").
Much has been made in the environmental age of a putative core relation of Homo sapiens to the natural world, of a kind exceeding mere pragmatism. I think there is something to this, but only ambiguously, and not in the way that John Muir may have experienced it. For there are two natural ideals, that of the wilderness and that of the garden. Most people in the history of the world, when given the chance, have preferred their nature cultivated and domesticated.
The very idea of wilderness depended on the great divide that was consciousness, the terrifying realization that so much of nature is not only "not me," but also "not of my kind." So history has consisted, among other things, of a stampede away from unrelieved wilderness. Only in the past century or so has the pressure of "our own kind" become so intolerable to some that wilderness seems like a relief by comparison.
The original Garden was well-named, of course, but it was in fact cultivated, by God if not by us. Indeed, the religious impulse could be said to entail an attempt to convert wilderness to Garden, to transform an inimical landscape to one that is somehow home to humankind. Nice try. And nice try, too, when we try to appreciate wilderness on its own terms these days, for we can't help importing humanity, whether physically or conceptually, as we do so. For the ideal of wilderness is human, all-too-human. If we learned that Earth's biosphere would be annihilated altogether tomorrow, who would shed a tear? No non-humans, that's for sure.
So the history of Earth, however complicated it may be to geologists, arguably contains two great phases: pre-conscious and conscious. If to paraphrase Wordsworth, we murder to landscape, it's hard to see how we could avoid doing so. Hamlet lamented of the world, "Fie on't, ah fie, 'tis an unweeded garden/ That grows to seed. Things rank and gross in nature/ Possess it merely." To be cast into consciousness is to be painfully aware of the existence of weeds, and of the duty to cast them out.
Are there many things in this world that are more grotesquely "unnatural" than a marshmallow, burned to a carcinogenic but tasty crisp? No, not unnatural at all, merely a "weed" to some. But I rather enjoy the dandelions in spring.
Back to the garden"
Joni Mitchell
Atavism and kids go hand in hand, so last night we set up the tent and built a fire in the back yard. But after some blankets and pillows, the first object transferred to the tent was...a portable DVD player. ("Daddy, do you have a long extension cord? NO!").
Much has been made in the environmental age of a putative core relation of Homo sapiens to the natural world, of a kind exceeding mere pragmatism. I think there is something to this, but only ambiguously, and not in the way that John Muir may have experienced it. For there are two natural ideals, that of the wilderness and that of the garden. Most people in the history of the world, when given the chance, have preferred their nature cultivated and domesticated.
The very idea of wilderness depended on the great divide that was consciousness, the terrifying realization that so much of nature is not only "not me," but also "not of my kind." So history has consisted, among other things, of a stampede away from unrelieved wilderness. Only in the past century or so has the pressure of "our own kind" become so intolerable to some that wilderness seems like a relief by comparison.
The original Garden was well-named, of course, but it was in fact cultivated, by God if not by us. Indeed, the religious impulse could be said to entail an attempt to convert wilderness to Garden, to transform an inimical landscape to one that is somehow home to humankind. Nice try. And nice try, too, when we try to appreciate wilderness on its own terms these days, for we can't help importing humanity, whether physically or conceptually, as we do so. For the ideal of wilderness is human, all-too-human. If we learned that Earth's biosphere would be annihilated altogether tomorrow, who would shed a tear? No non-humans, that's for sure.
So the history of Earth, however complicated it may be to geologists, arguably contains two great phases: pre-conscious and conscious. If to paraphrase Wordsworth, we murder to landscape, it's hard to see how we could avoid doing so. Hamlet lamented of the world, "Fie on't, ah fie, 'tis an unweeded garden/ That grows to seed. Things rank and gross in nature/ Possess it merely." To be cast into consciousness is to be painfully aware of the existence of weeds, and of the duty to cast them out.
Are there many things in this world that are more grotesquely "unnatural" than a marshmallow, burned to a carcinogenic but tasty crisp? No, not unnatural at all, merely a "weed" to some. But I rather enjoy the dandelions in spring.
Saturday, September 19, 2009
The Referendum
He has no children. All my pretty ones?
Did you say all? O hell-kite! All?
Macduff
Tim Kreider in the Times has a whimsical but spot-on piece about how we go about appraising the choices we--and no less important, others--have made in life. We seem destined to be comparative creatures, and while one wants one's life not to turn out disastrously, one may want even more that it turn out at least slightly less disastrously than the guy next door. Kreider wryly terms this mid-life look to the side "The Referendum."
I'll go out of my way here not to luxuriate in existential exhalations. His article though raised the question for me of what, really, is the single most salient, consequential, and true yes-or-no decision that one makes. I would suggest that in terms of classically fork-in-the-road life choices, the matter of children takes them all.
There is, of course, much in life that we have surprisingly little control over. My temperament and personality have been what they are as far back as my memories go; I can no more change them than I can turn around and look at the back of my head. Have I "chosen" my religious inclinations or the lack thereof (in any conventional sense anyway)? No, not really. I have in fact chosen to expose myself to a wide spectrum of religious ideas and experiences to make sure that I haven't missed anything, but the fact that these either have--or more commonly have not--impressed me is not really under my control. My sensibility is what it is.
Can I choose who I'm attracted to or fall in love with? It appears not. One likes what one likes, full stop. And the fact that one is attracted to the same qualities or types over and over again only drives this point home. The fact that these attractions can be spectacularly problematic, especially for long-term coexistence, does not however suggest that they are truly or existentially chosen. But I'll grant you that whether and whom to marry are significant choices, because one hopes at least to bring the unchosen nature of attraction somewhat in line with prudential considerations of what is likely to produce happiness over time.
But bachelorhood can be reconsidered, and marriages overturned. Similarly with where to live and how to work. These are weighty matters, of course, but none of them irrevocable. Move. Go back to school. Reboot. A friend of mine of a conservative bent not long ago lamented that American culture maintains the fiction that one can be forever pluripotent, like a walking, talking, middle-aged embryonic stem cell. We know it's not that easy. One differentiates; doors close. But freedom is real, isn't it, and not just self-indulgence?
The problem with freedom, conceptually, is that while we often consider specific choices or dilemmas in isolation, they really are subtly influenced by countless other decisions already made, whether by us or for us. The decision of whether to marry is influenced by who one happens to be attracted to, by how one went about about meeting people, by where one chose to live, etc. No decision enjoys a vacuum.
But to get back to the issue of procreation, I would say that child-bearing is the single greatest decision made in a life, for two reasons. It is a true either-or decision, even if one can quibble over degrees such as numbers of children, biological vs. adopted children, etc. (and of course there is the tragedy of infertility). But even more than that, becoming a parent is arguably the most irrevocable decision one makes.
I've long puzzled over why people have children. Many reasons throughout history present themselves: raising labor for the farm, populating the fatherland, generating prestige through marriage alliances. In evolutionary terms, the drive to have children is indispensable, to put it mildly. And sociologically it often just amounts to keeping up with the Joneses (although this suggests that if the Joneses ever stopped having kids--as they have in some European countries--procreation could theoretically grind to a halt). Or maybe people, pathetically and misguidedly, have kids for the same reason they may have pets, because they crave love from a dependent.
But why would a self-aware, thoughtful person in 2009, putting aside all of these automatisms, choose to have children, who after all are a massive drain of time, money, and emotional energy throughout the prime years of one's life? It seems to me that the wisest and best reason is to engage in a relationship with another human being that is like no other, really, in its intimacy and permanency. I mean this in no icky or enmeshed sense, of course, but in no other kind of relationship does one gain a closeness with a human being as one sees him/her come into the world and develop into a person.
All other relationships are reliably finite. Parents grow old and die. Friends move away or choose other paths, as do siblings. Spouses may grow estranged or, even worse, strange and unrecognizable. But children are forever. If they die, or even if they move across the world and don't call, they are not gone, but burn painfully in memory. Yes, there are people, fathers more often than mothers, who can and do forget their children, but I cannot ultimately understand or relate to them--they are beyond the pale for me.
I have nowhere implied I hope that having children is somehow desirable or praiseworthy in any general sense. Needs vary. And I realize that whether or not to parent, like all choices, is also never ex vacuo, but is influenced by countless social, familial, and psychological factors. But to my mind it is the decision that ends up dominating any life, and a decision in which, ironically, one freely (or as freely as it is given us to undertake anything in this world) chooses to anchor oneself--or less charitably, bind oneself--to a system of irrevocable relationships projecting into the future.
Did you say all? O hell-kite! All?
Macduff
Tim Kreider in the Times has a whimsical but spot-on piece about how we go about appraising the choices we--and no less important, others--have made in life. We seem destined to be comparative creatures, and while one wants one's life not to turn out disastrously, one may want even more that it turn out at least slightly less disastrously than the guy next door. Kreider wryly terms this mid-life look to the side "The Referendum."
I'll go out of my way here not to luxuriate in existential exhalations. His article though raised the question for me of what, really, is the single most salient, consequential, and true yes-or-no decision that one makes. I would suggest that in terms of classically fork-in-the-road life choices, the matter of children takes them all.
There is, of course, much in life that we have surprisingly little control over. My temperament and personality have been what they are as far back as my memories go; I can no more change them than I can turn around and look at the back of my head. Have I "chosen" my religious inclinations or the lack thereof (in any conventional sense anyway)? No, not really. I have in fact chosen to expose myself to a wide spectrum of religious ideas and experiences to make sure that I haven't missed anything, but the fact that these either have--or more commonly have not--impressed me is not really under my control. My sensibility is what it is.
Can I choose who I'm attracted to or fall in love with? It appears not. One likes what one likes, full stop. And the fact that one is attracted to the same qualities or types over and over again only drives this point home. The fact that these attractions can be spectacularly problematic, especially for long-term coexistence, does not however suggest that they are truly or existentially chosen. But I'll grant you that whether and whom to marry are significant choices, because one hopes at least to bring the unchosen nature of attraction somewhat in line with prudential considerations of what is likely to produce happiness over time.
But bachelorhood can be reconsidered, and marriages overturned. Similarly with where to live and how to work. These are weighty matters, of course, but none of them irrevocable. Move. Go back to school. Reboot. A friend of mine of a conservative bent not long ago lamented that American culture maintains the fiction that one can be forever pluripotent, like a walking, talking, middle-aged embryonic stem cell. We know it's not that easy. One differentiates; doors close. But freedom is real, isn't it, and not just self-indulgence?
The problem with freedom, conceptually, is that while we often consider specific choices or dilemmas in isolation, they really are subtly influenced by countless other decisions already made, whether by us or for us. The decision of whether to marry is influenced by who one happens to be attracted to, by how one went about about meeting people, by where one chose to live, etc. No decision enjoys a vacuum.
But to get back to the issue of procreation, I would say that child-bearing is the single greatest decision made in a life, for two reasons. It is a true either-or decision, even if one can quibble over degrees such as numbers of children, biological vs. adopted children, etc. (and of course there is the tragedy of infertility). But even more than that, becoming a parent is arguably the most irrevocable decision one makes.
I've long puzzled over why people have children. Many reasons throughout history present themselves: raising labor for the farm, populating the fatherland, generating prestige through marriage alliances. In evolutionary terms, the drive to have children is indispensable, to put it mildly. And sociologically it often just amounts to keeping up with the Joneses (although this suggests that if the Joneses ever stopped having kids--as they have in some European countries--procreation could theoretically grind to a halt). Or maybe people, pathetically and misguidedly, have kids for the same reason they may have pets, because they crave love from a dependent.
But why would a self-aware, thoughtful person in 2009, putting aside all of these automatisms, choose to have children, who after all are a massive drain of time, money, and emotional energy throughout the prime years of one's life? It seems to me that the wisest and best reason is to engage in a relationship with another human being that is like no other, really, in its intimacy and permanency. I mean this in no icky or enmeshed sense, of course, but in no other kind of relationship does one gain a closeness with a human being as one sees him/her come into the world and develop into a person.
All other relationships are reliably finite. Parents grow old and die. Friends move away or choose other paths, as do siblings. Spouses may grow estranged or, even worse, strange and unrecognizable. But children are forever. If they die, or even if they move across the world and don't call, they are not gone, but burn painfully in memory. Yes, there are people, fathers more often than mothers, who can and do forget their children, but I cannot ultimately understand or relate to them--they are beyond the pale for me.
I have nowhere implied I hope that having children is somehow desirable or praiseworthy in any general sense. Needs vary. And I realize that whether or not to parent, like all choices, is also never ex vacuo, but is influenced by countless social, familial, and psychological factors. But to my mind it is the decision that ends up dominating any life, and a decision in which, ironically, one freely (or as freely as it is given us to undertake anything in this world) chooses to anchor oneself--or less charitably, bind oneself--to a system of irrevocable relationships projecting into the future.
Friday, September 18, 2009
Still Kicking
The blog may be dying a slow death, but I'm not (well, no more so that we all are, really). A great deal has been going on here of the family and relationship variety, and in so overwhelming an emotional fashion as to make blogging matters seem quaint by comparison. But life requires quaintness as well as intensity, the abstract as much as the visceral, so whether I will regain the appetite for blogging that I had last year--whether I any more feel the need for that--I don't know, but the final word has not been written.
Sunday, September 13, 2009
The Hidden God
Less and Less Human, O Savage Spirit
If there must be a god in the house, must be,
Saying things in the rooms and on the stair,
Let him move as the sunlight moves on the floor,
Or moonlight, silently, as Plato's ghost
Or Aristotle's skeleton. Let him hang out
His stars on the wall. He must dwell quietly.
He must be incapable of speaking, closed,
As those are: as light, for all its motion, is;
As color, even the closest to us, is;
As shapes, though they portend us, are.
It is the human that is the alien,
The human that has no cousin in the moon.
It is the human that demands his speech
From beasts or from the incommunicable mass.
If there must be a god in the house, let him be one
That will not hear us when we speak: a coolness,
A vermilioned nothingness, any stick of the mass
Of which we are too distantly a part.
Wallace Stevens
Saturday, September 12, 2009
KISS and Tell
Among my guilty pleasures are the lower echelons of several modes of popular culture, including comics (particularly vintage copies from the 40's through the 60's) and music. KISS had a lot in common with comics actually, as the quartet came up with the magnificent 70's shtick of costumed secret identities: the rock band as perverse homage to the Justice League of America. I read today that KISS is coming out with a new record, amusingly titled "Sonic Boom," due out on October 6, a date and a year than which nothing could be more fitting in my case. It's almost enough to make me go out and buy a turntable in case it's released on vinyl. Supposedly the record is being advertised as the band's best effort since their 70's heyday; if that's damning with faint praise, so be it.
I wish I could recall when I first stumbled upon KISS, but it wasn't long before I discovered theirs was a subversive aesthetic deliciously at odds with the adult world around me. I came upon them when they were still in their prime, riding high on records like Destroyer and Alive I and II. They were my first concert, which I suppose would have been in 1979 as it was the Dynasty tour. It was probably one of my father's greatest indignities as a parent that he endured that show; his tastes running as they do to Lawrence Welk, I suppose he inserted his earplugs and stared, bewildered, into the smoky darkness. I would have been ten.
Comics books, which are for kids what opera is for some adults, are over-the-top in their aesthetic and mobilize simple but powerful themes. KISS ingeniously, in a move uniting Joseph Campbell with heavy metal, utilized several potent archetypes appealing even to kids and teenagers: the demonic (Gene Simmons), the spiritual or otherworldy (Ace Frehley), the animal (Peter Criss), and of course the sexual (Paul Stanley). Their shows, employing fire-breathing, blood-spitting and various other antics, generally outshone their studio records in both musical intensity and overall effect. It was a formula that probably shouldn't have worked, but for a few years it did. It was like a popular parody of Wagner's gesamtkunstwerk.
The actual music of KISS is a farrago of slick guitar chords and saccharine pseudo-strings. And if 80% of popular music is about one wild thing, in KISS's case it may have been more like 90%, and with less subtlety. For a pre-teen this was just fine; I was receiving disguised messages from an adult world barely guessed-at. It all segued well with a fascination for Conan the Cimmerian (in the form of the decadent Robert E. Howard books, not the Schwarzenegger movies--I still have to remind myself this actor is governor of California). Sure, stuff like "Calling Dr. Love," "God of Thunder," and "Rock and Roll all Nite" was schlock, but it was schlock a ten-year-old could well appreciate. I just loved the iconography; although like boxed wine it may have been tawdry, it pushed physiological buttons and added meaning to the world. The over-the-top excess (check out the shoes!), so foreign to my usual identity, appealed. Even the band's old label--Casablanca--evoked vistas of virtually unattainable mystery. By my early teens I had moved on; but do we ever fully "move on?"
I suppose I loved Simmons the best. While my full defection from conventional religion wouldn't come for a few more years (courtesy of Dostoyevsky and Nietzsche), Simmons delivered a message every young boy needs to hear: you have impulses that aren't very nice, but that's natural and okay, so long as you express them this way, in a song and not in the world. Did KISS do any harm I wonder? There have been infamous casualties at Who and Rolling Stones concerts, but KISS?
The real problem with pop phenomena like KISS is that they outlive their glory by about thirty years, recylcing material into their dotage. Around 1980 they should have disbanded and, like Prospero, should have broken their staves and "deeper than did ever plummet sound" have plunged their books, rather than having lived on as a caricature of 70's culture. If they produce anything remotely worthwhile on October 6 they will command a new archetype: Lazarus.
Wednesday, September 2, 2009
The Beauty of Things
Not mine (alas):
To feel and speak the astonishing beauty of things--earth,
stone and water,
Beast, man and woman, sun, moon and stars--
The blood-shot beauty of human nature, its thoughts,
frenzies and passions,
And unhuman nature its towering reality--
For man's half dream; man, you might say, is nature
dreaming, but rock
And water and sky are constant--to feel
Greatly, and understand greatly, and express greatly, the
natural
Beauty, is the sole business of poetry.
The rest's diversion: those holy or noble sentiments, the
intricate ideas,
The love, lust, longing: reasons, but not the reason.
Robinson Jeffers
To feel and speak the astonishing beauty of things--earth,
stone and water,
Beast, man and woman, sun, moon and stars--
The blood-shot beauty of human nature, its thoughts,
frenzies and passions,
And unhuman nature its towering reality--
For man's half dream; man, you might say, is nature
dreaming, but rock
And water and sky are constant--to feel
Greatly, and understand greatly, and express greatly, the
natural
Beauty, is the sole business of poetry.
The rest's diversion: those holy or noble sentiments, the
intricate ideas,
The love, lust, longing: reasons, but not the reason.
Robinson Jeffers
Tuesday, September 1, 2009
Obama Drama
Inexplicably I have lived into middle age without having written a clerihew. My life is now complete; this changes everything. If this is not my Everest, it is my answer to one of my son's favorite questions: "What is the smallest mountain in the world?" (The ant hill in the front yard--how should I know?).
Barack Obama our President
Has learned from precedent:
Nothing worse than being defeatist
Unless it's being labeled elitist.
Barack Obama our President
Has learned from precedent:
Nothing worse than being defeatist
Unless it's being labeled elitist.
Sunday, August 30, 2009
Orion Rising
Like needles infinitely thin
And infinitely cold,
Relentless stars resist the dawn
And pierce the restive mind.
Too beautiful for us, those suns
Recede. Before they fade
They cast the Hunter's lying form:
I cannot not see Him.
But could we travel there, his limbs,
His helm, his mighty sword
Would shiver mockingly apart
As Betelgeuse conspired.
In all the universe Orion exists
Just here, beholden to blinding human need.
And infinitely cold,
Relentless stars resist the dawn
And pierce the restive mind.
Too beautiful for us, those suns
Recede. Before they fade
They cast the Hunter's lying form:
I cannot not see Him.
But could we travel there, his limbs,
His helm, his mighty sword
Would shiver mockingly apart
As Betelgeuse conspired.
In all the universe Orion exists
Just here, beholden to blinding human need.
Saturday, August 29, 2009
Two More
I've never been much of a poetry writer, and until recently I hadn't set my hand to one in some fifteen years. I'm not sure why I've felt inclined lately (although I can guess).
I've also never been a big fan of formal rhyme, which in unskilled hands can lapse into the sing-songy, stilted, and sentimental. Thus this attempt in sonnet form:
Girl rides her horse with childlike gravity;
Defiance thereof, perhaps, makes her rejoice
To cross this paradise of grass and trees
In summer's silence but for the crickets' noise.
It never will be simple like this again:
The girl and animal, a space to share.
She brushes Princess Star and looks within
The foreign eyes, interpreting their stare.
The horse's muteness sobers and puzzles her;
She seeks to draw her into consciousness.
About the lives of others we can infer,
Propose, conjecture, extrapolate at best.
She parses difference, intuits "same,"
Explores, enlarges, and learns the limits of "tame."
And this one indulges my weakness for the subjective sublime, and for a very conventional pun:
Boy hurls himself against the surf, as sun
And sea contend on the ribboned anvil of rock,
Or tiny rocks, the sand, the earth ground down
Into flowing stone. The boy is one
Of mine, a drop of sea-stuff bound in bone
And sinew, fired by solar elements,
Warmed to the point of restless self-regard.
We came from there, I think, correct myself--
Not we, but bits of matter did congregate
In acts of whimsy, once, until the game
Developed needs, demands, to be gratified
Or denied. The water waits with awful patience,
Inscrutably, for what I do not know.
The sea is self-estranged; its progeny
See its depths as alien, visceral.
The ocean swallows the sun, in dreams at least.
But I think otherwise: the sea will boil
Five billion years from now, when all are gone.
But now I watch my son, scorched by our star,
But cooled by salty spray from the abyss,
Stand before a towering wave, which swats
Him flat onto the river of scraping stone.
He rises, laughing: son is victor, now.
I've also never been a big fan of formal rhyme, which in unskilled hands can lapse into the sing-songy, stilted, and sentimental. Thus this attempt in sonnet form:
Daughter's Day
Girl rides her horse with childlike gravity;
Defiance thereof, perhaps, makes her rejoice
To cross this paradise of grass and trees
In summer's silence but for the crickets' noise.
It never will be simple like this again:
The girl and animal, a space to share.
She brushes Princess Star and looks within
The foreign eyes, interpreting their stare.
The horse's muteness sobers and puzzles her;
She seeks to draw her into consciousness.
About the lives of others we can infer,
Propose, conjecture, extrapolate at best.
She parses difference, intuits "same,"
Explores, enlarges, and learns the limits of "tame."
And this one indulges my weakness for the subjective sublime, and for a very conventional pun:
Carolina Beach
Boy hurls himself against the surf, as sun
And sea contend on the ribboned anvil of rock,
Or tiny rocks, the sand, the earth ground down
Into flowing stone. The boy is one
Of mine, a drop of sea-stuff bound in bone
And sinew, fired by solar elements,
Warmed to the point of restless self-regard.
We came from there, I think, correct myself--
Not we, but bits of matter did congregate
In acts of whimsy, once, until the game
Developed needs, demands, to be gratified
Or denied. The water waits with awful patience,
Inscrutably, for what I do not know.
The sea is self-estranged; its progeny
See its depths as alien, visceral.
The ocean swallows the sun, in dreams at least.
But I think otherwise: the sea will boil
Five billion years from now, when all are gone.
But now I watch my son, scorched by our star,
But cooled by salty spray from the abyss,
Stand before a towering wave, which swats
Him flat onto the river of scraping stone.
He rises, laughing: son is victor, now.
Friday, August 28, 2009
Bumps in the Road of Science
From Edward Mendelson, The Things That Matter: What Seven Classic Novels Have to Say about the Stages of Life (2006):
In the early nineteenth century the most up-to-date and modern psychological science was phrenology, the pseudoscience that identifies your emotional and moral character by mapping the bumps on your skull, in much the same way that more recent pseudoscience traces your voluntary actions back to the unchosen, involuntary workings of selfish or altruistic genes. A university chair in phrenology was established at Glasgow in 1845, and Charlotte Bronte, like most of her contemporaries, took it for granted that phrenology was valid science; Jane Eyre is conscious of her "organ of veneration" when Helen Burns recites Virgil, and observes on Rochester's forehead "an abrupt deficiency where the suave sign of benevolence should have risen," and Rochester observes in Jane "a good deal of the organ of Adhesiveness." But for Charlotte Bronte, phrenology was merely a familiar feature of her intellectual landscape. Marian Evans [George Eliot] took it far more seriously as a new instrument of knowledge that called for her active participation in it. At the time she was translating Strauss's Life of Jesus, she arranged to have a full phrenological analysis made of herself, based on a plaster cast of her cranium, and the preparations for the analysis included shaving her head.
Thursday, August 27, 2009
A Playground, Once
This is mine:
They climb the ropes with small, determined fists,
Delighting in the planetary pull.
Unknown children play in parallel;
The lives of others mean everything and nothing.
The sun, benignant, violates the dark,
Burning color into bewildered sight.
A train's whistle sounds, its anchored tracks
Forlornly straight, its body massively wrought
As it pushes past our gratuitous idyll.
The beasts are absent, but for a wheeling bird
Or vigilant squirrel; the animals are gone.
We have made this land our own, have scoured
It clean but for this empty green expanse;
Kids play in the vacuum of myriad other springs.
The creekside walnuts bear witness, their boughs aloft.
Recall it just like this, no matter what
Happens, this is the way it was this day.
They grow now in the harsh glare of change,
But storied shadows, specters of memory,
Sit silently far off, and watch and wait.
They climb the ropes with small, determined fists,
Delighting in the planetary pull.
Unknown children play in parallel;
The lives of others mean everything and nothing.
The sun, benignant, violates the dark,
Burning color into bewildered sight.
A train's whistle sounds, its anchored tracks
Forlornly straight, its body massively wrought
As it pushes past our gratuitous idyll.
The beasts are absent, but for a wheeling bird
Or vigilant squirrel; the animals are gone.
We have made this land our own, have scoured
It clean but for this empty green expanse;
Kids play in the vacuum of myriad other springs.
The creekside walnuts bear witness, their boughs aloft.
Recall it just like this, no matter what
Happens, this is the way it was this day.
They grow now in the harsh glare of change,
But storied shadows, specters of memory,
Sit silently far off, and watch and wait.
Tuesday, August 25, 2009
Total War (?)
"War is the continuation of politics by other means."
Clausewitz
Frank Herbert in today's New York Times both laments the dearth of public interest and support for the wars in Afghanistan and Iraq and suggests that such lack puts in question those wars' very reason for being. Wars? War is one of those words (like "love" perhaps) that is misleadingly applied to a vast spectrum of human activities.
Horrific tragedies for military families continue, but I would submit that Iraq and Afghanistan elicit little more than a yawn from most of the public these days because these conflicts have become too remote and too abstract for most to fully appreciate (in this respect they may be similar to global warming and health care). I think I am no Pandora in reminding that 9/11 was eight years ago; that's twice the duration of U. S. involvement in World War II. No further attacks have occurred on American soil. Unlike Germany or Japan seventy years ago, Al Qaeda simply does not pose a sufficiently concrete threat to American survival, whether directly or by distortion of the global order, to provoke an unequivocal response.
To be sure, the risk of further attacks has by no means been removed, but the "war on terror" is no more a true war than the "war on drugs" or the "war on poverty" were true wars; it is a failure of metaphor. The adversary is no state, but rather an enormously complex cultural system, and perhaps "police action," notoriously applied to the Korean War if memory serves, most accurately applies to Iraq and Afghanistan.
Herbert marvels that only one percent of the U. S. population is directly involved in the military effort to protect the country, at this point against Al Qaeda. But this ceases to be surprising if one views the work in Iraq and Afghanistan as analogous to police work. After all, the work of the police is really never done; there never is any end point at which the crime rate is reduced to zero. The idea is to reduce the risk to the public to an acceptable level. That is really all the military can hope to do at this point in Afghanistan. The police protect all of us, but only a tiny fraction of the public is actively involved in policing. Is this fair? Apparently so, inasmuch as police work is voluntary and rewarded with respect and honor, if also by substantial risk.
I wish "war" would be used only for serious conflict, that threatening the actual integrity of nation states. Some other term, "police action" if nothing else will serve, should be used for more measured responses. If we really thought that Al Qaeda posed an irrefutable risk to our national survival (by means of weapons of mass destruction presumably), would we post a few tens of thousands of soldiers in the wasteland of the Afghanistan/Pakistan border? No, we would institute a draft and flood the region with, I don't know, half a million or more soldiers, reduce the rocks there to smaller rocks, and flame out cave by hidden cave, as we did in Pacific islands on the road to Japan. The national will is not there because the perceived threat is not there.
To be sure, this could change tomorrow with an audacious new attack. But the risk of prevention in military matters, like prevention in, say, psychiatry or policing (Minority Report anyone?), is that one can make things worse in trying to make them better. I am not recommending pulling out of Afghanistan, as if I had expertise to do so, but I wish we could stop calling it a war, as if clear victory were possible. Deaths are parallel and appalling tragedies wherever they occur, but at this point the death of a soldier in Afghanistan has more in common with the death of a state trooper in the line of duty than it does with a death on the beach on D-Day.
Sunday, August 23, 2009
Novalis
It's odd that I've never gotten around to sharing some of the real Novalis (1772-1801). Talented aphorists are a rare breed, from the transcendental (Emerson) to the empirical (Samuel Johnson) to the mordant (Oscar Wilde). Novalis's fragmentary style suited his tuberculosis-shortened life. As I was looking over a recently rediscovered volume, I was struck by some of these (all from Novalis: Pollen and Fragments, translated by Arthur Versluis):
"We seek above all the Absolute, and always find only things."
"The insignificant, mundane, raw, loathsome and ill-bred becomes through witticism alone fit for companionship. It is as if these were intended only as jokes: their destined aim is to be a joke."
"Humanity is a humorous role."
"The most ingenious insight is discerning the proper employment of insight."
"Each individual is the midpoint of an emanation-system."
"Where children are, there is a golden age."
"All enchantment is an artistic madness. All passion is an enchantment. An alluring maiden is an actual sorceress, inasmuch as one believes in her."
"A character is a completed, refined Intention."
"Bias and attachmentn are for the imagination what fog, blinding light, and colored spectacles are for the eyes."
"The higher something is, the less it overturns--rather, the more it strengthens and corrects."
"Play is experimenting with chance."
"All that is visible rests upon the invisible--the audible upon the inaudible--the felt upon the unfelt. Perhaps thinking rests upon unthinking."
"Every word is a word of incantation. Whatever spirit is called, such a one appears."
"Paradise is strewn over the earth--and therein become unknown--its scattered lineaments are bound to coalesce--its skeleton is bound to become enfleshed. Regeneration of paradise."
"Completed speculation leads back to nature."
"One could call every illness an illness of the soul."
"Poetry must never be substantive, but rather always only wonderful."
"Earnestness must glimmer cheerfully; jokes must glower soberly."
"Whoever has no sense of religion, must nevertheless have something in its place, which is for him what religion is for another--and therein originates much contention."
Saturday, August 22, 2009
Enough Already
"The greatest part of a writer's time is spent in reading, in order to write: a man will turn over half a library to make one book."
Samuel Johnson
"As good almost kill a man as kill a good book: who kills a man kills a reasonable creature, God's image; but he who destroys a good book, kills reason itself, kills the image of God, as it were in the eye."
Milton, Areopagitica
The last couple of years have involved a good deal of packing and moving, into and out of homes and offices. Books do not travel particularly well. They are weighty, they are easily nicked on the corners, and their myriad shapes and sizes seem designed to fatally vex their efficient packing into boxes. The other day, gazing upon such boxes stacked in closets, their contents accessible only in theory, I was seized by an impulse to purge the library. Last year I had conducted a purge, but this would be a larger one.
How many books does one need, particularly in the age of Kindle and the Internet, where many of the classics in particular are perennially available if one doesn't mind reading from a screen (granted, a big if)? And if one does suddenly crave a particular book, it is only a click and a day or two away. In my current demesne I have roughly 50 bookshelves worth of space divvied up among a number of bookcases of various sizes, and their contents leave perhaps half again as many books in boxes. Grossly estimating an average of 20 books per bookshelf, I gauge my library at around 1500 volumes. This doesn't sound as many as I may have thought, but due to upgrades made over the years most of these are hardbacks or substantial paperbacks.
I am no Thomas Jefferson, obviously, but he has always been a figure of fascination for me as for many others, and not least because of his famous library. His ideal was the erudite and cultured pastoral gentleman, and for most of human history, of course, if you wanted to live far from the city, you had to bring your culture with you. That he certainly did (thanks in part I suppose to slave labor). I read that he had between 9000 and 10000 volumes, a staggering number now, and a stupendous one then. But when the young Library of Congress was burned by the British in 1814, he sold over 6000 of his books to restock the institution. Presumably he realized that one man, no matter his genius or the flow of visitors to his doorstep, could not possibly make regular use of 10000 books.
One person can't make regular use of 1000 books either. But there are a substantial minority that I do dip into again and again, if only to revisit a chapter or look up a phrase. There are a number that I love for their sheer physical beauty; the arts books obviously fall in that category, but many others do too. And of course a number arouse various kinds of nostalgia, because they were gifts from special people, or because they bring to mind a certain phase of life or state of mind. Someday a 1000 book collection may be a veritable antique in the house--that day may come sooner rather than later--but no matter how prodigious the Internet becomes, a stocked bookcase will always mean the life of the mind to me.
These books have been amassed at a fairly regular pace over the past 25 years, with spurts here and there as cash flow permitted. For many years I couldn't get enough, and disdained the very notion of the public library. Why would I give a book my time--than which commodity nothing arguably is more precious--if I didn't want to keep it with me? It was bad enough that I couldn't hold onto periodicals. But there comes a point where even words and ideas can become clutter, and I don't have Jefferson's Monticello--or his slaves--to best store and manage this library.
So after several hours of sifting (yes, one's hands can become sore from the sheer handling of books), a dozen boxes--probably some 200-300 volumes--are going, whether to used book stores or wherever they can find a home. They range from genre fantasy from the mid-1980's to philosophy and professional books from just a couple of years ago that left me underwhelmed. A few of them, bought already well-used 20 years ago, will find no home, and of course the used-book stores won't take them all. How are dead books best disposed of? With fall coming on, we could use some extra fuel for the fire pit out back which the kids love. But no, not that.
Wednesday, August 19, 2009
Is Humor Possible in Psychiatry?
I just have a few moments before heading off to my (perpetually solemn) work, but at the risk of seeming to protest too much, I thought I'd dash off a few thoughts on humor and its hazards in psychiatry as raised by the last post.
The theory of humor is famously unfunny, but it seems to me that amusement can arise from: our common vulnerability to physical circumstance (slapstick), the humbling of the high and mighty and the pretentious (satire), and the sheer delight of ambiguity (puns).
The general humor of medicine, such as it is, owes most to satire inasmuch as doctors are viewed as (and really are) self-important. However, psychiatry is more subject to smirking precisely because of the ambiguity of its practices. Thus the myriad on-the-couch cartoons of The New Yorker are funny precisely because psychoanalysis is an ambiguous endeavor (this humor is also safe inasmuch as the patients there are viewed as well-to-do worried well). However, from the point of view of stigma, one could argue that psychoanalysis is in desperate straits as a profession; can it afford such lampooning?
There are a number of problems that are not funny because they are both serious and unequivocal: schizophrenia, dementia, mental retardation, severe depression, etc. However, when, as yesterday, when I see a new patient who has diagnosed himself with adult ADHD, I smile wryly to myself not because ADHD is not a real and serious condition, but because it has become so faddish and so ambiguous. Senility used to be faintly amusing until it became better appreciated how devastating dementia really is. Similarly, drunkenness is becoming less amusing over time as the gravity of alcoholism is better appreciated.
Arguably bipolar disorder is in a class by itself in this respect inasmuch as, in its severe forms, it is an appalling and potentially fatal disease, but it continues to defy proper understanding, as reflected in the ongoing controversies over its diagnosis and treatment. If I sometimes roll my eyes at bipolar disorder, I am doing so not due to its sufferers, but due to my and our own incomprehension of what is really going on. I will grant that, given the epistemological quagmire, humor may be best avoided, but prudence does not always prevail.
So in my humble opinion the Onion piece was funny on multiple layers. It was a kind of behavioral pun, in which Obama's roller coaster ride in politics and public opinion was suddenly cast in the absurd new light of a mood disorder. It was absurd, and therefore funny, precisely because we know that Obama doesn't have bipolar disorder (if he suddenly did, it would cease being funny). And given Obama's lofty status ("The One"), there is a pleasure in puncturing the pretension, even for one of his supporters.
So as a politically-interested psychiatrist, I was naturally amused not because the piece somehow made fun of bipolar patients, but because it showed the fallibility of our own diagnostic practices in a political context. I can well understand, of course, that someone with clear-cut bipolar disorder might view the Onion piece rather differently. After all this analysis, it ceases being funny, but that fact in itself is mildly amusing.
The theory of humor is famously unfunny, but it seems to me that amusement can arise from: our common vulnerability to physical circumstance (slapstick), the humbling of the high and mighty and the pretentious (satire), and the sheer delight of ambiguity (puns).
The general humor of medicine, such as it is, owes most to satire inasmuch as doctors are viewed as (and really are) self-important. However, psychiatry is more subject to smirking precisely because of the ambiguity of its practices. Thus the myriad on-the-couch cartoons of The New Yorker are funny precisely because psychoanalysis is an ambiguous endeavor (this humor is also safe inasmuch as the patients there are viewed as well-to-do worried well). However, from the point of view of stigma, one could argue that psychoanalysis is in desperate straits as a profession; can it afford such lampooning?
There are a number of problems that are not funny because they are both serious and unequivocal: schizophrenia, dementia, mental retardation, severe depression, etc. However, when, as yesterday, when I see a new patient who has diagnosed himself with adult ADHD, I smile wryly to myself not because ADHD is not a real and serious condition, but because it has become so faddish and so ambiguous. Senility used to be faintly amusing until it became better appreciated how devastating dementia really is. Similarly, drunkenness is becoming less amusing over time as the gravity of alcoholism is better appreciated.
Arguably bipolar disorder is in a class by itself in this respect inasmuch as, in its severe forms, it is an appalling and potentially fatal disease, but it continues to defy proper understanding, as reflected in the ongoing controversies over its diagnosis and treatment. If I sometimes roll my eyes at bipolar disorder, I am doing so not due to its sufferers, but due to my and our own incomprehension of what is really going on. I will grant that, given the epistemological quagmire, humor may be best avoided, but prudence does not always prevail.
So in my humble opinion the Onion piece was funny on multiple layers. It was a kind of behavioral pun, in which Obama's roller coaster ride in politics and public opinion was suddenly cast in the absurd new light of a mood disorder. It was absurd, and therefore funny, precisely because we know that Obama doesn't have bipolar disorder (if he suddenly did, it would cease being funny). And given Obama's lofty status ("The One"), there is a pleasure in puncturing the pretension, even for one of his supporters.
So as a politically-interested psychiatrist, I was naturally amused not because the piece somehow made fun of bipolar patients, but because it showed the fallibility of our own diagnostic practices in a political context. I can well understand, of course, that someone with clear-cut bipolar disorder might view the Onion piece rather differently. After all this analysis, it ceases being funny, but that fact in itself is mildly amusing.
Subscribe to:
Posts (Atom)