Archive for March, 2013

On Not Dying in Vain

March 31, 2013
  • How does victory confer retroactive meaning on the deaths of the winning side’s soldiers?  For that matter, what is the “meaning” of a death?  Obviously, I have been nudged into this line of inquiry by the wave of second thoughts about the war in Iraq now that we have been fighting there for ten years.  Which in turn recalls to my mind the ten-year war in Vietnam. 

    And (given the Lincoln film and the wave of other Lincoln material on my not-so-secret vice, the History Channel) the Civil War and so on.  “That these dead shall not have died in vain…” – aside from the elegant use of the future perfect verb, which I can’t imagine even the best speechwriter daring to use today, this requires some mental gymnastics, looking back at the past war deaths from the viewpoint of a yet-unformed future.  Did Lincoln mean that, at the time he was actually speaking, those deaths were (had been?) in vain, and would continue to have no meaning unless the United States conferred meaning on them in some way? The battle of Gettysburg was pretty close to the end of the war, but that might not have been readily apparent to Lincoln or those who heard him at the time; so was he only saying “we have to stay with this war, finish it, and win it”?  Or was he saying (as Garry Wills tells us) that the people of the United States can confer meaning  on the Civil War deaths only by committing to his vision of freedom and equality as the outcome of the war?  Or (skilled politician that Lincoln was) was he allowing his listeners to take their choice of commitments?

    At the most basic level, most of us die for nothing.  The privilege of dying for a purpose is rare, dearly bought, and usually an integral part of living for a purpose. 

    Some of us die as “martyrs” to our freedom to smoke tobacco or eat fatty foods or drink alcohol or ingest various other drugs or drive fast cars or have unprotected sex.  Most of us just die “for lack of another breath,” as the old epitaph puts it.  Why should we demand more for those who die fighting in a war?  (We rarely, by the way, insist on conferring meaning on the deaths of civilians in wartime.  There is no monument to the Unknown Civilian, though perhaps there should be.  The closest we get is G.B. Shaw’s imagined monument to Jack Falstaff*.)

    The “meaning” and “honor” of death in battle are among our oldest social/ religious/ political sancta.  The Iliad doesn’t quite have Achilles telling his buddies that Patroclus should not have died in vain (which is just as well, given the difficulty of using the future perfect in Homeric Greek), but it comes pretty close.  And I think it all connects to guilt.  The soldiers who died in battle died serving us (whoever “us” may be—Greece, Troy, Christians, Muslims, France, Germany…), and “we” (who in many instances are still alive because we dodged the draft or in some other unworthy way avoided military service) can’t do much to compensate them except to be appropriately grateful to them for making our lot better in some way than it would have been without their sacrifices.  It is they who have conferred meaning on our lives, rather than the reverse.

    Is there no meaning to be found in a losing battle?  The Alamo?  Masada?  Thermopylae? Two of these three were actually losing skirmishes in ultimately victorious wars, and acquire their meaning after the victory to which they contributed.  Masada, perhaps, is sui generis.  It is part of a much longer “war” on a much more significant, well-nigh cosmic level. 

    And, of course, there is always the Lost Cause.  I grew up in the American South, when Robert E. Lee was an eternal presence in our history and our geography, the 800-pound spiritual gorilla in the room.  It has, finally, lost most of its meaning, mainly by being outweighed by other, newer martyrs.  It lasted as long as it did mainly because the culture of the American South did, after all, survive to tell its story.  History is not always written by the victors, but it is almost always written, or at least preserved, by the survivors.  Sometimes it is the survivors on the other side, who demand the credit due to those who nobly defeat a noble foe (as opposed to the Crips defeating the Bloods, or maybe even the Capulets and the Montagues, or the Hatfields and the McCoys.)

    But, History Channel buff that I am, I long for a different view of historical meaning.  I long for a monument to the Unknown Civilian, to those who not only died (everybody dies, after all) but who first endured, who gave us not the cheap flashy courage of the NASCAR driver or the fake courage of a John Wayne, but the deep-rooted long-lasting courage of the Middle Passage, of Leningrad, of Sarajevo, of Robben Island, of the Warsaw Ghetto.  That kind of courage, we can all attain, and it can give meaning to all of our lives.

     

    CynThesis 

     

     

     

     

     

     

     


    * The real  Falstaff, Sir John Oldcastle, actually was a martyr, burned at the stake for his commitment to his proto-Protestant Lollard heretical beliefs.  Shakespeare chose to play him for laughs mainly for political reasons.  Although his martyrdom is memorialized in Foxe’s Book of Martyrs, he seems to have no physical monument, unless you count a pub named for him in Farringdon, UK.

Does Stephen Hawking Watch “The Making of…”?

March 17, 2013

We learn from various documentaries on making movies and TV programs that directors almost always do scenes in some order other that what the viewer (and probably the screenwriter) would consider chronological.  All the scenes that take place in a particular location are shot together, for instance. Or all the scenes in which a particular actor, or stunt person, or other crucial functionary, appears, who may have to get someplace else for some other production or something.  Or all the scenes that take place in a particular season, even in different years.  Then the editor gets to sort them out and put them into “chronological” order.  (Which may also not necessarily be the order originally set out in the book or play on which the screenplay is based.)

Which makes me wonder (when staying up late and watching TV movies alternating with Stephen Hawking) whether the order in which we experience events is necessarily the same as the order (or meta-order?) in which they occur Out There in Reality?

Jane Grey

A Very Hypothetical Question

March 17, 2013

Let us imagine, quite hypothetically, a Roman Catholic Pope, in his 80s, who is told by his physician that he is in the early stages of Alzheimer’s.  The Pope in question is well-read in canon law and church history, and therefore has a pretty good idea of the implications of such a diagnosis for somebody in his position, and for the Church as a whole.  The Church has dealt pretty successfully with evil popes, like the Borgias and a lot of the guys around AD 800 or so (of whom 1 out of 3 died violently, or so I heard today.)  That is to say, none of them did anything that affected the soundness of church doctrine or practice beyond their own lifetimes.  But there is no recorded instance (that I can find, anyway, by googling “Papacy dementia”) of a Pope who developed dementia while in office. 

 

(See “Ask a Catholic”, which tells us:

             Historically, I don’t know of any Papal cases. There probably were Popes that were removed from office by reason of health or political reasons. Never-theless, it is important to note, the Pope doesn’t stand alone. He is surrounded by all kinds of Cardinals and Secretaries of State and people who are in high res-ponsible positions around the Vatican and when they notice a Pope is failing for any kind of reason, I’m sure they get together and either would ask for his resig-nation or vote; after all they put him in the office. They can’t vote him out, but there can be a recognition that he is not fully himself at this time. It is so impor-tant to emphasize that, unlike our Presidency in the United             States, if something happens to the President, the Vice-President assumes his office — there is no vice-Pope; there is no assistant Pope. The Holy Father MUST resign of his own free will for it to be a valid resignation. If he doesn’t want to, then all you can do is pray that God takes him soon. The             First See (Rome) is judged by no one. We had a heresy called Conciliarism in which an ecumenical council thought they could depose a Pope. No one can depose the Pope even if he’s immoral or             loss his marbles.

            Now God forbid, if a Pope did get that way, maybe they might lock him in the closet or something like that, but you cannot remove him from office. I’m should there would be plenty of people that             are loyal to the Church and would take care that no damage would be done.

            On the issue of Infallibility: Even if, God forbid, a Pope was demented, infallibility would still be present. The Holy Spirit would stop him for saying something like “Jesus is really Mickey Mouse.” Infallibility is a negative charism, where the Holy Spirit would prevent him from making such a statement binding in faith and morals on the faithful. Whether or not he, himself, believes he             is Mickey Mouse, is not a part of infallibility so he might think he is a different character, but it is not part of infallibility.

            Infallibility only applies to the Pope’s teachings and the universal Church. This is where Catholic faith comes in. There was a Cardinal from Germany that has urged John Paul II to resign but you can never compel him to resign. This Office is so unique.On several related issues: If a priest has dementia, the bishop will take care of him fast. Some bishops have had this too and in these situations the Holy See will step in.

            Fr. Levis and Fr. John Trigilio from EWTN)

 

This raises at least as many questions as it answers.  For instance, if those who “step in” around a demented Pope make doctrinal pronouncements, will the Holy Spirit guide them and make them infallible? One would think it’s the least She could do. 

 

And, perhaps more to the point, just because the historical record apparently contains no instance of a demented Pope, that does not necessarily mean there has never been one.  Statistically, over 2000 years and 265 Popes, many of them of fairly advanced age, it’s hard to believe that at least one or two of them did not suffer from dementia, either age-related or from some other more common etiology.  We can only conclude that those around them did one helluva job of damage control and public relations, and/or that they did not “lock him in a closet,” but more likely hastened his heavenward journey. 

 

At any rate, with these issues facing him, our hypothetical Pope might very well choose to save the Church and everybody else a lot of trouble by retiring.  Is this hypothesis really off the wall?  I’m not a Catholic, I just read a lot, and have had a lot of experience with mental health in the civil legal system. 

 

Jane Grey

Listen to the Mockingbird

March 17, 2013

I have read that “Sununu”, in Arabic, means “little bird.” Hard to resist punning on that. John Sununu, on July 17, in a Fox interview, is quoted saying about Obama, “He has no idea how the American system functions, and we shouldn’t be surprised about that, because he spent his early years in Hawaii smoking something, spent the next set of years in Indonesia, another set of years in Indonesia, and, frankly, when he came to the U.S., he worked as a community organizer, which is a socialized structure, and then got into politics in Chicago.” Read more: http://thepage.time.com/2012/07/17/sununu-classic/#ixzz22JMAb6SD And later went on to say “I wish this President would learn how to be an American.” Presumably the first step would be staying away from Hawaii and Chicago, and maybe also politics?

So, let’s see, it doesn’t even matter whether Obama can produce a valid Hawaii birth certificate because Hawaii isn’t really American anyway (so why are the birthers wasting our time, and their own?) and neither is Chicago. (What? Okay, Indonesia really isn’t the US, but so what? Romney’s family spent a couple of generations in Mexico to evade prosecution for polygamy, but nobody is sideswiping him with any Frito Bandito labels.) Being a community organizer is a “socialized structure” (socialized? As in “people getting along with each other”? I think the organization he worked for was established under Section 501 ( c) (3) of the US Internal Revenue Code, which is about as unsocialist as you can get. Can’t imagine what else he could mean by “socialized.” Never thought I would be accusing myself of lack of imagination when examining American politics.)

Sununu did later apologize for “using those words,” though not with any degree of specificity. We don’t really know which words he was apologizing for—the slurs on the Americanness of Hawaii and Chicago, which could conceivably lose Romney a lot of votes in those places, and among people with family from those places? The evocation of bizarre images of the infant Obama smoking “something” in his cradle? The impolitic slur on Indonesia, which does after all have a history of authoritarian anti-communist government and a good deal of oil under its territory, both things “real Americans” tend to like? The mind reels.

And above all, what does it really mean to be an American, and how does one learn it? Watch this space.

Red Emma

Milgrom, My Generation, My Students, and Me

March 17, 2013

Due to Mr. Wired’s health problems, the Wired Family spends a lot of time watching educational TV.  National  Geographic, Smithsonian, History and History 2, PBS, and of course, the Science Channel lead the pack.  Last week, on the Science Channel, we had a chance to revisit a cultural milestone I hadn’t thought about in a while—the Stanley Milgrom experiments on obedience, conducted at Columbia University in the early 1960s. I wasn’t personally involved in that study, though it involved students of my age cohort while I was attending another Ivy League school.  But I had occasion to encounter it shortly after I graduated, in essay form, in a textbook I was using to teach freshman English comp at yet a third college. I used the study with my English students for several years. After that, at yet another college, I had occasion to use it again for several years, in a course, required for all psychology majors, on ethics in mental health practice.

 

Then, last week, thanks to the Science Channel, I discovered that more recent researchers have attempted to duplicate the Milgrom study to the extent possible within the relatively stringent limits of current behavioral science research ethics (see http://thesituationist.wordpress.com/2007/12/22/the-milgram-experiment-today/).

 

For the benefit of those readers who have not yet encountered the Milgrom study in either its original form or its more recent incarnation, it goes like this: the subject (S) is recruited for a purported study on the effect of pain on learning.  S is told to read a series of words to his co-subject (C), who is sitting on the other side of a soundproof glass window, wired up to various electrical doohickies.  C is supposed to choose the most appropriate word match within the series.  If he gets it right, S goes on to the next list.  If C gets it wrong, S pushes a button on the electrical doohickey on his side of the glass, which purportedly administers an electrical shock to C. Then S goes on to the next series of words.  Each time C gives a wrong answer, the shock S administers is escalated to a higher voltage.  After a while, C begins to jump and writhe in response to the “shocks.” In the original study, the voltage dial on S’s electrical doohickey  goes into a red zone, marked “danger,” and then to a level marked “extreme danger.”  If S hesitates to give shocks in the danger zone, the white-coated guy (WCG) who appears to be running the experiment tells him, “You must continue,” or “It is essential that you continue,” and keeps saying it until S either continues or pulls out.

 

The study involved substantial deception, fortunately.  C was actually a confederate in the experiment.  And he wasn’t getting any real electric shocks.  The electric doohickeys were all fakes.  The study had nothing to do with the effect of pain on C’s ability to learn.  Its point was how far the various S’s would go in shocking C. And the finding that shook up most of us in that just-before-the Eichman-trial era was that over sixty percent of the subjects in the original study went all the way to the top of the dial without refusing to continue.

 

Until last week, I had only read the report on the original study.  But the Science Channel production actually played tapes of some of the original study.  What startled me about them was that even the “refuseniks,” the subjects who had refused to administer potentially dangerous shocks, were painfully polite in refusing.  These were my agemates, my fellow Ivy League college students, just stepping into the wild and blasphemous Sixties.  I couldn’t imagine the refuseniks at my alma mater saying “I’m sorry, but I won’t continue.”  They would have said, to a person (well, to a man, anyway—we women were still a bit more polite) “This is bullsh*t.  I’m leaving.”  Okay, we saw only a few of the tapes of the original study.  Maybe the more forthright speakers got left on the cutting room floor?

 

The more recent version of the study made several significant changes.  First, the dial on S’s electric doohickey went only to 150 volts, still well within safe limits, rather than all the way into the red zone.  Second, the glass partition between S and C was no longer soundproof. S not only saw C writhing in pain, but heard him screaming and begging to be let out.  Third, the cover on the experiment was blown almost immediately after S either went all the way to 150 volts, or refused to do so.  And finally, the written agreement S signed before the experiment began explicitly stated, and the experimenter repeated orally a couple of times in introducing the experiment to S, that S would be free to stop the experiment at any time, and to leave, without forfeiting any of the money he was being paid for participating.

 

That last change may have been crucial in motivating at least some of the more recent refuseniks, many of whom cited it when stopping the experiment.  Would the original refuseniks have done the same?  Have we become more law- and lawyer-ridden since the Sixties?  It’s hard to tell.  But the current generation of refuseniks, more or less of the same cohort as my own psychology students, were just as polite about refusing.  No barnyard epithets, no colorful suggestions about where the pay for participation could be placed (viz Hemingway’s deathless telegram advocating that his editor “upshove” an offending book “asswards,”) The numbers came out roughly the same—less than forty percent of the subjects refused to give the most painful shocks.  The white-coated experimenter gave the same admonitions in the same deep, resonating, expressionless voice (“It is essential that you continue”), and nobody appears to have called him a pompous ass. 

 

I guess we can conclude from this more-or-less-duplicated study that students (the inevitable subject population of most academic studies) have become no better and no worse than their predecessors fifty years ago, and that even the best of us were and continue to be a bunch of mealy-mouthed wimps.  Do check out the videos on “The Milgrom Experiment” if you get the chance.  Or wait for the show to come around on the Science Channel in your neighborhood.

 

CynThesis