Archive for June, 2008

CODE-SWITCHING

June 30, 2008

When I was doing graduate work in sociology, I took a course on “deviance.”  I did a paper for the course, on conscientious objection.  It’s a fascinating subject, about which I could go on for a long time, but won’t.  I chose the subject in the first place because the notion that having a conscience could be “deviant” struck me as marvelously ironic.  (The upshot of the paper was that the CO application process was a triumph of organic over mechanical solidarity [as Durkheim would put it.] )  What really mattered about that paper was that it changed the course of my life, because I wrote it a year or so before the 1965 increase in American troop strength in Vietnam.  I chanced to mention it to a friend of mine, and the next thing I knew, lots of people were asking for a copy.  That made me nervous, because by this time it was full of probably outdated information.  So when I saw a posting from AFSC about a free course to become a draft counselor, I signed on right away.

 

I spent the next ten years working in the area of Selective Service and military law, and eventually went to law school. As an attorney, I’m still doing military and veterans’ benefits law, and have done Selective Service stuff when the issue arose.  That involves all kinds of legal issues, but it still occasionally raises questions of conscientious objection, and that’s still a fascinating process. 

 

The body of statute and case law that set out the CO requirements for draft exemption or military discharge (or exemption from the “bearing arms” oath for new US citizens) clearly started out with the “historic peace churches”—Quakers, Mennonites, and Brethren—in mind.  Those requirements have evolved to encompass other Christians, non-Christians, non-church members, agnostics, atheists, and ultimately people with no official religion at all.  But the law remains clear that a mere “personal moral code” or a set of “political beliefs” won’t qualify. 

 

And most young people these days—even the regular church-goers—are theological illiterates.  (Among the splendid exceptions are the Jehovah’s Witnesses, whom I have occasionally represented.)  All they have, most of the time, is a personal moral code, or a set of political beliefs.  What I do is more the job of an English teacher or an editor (both jobs I have held in addition to practicing law) than an attorney.  I work the client through the “Four Questions” that are the basis of the CO application:

 

  1. What do you believe, and how does it prevent you from being willing to participate in all wars? (Note: not just some wars—that would be too easy.  CO applicants go through endless grilling about whether they would have fought against Hitler, but nobody has to justify being unwilling to fight for Hitler.)
  2. Where’d you get these weird ideas? (Note: the military presumes that it is normal and natural to be willing to kill a total stranger when ordered to do so by another total stranger. Any deviation from this norm has to be explained.)
  3. What have you done to put your beliefs into action?
  4. Who can vouch for your sincerity?

 

This process requires a lot more introspection than most young Americans are used to.  Also a lot more writing.  (At the outset, I tell them it’s the equivalent of a long term paper, in expenditure of time and energy.) Once it’s all down on paper, the translation process begins.  Writing one’s congresscritter about the war in Iraq is rarely just a statement about that war; it is usually a statement about war in general in the context of the only war the kid knows about.  World War II? What was that? I know there was some kind of war in the 1940s, but I forget who was in it or who won.  Same with going to demonstrations and marches.  Working at a soup kitchen is a statement about the essential value of all human life, even the most miserable.  Running a school recycling center is a statement about the value of the earth and its resources, which war destroys big time. 

 

None of this is fake. I don’t do fakes, nor do my clients, so far as I know.  It’s just a matter of putting the very individual and personal—which won’t get recognized as conscientious objection by the official deciders–into a broader context that the client has only started to think about when confronted with a human-shaped target and told to “kill, kill!!!” 

 

This process is a species of what elementary school teachers call “code-switching”—expressing the same ideas in different ways depending on context, audience, and purpose. When greeting your buddy, you can high-five him and say “yo!”  When you meet the Dalai Lama, on the other hand, you do not break out singing “Hello, Dalai!” 

 

The process can go both ways, as Obama has demonstrated in his Sojourner speech two years ago (never mind that James Dobson for some reason has brought it up two years later to question Obama’s theology.)  If you are going to bring your religious beliefs to bear on political issues (other than a CO application, I guess) among people who do not share those beliefs, you need to speak the language of your audience.  This is partly for symbolic purposes—we are conducting an election in a democracy composed of people who hold lots of different beliefs, and whose constitution prohibits establishment of religion as such or of any particular religion.  A candidate or advocate who does not respect that prohibition is telling at least some of that audience, “You don’t belong. You don’t count.” Which is the last thing you want to tell a voter, any voter.

 

But a lot of those voters may not even catch the in-groupness of standard evangelical Christian language, because they don’t know anybody who doesn’t speak it. (Whoever discovered water, it probably wasn’t a fish.)  I have long since lost count of the very nice, very earnest Christians who ask me, in utter perplexity, “You mean Jews don’t believe in Jesus?”  And telling them that their language is not the lingua franca of common discourse in their own country can amount to telling them “you don’t belong. You don’t count.”  It can carry its own political costs.

 

Nonetheless, I think American politicians who are running for president rather than, say, Pope or Caliph or Bishop, have to presume that their audience includes non-Christians and non-religious people who still have a right to vote, and to know where their candidates stand.  In short, translating values based in a particular religion into universally comprehensible values is not only effective politics in a pluralistic polity, but a way of honoring the founders of that polity and the universal values they were trying to establish.

 

Jane Grey

 

 

NO UNSOLICITED MANUSCRIPTS ACCEPTED

June 30, 2008

 

We hear a lot these days about the growing concentration of the media, on one hand, and the wild expansion of the blogosphere on the other.  It calls to mind A.J. Liebling’s famous dictum that “freedom of the press belongs to those who own one.”  These days, almost anybody can own the equivalent of a printing press. Like your humble servant here, for instance.

 

Which gives us the chance to think about what the free marketplace of ideas is really about, and what kind of metaphor it really is.  If I want to hear interesting ideas from reasonably competent writers on subjects I care about, I can go to the corner and buy a newspaper, or to the bookstore in the next block and buy a magazine or a book.  Or, if I want to make money, I can find a magazine or a newspaper that wants to buy something I write, or even line up a publisher for a book I want to write.  (In case you’re wondering, by the way, writing for a mag or a paper makes a lot more economic sense than trying to get a book published, or even having succeeded at doing so, at least judging by my daughter’s  experience and my own in the literary world.)  Both sets of operations are based on the presumption that more people want to read than to write, and that therefore the financial incentives should be directed at would-be writers from would-be readers rather than the other way around. 

 

Guess what, people?  That presumption is dead solid wrong.  The blogosphere, in both its professional and amateur incarnations, is proof positive of that.  Not to mention the experience of Ted Kaczynski (sp?), the Unabomber, who spent nearly 20 years mailing bombs to commercial and non-profit techies (killing 3 recipients and seriously injuring 23 others,) but promised to stop in exchange for publication of his “manifesto” in a major paper.  The New York Times and the Washington Post took the deal, mute testimony both to what it takes to get published these days, and to the determination of some unpublished authors to get into print anyway.  If Kaczynski had only waited until the invention of the blog, the Unabomber’s victims might all still be alive and unhurt. 

 

Or maybe not.  In general, the incentives are all running in the wrong direction. 

1/3 of high school graduates never read another book for the rest of their lives.

42 percent of college graduates never read another book after college.

80 percent of U.S. families did not buy or read a book last year.

70 percent of U.S. adults have not been in a bookstore in the last five years.

57 percent of new books are not read to completion.

70 percent of books published do not earn back their advance.

70 percent of the books published do not make a profit.

(Stats from the Erma Bombeck Writers Workshop, University of Dayton: http://www.humorwriters.org/startlingstats.html)

 

What somebody needs to do is set up a system in which writers pay and their readers get paid.  It should start on a smallish scale, obviously.  Most writers don’t have enough money to pay lots of readers, even in small sums.  But many readers could use the money.  And perhaps many illiterates would be motivated to take up reading.  A couple of school reading programs have tried similar incentives, with a fair degree of success.  Prisoners might find the idea especially attractive. 

 

Or perhaps we should start with a reader/writer cooperative—or is that what the blogosphere already is?  Anyway, this needs a closer look.

 

CynThesis

 

QUOTING SCRIPTURE TO ONE’S OWN PURPOSE

June 25, 2008

Apparently there has been some heated controversy between James Dobson (of Focus on the Right Kind of Family) and Barack Obama about Obama’s speech to Sojourners back in 2006.  The speech is old news. The controversy, oddly, seems to have just arisen.  Dobson says Obama distorted the meaning of scripture when he said, “Which passages of scripture should guide our public policy? Should we go with Leviticus, which suggests slavery is OK and that eating shellfish is abomination? How about Deuteronomy, which suggests stoning your child if he strays from the faith? Or should we just stick to the Sermon on the Mount – a passage so radical that it’s doubtful that our Defense Department would survive its application? ” 

Oddly enough, I’m not going to dispute Dobson today (I must be having a really weird day–see today’s previous posts), but I do think Obama is distorting scripture, though certainly not in the ways Dobson talks about.  Leviticus doesn’t really “suggest slavery is OK.”  It regulates it.  In fact, as between one Israelite and another, it regulates it out of existence altogether.  What Leviticus talks about that usually gets translated “slavery” is actually time-limited indentured servitude.  The time limit is 7 years, the longest one can keep an Israelite slave.  At the end of that time, BTW, the master is supposed to not only free the servant, but set him up with the biblical equivalent of 40 acres and a mule.  The word “ebed” can legitimately be translated “worker,” “slave,” or “servant,” depending entirely on context.  Clearly what it means in the passages of Leviticus dealing with the Israelite who becomes an “ebed” (usually because of debt) is “servant.” 

[BTW, I just found out the other day that Judah Benjamin, Jefferson Davis’ Jewish Secretary of State, was the only member of the Confederate cabinet who owned no slaves during his term of service.] 

Jane Grey

RACISM MADE HIM STRONG; WE’LL MAKE HIM AN AMPUTEE

June 25, 2008

And, speaking of the impact of military service on minority communities, Professor Cindy Williams, who teaches political science at MIT, and keeps careful track of military recruitment trends, says the Army is having trouble maintaining the quality of its recruits, and that they are having particular difficulty getting “high quality” [recruiter-speak for high school graduates] African-American recruits, largely because of parental  lack of support for young people in that community joining the military.  Given the large proportion of African-Americans in single-parent families, that should probably be translated as lack of support among Black mothers for turning their kids—especially the ones who manage to graduate from high school–into cannon fodder.   Getting an African-American boy through high school is not an easy task these days, and one can understand why a mother who has accomplished it would be reluctant to send her son off to get blown to bits on some Middle Eastern highway or byway.  Recruiters are saying that, while signing up a young person used to require 14 hours of sweet-talking the young person and 4 hours of sweet-talking his parents, the ratio is now reversed.  The kids may like the idea of military service, but the parents are getting really skeptical. 

 

Red Emma

A KIND WORD FOR McCAIN

June 25, 2008

This is not to be taken as an endorsement, but John McCain says he is opposed to reinstating the military draft for anything short of World War III, because the Vietnam War draft weighed most heavily on lower-income Americans. Good for him!

Rep. Charles Rangel has been calling for a draft at least once a year ever since we dropped it in the early 1970s, always on the basis that it distributes the burdens of military service more equitably, and forces a war-making government to be more accountable to the voters whose children are being drafted. On this one, McCain has it right and Rangel has it wrong.

The one draft exemption nobody would dream of getting rid of, and the one that has accounted for the majority of the people who got out of the draft (roughly half of everybody, ever since World War I) is the medical exemption. Not unreasonably, the Army does not want to function as a rehab hospital, so its own regulations (AR 40-501) exempt people from service whose medical problems would make them more trouble than they are worth as cannon fodder. The catch is that the draft physical has never been more than a cursory glance, involving counting extremities and asking a bunch of abstruse questions, often too rapidly to be understood or properly answered. Anybody whose medical issues are more complicated than such an exam would reveal has to get documentation from a private physician to present to the examiners.

And that means having a private physician. So the increasing proportion of inner-city and rural youth, whose medical documentation consists of having an emergency room doctor take three minutes out of his already crammed schedule to scrawl “Sick—no work” across a prescription pad, are not going to get medical exemptions from the draft. If they are lucky, the Army will discharge them after they get to Basic Training, when their disabilities become apparent. Most of them aren’t that lucky.

In addition, as we know from the experience of other countries with “universal” military service, such as Israel and the former USSR, the children of wealthy and well-connected families will almost always get drafted into the most prestigious and comfortable branches of service, while the offspring of civilian peons will almost inevitably become military peons.

Obama hasn’t taken any position on a reinstatement of the draft, so far as I know. It makes more sense, of course, to promise to end the war, rather than plan to get more troops into it, so he may not feel obliged to make any statement on the subject. But let’s hope he pays attention to the real history of conscription if he does.

Red Emma

 

THE ONE-MINUTE LAWYER

June 20, 2008

Northwestern University School of Law, here in Chicago, has just announced that it will be offering students a chance to get a law degree in two years instead of the traditional three. Let’s look at the history of this. Back in the Good Old Days, you didn’t have to go to law school at all to become a lawyer. Nor did you have to go to college first. You apprenticed as clerk to a practicing lawyer. That’s how Abe Lincoln started out. There were law schools, but they were for the upper classes. There are still places where you can get admitted to the Bar without a law degree today, though it’s pretty hard to arrange, and not many people do it. A Chicago attorney who passed away only a couple of years ago was the last person I know of personally who did it. He was pretty good at what he did, too.

Then came the American Bar Association, which in its wisdom decided about 100 years ago that the profession was too open to the riffraff. So it encouraged state bar communities to require a law degree, or at least make that the preferred track into the profession. Back then, getting a law degree took—gasp!—two years. Back then, BTW, many law schools did not require a college degree for entrance.

Somewhere around the same time, law schools started permitting night study for people who had to work while getting their degrees. That, reasonably enough, took three years.

I’m not sure precisely when the third/fourth year of law school was added, though I think some schools were still holding out as late as the 1960s, but the point of it was to enable the student to take more specialized courses. As a practical matter, the last year of law school has, ever since, been regarded with barely-tolerant amusement by most students. The axiom among law students is, “The first year, they scare you to death. The second year, they work you to death. The third year, they bore you to death.” I’m not sure what useful purpose the last year ever served. But I’m really skeptical about the rationale now being used for dropping it.

Well, not exactly dropping it. What they’re actually doing is compressing it. Pretty much what I did with my own legal education, actually—I went to night school, while working full-time. Normally that should have taken four years. I did it in three and a half, by taking classes through the summers. But Northwestern is proposing that day students, who would normally finish in three years, take classes through the summers and “mini-classes” between semesters, thereby finishing in two years. They are also shifting the emphasis in course requirements, to teach skills such as “accounting, teamwork, and project management,” and modeling the degree program more closely on the business school pattern. Presumably that means that some other courses are being dropped, but nobody is saying which ones. All of this is explicitly designed to make the future employers of the next generation of lawyers happy with their more practical, business-oriented approach.

Which makes me really nervous. If all NU wanted was to shorten the time required to get a law degree, I wouldn’t be all that bothered by their dropping or shortening the third/fourth year. But making the students work harder and with less time between terms, and come out looking more like MBAs, scares the flippen daylights out of me.

Law has been for a long time the last refuge of the liberally educated professional, the last place where you can get props for knowing things outside your specialty. Roughly half of my law school class had advanced degrees in something else before starting law school. The point of being a lawyer is to be able to think critically and analytically, to balance competing interests and ethical imperatives, to be able to break down a situation into ponderable parts and weigh them against the applicable law. [And, not incidentally, to learn how to spell habeas corpus.]

The lawyer’s job is to be, not a Minuteman, but a Wait-a-Minuteman, somebody who can tell the client, “Not so fast. You can’t do what you want to do here.” Clients don’t much like that function of the legal professions, and would vastly prefer to hire MBA-like robots who will simply map out the shortest apparently legal distance between here and where the client wants to go, and “make it happen.” We have been witnessing, over the last ten or fifteen years, a large number of accountants and MBAs getting indicted, and often imprisoned, for taking just this approach. A few lawyers have already gone this way as well. Does NU really want to send an entire generation of new lawyers in the same direction?

What’s the rush, anyway? I can understand people who try to finish college in three years—it can actually save money, if planned properly. The NU administration claims not to have decided whether its two-year program will cost the same $42,000 total tuition as the current program, but I would bet it will. It’s hard to imagine that faculty members drafted to teach the new summer and mini-courses won’t demand some kind of extra pay for doing it. The proposed program would require entering students to have spent at least two years working in some other field, which many of them would anyway. But the fact is, we’re all living longer these days, and lawyers never retire. [One of my colleagues, who was still practicing last I heard, just celebrated his 100th birthday. Another, who I know is still practicing, was admitted to the Bar the year I was born.] So starting one’s professional career a year earlier or later is pretty insignificant in the long run. If anything, the legal profession needs to slow down a bit more, and get back into the habit of thinking things through before making a recommendation. And one of the recommendations most in need of thinking through, obviously, is Northwestern’s two-year program.

Jane Grey

 

Blessed is S/He Who is Not Offended

June 13, 2008

As some of you are aware, I am a regular on another blogsite, which shall remain nameless, amid posters and bloggers from left, right, and around the block. A great many of the comments from self-labelled conservatives on that site start out complaining about:

“condescension from [partisans of] the left, who are … self-defined as more “enlightened,” and

“Massachusetts, where one is likely to be labelled “stupid,” “idiotic,” “conservative, or “Fascist,” not to mention “fundamentalist,” “evangelical,” a “redneck,” a “cracker,” “podunk” or “white trash” if one strays inadvertently one inch, one centimeter, to the right of wherever the line in the sand has been drawn on that particular day by the tolerant and open-minded people of that great state,”

before stating their own beliefs.

Every now and then, one also sees leftists proclaiming:

“The fact is that many Bush-bots bought the notion that Kerry was a commie pinko rather than a bona fide Vietnam war hero, actually confirmed by the military. Bush didn’t even show up to all his reserve unit duty. Republicans are suckers for ‘Rambo’ type talk and the rest of us pay a price,” or,

“When I discuss politics [in Texas] with Americans who may be on their way to no longer being my friends, I hear things like “stupid,” “idiotic,” “liberal,” or “Communist” in reply to what I consider simply a different set of observations and experiences which do not embody absolute truth.”

What all this comes down to is that we have all become a whole lot more touchy about politics lately. By which I don’t mean that we take politics seriously. I mean we take it personally. No, not in the sense that “the personal is the political.” In the sense of “tell me what you believe and I will tell you who you are and whether I will let my kids play with your kids.” My own vantage point is pretty much leftist, so that’s what I’m mainly talking about for the moment. Whenever I, or somebody whose opinion I share, disagrees with a conservative, the response is almost always sure to include stuff like, “I know you despise people like me, and you think we’re stupid, but…” usually ending with “…so you’re an arrogant elitist snob.”

Generally, I and many of my colleagues on the left are pretty respectful when talking to conservatives, or at least we sure feel as if we’re being respectful, and deserve all kinds of moral points for not saying, “I despise you and I think you’re stupid.” So we are especially annoyed at not even getting credit for not saying what we went out of our way not to say. I suspect that Obama’s comments about “bitter” working-class voters were made in very much that spirit.

And of course, the nastiest thing you can do to a liberal, who wishes to be a brother/sister to all of the wretched of the earth, is to accuse him/her of being an arrogant elitist snob. Compared with that, vague accusations of moral corruption and omnifutuance are small potatoes, if not an outright source of bragging rights.

So we are operating in a universe of discourse in which conservatives feel insecure about their intelligence and liberals feel insecure about their humanitarianism. Each tends to take offense from those respective vantage points at whatever anybody on the other side says about almost anything.

In the process, we tend to ignore all sorts of other dimensions of each other’s discourse. For instance, even though liberals tend to get branded as overly tolerant of immorality, we actually tolerate only specific (mostly sexual) varieties thereof. We yield to no one in our intolerance of financial hanky-panky, violence, and environmental trashiness. And on the other hand, while conservatives may talk a good game of judging other people’s sins, on the personal level they are often a lot nicer than liberals—even to those they disagree with. If I were stranded on the side of the road with a flat tire, I think I would be more likely to be assisted by somebody with a “God, Guts, and Guns” bumper sticker than by the guy driving a hybrid with a Darwin fish on it.

Part of the problem is just that we all read, or digest pre-read byproducts of, sociology. Which tells us that liberals have more education and more money and more refined tastes in food, art, and decoration, and less religiosity, than conservatives. I’m not actually sure of the validity of those data anyway, and I used to be a sociologist. But I do know for a fact that lots of liberals have less money than lots of conservatives—even after discounting Black and Hispanic voters. I also know that lots of liberals are religious, and that some are even evangelical. Maybe this is because I am broke, religious, and liberal, so a lot of my friends are too.

Aside from that, even where the generalizations are based on valid data, they are based on old data. A new age is coming. A lot of people whose parents were middle-class are working-class now. A lot of people who went to college may have trouble getting their kids through the BA. Many of today’s adults could not afford to buy the houses in which they grew up, or live in the same neighborhoods. Even those who were raised to prefer croissants to Twinkies™ often can’t afford the croissants any more. We are all finding ourselves spending more time bagging peanut butter and jelly sandwiches to work and cooking rice and beans at home, no matter whether we used to prefer eating out at McDonald’s or La Maison de Quiche Raffinée. Mobility is not what it used to be. I can tell, because I can’t remember whether that accent aigu is supposed to go over the first or the second e.

So anyway, guys, can’t we all—at least those of us who frequent the same sorts of blogspace and care about the same kinds of issues—get along? Can’t we assume a certain amount of good will on the part of anyone we are discoursing with who hasn’t actually come out and said,“I despise you and I think you’re stupid,” or “You are a morally corrupt pinko”? If we did, maybe we could actually exchange ideas and come up with a few useful new ones that could help all of us out of the mess we are all, whether in the right, left, or center lane, heading straight into.

Red Emma

 

Biblical Illiteralism

June 7, 2008

Back when I was an English teacher, I had a list of things that I told my students would automatically get an assignment an F.  (Of course, I almost never actually carried out the flunk threat. It was purely a mechanism to get the students’ attention, and it was fairly effective.) Miscopying printed text that was in front of the student when s/he was writing (like the assigned topic) was a biggie.  So was “between you and I.”  Provable plagiarism, of course.  As time went on, the list got longer.  The last few years I taught composition, I finally put “the Bible says” on the list (unless it was followed by chapter and verse.)

 

Some of my students undoubtedly concluded I was some kind of firebrand infidel, and I rarely bothered to correct the misimpression.  But in fact I was reacting to the increasing number of students who cited “the Bible” (without chapter and verse) as saying things like, “to thine own self be true,” “God helps those who help themselves,” and “all men are created equal” (actually quoting, respectively, Hamlet, Ben Franklin, and the Declaration of Independence.) 

 

The students who could in fact provide chapter and verse were fine with my edict, and I was fine with them.  Accurately cited scripture is proof that the student can read and quote sources in a way that is useful to the reader, a valuable and increasingly rare skill in college English classes.  I may object to a particular student’s interpretation of a particular passage of scripture, but not to the point of quibbling about it in my grading of a composition assignment.  In fact, I really like students who can use biblical sources in a useful way.  I am always pleased to have a Jehovah’s Witness in my class, because they tend to have great study habits. 

 

But what gives me the terminal twitches is people who cite or quote or allude to the Bible without having read it thoroughly and meaningfully.  People who call themselves biblical literalists and obviously haven’t read the Book in any sort of substantial way, but just quote whatever the pastor says.  People, for instance, who claim that “the Bible” forbids abortion.  I ran into this one, oddly enough, on a Quaker e-list, and when I posted back, honestly curious to think I might have missed something, the reply I got was that “I cannot believe that a God who revealed the Scripture to us would not have made the fetus a human soul from conception.”  (So much for George Fox’s  You will say, Christ saith this, and the apostles say this, but what canst thou say?”)

 

Or the self-labeled bible-believing parents in Kentucky who assailed the local school board for requiring their kids to read textbooks that depicted men and boys cooking, as “unbiblical.”  Looking up “cooking” in a concordance will reveal that all but one of the first 25 references to cooking in the Bible attribute it primarily to men, mostly priests but also Abraham, Jacob, and Esau. A cynic or a feminist might suggest that in these instances, men took the credit for work actually done by women.  But no self-respecting biblical literalist would dare to engage in such a feat of sleight of interpretation.  The Bible not only depicts men cooking, it apparently endorses the practice. Apart from Rebecca’s deceptive preparation of the goat stew that tasted like venison (hardly an endorsement), I think the first depiction of women cooking may be St. Peter’s mother-in-law.

 

Am I nitpicking?  Not with people who call themselves biblical literalists, I’m not.  A literalist is somebody who reads a text from cover to cover and then follows it word for word.  These days, most of the people who call themselves biblical literalists follow nothing word for word except the edicts of their particular pastors, who may or may not be real biblical literalists.  I certainly don’t consider myself a literalist, but four years of divinity school and forty years of Torah study at least enable me to spot a fake when I see one.

 

In the spirit of full disclosure, I must admit to having passed a religion exam in high school with a fabricated quotation from a fabricated book of the Bible (Hezekiah.) But I have become more respectful of the Book since then, and wish other people were too, particularly the ones who claim to shape their lives and their thinking on it.

 

Jane Grey

 

 

 

FROG IN THE KETTLE

June 1, 2008

Time Magazine had a cover article this week on how to survive a disaster.  Most of the advice was quite useful (plan ahead, do drills, use your everyday skills.) But it didn’t cover the most important problem of all—how do you know there is a disaster happening, and that your normal behavior won’t work this time?  For instance, German Jews between the election of Hitler and his invasion of Poland mostly had no idea they were in the runup to what would turn out to be the greatest catastrophe ever to befall world Jewry.  Many of them had access to good information, enough money to emigrate and resettle quite comfortably, and nothing hampering their mobility.   Why didn’t they leave, or at least set up a bolt-hole to get to in a hurry? 

 

Similarly, on a much smaller scale, women whose husbands or boyfriends abuse them almost always underestimate the seriousness of the threat—sometimes until it has culminated in murder.  The escalating pattern of control and violence gets worse a bit at a time, always staying within the limits of what the woman sees as “normal,  while the definition of “normal” changes with imperceptible slowness.  There is rarely a bright line, except—I have heard this again and again—a threat to the children, or an episode of violence flaring out beyond the “normal” to what the victim sees as near-death. 

 

The Time article emphasized the importance of preparedness, of being aware of what could happen. Obviously that varies from place to place and time to time.  In Florida and the Gulf Coast, it’s hurricanes. In the Midwest, it’s tornadoes.  In California, it’s darn near everything—fires, floods, earthquakes, landslides.  Fires can happen anywhere. So can plane and car crashes. 

 

But the writers never dealt with our problematic reactions to “normal” disasters.  I grew up in South Florida, during a period when hurricanes were regular events.  We had our standard responses to them—you fill up the bathtub with water, stock up on kerosene for the lanterns, batteries for the radio, and food that requires no refrigeration or cooking; you shutter the windows and doors, you bring in the lawn furniture, and then you hunker down until the Weather Bureau says it’s okay to go outside. That was the normal response to a normal disaster.  Would that have sufficed during Katrina?  Probably not. But by the time the locals figured that out, it would have been too late, precisely because they had done all the normal things that usually worked, and weren’t alert for any sign that something else was needed.

 

Similarly, the German Jews had their own normal responses to normal outbursts of anti-semitism.  Keep your passport up-to-date, keep all your important papers readily available, wear your World War I medals, stay off the streets and out of trouble, don’t make waves.  For the previous century or so, that had been enough.  By the time the locals figured out that this was an entirely different situation, a whole order of magnitude different, it was too late.

 

Abused women have their own normal responses.  Don’t do anything to annoy him, keep the children out of his way, be the perfect homemaker, don’t talk back, stay out of his way when he’s drinking, and so on.  Most of the time they work. Sometimes they don’t.

 

Paradoxically, sometimes our “normal” preparation for  “normal” disasters gets in the way of realizing that there is a real disaster out there.  You go through one fire drill after another assuming that, since this is a drill, it’s okay to go back and get your purse or your briefcase.  You do everything the Red Cross tells you to do before a hurricane, but you figure it’s not for real so it’s okay to leave one window un-boarded so the place won’t look so gloomy. 

 

The Time writers tell us that proper preparedness assures that, once you shift into disaster mode, you will do the right thing.  But how do you tell when you need to shift into disaster mode? Only the real professionals can figure that out.  Sometimes even they get it wrong.  The rest of us are at the mercy of luck.  (When the tsunami struck South Asia a few years ago, one vacationing schoolgirl noticed that the water was receding down the beach well below its usual level.  She had just learned in her science class that this was the beginning of a tsunami, and she got her family and all of the people around her to run to high ground before the water came back up—and up, and up.  By the luck of being in the right place at the right time, she, and her science teacher, saved lives.  But most of the other people in the area either didn’t notice, didn’t think the sudden receding water meant anything, or thought this was a great opportunity to go clamming.)

 

We all tend to make fun of survivalists.  In December, 1999, a friend of ours urged us to stock up on canned goods and bottled water in case Y2K turned into a real disaster. Figuring it couldn’t hurt, we did. We bought a wind-up radio and stocked up on batteries for the flashlights.  I gassed up the car. I made a point of putting our important papers, our family photographs, and the cat carrier where we could find them in a hurry.  A couple of months later, I donated most of the canned goods to a food pantry.  We drank the bottled water.  We haven’t had any real disasters since then, thank heaven.  That minor preparedness was certainly better than nothing.  Maybe it’s the best we can do—ride the fine line between ignorance and paranoia and hope we’re on the right side of it when the real disaster comes.

 

CynThesis