Archive for the ‘non-trivia’ Category

The Sign of the M

December 24, 2010

Many years ago, I started work on a speculative fiction short story about a dystopic world in which a sizeable portion of the population worshipped Murphy, as in Murphy’s Law (whatever can possibly go wrong, will.) The public demonstration of their faith was the Sign of the M, made by holding either hand with three fingers extended downward. Usually this was an apotropaic sign, as the anthropologists say, to ward off the nastier consequences of Murphy’s Law. I never finished writing the story (which involved psychedelic experiences caused by claustrophobia in a windowless world, among other things), but the cult of Murphy stuck with me.

It revisited my head yesterday, when I heard on the radio one of numerous stories about people who are now suing various banks and mortgage agencies. About a year and a half ago, the feds came up with a program to help people get their mortgages modified and avoid foreclosure. In practice, the program, and those trying to avail themselves of its benefits, have been thwarted by a pattern of bumbling and non-responsiveness on the part of the mortgagors that could be explained only as (a) conspiracy, or (b) Murphy’s Law at its apocalyptic worst. As I understand it (and what do I know, I’m just a lawyer?), to get your mortgage modified (to lower either the monthly payments, the rate of interest, the total amount owed, or more than one of the above), you have to contact whoever holds your mortgage and “work it out” with them. Naturally, they will want documentation both of the existence and terms of the current mortgage, and the problems that now lead you to want it modified. Usually that means lots of financial documentation, roughly parallel to a tax audit. The uninitiated may find this to be unnecessarily intrusive or just too damn much trouble to bother with. But we more sophisticated white-collar types figure it’s perfectly fair, and generally manage to comply with these demands, albeit with some difficulty because (a) nobody ever remembers exactly when something happened, and (b) most people don’t know exactly where anything is.

The plaintiff in one of these lawsuits explained that she kept contacting the bank to find out what documentation they needed, and sending it in. Each time, she would then find out they either needed one more piece of paper, or that they hadn’t received the last piece of paper she had sent them. Every time she called, a different person would answer, and would have to find her file and ascertain what had previously happened in her case. Eventually, it all came down to a single tax document. She sent it in, and then called to see whether they had received it. They hadn’t. Lather, rinse, repeat. And repeat. And repeat. Next thing she knew, some stranger was standing on her front lawn auctioning off her house on behalf of the bank, which had foreclosed, because, they claimed, they had never received that single last tax document. The semi-happy semi-ending in this case is that the court ultimately suspended the foreclosure, and is now graciously permitting her to live in her home and continue to make the payments on her mortgage pending resolution of the lawsuit.

But think about it, gentle reader. How different is this cycle of misfortune from what most of us deal with at least once a week, with somewhat easier and more favorable resolutions? The difference between us (so far, anyway) and the unlucky Ms. Plaintiff is all in the relative competence of the bureaucracy on the other end of the process. Eventually, they usually get it right.

Let’s leave aside, for another day’s posting, the question of whether there really is a conspiracy involved in the mortgage modification mess. It wouldn’t surprise me, but Murphy is perfectly capable of messing up transactions like this on his/its own. My brother and my daughter, who both do astrological charts, would attribute a lot of this stuff to Mercury being retrograde. I actually know realtors (mostly of Asian ancestry) who won’t schedule a closing when Mercury is retrograde.*

Anyway, while listening to Ms. Plaintiff, it occurred to me to wonder whether there is anybody out there who doesn’t have to deal with this kind of administrative bumbling. See, on roughly the same subject, only with the congressional representative of a fairly important district as the protagonist.

The Wired family is actually caught in a similar bind at the moment, unable to pay for a very necessary medication until the New Year. This happens because the insurance program which provides Part D coverage has decided this is the time for the deductible to kick in, but the various assistance programs for the pharmaceutical companies are available only to people who don’t have medication insurance coverage. And the reason for this inability to pay is that a bunch of clients who owe me roughly a year’s income between them haven’t paid, and most of them really can’t pay, for reasons very similar to my own.

All I need, perhaps all most people need, is for the system to work properly by its own standards. This would entail the people who owe me money paying on time, the people I send documents to not losing them, the people I talk to on the phone on various important matters either being available the next time I call, or at least taking good notes so the next person I talk to there will be able to find the file and then get up to speed on the case while I am on the phone with him/her. Judging by the experience of Maxine Waters, getting elected to Congress wouldn’t entitle me to this. Probably being POTUS or very rich would. Which makes it not only a privilege, but a really rare one.

In some situations, the M Factor can be literally deadly. The procedural histories of most of the death penalty cases that find their way to the Supreme Court are swamps of the M Factor. It is no coincidence that most of the defendants in these cases are poor, poorly-educated, and poorly represented. It is probably no coincidence that so much of the Supreme Court’s wisdom is expended on deciding whether a particular occurrence of the M Factor is serious enough to justify overturning a guilty verdict or a death sentence.

On the other hand, POTUS, and the very rich, are not necessarily exempt from these hassles. What their privilege generally gets them is the services of somebody else, usually somebody reasonably competent, to resolve them. There is actually a job title for this: Personal Assistant. Arguably, there may even be a career path for it: wife. President George H.W. Bush was famously enthralled by the electronic gizmos at grocery checkout counters, not because no grocery store was involved in the purchasing of his daily bread, but because somebody else had always been the one to stand in the checkout lines for it. Does this mean that the M Factor (let’s call it that for the purposes of this discussion), like matter and energy, cannot be eliminated, but only moved around, delegated, transformed?

I don’t know if anyone has actually done a systematic study of the M Factor as such, under any name.. Hamlet calls it “the law’s delay,” and cites it as one good reason for suicide. It gets examined for other purposes under other names, for instance, in the study of Third World systems and why they work so badly. Corruption is merely a subspecies of the M Factor—it is the money one pays to various functionaries to keep the M Factor from totally destroying a transaction. Various hierarchies of public and domestic service exist mainly both to create the M Factor, and to keep it under control.

The M Factor raises some great politico-philosophical questions. Is it inherent in the human condition? Are those who either try to eliminate it or try to avoid dealing with it in their own lives guilty of hubris? Is a life free from the depredations of the M Factor, a life where everything works the way it is supposed to, a privilege everybody should have, or a privilege nobody should have? Most of us perceive rich people as having to deal with less of the M Factor than poor people, and people in rich countries dealing with less of it than people in poor countries. Is this reality, or appearance? Is the amount of the M Factor per person constant, subject only to being delegated or moved around? Or can affluence actually get rid of at least some of it? And if so, is affluence necessary for that purpose, or merely sufficient? Is there some other way to do the job?

Until we get a better handle on the workings of the M Factor, maybe worshipping it is the best we can do. Ommmmmm’s the word.


Not-so-legendary Urban Legends

July 10, 2008

According to a couple of sources cited in Wikipedia, Hitler may really have been monorchidic, just like the song says. The Russians allegedly did an autopsy and that was one of the things they claim to have found, or rather, not found. No word on the glandular endowment of the other Third Reich functionaries, however, except that Goebbels, whose wife bore six children, probably was better endowed than the song indicates.

The mouse-in-the-Coke-bottle story is not an urban legend. In fact, mice (and various other vermin and foreign objects) found in Coke bottles have been the subject of numerous for-real court cases, most notably Harris vs. Coca-Cola Bottling Company (35 Ill. App.2d 406, 183 N.E.2d 56, Illinois 1962,) which in turn cites a string of cases going back nearly 40 years involving similar problems. In none of those cases was there any dispute about the foreign object in question actually being in the Coke bottle—only about how it got there, and the appropriate legal consequences of the occurrence. A cursory search of litigation history in Illinois alone reveals more than 25 cases, not counting such unreported cases as the one I handled some years ago involving a used Band-Aid in a bagel. The frequent allusion to such cases as urban legends is undoubtedly part of the vast right-wing conspiracy against tort lawyers. Don’t be fooled, either by the conspiracy or by the Coke bottles. I have not done any research into the incidence of foreign objects and vermin found in Coke cans, however.

Maybe we should start a regular feature column of urban legends that turn out to be real?

Jane Grey



June 30, 2008

When I was doing graduate work in sociology, I took a course on “deviance.”  I did a paper for the course, on conscientious objection.  It’s a fascinating subject, about which I could go on for a long time, but won’t.  I chose the subject in the first place because the notion that having a conscience could be “deviant” struck me as marvelously ironic.  (The upshot of the paper was that the CO application process was a triumph of organic over mechanical solidarity [as Durkheim would put it.] )  What really mattered about that paper was that it changed the course of my life, because I wrote it a year or so before the 1965 increase in American troop strength in Vietnam.  I chanced to mention it to a friend of mine, and the next thing I knew, lots of people were asking for a copy.  That made me nervous, because by this time it was full of probably outdated information.  So when I saw a posting from AFSC about a free course to become a draft counselor, I signed on right away.


I spent the next ten years working in the area of Selective Service and military law, and eventually went to law school. As an attorney, I’m still doing military and veterans’ benefits law, and have done Selective Service stuff when the issue arose.  That involves all kinds of legal issues, but it still occasionally raises questions of conscientious objection, and that’s still a fascinating process. 


The body of statute and case law that set out the CO requirements for draft exemption or military discharge (or exemption from the “bearing arms” oath for new US citizens) clearly started out with the “historic peace churches”—Quakers, Mennonites, and Brethren—in mind.  Those requirements have evolved to encompass other Christians, non-Christians, non-church members, agnostics, atheists, and ultimately people with no official religion at all.  But the law remains clear that a mere “personal moral code” or a set of “political beliefs” won’t qualify. 


And most young people these days—even the regular church-goers—are theological illiterates.  (Among the splendid exceptions are the Jehovah’s Witnesses, whom I have occasionally represented.)  All they have, most of the time, is a personal moral code, or a set of political beliefs.  What I do is more the job of an English teacher or an editor (both jobs I have held in addition to practicing law) than an attorney.  I work the client through the “Four Questions” that are the basis of the CO application:


  1. What do you believe, and how does it prevent you from being willing to participate in all wars? (Note: not just some wars—that would be too easy.  CO applicants go through endless grilling about whether they would have fought against Hitler, but nobody has to justify being unwilling to fight for Hitler.)
  2. Where’d you get these weird ideas? (Note: the military presumes that it is normal and natural to be willing to kill a total stranger when ordered to do so by another total stranger. Any deviation from this norm has to be explained.)
  3. What have you done to put your beliefs into action?
  4. Who can vouch for your sincerity?


This process requires a lot more introspection than most young Americans are used to.  Also a lot more writing.  (At the outset, I tell them it’s the equivalent of a long term paper, in expenditure of time and energy.) Once it’s all down on paper, the translation process begins.  Writing one’s congresscritter about the war in Iraq is rarely just a statement about that war; it is usually a statement about war in general in the context of the only war the kid knows about.  World War II? What was that? I know there was some kind of war in the 1940s, but I forget who was in it or who won.  Same with going to demonstrations and marches.  Working at a soup kitchen is a statement about the essential value of all human life, even the most miserable.  Running a school recycling center is a statement about the value of the earth and its resources, which war destroys big time. 


None of this is fake. I don’t do fakes, nor do my clients, so far as I know.  It’s just a matter of putting the very individual and personal—which won’t get recognized as conscientious objection by the official deciders–into a broader context that the client has only started to think about when confronted with a human-shaped target and told to “kill, kill!!!” 


This process is a species of what elementary school teachers call “code-switching”—expressing the same ideas in different ways depending on context, audience, and purpose. When greeting your buddy, you can high-five him and say “yo!”  When you meet the Dalai Lama, on the other hand, you do not break out singing “Hello, Dalai!” 


The process can go both ways, as Obama has demonstrated in his Sojourner speech two years ago (never mind that James Dobson for some reason has brought it up two years later to question Obama’s theology.)  If you are going to bring your religious beliefs to bear on political issues (other than a CO application, I guess) among people who do not share those beliefs, you need to speak the language of your audience.  This is partly for symbolic purposes—we are conducting an election in a democracy composed of people who hold lots of different beliefs, and whose constitution prohibits establishment of religion as such or of any particular religion.  A candidate or advocate who does not respect that prohibition is telling at least some of that audience, “You don’t belong. You don’t count.” Which is the last thing you want to tell a voter, any voter.


But a lot of those voters may not even catch the in-groupness of standard evangelical Christian language, because they don’t know anybody who doesn’t speak it. (Whoever discovered water, it probably wasn’t a fish.)  I have long since lost count of the very nice, very earnest Christians who ask me, in utter perplexity, “You mean Jews don’t believe in Jesus?”  And telling them that their language is not the lingua franca of common discourse in their own country can amount to telling them “you don’t belong. You don’t count.”  It can carry its own political costs.


Nonetheless, I think American politicians who are running for president rather than, say, Pope or Caliph or Bishop, have to presume that their audience includes non-Christians and non-religious people who still have a right to vote, and to know where their candidates stand.  In short, translating values based in a particular religion into universally comprehensible values is not only effective politics in a pluralistic polity, but a way of honoring the founders of that polity and the universal values they were trying to establish.


Jane Grey




June 20, 2008

Northwestern University School of Law, here in Chicago, has just announced that it will be offering students a chance to get a law degree in two years instead of the traditional three. Let’s look at the history of this. Back in the Good Old Days, you didn’t have to go to law school at all to become a lawyer. Nor did you have to go to college first. You apprenticed as clerk to a practicing lawyer. That’s how Abe Lincoln started out. There were law schools, but they were for the upper classes. There are still places where you can get admitted to the Bar without a law degree today, though it’s pretty hard to arrange, and not many people do it. A Chicago attorney who passed away only a couple of years ago was the last person I know of personally who did it. He was pretty good at what he did, too.

Then came the American Bar Association, which in its wisdom decided about 100 years ago that the profession was too open to the riffraff. So it encouraged state bar communities to require a law degree, or at least make that the preferred track into the profession. Back then, getting a law degree took—gasp!—two years. Back then, BTW, many law schools did not require a college degree for entrance.

Somewhere around the same time, law schools started permitting night study for people who had to work while getting their degrees. That, reasonably enough, took three years.

I’m not sure precisely when the third/fourth year of law school was added, though I think some schools were still holding out as late as the 1960s, but the point of it was to enable the student to take more specialized courses. As a practical matter, the last year of law school has, ever since, been regarded with barely-tolerant amusement by most students. The axiom among law students is, “The first year, they scare you to death. The second year, they work you to death. The third year, they bore you to death.” I’m not sure what useful purpose the last year ever served. But I’m really skeptical about the rationale now being used for dropping it.

Well, not exactly dropping it. What they’re actually doing is compressing it. Pretty much what I did with my own legal education, actually—I went to night school, while working full-time. Normally that should have taken four years. I did it in three and a half, by taking classes through the summers. But Northwestern is proposing that day students, who would normally finish in three years, take classes through the summers and “mini-classes” between semesters, thereby finishing in two years. They are also shifting the emphasis in course requirements, to teach skills such as “accounting, teamwork, and project management,” and modeling the degree program more closely on the business school pattern. Presumably that means that some other courses are being dropped, but nobody is saying which ones. All of this is explicitly designed to make the future employers of the next generation of lawyers happy with their more practical, business-oriented approach.

Which makes me really nervous. If all NU wanted was to shorten the time required to get a law degree, I wouldn’t be all that bothered by their dropping or shortening the third/fourth year. But making the students work harder and with less time between terms, and come out looking more like MBAs, scares the flippen daylights out of me.

Law has been for a long time the last refuge of the liberally educated professional, the last place where you can get props for knowing things outside your specialty. Roughly half of my law school class had advanced degrees in something else before starting law school. The point of being a lawyer is to be able to think critically and analytically, to balance competing interests and ethical imperatives, to be able to break down a situation into ponderable parts and weigh them against the applicable law. [And, not incidentally, to learn how to spell habeas corpus.]

The lawyer’s job is to be, not a Minuteman, but a Wait-a-Minuteman, somebody who can tell the client, “Not so fast. You can’t do what you want to do here.” Clients don’t much like that function of the legal professions, and would vastly prefer to hire MBA-like robots who will simply map out the shortest apparently legal distance between here and where the client wants to go, and “make it happen.” We have been witnessing, over the last ten or fifteen years, a large number of accountants and MBAs getting indicted, and often imprisoned, for taking just this approach. A few lawyers have already gone this way as well. Does NU really want to send an entire generation of new lawyers in the same direction?

What’s the rush, anyway? I can understand people who try to finish college in three years—it can actually save money, if planned properly. The NU administration claims not to have decided whether its two-year program will cost the same $42,000 total tuition as the current program, but I would bet it will. It’s hard to imagine that faculty members drafted to teach the new summer and mini-courses won’t demand some kind of extra pay for doing it. The proposed program would require entering students to have spent at least two years working in some other field, which many of them would anyway. But the fact is, we’re all living longer these days, and lawyers never retire. [One of my colleagues, who was still practicing last I heard, just celebrated his 100th birthday. Another, who I know is still practicing, was admitted to the Bar the year I was born.] So starting one’s professional career a year earlier or later is pretty insignificant in the long run. If anything, the legal profession needs to slow down a bit more, and get back into the habit of thinking things through before making a recommendation. And one of the recommendations most in need of thinking through, obviously, is Northwestern’s two-year program.

Jane Grey


Biblical Illiteralism

June 7, 2008

Back when I was an English teacher, I had a list of things that I told my students would automatically get an assignment an F.  (Of course, I almost never actually carried out the flunk threat. It was purely a mechanism to get the students’ attention, and it was fairly effective.) Miscopying printed text that was in front of the student when s/he was writing (like the assigned topic) was a biggie.  So was “between you and I.”  Provable plagiarism, of course.  As time went on, the list got longer.  The last few years I taught composition, I finally put “the Bible says” on the list (unless it was followed by chapter and verse.)


Some of my students undoubtedly concluded I was some kind of firebrand infidel, and I rarely bothered to correct the misimpression.  But in fact I was reacting to the increasing number of students who cited “the Bible” (without chapter and verse) as saying things like, “to thine own self be true,” “God helps those who help themselves,” and “all men are created equal” (actually quoting, respectively, Hamlet, Ben Franklin, and the Declaration of Independence.) 


The students who could in fact provide chapter and verse were fine with my edict, and I was fine with them.  Accurately cited scripture is proof that the student can read and quote sources in a way that is useful to the reader, a valuable and increasingly rare skill in college English classes.  I may object to a particular student’s interpretation of a particular passage of scripture, but not to the point of quibbling about it in my grading of a composition assignment.  In fact, I really like students who can use biblical sources in a useful way.  I am always pleased to have a Jehovah’s Witness in my class, because they tend to have great study habits. 


But what gives me the terminal twitches is people who cite or quote or allude to the Bible without having read it thoroughly and meaningfully.  People who call themselves biblical literalists and obviously haven’t read the Book in any sort of substantial way, but just quote whatever the pastor says.  People, for instance, who claim that “the Bible” forbids abortion.  I ran into this one, oddly enough, on a Quaker e-list, and when I posted back, honestly curious to think I might have missed something, the reply I got was that “I cannot believe that a God who revealed the Scripture to us would not have made the fetus a human soul from conception.”  (So much for George Fox’s  You will say, Christ saith this, and the apostles say this, but what canst thou say?”)


Or the self-labeled bible-believing parents in Kentucky who assailed the local school board for requiring their kids to read textbooks that depicted men and boys cooking, as “unbiblical.”  Looking up “cooking” in a concordance will reveal that all but one of the first 25 references to cooking in the Bible attribute it primarily to men, mostly priests but also Abraham, Jacob, and Esau. A cynic or a feminist might suggest that in these instances, men took the credit for work actually done by women.  But no self-respecting biblical literalist would dare to engage in such a feat of sleight of interpretation.  The Bible not only depicts men cooking, it apparently endorses the practice. Apart from Rebecca’s deceptive preparation of the goat stew that tasted like venison (hardly an endorsement), I think the first depiction of women cooking may be St. Peter’s mother-in-law.


Am I nitpicking?  Not with people who call themselves biblical literalists, I’m not.  A literalist is somebody who reads a text from cover to cover and then follows it word for word.  These days, most of the people who call themselves biblical literalists follow nothing word for word except the edicts of their particular pastors, who may or may not be real biblical literalists.  I certainly don’t consider myself a literalist, but four years of divinity school and forty years of Torah study at least enable me to spot a fake when I see one.


In the spirit of full disclosure, I must admit to having passed a religion exam in high school with a fabricated quotation from a fabricated book of the Bible (Hezekiah.) But I have become more respectful of the Book since then, and wish other people were too, particularly the ones who claim to shape their lives and their thinking on it.


Jane Grey