Archive for the ‘culture’ Category

See What The Boys In the Basement Will Have

December 7, 2011

The “boys in the basement” are what Stephen King calls his muse, the source of his imagination. Mostly they just hang out, idly, making occasional noise, drinking beer, and every now and then sending messages upstairs. When they are napping, or when the folks upstairs are paying insufficient attention to them, the writer is stuck. “Blocked,” as some of the semi-pros like to say.

The pros often say there is no such thing as writer’s block, there is only laziness. I think that may depend on how one experiences, or defines, the state of consciousness required for writing. For me, it varies, often depending on the context. Back when I wrote regularly for publication, what I mainly required was a topic, a word count, and a deadline, and I believed I could produce just about anything on time. That belief may or may not have been justified, but it worked pretty well most of the time, for the kind of stuff I was expected or contracted to write.

Under deadline, I don’t recall ever being blocked. Often, I would wait until some siege of particularly inclement weather (snow or heat) to wall myself up and produce produce produce. Chicago can be trusted to come up with such onslaughts often enough to keep the writer at work. Sometimes, I would use a long weekend for the same purpose. But working against the clock really helped a lot.

These days, when nobody is waiting on my production to fill space or meet some third-party obligation, it’s harder for me to get going. The boys in the basement are too busy playing video games to communicate with me. When they bother, often, it’s to complain about the brand of beer I’m stocking.

Sometimes, the problem is that I feel as if I’ve already said it all, at least about some particular subjects. Newt Gingrich, for instance. Back when he was just a twinkle in the eye of his Georgia congressional district (which my brother was living in at the time), I thought he was a flake, but a smart flake. I still do. Since then, he has made his bones as a serial wife-dumper and contributed significantly to my opinion of the GOP as a large closet rather than a big tent. He has joined the collection of people I would cross the street to avoid shaking hands with (along with Clarence Thomas, but that’s another story.) (People with whom I would cross the street to avoid shaking hands? See Churchill’s “the kind of arrant pedantry up with which I will not put.”) But mostly, I think I said it all in the 1990s.

Same goes for The Bell Curve, which is now enjoying revived discussion in the Atlantic by two of my favorite writers, Coates and Sullivan. I taught a course using it, and Plato’s Republic, and Stephen Jay Gould’s Mismeasure of Man, back in the 1990s, and essentially summarized the course for a review in a small newspaper around the same time. What more could I possibly say? See for yourself (http://dissociatedpress.blogspot.com/search?q=Bell+Curve) If they’d just give me some frakkin’ new material, maybe I could think of something new to say about it.

Okay, what about the GOP primary? Is Romney “inevitable”? Maybe. Maybe even in the general election, since it looks as if both sides might actually allow him to govern if he gets elected. Not sure any other candidate, on either side, meets that qualification. That may be all the voters want, these days. Chances are, they would even accept a third-party candidate if he seemed likely to meet that bar. Cain is comic relief; although it bothers me that some commenters see the stories about his affair as being all the more damaging because it allegedly lasted 13 years. I think 13 years is a plus. It indicates that Cain is capable of focused affection, unlike the afore-mentioned Newt, or Rudy Giuliani, who actually managed to cheat on his wife, his official mistress, and his girlfriend within the same short span of months. Maybe that’s just the cynical perspective of a divorce lawyer. But dammit, it’s all old news.

Okay, how about: which is worse, or better, the Tea Party, or Occupy Wherever? (A recent client of mine actually got busted with Occupy Salt Lake City, a mind-boggling concept.) That’s relatively new news, right? I think they draw their passions from the same cultural spring. They’re not quite as easy to tell apart as anarchists (dionysian) versus libertarians (apollonian.) They both have a very healthy dose of localism. And they both have a large dose of dionysian energy and not a helluva lot of apollonian intellect behind them. But they are both, in fact, slightly differing ways of saying “I’m mad as hell and I’m not going to take it any more.” Which dates back to Network, in 1976. And which in turn, I think, is a loose translation of “je m’en fichisme,” a French phrase which the New Yorker dates back to 1917 or thereabouts, but which I first encountered in the late 1950s, and which apparently has stuck in my memory because I was studying high school French at the time, and got a kick out of learning a French phrase that the good sisters undoubtedly were never going to teach me.

How about sexual child abuse among college football coaches? Nothing new there, except that none of them are vowed to celibacy, and some of them may even be Protestant. The scandal seems to have erupted on a slow news day and then taken on a life of its own.

Okay, breaking news—our Chicago public radio station has just announced that tonight it is “pre-empting the world” ! How’s that for nerve? Actually, it just means we don’t get to listen to the BBC world news program tonight, because the head of the Chicago public school system is coming on live to answer questions in the same time slot. Not a bad idea, but not exactly world-shaking (or even world-pre-empting) either. Maybe I’m the one with a case of je m’en fichisme.

Or maybe my real problem is that the boys in the basement, kind of like a newborn baby, sleep when I’m awake and available to write, and start jumping around when I’m getting ready for bed. Stephen King, never having been pregnant, seems not to notice the similarity.

Today, our former governor got sentenced to 14 years for corruption. His predecessor, I think, got six and a half years on similar charges. Two of their predecessors also did time, and another one was indicted but acquitted. More not-very-new-news. From now on, maybe I should plan to start writing sheer fantasy of various fictional and nonfictional varieties, just to keep the boys in the basement awake when I have time to write.

Oh, and one more bit of not-very-new-news: Kathleen Sebelius has overruled the FDA and chosen not to make the “morning after” pill more readily available. No doubt the Obama administration is choosing its battles carefully these days. But it bothers me that the religiously-affiliated lobbies that have worked so assiduously to make access to contraception more difficult have not uttered Word One about the evils of prescribing Viagra for unmarried males. So much for a consistent sexual ethic which is not to be viewed as anti-woman.

In the meantime:

Raisin Consciousness

Physicists say that time

Is what keeps everything from happening at once.

But holidays

Are what keeps everything from feeling as if it’s happening at once.

Holidays are like the raisins in rice pudding.

Without them, it turns into a glutinous untextured mass.

The raisins add texture,

And sometimes, sweetness.

A good holiday to you all. Peace and light,

The Wired Family

Advertisements

The Optical Illusions of the Soul

November 20, 2011

Google “optical illusions” and you will pull up a huge number of moving and stationary, black-and-white and color, geometric and random drawings that have in common the ability to look first like one very definite image, and then like a totally different one, to the normal human eye. I just finished reading a Dan Simmons novel, Flashback, which had pretty much the same effect on me, not for the first time. Reading Ayn Rand does the same thing. So do some of the postings on this blog. They make me acutely aware that my fondest dream may be your worst nightmare. Dunno whether I am the only person for whom my fondest dream may be my worst nightmare. That, of course, is why we Wired Sisters are multiple.

I’ll start with the religious version of this phenomenon, which I used as a Rosh HaShanah discussion last year under the title The Abrahamic Split. Some bibliographical and linguistic notes: Rabbi Michael Lerner is the author of Jewish Renewal. “Milhemet Mitzvah” is, in traditional Jewish thinking, a war which is commanded by the Holy One, as opposed to wars which are either optional or forbidden. The only one of these that everybody seems to agree on was the conquest of the land of Canaan by the Israelites on their way out of Egypt, the subject matter of the second through fifth books of the Bible. “Midrash” is how the various scholars explain what the biblical characters did between the installments of the text. What Woody Allen does at the end of this discussion is also midrash. Maimonides was a twelfth-century rabbi, scholar, philosopher, and physician, whose views of scripture often seem to come out of left field. So here it is. Next posting will be the political angle.

Over recent decades, we have become conscious of a double voice in the Jewish tradition, a voice on one hand of “love your neighbor as yourself” (Deuteronomy 6:5), and on the other of “remember Amalek” (Deuteronomy 25:17.) Those of us who follow political and religious controversies are all too aware that this double voice is duplicated in Islam ([Qur’an, Sura 2:256] “There shall be no compulsion in religion…. [Sura 18:29] Proclaim: “This is the truth from your Lord,” then whoever wills let him believe, and whoever wills let him disbelieve”, and on the other side [Sura 47.4] “When you encounter the unbelievers, Strike off their heads. Until you have made a wide slaughter among them…” Similarly, Christians can quote the Gospel of Matthew: ” Love your enemies and pray for those who persecute you, that you may be sons of your Father in heaven. He causes his sun to rise on the evil and the good, and sends rain on the righteous and the unrighteous” (Matthew 5: 44-45) Or they can call down the wrath of Heaven in the form of Crusades, new and old. Before the Crusades, after all, was the Jihad. And before the Jihad was the Milchemet Mitzvah. After a while, some of us find the lazy way out. We decide that the Holy Blessed One was speaking only in the words of love and mercy. We who hear that voice, among all the Abrahamic faiths, can talk to each other. But we need pay no attention to those, in all three faiths, who hear the voice of cruelty and revenge. Rabbi Michael Lerner, in Jewish Renewal, even suggests that that voice is not the voice of God at all.

Let us leave aside for the moment Whose voices those are, on both sides of the Abrahamic split. Let’s look at where, historically, they are first heard. I think the Jewish tradition first hears them both, side by side, in the story of the Akedah, the binding of Isaac, which is the root narrative of all three of the Abrahamic faiths (Genesis 22:1-19.) Abraham hears the voice of the Holy Blessed One, at night, tell him to take his son up to the Mount and “offer” him. Michael Lerner will tell us that that was not the Holy One’s voice. Maimonides will tell us that it was a prophetic vision meant to show us how far one may be expected to go in obedience to Heaven—but that the actual Akedah may never have happened at all. The Muslims tell us that the son designated for offering was Ishmael, not Isaac. The Christians tell us that the whole thing prefigured the sacrifice of Christ. But let us assume for the moment that what Abraham heard was really the voice of Heaven. He certainly behaved as if he believed that. He took the two “lads” (midrash tells us they were Ishmael and Eliezer) and Isaac, and all of the paraphernalia of sacrifice except the sacrificial animal, and walked three days toward a place “to be announced.” When they got there, Abraham apparently knew this was the place, even though he did not hear any divine voice saying “Okay, here you are. Up on that mountain there.” He looked up and there it was, without so much as a “You Are Here” sign.

But Abraham also never quite comes out and says that Isaac is to be the victim. Is this because the voice in his vision told him only to “offer” his son, and not to kill him? Some of the midrash points in that direction. Other midrash, coming from the time of the Rhineland massacres a thousand years later (when the Crusaders stopped off on their way to the Holy Land to kill enormous numbers of Jews), will not accept that lawyer-like parsing of words. That midrash depicts Isaac preparing himself to be killed, and asking his father’s help to be a worthy victim. Indeed, in some of that body of midrash, Isaac is actually killed, and then revived.

At any rate, Abraham sends the “lads” away, binds his son on the altar, and raises his knife. And then he hears another voice. The text says it is the voice of an angel or a messenger, but we are familiar by now with the ever-shifting line between the Holy One and the angels, between Principal and Agent. “Lay not your hand on the child,” that voice says, “nor do anything to hurt him.” Abraham, confused, stops, frozen, his arm raised. He is here to do what he has been commanded. Now he is commanded to stop. What does he do now?

Completely distracted from what he has so painfully nerved himself to do, he looks around, and sees a ram. The Ram. Sees him “after,” “behind,” in Hebrew “achar.’ The Hebrew in such a construction, would normally have been “acharav”—behind him. Midrash makes much of the oddness of the locution here, based on its axiom that the Holy One does not waste words. Does “achar” mean, as it often does, “in the future”? Maimonides thinks so. Is Abraham seeing the generations after, looking back on the story as we are doing now, and asking himself, not only “what does the Holy One want me to do?” but “what does the Holy One want all of us to do, for generations to come”? Abraham, after all, is a prophet. Prophets have visions. They see the future. Or futures.

The Ram is caught in the bushes—“basbaq,” a locution that, in modern Hebrew, means something like “in the turmoils of everyday life.” Which is something more likely to happen to us than to rams. Or is Abraham the one who is caught in turmoil? At any rate he resolves the turmoil by taking the ram from the bushes and substituting him for Isaac on the altar, where he completes the offering.

The midrash makes this ram the raw material of Jewish ritual for centuries afterward. “The ashes of the parts burnt upon the altar formed the foundation of the inner altar, whereon the expiatory sacrifice was brought once a year, on the Day of Atonement, the day on which the offering of Isaac took place. Of the sinews of the ram, David made ten strings for his harp upon which he played. The skin served Elijah for his girdle, and of his two horns, the one was blown at the end of the revelation on Mount Sinai, and the other will be used to proclaim the end of the Exile, when the “great horn shall be blown, and they shall come which were ready to perish in the land of Assyria, and they that were outcasts in the land of Egypt, and they shall worship the Lord in the holy mountain at Jerusalem.” And of course we have been blowing the ram’s horn, the shofar, during the Days of Awe, for the same purpose. For the purpose, in fact, of making us hear once again, and again and again, that other voice of Heaven, holding us back from the ultimate violence.

The only midrash I have been able to find that brings these two voices into simultaneity, if not harmony, comes from, of all people, Woody Allen.

“And so he took Isaac to a certain place and prepared to sacrifice him but at the last minute the Lord stayed Abraham’s hand and said, “How could thou doest such a thing?”
And Abraham said, “But thou said —”
“Never mind what I said,” the Lord spake. “Doth thou listen to every crazy idea that comes thy way?” And Abraham grew ashamed. “Er – not really … no.”
“I jokingly suggest thou sacrifice Isaac and thou immediately runs out to do it.”
And Abraham fell to his knees, “See, I never know when you’re kidding.”
And the Lord thundered, “No sense of humor. I can’t believe it.”
“But doth this not prove I love thee, that I was willing to donate mine only son on thy whim?”
And the Lord said, “It proves that some men will follow any order no matter how asinine as long as it comes from a resonant, well-modulated voice.”

But the Tradition is not comfortable with that view either. We do not discard one of the divine voices in the Torah because it is cruel, or cast doubt on its reality because it was “only” a prophetic vision, nor because the Holy One was only joking. All of those are tempting, and we are honest with ourselves about the temptation. But in the end we side with the Sanhedrin, as it ruled between the strictness of the rabbinic school of Shammai and the humility and humanity of the school of Hillel: Elu v’elu divrei elohim hayyim—These and those are both the words of the Living God—but the law—our law, because we are only human–must follow the merciful school of Hillel.

CynThesis

Old Age Should Burn and Rave at Close of Day

October 7, 2011

Back when I was an English teacher, one of the best writing tips I gave my students was to write the last paragraph of an essay first, and the first paragraph last. Remember Benjamin Button (recently played by Brad Pitt)? The guy who was born old, grew younger every year, and finally faded into infancy and unborn-ness? Maybe that’s how most of us live. The teenage brain has no sense of the long term. With most of his three-score and ten years still ahead, the teenager lives as if there were no tomorrow. Developing a sense of the future, and then the ability to plan for it, is the project of young adulthood. Some of us do it better than others. But by the time we reach the post-householder age (as the Hindus define it), there really are very few tomorrows left, but we live as if our future were completely fixed and defined.

Admittedly, the post-householder age is not what it used to be. There’s a lot more of it. When Prussian Chancellor Bismarck, back in the 1880s, introduced government-funded old age pensions, he set them up to begin at age 65—because by that age, the average German citizen was dead. Today, Social Security coverage would have to begin at 80 to accomplish the same goals. And most of us are still in pretty good health until shortly before death. The average American spends half of his total lifetime medical expenditure in the last six months of life. For the 14.5 or so years before that, most of us are in pretty good shape.

So here we are, we older Americans, with 14.5 years of able-bodied life ahead of us, free of workplace obligations, educated by experience to know which way the wind is blowing without the aid of a meteorologist, and, often, more economically secure than we have ever been before. “The last of life for which the first was made,” as Robert Browning presciently called it. We are the natural revolutionary cadre. We can no longer leave it to the college kids, who are overworked and economically terrorized, desperate to build a future they cannot imagine. We have the security. We have the education and experience. Above all, we have the time.

What we don’t have is a romance of revolution. The Arab Spring is rooted in societies where the median age is 30 or under—pretty much like “the ‘60s” in the US and Europe. In 1966, Time Magazine named the youth of the ‘60s its “Person of the Year.” We still think of revolution as the task of youth. That’s a luxury our country as a whole can no longer afford. The median age of the American population is now close to 40. And everybody under 60 or thereabouts is expected to be either working for pay, or trying to find work. We geezers and crones are the only people allowed to do anything else useful with our time, and even availed of the necessities of life while we do it.

My brief perusal of the coverage of Occupy Wall Street in New York and its parallel protests in other cities tells me the media see the protesters as “students” and “youth.” I’m a bit skeptical of this depiction. Back in The Day, I spent a fair amount of time in protests and rallies myself. I was at the time an English teacher, respectably married, and generally went to such events wearing skirt, blouse, and jacket, hose and shoes. Most of the people I hung out with were similarly employed and attired. But we never showed up in the coverage. Had there been one single long-haired scraggly hippie among us, he was, invariably, the one who would turn up on the evening news. So if the media want to define this round of protests as completely youth-oriented, mere facts won’t stop them. And if the Raging Grannies and the Gray Panthers and our other age-mates happen to be turning out in respectable numbers, we will probably be operating under cover of media-generated ignorance for at least the first year or two, and that may be just as well. Invisibility is a useful tool and an excellent weapon. Let’s hold off on public Dodder-Ins for a while yet.

In the meantime, I have just managed to get some assistance with taking care of Mr. Wired, so I will be able to spend more time practicing law and getting to Shabbat services. This is going to be an interesting year. Peace and light to you all.

Red Emma

A Limited Defense of Affirmative Action

May 29, 2011

I am a beneficiary of affirmative action. These days, so they say, I should be ashamed to admit it. It implies, after all, that I was not otherwise qualified for some benefit I obtained only because of being some kind of “minority.”

I have actually benefited from affirmative action on two different counts–as a woman, and as a Hispanic. Every now and then that gives me a slight edge on the competition. That doesn’t bother me particularly. I’ve been discriminated against as a woman more times than I can remember (or, probably, than I have ever known) beginning at least with my first permanent job, which I obtained only by giving the right answer to the employment agency’s question about what method of birth control I used. (For those too young to remember that era, the right answer was not “none of your damn business.” It was “the Pill.”) On another job, I was sexually harassed before there was even a word for it, much less a cause of action. So I figure any benefit I get from the double x chromosome is just a matter of restitution.

I have also been discriminated against, I’m pretty sure, for being Jewish. This, of course, gets me no affirmative action points, but that kind of makes up for the fact that I do get points for being a Hispanic (both my parents were born in Cuba, and my family is essentially bicultural) even though I have never been discriminated against for that fact. (As a matter of fact, since I am a natural blonde and speak English without an accent, nobody knows I am Hispanic unless I choose to tell them, and I normally do that only where I will get extra points for it. Which is generally in jobs where my ability to speak Spanish really is a plus.) And most recently, I have probably been discriminated against for my age, which is illegal, but for which I get no affirmative action points. So I will take those points where I can get them, without embarrassment and without feeling that my competence is in any way in question.

I went to a good college and made Dean’s List my last two years. I scored in the 98th percentile on my LSATs. But when I applied to law school, I was admitted to a school in which 45% of my class was female, in the mid-’70s, and rejected by another school which had a far lower percentage of female students in that year. The evidence seems clear; I was almost certainly admitted to the former because of my gender, and rejected by the latter for the same reason. My objective qualifications were equally irrelevant to both schools. Probably all those qualifications got me in the second school was a rejection further along in the process than some of my less-qualified sisters (and my totally-unqualified brothers.)

Realistically, of course, nobody ever challenged my academic competence, or that of any other woman I know who has been accepted into any academic program under an affirmative action program. Even the most neanderthal of male supremacists will grant that women on the average do better in school, except in mathematics and the hard sciences, than men. The reason women have historically been discriminated against in academic admissions is that we are not expected to be able to do much of anything useful with our knowledge and academic credentials after we get them.

So the affirmative action issue really only gets raised, where women are concerned, when one of us is promoted to a position of power, beyond the glass ceiling. Then the innuendoes fly–quotas, sleeping with the boss, the supervisor is a leg man, somebody’s sister, somebody’s daughter, somebody’s wife. Most of us, however, would still rather live with the humiliation of possibly having been promoted because of our gender than with the equally potent and much less remunerative humiliation of not having been promoted for the same reason.

Stephen L. Carter’s misgivings

Which is why I have trouble with people like Stephen L. Carter. His Reflections of an Affirmative Action Baby is a thoughtful and well-written book with a good sense of the complexities of inter-ethnic relations in the United States of the 1990s. But I have a few problems with its basic premises. Don’t expect the Establishment to make special standards for you, he tells young African Americans. It’s humiliating that we should think we need that. Meet their standards, beat their standards, and demand to be accepted on their terms. For Blacks and Hispanics, who are popularly expected to be less competent in academic achievement, it may actually be a source of humiliation to be admitted to a respectable school under an affirmative action program because of their ethnicity. However, most of the “affirmative action babies” I know would say that it is no more humiliating than being rejected because of that same ethnicity, and pays a lot better.

Carter’s advice takes the Establishment’s claims of devotion to meritocratic standards at face value. Which gives a lot more credit than it deserves to an Establishment that has never really believed in those standards, and has espoused them only when doing so would serve the purpose of keeping a particular group of outsiders outside.

The reason Carter has not seen this hypocrisy is that he is looking at the experience of only one group of outsiders. If he were to consider that of three others–women, Asians, and Jews–whose ability to meet meritocratic standards has rarely been questioned by anybody, he would discover that the Establishment has never had any difficulty excluding them, or severely limiting their upward mobility, on some other grounds.

The merit system: now you see it, now you don’t

For instance, in the 1930s, Harvard Medical School discovered that, if academic qualifications were to be the only criteria for admission, its entire entering class would be Jewish. Indeed, they would have had to double the size of the entering class to get in more than few token gentiles. So they suddenly discovered that there was more to being a physician than “mere” academic excellence. Arbitrarily, they set a quota of 22% for Jewish applicants, a quota which remained in effect until the ’60s, when, like the Jewish quotas in many other educational institutions, it was replaced with a larger and slightly less transparent quota on students from large cities, especially New York City, under the rubric of “geographical distribution.” Those quotas still exist today in many schools.

The experience of women is in some ways even more blatant. When my classmates and I graduated from college in the early ’60s, we frequently looked for jobs before and between graduate school, in the public sector. We took the civil service exams, scored at or near the top, and were repeatedly beat out for the actual jobs by men who had scored a good deal lower, before using their veterans’ preference points.

When I was at college, in the late ’50s and early ’60s, it was a truism, repeated to us regularly by faculty and admissions honchos, that men scored higher than women on the math section of the SAT, but women scored higher than men on the verbal section. It didn’t, of course, get us much. There were fewer places available for women at good colleges (or any other colleges, actually) than for men, and less scholarship money available for us. So nobody thought much about it. But twenty years later, when the various controversies about the biases of the SAT arose, I was startled to hear everybody, on all sides of the dispute, saying that women scored lower than men on both sections of the SAT. Even the American Association of University Women, in its otherwise beautifully researched study of discrimination against women in education, could only conjecture about what happened, by the end of high school, to the clear lead in reading and verbal skills, that girls have over boys in elementary school. What had happened–a couple of very well-hidden and quickly forgotten news stories revealed–was that in the middle ’60s, ETS changed the verbal section of the SAT, substituting scientific essays for one or two of the fiction selections in the reading comprehension test. Female scores promptly dropped to their “proper” place–visibly below those of their male classmates–and have stayed there ever since.

Asians are the most recent victims of similar policies. Several West Coast schools, most notably the University of California at Berkeley, have experimented with ceilings on the number of Asian students within the last 10 years. A university, the administration proclaims, has the right to put “diversity” above “mere” academic excellence.

In short, the history of other groups of outsiders suggests strongly that if an entire generation of African American young people followed Carter’s advice to meet meritocratic standards and beat them, the Establishment would have no trouble finding some other pretext to exclude all but the most presentable tokens among them from the precincts and perquisites of power–either by changing those standards, or suddenly discovering the greater importance of some other factor.

That does not, of course, invalidate Carter’s advice. It does make one wish Carter were a little more careful about truth in advertising, however. I tend to prefer Malcolm X’s more honest approach, when he advised his followers to read anything they could get their hands on and get all the education they could, even if all it got them was the proud position of best-educated person in the unemployment line.

Was there ever a merit system?

Before the phrase “affirmative action” ever found its way into our vocabulary, the reality of affirmative action was already as American as apple pie. After all, what else is veterans’ preference, if not an affirmative action program for (in the post-World War II era in which it was born) men? What else is seniority, if not an affirmative action program for older workers? I have never known a veteran, or an experienced union man, who was in the least ashamed to have benefited by those affirmative action programs.

Nor should they be. Before the rise of the meritocratic mythology of the ’70s, any American old enough to have held a job at all knew that nobody gets a job solely by virtue of being the most qualified candidate for it. In an economy which has never even aspired to full employment, most available jobs have several well-qualified candidates on hand. Most employment discrimination does not involve hiring an unqualified person in preference to a qualified one, but rather choosing between more-or-less equally qualified candidates on the basis of factors unrelated to the job.

The Jewish Establishment’s position

Many established Jewish community organizations, like many other high-minded, principled opponents of affirmative action, really believe that they are espousing a pure meritocracy as against a system of arbitrary choice. To take that position, they have to presume that, before the 1969 Civil Rights act, all male Jews had the jobs they were entitled to, by reason of their meritocratic qualifications. They also have to presume that all Jews are male, white, anglo, and middle-class and have nothing whatever to gain from affirmative action. They have to, in fact, ignore the experience of considerably more than 53% of the Jewish community. They even have to advocate giving back to the same academic and professional Establishment that subjected Jewish males to explicit, exclusive quotas until the early ’60s, the power to do it again.

Two cheers for affirmative action

Most supporters of affirmative action see it as a lesser evil. But, unlike its opponents, they recognize the realistic alternative as a greater evil. Affirmative action is not a matter of substituting for a pure meritocracy a system of choices among qualified candidates according to standards unrelated to job or scholastic requirements. It is a substitution of one set of arbitrary choices for another.

The alternative to affirmative action in real life is the divinely-ordained and legally-protected right of the employer or supervisor to hire people who remind him [sic] of his best friend, or people who fit his stereotyped image of the “proper” telephone operator or waittress or whatever. We know that most people who get jobs get them for reasons only distantly related to their ability to perform. In fact, the most serious downside of affirmative action, so far as I can tell, is that it denies future generations a really useful index of professional excellence. When I meet a doctor, or a lawyer, or a CPA, who is female or non-white (or better still, both) and who got his or her professional credential before 1970, I know I am dealing with a superlatively qualified professional, because only the best women and non-whites were able to survive the discriminatory professional screening processes in those days. For professional women and non-whites with more recent qualifications, alas, I have to take my chances, just as I would with a white male of any age.

So we sincerely hope that the people into whose hands we put our lives, our fortunes, and our sacred honor are in fact qualified to do their jobs. But as a practical matter, we know that we are at least as much at risk from incompetents who were hired or promoted for being the boss’s brother, or being tall, or not being Hispanic, or having an officious-sounding British accent, as from those hired or promoted for being female, Black, or Hispanic–quite possibly more, since the latter are usually watched more closely. In fact, these days I am beginning to suspect that American-born doctors can no longer be presumed to be as competent as doctors with foreign accents, since the latter are subjected to much tougher screening standards.

Well, maybe two and a half

We may see ourselves as winners or losers, and we may attribute our situation to other people or to our own deserts. Human beings generally have never had any trouble taking credit for their own good fortune or blaming others for their misfortunes. More recently, “new age” thinking has led many of us to take the rap for our own misfortunes, often in truly preposterous ways (“How have I created this reality?” the cancer patient asks.) But it is difficult for any of us to admit that our good fortune may be the result of some totally unearned “break” from the outside world–being white, for instance, or male. That is the real threat of affirmative action–that it requires us to consider the possibility that (even if, as is likely, we aren’t as well off as we would like to be) we haven’t “earned” even the few goodies we have. For those of us raised in the Jewish tradition, which teaches us that the Land promised to us by the Holy One is ours only on loan, and that we were not chosen to receive it because of any particular merit on our part, that shouldn’t be too much of a leap. It should make us more willing to grant similar unearned goodies to other people. “Use each man according to his deserts,” says Hamlet, “and who should ‘scape whipping?” Or unemployment, as the case may be. Even us, the few, the proud, the overqualified.

Red Emma

Forethoughts

May 29, 2011

Recommended Reading

I have a client who now resides in a nursing home and is in the early-to-middle phases of dementia. She is also a sci-fi fan, so whenever I clean out my bookshelves, I take the proceeds to her. I am discovering that, while that improves the quality of my life, it doesn’t necessary change hers all that much. Because one of the few so-far-unheralded upsides of dementia, at least in its early phases, in that you get what I have always wanted—multiple opportunities to read the same book for the first time.

Among the books I have especially wanted multiple shots at in this way are John Brunner’s line of speculative novels: Stand on Zanzibar (1968), Jagged Orbit (1970), The Sheep Look Up (1972), and The Shockwave Rider (1975.) And I spent a fair amount of time wishing there was somebody around right now who writes that kind of stuff, preferably in batches rather than an occasional one-off like Orson Scott Card’s Empire and Hidden Empire (okay, that makes them a two-off, I guess.) I think I’ve found one—John Barnes, author of Mother of Storms, Directive 51, and The Man Who Pulled Down the Sky.. Unlike Brunner and Card, he does dabble in the Irwin Allen school of writing (one damn disaster after another), but in the process he takes a serious look at the trajectories of current social, technological, economic, and political phenomena. Consider this a recommendation.

CynThesis
***************************************************************
The Unknowing God

For a period that lapped over into my college years, the existentialists told us that the human race is engaged in a frantic effort to become god. As I think about it these days, I am increasingly convinced that many of us already are god, and we are failing to notice it (and falling down on the job) to a dangerous extent. Me, for instance. Most of my days I spend working, on the phone, on the computer, at the office, in court, at home running around finding things (and of course losing things and not realizing it till later), shopping, and so on. If in the middle of all this, I sit down and call the Wired Cat, and she comes over to me, sits down at my feet, and reaches out her front paw to pat my leg, to which I respond by reaching down to rub her head between her ears and down to her neck, for her this is a religious experience. Her divinity has taken time out from managing the universe to communicate with, relate to, and pleasure her. Sometimes, like most divinities, I do things she really doesn’t like, such as taking her to the vet. She seems to accept this as good for her in some way that I understand and she doesn’t. She’s lucky enough to have a divinity who doesn’t do any of the awful things to her that one hears about on Animal Planet (Mr. Wired is an Animal Cops junkie and a hard-core groupie of Anne-Marie Lucas.) But if it did, she’d probably accept that too, as most domestic animals seem to. The ones who have been too utterly traumatized retreat into the animal counterpart of atheism—the feral life. (Atheism is not actually the right word—I am not the first to wonder if there is a word for somebody who believes in the Holy One but just doesn’t like H* very much.)

And of course, to our children, and to most of the children we come into extended contact with (as teachers, for instance, and maybe as pediatric health professionals), we are also god. (Note the lower-case initial, used—as Grace Slick explained when she named her kid “god”—so we won’t get stuck-up about it.) So far as the kids can tell, we (especially parents but adults in general to a considerable extent) run the universe, and occasionally take time out from doing that to interact with the kids, for better and for worse.

The Bible actually plays with this idea. For instance there are two or three references to judges as gods. (One suspects some of the human authors of these passages spent some time on the bench themselves—certainly ordinary human judges have always tended to see themselves as some kind of deity.) Moses is told that he is going to be “in the place of G-d” to Pharaoh, and that his smoother-talking brother Aaron will be his “prophet.”

And there is a story about a rabbi (Hasidic, I think) who, upon being told that somebody he knew was an atheist, said something like “Well, that’s good. It means that if he sees somebody who is poor or in trouble, he won’t just say ‘G-d will help him,’ he’ll get up and actually do something for the guy.” Even professionally religious people may have a kind thought for people who, not believing in a divinity, feel obliged to fill in for H*.

Which, if you accept the hard-core deterministic schema of the behavior of all non-human entities, means that human beings and their actions are the only preserve of free will in the universe, and thus also the only rational place for the divine to operate, by inspiration and impulse. Many rational religious people have trouble believing that the Holy One has ever made the sun stand still or water run uphill, but will accept a divine push toward extraordinarily decent human behavior—in other words, that we are not exactly in the hands of G-d, sometimes we are the hands of G-d.

Jane Grey
*********************************************************************
War is the End, part II

Does anybody else remember the study that told us we could have won the hearts and minds of the Vietnamese people by giving $10,000.00 to every man, woman, and child in that country, and still have spent less than the $686 billion we actually spent on the war? (Another sourcing problem, obviously.) Anyway, Cecil Adams, of “The Straight Dope” has heard from a history scholar who says the North could have bought and freed all the slaves in the then-US for something like $72 billion in present-day dollars, which was also considerably less than the overall cost of the Civil War, especially if you reckon costs and damages on both sides, which of course all ultimately came out of US GNP. This once more tells us that wars are almost never “about” their official causes and purposes, which could almost always be implemented a lot more cheaply, easily, and with less violence. War itself, or some so far unknown concomitant of war, makes it an irreplaceable element of human polity.

Red Emma

*********************************************************************
Life Among the Condonauts

I just opened a mysterious envelope from a fellow resident of our condominium building, to discover that, as a member of the condo association, the Wired Household is being sued by another member of the association and by our really heroically estimable janitor, for the alleged misconduct of the erstwhile chair of the association, our upstairs neighbor. This is a peculiarity of condo law-—in order to obtain a remedy for some misbehavior by condo association officers, you have to sue the association, even if you are a member of it. Which means that you are, in a sense, suing yourself. You are certainly costing yourself money. All the costs of defending the suit come out of the pockets of the residents. We could even wind up paying the costs of the other side. This damn thing has got to be mediated, ASAP.

I am the only attorney I know who lives in a condo (for 31 years now) and has never served on the board. I really want to keep it that way. Lawyers are easy marks for pleas of communal obligation. But condo boards are a time sink. I just sent a frantic email to the plaintiffs asking them to please consider mediation. Yikes!

CynThesis

A New Look at Child Labor

May 12, 2011

I googled “child labor” recently, and all I could find was stuff on how much of it there still in the world, and why it was so bad. Nobody seems to be looking at, or even for, an upside. Okay, maybe this is kind of like looking for the upside of the Third Reich (the Volkswagen?) or the reign of Caligula (no bright ideas at all here.) But I think there are actually a few good things to be said about child labor, at least within proper limits.

Depending on what you mean by “labor.” If what you mean by “labor” is doing something that will wear you out and use you up within 20 years or less, no matter what age you start doing it at, then, no matter when you start, it’s a bad idea. Like coal mining, for instance. It was bad when 9-year-old kids were pulling coal carts in 19th-century England, and it’s just about as bad today when 45-year-old men die of Black Lung after 20 years of it, in Kentucky. The use of child labor instead of adult labor has all kinds of nasty side effects, such as lowering the general wage rate (under the odd misimpression that it‘s okay to pay for the same work at lower rates when a smaller person does the work,) and increasing the unemployment rate among adults. And working employees too hard and too long to allow them any kind of personal life or education is bad, whether you do it to kids or adults. Paying them so little that they have to supplement their wages with the only kind of “moonlighting” they have the time and energy for, namely prostitution—whether you do it to women or children—is as immoral as it gets. For further information, read Dickens.

In short, I’m not sure there is any way in which the bad side of child labor for the child is any worse than the bad side of adult labor for the adult worker. Since adults make the laws, and since one of the bad sides of child labor for adult workers is lowering wages and increasing unemployment, that didn’t really matter much once the groundswell against child labor started to grow. Progressivism and New Deal trade unionism both leaned strongly in the direction of getting people other than prime-working-age adult white males out of the workforce, using whatever rationale happened to be handy at the moment. Which was often good for families, good for adult workers, and good for The Economy.

But societies that have banned child labor (not to be confused with societies that have actually eliminated it) have created problems of their own. The most notable is that, in such societies, children are an economic liability to their parents, and may suffer abuse or neglect from them as a result. In places like China, if you can’t sell your child’s labor, you may end up selling the child instead. In places where nobody’s buying, you may simply abandon the child, either in some exposed place or in some “orphanage.” Either way, the child may die young or never develop its full mental and physical potential.

But as long as poverty persists among families, banning child labor is unlikely to completely eliminate it. Child labor persists in the US in fast food joints, on farms, and most notably in criminal enterprises, where the fact that a juvenile will get no more than a nominal punishment for conduct that could put an adult away for a long time makes “shorties” really desirable employees for look-out and courier duty.

Oddly enough, most families affluent enough not to need to put their children into the legal, semi-legal, or illegal workforce, tend not to expect much labor from them at home either. My mother, who was #5 of 8 children, once told me that her mother told her, “Once your oldest daughter is 8 years old, it doesn’t matter how many more you have; one of the older ones will always be able to take care of the younger ones.” Both because such large families are rare today, and because middle-class Americans disapprove of anybody under 12 doing any kind of child care or major domestic chores, this doesn’t happen any more. Child development “experts” generally believe that children should be expected to help out around the house and clean up after themselves, and should not get their allowance as “wages” for these tasks, but they get listened to only slightly more on this subject than on the topic of corporal punishment, which isn’t much.

Okay, that’s pretty much the adult side of the issue. What about the kids? The advantage of writing about children, of course, is that even if you’ve never raised one, you and every other person on the planet has been one. (Original sin consists of having been born with parents, which is why Adam and Eve escaped it.) Do y’all remember the time in your early teens and the years just before that when you really really really wanted to do something real and significant and useful and necessary? There are long stages of child development in which the child’s play consists of nothing but imitating (to the best of her knowledge and ability) the adult’s work. Sometimes that knowledge and ability can be pretty impressive. The computer skills of people we usually regard as “kids” can be downright amazing, and sometimes even remunerative. The Wired Daughter, between ages 15 and 18, got herself a job in a social service agency working with runaway youth, doing all kinds of statistical correlation and record-keeping, much more skilfully and assiduously than most adults I have known doing the same kind of work. Because it was a non-profit, nobody worried much about child labor laws, least of all our daughter, who was having the time of her life. Once she turned 18 she turned (temporarily, thank heaven) into a slacker. But not letting her do the work of her choice before that would have been a real injustice to her. When my nephew was the same age, he worked until well after the official closing time in a local restaurant, and found it both enjoyable and liberating. When I was the same age, I was learning to sew, and type, and cook, and write. All of us, of course, were also going to school and doing pretty well at it. None of us were dependent on earnings from such work. Which gets rid of most of the downside of child labor. I think that’s just a stage of development kids go through, with or without compensation, and it’s a good thing for all of us that they do.

On the other hand…

As more and more “middle-class” families in the US find themselves sliding out of the bourgeoisie, the role of child labor in such families will become more and more difficult. Most middle-class and even working-class families today do not expect their children to contribute to the household income, even by paying rent when they are working full-time and living with their parents. Most middle-class parents are really uncomfortable sharing the financial realities of their lives with their children (often, even after the said “children” have long since reached adulthood.) The whole point of being “middle-class” in this culture’s families is that the parents never have to admit to their children that they can’t “make it” in this economy, or even seriously discuss what it would mean not to ”make it.” No doubt it’s comfortable for a child to believe that the parents will always be able to “manage,” just as it’s comfortable for the child to believe that Daddy can beat up any other guy on the block. Until recently, the majority of American kids had no reason to disbelieve either proposition. Now, child development “experts” are taking on these issues, with varying degrees of success. It would probably help them, and the parents they advise, and the children who do or don’t benefit from that advice, if we could start talking more explicitly about what children can do to help their families in a bad economy, and why letting them do it isn’t unthinkable.

Red Emma

War is the End; the State is the Means

April 27, 2011

Just finished reading Nicholson Baker’s piece on pacifism in the latest Harper’s. It dovetails nicely with some other thinking I’ve been doing lately. Specifically, I’m remembering the ten years of the Vietnam War, and what it felt like at the time, and trying to figure out why Americans, even those most opposed to the current ten-year wars in Iraq and Afghanistan, are so much less passionate in their opposition than we were to the Vietnam War. One of the major differences, of course, is that we have no military draft today.

I was very active in the struggle against the draft during the Vietnam War, and got the ultimate rush, 20 years later, when one of my students, in a discussion of relatively recent history, literally could not remember the words “draft” or “conscription.” By George, I thought. We really did it! Many of my more radical friends and colleagues, at the time, predicted that ending the draft would take a lot of the juice out of opposition to any future wars. I allowed that they were probably right, but that even so the unspeakably hard choice to kill or not kill ought not to be forced on any unwilling person. I still believe that. But it’s obvious that, without a draft, this war, or the next or the next after that, could conceivably go on forever. That’s how all those European wars—the Seven Years’ War, the Thirty Years’ War, the Hundred Years’ War—got to be so interminable. They were not fought with conscript armies. Neither were the conquests that built and maintained the Roman Empire over 400 years, or the British Empire for 200+ years..

Baker takes up the issue based on what most of us have seen as the ultimate hard case against pacifism, the Second World War/the Holocaust. If you assume, as most of us have after the fact, that the war was necessary to save what was left of the Jews in Europe, then how could one argue against it? What originally disabused me of that notion was reading Arthur Morse’s While Six Million Died, published in 1967. Subsequent research only strengthens the premise of that book—the Second World War may or may not have put an end to the slaughter of the Jews of Europe, but it clearly was not fought for that purpose. We need to disentangle the war from the Holocaust to make sense of either of them.

In point of fact, to be sure, the Holocaust and World War II went on at more or less the same time, and were instigated by a lot of the same people. But they were very different phenomena. They gave rise to very different responses (even from the same people.) And, while they were causally inextricably related to each other, that relationship was almost unimaginably complex. The war provided a pretext for the Holocaust, as war almost always provides a pretext for oppression (up to and including murder) of noncombatant minorities, viz. the Armenians. And the Holocaust, ultimately, obstructed the Nazi conduct of the war, probably fatally. Hitler wasted resources on killing Jews and other “inferior” races that he could have devoted to beating the Allies. (Which may partially account for the reluctance of the Allies to do anything that might have impeded the Holocaust.) He expelled from Germany the Jewish and anti-Nazi scientists who might have given Germany the nuclear bomb. The Six Million, arguably, were martyrs to the Allied victory. Without their deaths, that victory might never have happened.

Those who opposed the Nazis at the time, both in Germany and elsewhere, opposed them, not because of their treatment of Jews and other minorities, but for pretty much the same reasons the Allies had opposed Germany in World War I and the democratic forces in Germany had opposed the Kaiser. Hitler was well on the way to conquering the world. In the course of doing so, he had eliminated most of the hard-won democratic rights enshrined into law in the Weimar Republic. Which is what happens to the civil liberties of citizens in almost any war. Good enough reasons, to be sure, and by no means to be sneered at. But even the staunchest anti-Nazis, at home and abroad, at best had little concern for the Jews, and at worst viewed the racist Nuremberg Laws as one of Hitler’s few good moves. This was as true of anti-Nazi resistance in occupied countries as in Germany itself. Indeed, there were anti-Nazi partisan units in Eastern Europe that killed Jews in their spare time, when Nazi-fighting got slow.

The British found it inconvenient to notice the plight of the Jews, because they were being called on to respond by opening up Mandate Palestine to Jewish refugees, at the expense of British relations with the Arabs. The Americans stayed out of the war until Pearl Harbor was bombed, fortuitously, by the Japanese–because American public opinion tended to side with the Germans against the Jews, but could easily enough be swayed against non-whites who had had the nerve to bomb American territory. The French had no choice but to respond to the invasion of their territory–but their struggle against the Nazis stopped short of any serious effort to protect French citizens of Jewish ancestry, much less alien Jewish refugees from further east. Indeed, rounding up Jews was one of the few activities in which many of the French cooperated willingly or even enthusiastically with the Germans.

The allied War Crimes Trials in Nuremberg made clear what the Allies considered to be the real offenses of the Nazis: violation of treaties, making of aggressive warfare, and torture and murder of Allied prisoners of war. The Nuremberg trials had virtually nothing to say about Nazi treatment of enemy civilians, and nothing whatever about Nazi mistreatment and murder of German and Austrian citizens. It was left to the Israelis and the successor governments of the formerly occupied countries to prosecute those crimes. Obviously none of them were in any shape to do so until at least the 1950s. By then many of the major war criminals were safely hidden away on other continents.

The switching of gears came in the 1960s. It was partly precipitated by the capture and trial of Adolf Eichmann (and Hannah Arendt’s in-depth coverage of it) between 1961 and 1963, and partly by the intensification of the Vietnam War. At that point, hawks, especially liberal hawks like the Henry Jackson faction of the Democratic Party, were holding up World War II as a shining example of a just war fought to protect a helpless minority against a marauding dictator, and a model for U.S. participation in the Vietnam War. It was the American war machine which, in the words of Herman Wouk, “kept my grandmother from being turned into soap.” Draft boards and congressional hawks stated over and over again that opposition to the Vietnam War was equivalent to the America Firsters’ opposition to American participation in World War II, which in turn was tantamount to endorsing the Holocaust. The “war crimes” actually tried at Nuremberg were hardly ever mentioned–except occasionally by anti-war advocates. Pro-war forces gave up their use of the Holocaust analogy only after the My Lai massacre, when it became fairly obvious that the U.S. military was killing at least as many civilians as the Viet Cong.

In the Vietnam and post-Vietnam rationale, the reason the Nazis were Bad People was their murder of helpless civilians, especially Jews. American World War II movies made in the ’60s and after often portrayed German soldiers who weren’t in the SS as “good Germans”, tragically honorable men doing what any patriotic citizen would do (including, presumably, aiding and abetting all the crimes prosecuted at Nuremberg), as opposed to the “bad Germans” who ran concentration camps. It might be inhumane to put civilians into concentration camps and gas them, but strafing, shelling, or dropping bombs on them from overhead was just a normal exercise of warrior morality, i.e., the same sort of thing our warriors were doing.

Getting back to Baker, he goes into considerably more detail than Morse about pacifist opposition, and the reasoning behind it, to American participation in World War II. Many of the pacifists of that era, including important Jewish spokesmen, accepted well before our time the premise that the purpose of any such participation was to save the European Jews, and by extension the Jews in the rest of the world not yet directly threatened by Hitler. But why not find some way to save the Jews that did not involve widening the war? they asked. “The Jews needed immigration visas, not Flying Fortresses. And who was doing their best to get them visas, as well as food, money, and hiding places? Pacifists were,” Baker points out. Moreover, if the purpose of the war was to stop Hitler, war might be precisely contraindicated. “…what fighting Hitlerism meant in practice was…the five-year-long Churchillian experiment of undermining German ‘morale’ by dropping magnesium fire-bombs and 2,000-pound blockbusters on various city centers. The firebombing killed and displaced a great many innocent people—including Jews in hiding—and obliterated entire neighborhoods. It was supposed to cause an anti-Nazi revolution, but it didn’t….If you drop things on people’s heads, they get angry and unite behind their leader. This was, after all, just what happened during the Blitz in London.”

Baker takes a perspective on the Holocaust that I found startling: that it was “the biggest hostage crisis of all time.” Hitler’s threats against the Jews of Europe were largely unfulfilled before the US entered the war. Many anti-war activists proposed negotiating at that point, when the US still had something to offer in exchange for the lives of Europe’s Jews. Holocaust historians Saul Friedländer and Roderick Stackelberg suggest that, although Hitler had long planned the killing of all Jews under German control, “its full implementation may have been delayed until the US entered the war. Now the Jews under German control had lost their potential value as hostages.” The first extermination camp, Chelmno, began operations, coincidentally (?), on December 8, 1941. Pacifist and near-pacifist advocates continued to call for “peace without victory”, an end to military operations in Europe on condition that the Jews be allowed safe passage out of Europe. It was not a popular suggestion among Allied politicians. Among the excuses for not even considering this possibility were Churchill’s statement that “[e]ven were we to obtain permission to withdraw all Jews, transport alone presents a problem which will be difficult of solution.” Anthony Eden, his foreign secretary, told the American Secretary of State that “Hitler might well take us up on any such offer, and there simply are not enough ships and means of transportation in the world to handle them.” This from the engineers of the Dunkirk evacuation two years earlier, who had gotten nearly 340,000 men from the French beaches to England in a mere nine days!

Baker is either a nicer person than I, or just more cautious. These lame obfuscations make it obvious to most modern readers that the Brits—and the US State Department—would not have wanted a massive influx of Jewish refugees even if all of them had somehow grown wings to fly themselves out of Europe. The real point was that both countries had a lingering substrate of anti-semitism to deal with, both in the general population and among their diplomatic apparatchiks in particular. Many of their citizens were likely to be lukewarm in their support of the war if they thought its purpose had anything to do with saving Jews. The diplomatic establishments were nice enough to consider acknowledging this in official communications to be a breach of etiquette, but not decent enough to overcome it with an offer to save Jewish lives. If the Jews were to be saved, the Anglo-Saxon alliance was declaring, it would have to be as an incidental—or perhaps even accidental–by-product of a war being fought for utterly different reasons.

If even World War II, for which the most noble and humanitarian purposes have since been adduced, was not in fact fought for those purposes, what does that say about the rest of the wars which have bloodied the world since humans first aglommerated into groups large enough to have wars? What are the real reasons for war?

The first and most obvious one is They hit Us first. Beginning with the first blood feud, this becomes problematic, because each “first blow” from Them always turns out to be a response to a pre-first blow from Us, and so on. So let’s abandon that game, or at least recognize it for the fraud it is.

The next most popular reason is They might hit Us first, if We don’t hit Them first,which is vulnerable to the same realities.

Then there’s if We don’t hit Them, Those Other Guys Over There might think We’re weenies and start hitting Us. In this age of universal publicity, it should be fairly easy to deal with this proposition without actually hitting anybody.

The fact that both sides, in any war, can come up with some reason for their behavior makes it pretty clear that those reasons are really nothing but excuses.

So if there are no bona fide purposes for war, why do we do it?

I suspect that this hypothesis isn’t even original, but war is not a means to achieve an end. If it were, many of those ends might be achievable by other means. Somehow, that never happens. Because war isn’t a means, it’s an end. Clausewitz to the contrary notwithstanding, war is not the continuation of politics by other means. It is the purpose of politics. It is the purpose of the nation-state (and the street gang, and the clan, and arguably the religion, and maybe even the family.) Domestic politics, and government, and the arts of peace are merely things to do in the interval between wars, to give the crew time and resources to break down the set, get the audience out, build up the new sets, find a new script and get all the lines learned, and then get the new audience in. In the American political context, the Republican party is more honest about this. The Democrats are willing to help us fool ourselves that we don’t choose war. Like Michael Corleone in his declining years, we just get pulled into it against our will because we’re such nice guys. The post-Vietnam series of wars and incursions—Panama, the Balkans, Lebanon, Kuwait, Iraq, Afghanistan—aren’t an aberration. Vietnam was the blip. Vietnam was the play to which we reacted as if it involved real people dying real deaths. Abolishing the draft has revived the concept of the “theater of war.” Vesti la giubba.

Red Emma

Surviving in Third-World America

April 12, 2011

Do you ever get tired of hearing that the U.S. is the only western industrialized country that (doesn’t have handgun control/doesn’t have a national health care program/has an infant mortality rate over __%/imprisons more than __% of its citizens/pick one)? After hearing so many of the pronouncements indicating that we trail the industrial West in good stuff and lead them in bad stuff, are you starting to wonder whether the U.S. really is a western industrialized nation any more? Is it possible that we’ve become, or are at least well on the way to becoming, a Third World country? After all, we are no longer the world’s wealthiest nation, nor its healthiest, nor its best educated. Now that the Soviet Union is no longer marking the boundary of the First World, maybe we are. And how long will it be before we mark that boundary from the wrong side?

I’m willing to leave the geopolitical and macroeconomic implications of all that to the politicians. What concerns me is what concerns just about any ordinary person–how to make it from day to day in a Third Wold, or nearly Third World, country. Obviously, the best way to research this question is to ask people who’ve done it, more or less successfully, all their lives–the ordinary, would-be middle-class people from various Third World countries. Or at least to learn as much as possible about them.

So, based on what we know about real life from Third Worlders, here are some basic suggestions:

* In unity there is strength. Extend your family as far as you can. Begin with real relatives, by blood or marriage, and then quasi-relatives (exes and steps and their families) and then what anthropologists call “fictive kin”–godparents and foster siblings and so on. Cultivate these relationships and use them for the benefit of all concerned.

* One of the most important ways to do this, of course, is to share living space, especially if somebody in the family has a large, fully-paid-for house. This gets everybody economies of scale in housing, utilities, and food. It also puts people who have both jobs and small children within easy reach of potential baby-sitters with neither.

* If you can’t extend your family, you can at least create one. Get married. Form close friendships. Join cooperatives of all kinds. Join the church/ synagogue/mosque/ coven/whatever of your choice. Making it will be hard enough in the company of others. Alone, you’re probably a dead duck.

The only possible exception to this rule is children. Third Worlders typically have them–lots of them, if possible–for retirement insurance. But Third Worlders generally are required to expend fewer resources up front on their kids than American child labor and compulsory education laws allow. Give this one some thought.

* Stay out of the official dollar economy as much as possible. The IRS, of course, frowns on “off the books” income and untaxed barter. But even they have not figured out how to tax you on the value your do-it-yourself activities add to your assets. The official money economy in Third World countries is rigged to underpay the non-rich to the point of starvation while extracting from them in prices and taxes more than they can possibly afford. The only way to survive in such an economy is to stay out of it, both for production and for consumption (including credit–borrow from family, borrow from friends, borrow from your business colleagues, and then let all of them borrow from you, but stay out of the official credit market if you possibly can. Likewise, don’t lend in that market–that is, keep your savings out of banks.)

* Play the lottery–but not very much. It is true, of course, that your chances of winning are slightly less than your chances of getting struck by lightning. But they are also only slightly less than your chances of attaining the American Dream in any of the official legal jobs likely to be open to you. Buying a ticket nearly doubles your chances. It’s hard to beat odds like that. But buying more than a couple of tickets a week is a bad investment of money you should be using elsewhere. By the way, if the prior history of American lotteries is any indication, this batch will be around only another fifteen years or so, so take advantage of them while you can.

* Use public amenities creatively, while there still are any. Their days too are numbered, but while they last, public schools, public libraries, public parks, public hospitals, and similar amenities are usually perks of living in a particular locality. Therefore, given your limited stock of housing dollars, you are usually better off spending them on cramped, shabby accommodations in an affluent town or neighborhood than on a commodious, well-appointed place in the slums. Besides, your well-off neighbors are more likely to have jobs for you–both long-term and free-lance–than slum-dwellers are. And they are usually canny shoppers, so the assortment of merchandise available to you in local stores will be higher quality at lower prices. You and your family will have a much better chance of making good business contacts too. In short, unless you have ambitions in local politics, it is better to be the poorest person in a rich neighborhood than the richest person in a poor neighborhood.

* Education will still pay off, but will be a lot harder to get, and won’t necessarily bring your income above the poverty line. Nevertheless, get as much of it as you can, and try to keep as much of it as possible in four-year colleges, which produce slightly more respectable credentials than community colleges. You may want to consider going outside the U.S., at least for your bachelor’s degree–it could be cheaper.

* Don’t plan to retire. You will probably never get a private pension, and the value of your Social Security grant will almost certainly diminish rapidly. So be prepared to look for the odd jobs you are still physically capable of doing, most notably childcare and other home help for employed family members.

* Stay healthy. If you can’t stay healthy, at least try to stay out of the official health-care system, which you probably can’t afford, and which probably can’t do much for you anyway. Better you should spend your health care dollars on (a) studying self-care; (b) alternative practitioners recommended by people you trust who have not yet died of their own health problems; or (c) if you must use “official” practitioners, use the lowest professional level available to you–that is, better a Physician’s Assistant than a physician; better a Nurse Practitioner than a PA; better a Registered Nurse than a NP; better a Licensed Practical Nurse than an RN. The lower down on the professional scale you go, the more personal attention you are likely to get. Whenever possible, stick with practitioners you pay out of your own pocket–they’re cheaper, they are accountable directly to you rather than some insurance company, and they still realize you have the option of not coming back next time if they screw up this time.

* Stay morally connected. Be active in religious, neighborhood, civic, and volunteer organizations. They will remind you–when it is very easy to forget–that there is more to life than survival, and that, even if the big corporations that control the few remaining permanent full-time secure jobs consider you less than the dust beneath their big wheels, there are plenty of people around you to whom you are not merely valuable but essential.

* Similarly, if you have some sort of artistic or intellectual talent and can’t get the official purveyors of culture to take notice of it, don’t let that stop you from putting it to work in blogs, local newsletters, murals, amateur theatricals or whatever, which are probably the only art your friends can afford. Who knows–someday it may get noticed by the official critics. But even if it doesn’t, you have given and received pleasure.

* You may have to do a lot more groveling than you are used to. It is possible to be marketably obsequious and still keep your self-respect, simply by maintaining your objectivity behind your mask (see W.E.B. DuBois.). We Americans have long believed that people who have attained wealth and prominence must be more deserving than the rest of us. As long as ordinary people had a reasonably decent chance of achieving some wealth and prominence of their own, that was a harmless delusion. Now, however, it is dysfunctional and can even be deadly. The only way to survive psychologically and morally in a Third World United States is to be absolutely certain that, as a human being and a citizen, you have the same ultimate value as any other human being and citizen. Do not allow yourself to become part of any institution that undermines that conviction, unless it pays you relatively well. And withdraw your attention and your allegiance from the artifacts of commercial culture that undermine your sense of your own value as a human being. Nobody, after all, is paying you to watch television, so your family loses nothing if you stop watching.

* Do your homework–speculative fiction is rich in models for the world we are moving into, from the novels of John Brunner (The Sheep Look Up, Stand on Zanzibar, The Shockwave Rider) to Philip Jose Farmer’s “Riders of the Purple Wage” to Robert Heinlein’s future histories. Not to mention, of course, Orwell’s 1984 and Aldous Huxley’s Brave New World. See also Strieber and Kunietka’s Warday and Nature’s End. These are just my particular favorites–there are lots more where they came from. If there is one thing we have learned in the past fifty years, it is that if the warped mind of a speculative fiction writer can imagine a shape for a future dystopia, the grasping hands of a political or economic establishment can implement it.

* Furthermore, there are plenty of ways to learn more about how today’s real-life Third-Worlders are managing. Among the goodies available in public libraries are magazines and newspapers from such places, many of them in English (which is, after all, one of the official languages of India, the Philippines, and many African countries.) Much of the fiction of modern India, the West Indies, and Africa was written in English, and much of it is richly informative.

* And note, by the way, that used books are probably one of the cheapest forms of recreation available. The only thing cheaper are the public domain books available for free on your computer or iPhone.

* Learn to like rice and beans. Together they make up the complete protein necessary for good nutrition, as well as containing lots of fiber. With a little celery, onion, and garlic, they can provide most of your nutritional needs for literally pennies a day. They’re probably healthier than whatever you’re eating now.

* Don’t drink the water. Not unfiltered, anyway, and not bottled—that’s just a waste of money and of valuable natural water imported from many places that need it badly themselves, like Florida. Pick up a used scouting handbook and find out all the cheap and quick methods to purify questionable drinking water. Note that, if you live in the country, the air may smell better, but your drinking water may already be dangerously contaminated with pesticide and chemical fertilizer runoff. Urban problems will be different, but just as serious.

I am not suggesting that the ThirdWorldization of the United States is a good thing, or only trivially harmful. On the contrary, for most ordinary people, it can mean perpetually living on the edge of catastrophe and occasionally slipping over it. But it is time we started getting prepared for it, while we still can.

Red Emma

Sam Harris and the Fundamental Things: Do They Apply?

March 4, 2011

Well, not exactly. Sam Harris, the affable atheist who claims that a system of morality can be established by scientific thinking, leaves a hole in his system big enough to drive a juggernaut through. He starts with the age-old utilitarian presumption that pain is bad and pleasure is good, and the even more preposterous presumption that everybody agrees on those two truisms.

Let’s look at the purely material facts. It took medical science, in the guise of the National Institutes of Health, until the late 1960s to discover that physical pain is bad for you. Duuuh. Until then, it was regarded as, at best, merely a symptom, an indicator of some other problem. It was useful, and interesting, only to the extent that it was a valid and scalable indicator (that is, that severe pain indicated a serious problem, while minor pain correlated with a minor problem. Which in fact ain’t necessarily so.) The use of anesthesia for surgery goes back to the ancient Greeks and their contemporaries in India and China. But it was used, not because it made the patient feel better, but because it’s easier to operate on a patient who can’t fight back. Medical science (such as it was) was perfectly fine with pain in situations that did not inconvenience the physician or, more especially, the surgeon. (Docs here, please feel free to argue this point.) Which is why anesthesia for childbirth was not widely used until the late 19th century, and faced strong opposition from both the medical profession and religious authorities then.

Religious authorities. Ah yes, there’s the rub. There’s where Sam Harris meets his unacknowledged opposition. Genesis 3:16 portrays the Holy One telling Eve, “I will sharpen the pain of your pregnancy, and in pain you will give birth.” So the Victorian divines told their medical opposite numbers, who are you to mess with the divine plan? Women are supposed to have pain in childbirth. It took Queen Victoria herself to overwhelm these pronouncements by having her 7th, 8th, and 9th children delivered with the assistance of chloroform. (Her Majesty was in many ways not all that Victorian. She was also one of the first people in England to have a telephone in her home.)

Well, okay, Jeremy Bentham had propounded, long before Victoria made pain relief in childbirth socially acceptable, the philosophy of utilitarianism, the goal of which was the greatest good (which he pretty much equated with pleasure, or at any rate the absence of pain) of the greatest number. But the church authorities didn’t like him much better than they liked anesthesia. From their point of view, Bentham was barking up the wrong tree. Material well-being was irrelevant to them. And that point of view did not die out with the Victorians. It is still with us today. Innumerable religious thinkers even today tell us that suffering is not merely inevitable but, in many instances, good for us.

The most intelligent and graceful defense of this position is probably that of C. S. Lewis, in his two masterful books (separated by 20 years and the death of his beloved wife), The Problem of Pain, and A Grief Observed. Suffering, he tells us, is the Holy One’s tool for helping us become better and ultimately perfected.

The Roman Catholic view of suffering was that the sufferer could “offer up” her suffering as a form of prayer, or more accurately a form of sacrifice, to help redeem the world. Dunno whether this is still current. There is something to be said for this approach to unavoidable pain—it gives it meaning, and may thereby make it more endurable. But, at least in the Middle Ages, and even today in some monastic and ascetic communities (such as, famously, Opus Dei), people have been encouraged to deliberately seek out pain, and even inflict it on themselves, in order to be able to use it, either for one’s own spiritual improvement or for the redemption of the world, or both. Orthodox Muslims seem to follow these same paths, up to and including self-inflicted suffering.

The Jewish tradition, while it does not encourage voluntary suffering, is realistic about the prevalence of unavoidable pain (as one would expect from its history.) We believe in relieving pain and suffering to the extent possible given the science and technology of the day, but we also try to confer meaning on unavoidable pain. That’s the whole point of the Book of Job.

The Buddha teaches that suffering is intrinsic to normal human existence (that’s the First Noble Truth), and that most of the ways we use to avoid or lessen it don’t work (that’s the Second one), but that enlightenment as to the true nature of human existence can enable us to transcend it (that’s #3.)

The Stoics did a lot of thinking about suffering too. They were, so far as I can tell, the first to stand the inevitable why me? on its head and ask why not me? Who am I to be exempt from the normal costs of human existence? Why should I find my own suffering any more problematic than the much greater suffering of enormous numbers of other beings, past, present,and future? They did expect this contemplation to make suffering more endurable, which is a little hard for us moderns to accept, but it’s still an approach worth taking.

Sam Harris is, of course, a neo-utilitarian who doesn’t even give credit where credit is due (thereby, according to the Talmud, postponing the redemption of the world. But I digress.) For his fellow neo-utilitarians, his argument is perfectly sound. But he’s ignoring a large proportion of the human race, which is downright dangerous, and for sure isn’t good science, since it skews the rest of his sample. Sure, it is possible to establish a utilitarian morality which is scientifically valid, if you start with utilitarian premises and are addressing only other people who accept those premises. That’s not science, that’s just technology.

Jane Grey

In Praise of Folly

February 20, 2011

Item 1: So far, there’s one consequence of the 1995 government shutdown that I haven’t seen anybody mention. It was the direct cause of President Clinton meeting Monica Lewinsky. Ordinarily, she would never have spent any substantial amount of time in the Oval Office, as a mere intern. But when Congress shut off the money supply to the federal government, Monica, like all the other interns, suddenly became essential. She could continue to report to work, and even get enlisted to do things interns normally never did, because, unlike the civil servants who normally did them, she didn’t get paid. The pundits now discussing the possibility of another shutdown generally think the Republicans lost that round. If you factor in Monica, that’s not so clear.

Item 2: new entries on the Bennigan’s Index—of the two new eateries advertising their plans to open up within one block of my office, only one has actually done so. I’m getting really skeptical about the other, given that it’s been six months now. And in the meantime, two other cheap eateries in the next block have closed down. This is not encouraging. And of course, Giordano’s Pizza has just filed for bankruptcy.

Item 3: which leads one to wonder. Last year, several economists mentioned the second round of the Great Depression that started in 1937 as a direct result of Roosevelt cutting spending and raising taxes to reduce the deficit. This year, nobody’s talking about it at all. Instead, the GOP is suddenly utterly panicked about the deficit, which of course bothered them not at all when Bush was running it up in the first place.

Item 4: speaking of which, Mr. Wired is watching the SyFy [sic] Channel marathon of disaster movies, which this week is mostly about snakes gone wrong. Roger Ebert once characterized a certain genre of films as “idiot movies,” in which every time a character had to make a choice, it was always the stupidest choice possible. Most of the SyFy disaster films are more like the Ten Little Idiot genre, in which we watch a whole series of characters make such choices, and we get to bet on which one is still standing at the end. Not unlike presidential primary season, except that even the meanest monster snake is still kind of pretty, compared with many politicians.

Item 5: And Republican Congresscritter Mike Beard (see http://www.huffingtonpost.com/2011/02/16/mike-beard-natural-resources-god_n_824312.html?ir=Religion) seems to think that if he eats all the pie, G-d will put another one in front of him. Didn’t the Greeks have myths about this?

Red Emma