Archive for the ‘can’t we all just get along?’ Category

The Broccoli Reflex

August 14, 2012

[I originally wrote this in 1992 for a newspaper I contributed to at the time. I understand that this is now construed as auto-plagiarism unless properly confessed, but it still seems relevant to current realities, and is hard to find anyplace else, so I am, modestly, reprinting it here as a public service.]

Quick, what do broccoli, tofu, fruitcake, and Democratic presidential candidates have in common? The first answer is probably most people’s first reaction to all of these: “Eeeeeeuuuww!” The second answer is that I suspet very strongly that this reaction, in all four instances, has been conditioned by, if not a Sinister Media Conspiracy, something at least as effective.

Kids are raised, from the first time they set eyes on televised food commercials, to dislike vegetables, and especially broccoli. Sometimes it is the purveyors of some veggie delicacy themselves who teach this lesson. “You may think broccoli is yucky, but we do something to it that you’ll like!” Tofu is the butt of everybody’s jokes about Japanese cuisine, nouvelle cuisine, vegetarian cuisine, and healthy New Age living. And fruitcake, over the last few years, has become a staple of Christmas jokes. Nobody eats fruitcake, the joke goes; they just wrap it up and pass it around the family from generation to generation, using it as a doorstop between holiday seasons.

In point of fact, broccoli, like any other green vegetable, can be quite tasty if not overcooked. Tofu takes on the taste of whatever it’s cooked with, for better or for worse. Which means that, cooked with decent seasonings, it can be a tasty, no-fat substitute for meat or cheese. And fruitcake–well, I may be prejudiced by the fact that my family recipe for fruitcake starts out with soaking a bunch of dried fruit in rum for 24 hours or so, but I like fruitcake, quite a lot actually, and so do about half the people I know.

Still, the bad press given to these laudable foods is really harmless in the greater scheme of things. What happens to Democratic candidates is more serious. For instance, a poll done shortly after some spectacularly bad economic news last fall indicated that 56% of the population would vote for an unnamed Democratic candidate (sort of like a first draft choice, I suppose) against Bush. But the figure dived to well below 50% for any specific Democrat.

In 1984, Ronald Reagan not only succeeded in carrying the popular and electoral college vote against Walter Mondale, but in completely destroying Mondale’s personal credibility. A monogamous churchgoer who had spent twenty-odd years getting regularly re-elected to Senate and Congress from a conservative sobersided state somehow became perceived overnight as a weak-kneed defender of sexual promiscuity and financial profligacy. He has essentially not been heard from since, and could probably not be elected to a local school board.

Then, in 1988, a wide array of well-educated, experienced candidates with a variety of interesting positions on important issues got shot down, one by one, for a spectrum of personal failings ranging through all the Seven Deadly Sins. The lone survivor of the process, Michael Dukakis, was the successful governor of a then-thriving state. But before the campaign was over, he had managed to blow a 17-point lead and came out looking like the patron saint of wimps and rapists. His credibility has been destroyed, and he would have a hard time getting a credit card these days. After these depressing examples, one can hardly blame the electable Democrats for not getting into the race until about two months after the start of the usual season, or Mario Cuomo for being unwilling to get into it at all. In most elections, even the loser gains something, be it only name recognition for a business or professional practice, or a good shot in the next election. When a Republican loses in the presidential primaries or the election, he can live to campaign another day, possibly for another office. Look at Reagan. And Bush. And Goldwater. Even Nixon is surprisingly lively. But when a Democrat does it–at least since McGovern–he’s out of the picture, and out of almost any picture, forever. Clearly, he has very little to gain and almost everything to lose by running. Now that any Democrat with the IQ necessary to sign his own nominating petition has figured this out, we have to assume that those still willing to run are either crazy or very very gutsy and dedicated.

Maybe that really is how we progressives want our candidates selected–the survival of the craziest. But if it isn’t, we need to bring to our own awareness and then the public’s, to the insidious mechanism that clicks into action against any Democrat the instant he becomes known as a possible presidential candidate–the conditioning program to trigger the Broccoli Reflex. Face it, folks, nobody could be simultaneously as vapid and wimpy and corrupt and stupid and insubstantial and dangerous and dull as we always end up believing all of the Democratic candidates are, and still tie his shoes and stay out of jail, let alone get elected to state or federal office and perform even the most minimally ceremonial duties of that office. A Democrat could have the charisma of Franklin Roosevelt, the vision of Eleanor Roosevelt, the devotion and integrity of Mother Teresa, the brains of Albert Einstein, and the good looks of Robert Redford, and the Sinister Media Conspiracy would still find a way to trigger the “Eeeeeuuuww! “reflex at mere mention of his name or party affiliation.

When we hear the current batch of candidates called the six-pack (“all lite, no head”), we need to recognize that this isn’t political satire, it’s operant conditioning. The GOP mastered the trick by accident in 1972 (actually coining the neologism “ultraliberal” for the occasion because even they knew nobody would swallow the idea of McGovern as a radical), and lost by accident to the same mechanism in 1976. (Everybody thinks it was the Nixon pardon that cost Ford the White House. In fact it was Chevy Chase’s persistent portrayal of Ford, on “Saturday Night Live”. as a maladroit malapropist.) Since then, they haven’t faltered once, and nobody has spotted the wires under their levitation act.

I don’t mean to imply that all Democratic candidates do in fact combine all the better traits of a Roosevelt/Teresa/Einstein/Redford hybrid. Obviously, Hart and Biden really did adulterate and plagiarize, respectively, and some of the others made minor but genuine goofs that year. But a Republican can lie, cheat, steal, fornicate, adulterate, and sell out the entire American economy to the Japanese–can have the brains of Dan Quayle, the family life and war record of Ronald Reagan,and the ethics and looks of Richard Nixon–and still be perceived by just about everybody, including most Democratic voters, as “presidential caliber.” There is more going on here than meets the eye. Broccoli, fruitcake, and tofu were only trial runs. The way things are going now, in 1996, the Republicans could run a Big Mac for President–a Big Mac over 35 years old!–and win. Next time somebody says “six-pack” (except, of course, when referring to beer), STOP the conversation right there. And don’t let it proceed until you have forced all participants to ask themselves “Where did I hear that? Do I really want to say it or endorse it? What do I really know about any of these guys?” And let’s be really conscious that there is a real difference between satire and sabotage.

Red Emma

The Optical Illusions of the Soul

November 20, 2011

Google “optical illusions” and you will pull up a huge number of moving and stationary, black-and-white and color, geometric and random drawings that have in common the ability to look first like one very definite image, and then like a totally different one, to the normal human eye. I just finished reading a Dan Simmons novel, Flashback, which had pretty much the same effect on me, not for the first time. Reading Ayn Rand does the same thing. So do some of the postings on this blog. They make me acutely aware that my fondest dream may be your worst nightmare. Dunno whether I am the only person for whom my fondest dream may be my worst nightmare. That, of course, is why we Wired Sisters are multiple.

I’ll start with the religious version of this phenomenon, which I used as a Rosh HaShanah discussion last year under the title The Abrahamic Split. Some bibliographical and linguistic notes: Rabbi Michael Lerner is the author of Jewish Renewal. “Milhemet Mitzvah” is, in traditional Jewish thinking, a war which is commanded by the Holy One, as opposed to wars which are either optional or forbidden. The only one of these that everybody seems to agree on was the conquest of the land of Canaan by the Israelites on their way out of Egypt, the subject matter of the second through fifth books of the Bible. “Midrash” is how the various scholars explain what the biblical characters did between the installments of the text. What Woody Allen does at the end of this discussion is also midrash. Maimonides was a twelfth-century rabbi, scholar, philosopher, and physician, whose views of scripture often seem to come out of left field. So here it is. Next posting will be the political angle.

Over recent decades, we have become conscious of a double voice in the Jewish tradition, a voice on one hand of “love your neighbor as yourself” (Deuteronomy 6:5), and on the other of “remember Amalek” (Deuteronomy 25:17.) Those of us who follow political and religious controversies are all too aware that this double voice is duplicated in Islam ([Qur’an, Sura 2:256] “There shall be no compulsion in religion…. [Sura 18:29] Proclaim: “This is the truth from your Lord,” then whoever wills let him believe, and whoever wills let him disbelieve”, and on the other side [Sura 47.4] “When you encounter the unbelievers, Strike off their heads. Until you have made a wide slaughter among them…” Similarly, Christians can quote the Gospel of Matthew: ” Love your enemies and pray for those who persecute you, that you may be sons of your Father in heaven. He causes his sun to rise on the evil and the good, and sends rain on the righteous and the unrighteous” (Matthew 5: 44-45) Or they can call down the wrath of Heaven in the form of Crusades, new and old. Before the Crusades, after all, was the Jihad. And before the Jihad was the Milchemet Mitzvah. After a while, some of us find the lazy way out. We decide that the Holy Blessed One was speaking only in the words of love and mercy. We who hear that voice, among all the Abrahamic faiths, can talk to each other. But we need pay no attention to those, in all three faiths, who hear the voice of cruelty and revenge. Rabbi Michael Lerner, in Jewish Renewal, even suggests that that voice is not the voice of God at all.

Let us leave aside for the moment Whose voices those are, on both sides of the Abrahamic split. Let’s look at where, historically, they are first heard. I think the Jewish tradition first hears them both, side by side, in the story of the Akedah, the binding of Isaac, which is the root narrative of all three of the Abrahamic faiths (Genesis 22:1-19.) Abraham hears the voice of the Holy Blessed One, at night, tell him to take his son up to the Mount and “offer” him. Michael Lerner will tell us that that was not the Holy One’s voice. Maimonides will tell us that it was a prophetic vision meant to show us how far one may be expected to go in obedience to Heaven—but that the actual Akedah may never have happened at all. The Muslims tell us that the son designated for offering was Ishmael, not Isaac. The Christians tell us that the whole thing prefigured the sacrifice of Christ. But let us assume for the moment that what Abraham heard was really the voice of Heaven. He certainly behaved as if he believed that. He took the two “lads” (midrash tells us they were Ishmael and Eliezer) and Isaac, and all of the paraphernalia of sacrifice except the sacrificial animal, and walked three days toward a place “to be announced.” When they got there, Abraham apparently knew this was the place, even though he did not hear any divine voice saying “Okay, here you are. Up on that mountain there.” He looked up and there it was, without so much as a “You Are Here” sign.

But Abraham also never quite comes out and says that Isaac is to be the victim. Is this because the voice in his vision told him only to “offer” his son, and not to kill him? Some of the midrash points in that direction. Other midrash, coming from the time of the Rhineland massacres a thousand years later (when the Crusaders stopped off on their way to the Holy Land to kill enormous numbers of Jews), will not accept that lawyer-like parsing of words. That midrash depicts Isaac preparing himself to be killed, and asking his father’s help to be a worthy victim. Indeed, in some of that body of midrash, Isaac is actually killed, and then revived.

At any rate, Abraham sends the “lads” away, binds his son on the altar, and raises his knife. And then he hears another voice. The text says it is the voice of an angel or a messenger, but we are familiar by now with the ever-shifting line between the Holy One and the angels, between Principal and Agent. “Lay not your hand on the child,” that voice says, “nor do anything to hurt him.” Abraham, confused, stops, frozen, his arm raised. He is here to do what he has been commanded. Now he is commanded to stop. What does he do now?

Completely distracted from what he has so painfully nerved himself to do, he looks around, and sees a ram. The Ram. Sees him “after,” “behind,” in Hebrew “achar.’ The Hebrew in such a construction, would normally have been “acharav”—behind him. Midrash makes much of the oddness of the locution here, based on its axiom that the Holy One does not waste words. Does “achar” mean, as it often does, “in the future”? Maimonides thinks so. Is Abraham seeing the generations after, looking back on the story as we are doing now, and asking himself, not only “what does the Holy One want me to do?” but “what does the Holy One want all of us to do, for generations to come”? Abraham, after all, is a prophet. Prophets have visions. They see the future. Or futures.

The Ram is caught in the bushes—“basbaq,” a locution that, in modern Hebrew, means something like “in the turmoils of everyday life.” Which is something more likely to happen to us than to rams. Or is Abraham the one who is caught in turmoil? At any rate he resolves the turmoil by taking the ram from the bushes and substituting him for Isaac on the altar, where he completes the offering.

The midrash makes this ram the raw material of Jewish ritual for centuries afterward. “The ashes of the parts burnt upon the altar formed the foundation of the inner altar, whereon the expiatory sacrifice was brought once a year, on the Day of Atonement, the day on which the offering of Isaac took place. Of the sinews of the ram, David made ten strings for his harp upon which he played. The skin served Elijah for his girdle, and of his two horns, the one was blown at the end of the revelation on Mount Sinai, and the other will be used to proclaim the end of the Exile, when the “great horn shall be blown, and they shall come which were ready to perish in the land of Assyria, and they that were outcasts in the land of Egypt, and they shall worship the Lord in the holy mountain at Jerusalem.” And of course we have been blowing the ram’s horn, the shofar, during the Days of Awe, for the same purpose. For the purpose, in fact, of making us hear once again, and again and again, that other voice of Heaven, holding us back from the ultimate violence.

The only midrash I have been able to find that brings these two voices into simultaneity, if not harmony, comes from, of all people, Woody Allen.

“And so he took Isaac to a certain place and prepared to sacrifice him but at the last minute the Lord stayed Abraham’s hand and said, “How could thou doest such a thing?”
And Abraham said, “But thou said —”
“Never mind what I said,” the Lord spake. “Doth thou listen to every crazy idea that comes thy way?” And Abraham grew ashamed. “Er – not really … no.”
“I jokingly suggest thou sacrifice Isaac and thou immediately runs out to do it.”
And Abraham fell to his knees, “See, I never know when you’re kidding.”
And the Lord thundered, “No sense of humor. I can’t believe it.”
“But doth this not prove I love thee, that I was willing to donate mine only son on thy whim?”
And the Lord said, “It proves that some men will follow any order no matter how asinine as long as it comes from a resonant, well-modulated voice.”

But the Tradition is not comfortable with that view either. We do not discard one of the divine voices in the Torah because it is cruel, or cast doubt on its reality because it was “only” a prophetic vision, nor because the Holy One was only joking. All of those are tempting, and we are honest with ourselves about the temptation. But in the end we side with the Sanhedrin, as it ruled between the strictness of the rabbinic school of Shammai and the humility and humanity of the school of Hillel: Elu v’elu divrei elohim hayyim—These and those are both the words of the Living God—but the law—our law, because we are only human–must follow the merciful school of Hillel.


What Should Vargas Have Done?

September 23, 2011

I spent nearly twenty years teaching a course on “professional standards for mental health workers,” which was essentially a course on professional ethics. While we spent a lot of time talking about medical ethics, because historically all professional ethics start with the Hippocratic Oath, we also looked at the ethics of other professions or quasi-professions. For some reason, we never got around to journalism. Which is just as well, because I would have been tempted to inquire about the ethical status of the age-old question, “How did you feel, Mrs. Jones, when you saw your baby eaten by the tiger?” (I did, at one point, ask a couple of faculty members at the local school of journalism, who said they always tell their students that no good reporter would ask such a question. Yeah. Right.)

But now we’re hearing a lot about the professional ethics of journalism, in the context of Jose Antonio Vargas, a respected reporter whose “coming out” as an undocumented alien was recently published in the NYT magazine. His parents sent him here when he was 12 years old; he was raised in California by his grandparents. He didn’t know he was illegal until he was 16, and used fake documents to survive after that. He published his story to contribute to the current debate about immigration. But in the course of doing that, he has raised some issues in a heretofore mostly dormant debate about journalistic ethics.

First, let’s talk about the basic ethical issues. “Vargas has been living a lie at least since he was 16. What does that do to his credibility, including his credibility as a journalist? How can we believe anything he says?” people are asking. Not unlike the people who asked how anybody could believe anything Bill Clinton said after he lied about having sex with Ms. Lewinsky. Falsus in uno, falsus in omnibus, as the Romans said. The Romans were pretty good liars themselves, probably including whoever coined that adage. (Not to be confused with my father, a strong believer in the professional ethics of his profession, accounting, who once told me “Schlock in uno, schlock in omnibus,” by which he meant that somebody who screws around with IRS is probably also violating OSHA, the Clean Air Act, the child labor laws, and the Ten Commandments. Which my own professional experience finds quite credible.)

“Living a lie.” Until recently, most homosexuals lived a lie, too. Before them, during the McCarthy era, many American leftists lived a lie. And before them, back into the earliest beginnings of recorded history, so did adulterers. At various times, the religiously heterodox have had to live as liars, in the face of the Inquisition or similar organizations. During the American Revolution, many patriotic colonials, on both sides, “lived a lie.” Possibly including many of our now-revered founding fathers. I doubt that we have become any more honest since then. What has changed are the penalties for homosexuality, leftist politics, unpopular religious and political beliefs, and adultery. At one time or another, all of them have been capital offenses. More recently, they have been grounds for being deprived of employability, social respect, love, friendship, and companionship. Whoever formulated these penalties probably wasn’t hoping to be able to eliminate homosexuality, adultery, wrong-headed religions, or leftist political thinking. They just wanted to make sure that anybody who engaged in them had to lie about it. Which made such people vulnerable to all kinds of blackmail, some of it quite lucrative for people “in the know.”

For an adolescent who has just discovered that his presence in the country he has lived in for most of his conscious life is illegal, the issues are more complicated. What should the kid have done? Turned himself in at the nearest INS office (as it was then designated)? I have no idea what its functionaries would have done, back then. Probably the local migra was as clueless as Vargas himself. They might just have sent him back to his grandparents, rather than deal with all the paperwork. Or they might have locked him up and sent him back to his native country (the Philippines, I think) without a word to his grandparents or anyone else who knew where he had been living. Either way, I have real trouble believing he had any ethical obligation to submit himself to the dubious attentions of INS or any other government agency.

But at some point, apparently, he did make a deliberate choice to remain in the US, go to college, and adopt a profession without first attempting to regularize his situation. I am willing to accept for the sake of argument the proposition that that particular profession required special attention to truthfulness, although you may imagine my skepticism. I am not willing to accept that the professional journalist is obliged as such to be more open about his personal life and circumstances than any of the rest of us. I assume that the profession includes as many adulterers, tax cheats, and guys who tell girls they meet in bars that they will call them in the morning, as any other occupation. I don’t recall hearing any of these nefarious propensities being punished or even deplored any more among journalists than among cops, trash collectors, bartenders, or janitors. I don’t recall any article in the Columbia Review of Journalism advocating that they should be. Am I missing something?

Falsus in uno, falsus in omnibus is nonsense. We all know this. Every one of us has situationally relative standards of truthfulness. Most of us will lie about trivia, and about the details of our own personal lives and those of our near and dear. Some of us will embellish our resumes, and most of us in positions of responsibility in the business world will embellish the prospects of applicants for employment (“this job requires some typing, but you won’t just be a clerk…”) and virtually all of us will inflate our esteem for people we have just met. But we all know the difference between the level of veracity prevalent among ordinary reasonable persons and what we are likely to hear from real liars. We also know the difference between deliberate knowing falsehood and mere “reckless disregard” for the truthfulness of a particular statement (the difference, let us say, between Oliver North and Michelle Bachman.)

And most of us also know that society will cut some slack for the reformed sinner, or liar, who eventually comes clean, if only to encourage others to do the same. It’s sound social and moral policy. Some of us even realize that the immigration policy of the United States is more openly subject to the application of clout (in the form of private acts of Congress) than almost any other area of our government, and that Vargas, given his professional eminence, is very likely to benefit from such clout and be safely legalized by the time ICE (as it is now known) gets around to dealing with him. It would be nice if more of us were also aware that being in the United States illegally is not the same kind of violation as, say, mother-stabbing or father-raping, and should not subject those who commit it to the full penalties of outlawry in the original sense of the word. Enough already, let’s concentrate on real lying and real crime, especially among those who have had the benefit of being born in the USA.

Red Emma

A Limited Defense of Affirmative Action

May 29, 2011

I am a beneficiary of affirmative action. These days, so they say, I should be ashamed to admit it. It implies, after all, that I was not otherwise qualified for some benefit I obtained only because of being some kind of “minority.”

I have actually benefited from affirmative action on two different counts–as a woman, and as a Hispanic. Every now and then that gives me a slight edge on the competition. That doesn’t bother me particularly. I’ve been discriminated against as a woman more times than I can remember (or, probably, than I have ever known) beginning at least with my first permanent job, which I obtained only by giving the right answer to the employment agency’s question about what method of birth control I used. (For those too young to remember that era, the right answer was not “none of your damn business.” It was “the Pill.”) On another job, I was sexually harassed before there was even a word for it, much less a cause of action. So I figure any benefit I get from the double x chromosome is just a matter of restitution.

I have also been discriminated against, I’m pretty sure, for being Jewish. This, of course, gets me no affirmative action points, but that kind of makes up for the fact that I do get points for being a Hispanic (both my parents were born in Cuba, and my family is essentially bicultural) even though I have never been discriminated against for that fact. (As a matter of fact, since I am a natural blonde and speak English without an accent, nobody knows I am Hispanic unless I choose to tell them, and I normally do that only where I will get extra points for it. Which is generally in jobs where my ability to speak Spanish really is a plus.) And most recently, I have probably been discriminated against for my age, which is illegal, but for which I get no affirmative action points. So I will take those points where I can get them, without embarrassment and without feeling that my competence is in any way in question.

I went to a good college and made Dean’s List my last two years. I scored in the 98th percentile on my LSATs. But when I applied to law school, I was admitted to a school in which 45% of my class was female, in the mid-’70s, and rejected by another school which had a far lower percentage of female students in that year. The evidence seems clear; I was almost certainly admitted to the former because of my gender, and rejected by the latter for the same reason. My objective qualifications were equally irrelevant to both schools. Probably all those qualifications got me in the second school was a rejection further along in the process than some of my less-qualified sisters (and my totally-unqualified brothers.)

Realistically, of course, nobody ever challenged my academic competence, or that of any other woman I know who has been accepted into any academic program under an affirmative action program. Even the most neanderthal of male supremacists will grant that women on the average do better in school, except in mathematics and the hard sciences, than men. The reason women have historically been discriminated against in academic admissions is that we are not expected to be able to do much of anything useful with our knowledge and academic credentials after we get them.

So the affirmative action issue really only gets raised, where women are concerned, when one of us is promoted to a position of power, beyond the glass ceiling. Then the innuendoes fly–quotas, sleeping with the boss, the supervisor is a leg man, somebody’s sister, somebody’s daughter, somebody’s wife. Most of us, however, would still rather live with the humiliation of possibly having been promoted because of our gender than with the equally potent and much less remunerative humiliation of not having been promoted for the same reason.

Stephen L. Carter’s misgivings

Which is why I have trouble with people like Stephen L. Carter. His Reflections of an Affirmative Action Baby is a thoughtful and well-written book with a good sense of the complexities of inter-ethnic relations in the United States of the 1990s. But I have a few problems with its basic premises. Don’t expect the Establishment to make special standards for you, he tells young African Americans. It’s humiliating that we should think we need that. Meet their standards, beat their standards, and demand to be accepted on their terms. For Blacks and Hispanics, who are popularly expected to be less competent in academic achievement, it may actually be a source of humiliation to be admitted to a respectable school under an affirmative action program because of their ethnicity. However, most of the “affirmative action babies” I know would say that it is no more humiliating than being rejected because of that same ethnicity, and pays a lot better.

Carter’s advice takes the Establishment’s claims of devotion to meritocratic standards at face value. Which gives a lot more credit than it deserves to an Establishment that has never really believed in those standards, and has espoused them only when doing so would serve the purpose of keeping a particular group of outsiders outside.

The reason Carter has not seen this hypocrisy is that he is looking at the experience of only one group of outsiders. If he were to consider that of three others–women, Asians, and Jews–whose ability to meet meritocratic standards has rarely been questioned by anybody, he would discover that the Establishment has never had any difficulty excluding them, or severely limiting their upward mobility, on some other grounds.

The merit system: now you see it, now you don’t

For instance, in the 1930s, Harvard Medical School discovered that, if academic qualifications were to be the only criteria for admission, its entire entering class would be Jewish. Indeed, they would have had to double the size of the entering class to get in more than few token gentiles. So they suddenly discovered that there was more to being a physician than “mere” academic excellence. Arbitrarily, they set a quota of 22% for Jewish applicants, a quota which remained in effect until the ’60s, when, like the Jewish quotas in many other educational institutions, it was replaced with a larger and slightly less transparent quota on students from large cities, especially New York City, under the rubric of “geographical distribution.” Those quotas still exist today in many schools.

The experience of women is in some ways even more blatant. When my classmates and I graduated from college in the early ’60s, we frequently looked for jobs before and between graduate school, in the public sector. We took the civil service exams, scored at or near the top, and were repeatedly beat out for the actual jobs by men who had scored a good deal lower, before using their veterans’ preference points.

When I was at college, in the late ’50s and early ’60s, it was a truism, repeated to us regularly by faculty and admissions honchos, that men scored higher than women on the math section of the SAT, but women scored higher than men on the verbal section. It didn’t, of course, get us much. There were fewer places available for women at good colleges (or any other colleges, actually) than for men, and less scholarship money available for us. So nobody thought much about it. But twenty years later, when the various controversies about the biases of the SAT arose, I was startled to hear everybody, on all sides of the dispute, saying that women scored lower than men on both sections of the SAT. Even the American Association of University Women, in its otherwise beautifully researched study of discrimination against women in education, could only conjecture about what happened, by the end of high school, to the clear lead in reading and verbal skills, that girls have over boys in elementary school. What had happened–a couple of very well-hidden and quickly forgotten news stories revealed–was that in the middle ’60s, ETS changed the verbal section of the SAT, substituting scientific essays for one or two of the fiction selections in the reading comprehension test. Female scores promptly dropped to their “proper” place–visibly below those of their male classmates–and have stayed there ever since.

Asians are the most recent victims of similar policies. Several West Coast schools, most notably the University of California at Berkeley, have experimented with ceilings on the number of Asian students within the last 10 years. A university, the administration proclaims, has the right to put “diversity” above “mere” academic excellence.

In short, the history of other groups of outsiders suggests strongly that if an entire generation of African American young people followed Carter’s advice to meet meritocratic standards and beat them, the Establishment would have no trouble finding some other pretext to exclude all but the most presentable tokens among them from the precincts and perquisites of power–either by changing those standards, or suddenly discovering the greater importance of some other factor.

That does not, of course, invalidate Carter’s advice. It does make one wish Carter were a little more careful about truth in advertising, however. I tend to prefer Malcolm X’s more honest approach, when he advised his followers to read anything they could get their hands on and get all the education they could, even if all it got them was the proud position of best-educated person in the unemployment line.

Was there ever a merit system?

Before the phrase “affirmative action” ever found its way into our vocabulary, the reality of affirmative action was already as American as apple pie. After all, what else is veterans’ preference, if not an affirmative action program for (in the post-World War II era in which it was born) men? What else is seniority, if not an affirmative action program for older workers? I have never known a veteran, or an experienced union man, who was in the least ashamed to have benefited by those affirmative action programs.

Nor should they be. Before the rise of the meritocratic mythology of the ’70s, any American old enough to have held a job at all knew that nobody gets a job solely by virtue of being the most qualified candidate for it. In an economy which has never even aspired to full employment, most available jobs have several well-qualified candidates on hand. Most employment discrimination does not involve hiring an unqualified person in preference to a qualified one, but rather choosing between more-or-less equally qualified candidates on the basis of factors unrelated to the job.

The Jewish Establishment’s position

Many established Jewish community organizations, like many other high-minded, principled opponents of affirmative action, really believe that they are espousing a pure meritocracy as against a system of arbitrary choice. To take that position, they have to presume that, before the 1969 Civil Rights act, all male Jews had the jobs they were entitled to, by reason of their meritocratic qualifications. They also have to presume that all Jews are male, white, anglo, and middle-class and have nothing whatever to gain from affirmative action. They have to, in fact, ignore the experience of considerably more than 53% of the Jewish community. They even have to advocate giving back to the same academic and professional Establishment that subjected Jewish males to explicit, exclusive quotas until the early ’60s, the power to do it again.

Two cheers for affirmative action

Most supporters of affirmative action see it as a lesser evil. But, unlike its opponents, they recognize the realistic alternative as a greater evil. Affirmative action is not a matter of substituting for a pure meritocracy a system of choices among qualified candidates according to standards unrelated to job or scholastic requirements. It is a substitution of one set of arbitrary choices for another.

The alternative to affirmative action in real life is the divinely-ordained and legally-protected right of the employer or supervisor to hire people who remind him [sic] of his best friend, or people who fit his stereotyped image of the “proper” telephone operator or waittress or whatever. We know that most people who get jobs get them for reasons only distantly related to their ability to perform. In fact, the most serious downside of affirmative action, so far as I can tell, is that it denies future generations a really useful index of professional excellence. When I meet a doctor, or a lawyer, or a CPA, who is female or non-white (or better still, both) and who got his or her professional credential before 1970, I know I am dealing with a superlatively qualified professional, because only the best women and non-whites were able to survive the discriminatory professional screening processes in those days. For professional women and non-whites with more recent qualifications, alas, I have to take my chances, just as I would with a white male of any age.

So we sincerely hope that the people into whose hands we put our lives, our fortunes, and our sacred honor are in fact qualified to do their jobs. But as a practical matter, we know that we are at least as much at risk from incompetents who were hired or promoted for being the boss’s brother, or being tall, or not being Hispanic, or having an officious-sounding British accent, as from those hired or promoted for being female, Black, or Hispanic–quite possibly more, since the latter are usually watched more closely. In fact, these days I am beginning to suspect that American-born doctors can no longer be presumed to be as competent as doctors with foreign accents, since the latter are subjected to much tougher screening standards.

Well, maybe two and a half

We may see ourselves as winners or losers, and we may attribute our situation to other people or to our own deserts. Human beings generally have never had any trouble taking credit for their own good fortune or blaming others for their misfortunes. More recently, “new age” thinking has led many of us to take the rap for our own misfortunes, often in truly preposterous ways (“How have I created this reality?” the cancer patient asks.) But it is difficult for any of us to admit that our good fortune may be the result of some totally unearned “break” from the outside world–being white, for instance, or male. That is the real threat of affirmative action–that it requires us to consider the possibility that (even if, as is likely, we aren’t as well off as we would like to be) we haven’t “earned” even the few goodies we have. For those of us raised in the Jewish tradition, which teaches us that the Land promised to us by the Holy One is ours only on loan, and that we were not chosen to receive it because of any particular merit on our part, that shouldn’t be too much of a leap. It should make us more willing to grant similar unearned goodies to other people. “Use each man according to his deserts,” says Hamlet, “and who should ‘scape whipping?” Or unemployment, as the case may be. Even us, the few, the proud, the overqualified.

Red Emma


May 29, 2011

Recommended Reading

I have a client who now resides in a nursing home and is in the early-to-middle phases of dementia. She is also a sci-fi fan, so whenever I clean out my bookshelves, I take the proceeds to her. I am discovering that, while that improves the quality of my life, it doesn’t necessary change hers all that much. Because one of the few so-far-unheralded upsides of dementia, at least in its early phases, in that you get what I have always wanted—multiple opportunities to read the same book for the first time.

Among the books I have especially wanted multiple shots at in this way are John Brunner’s line of speculative novels: Stand on Zanzibar (1968), Jagged Orbit (1970), The Sheep Look Up (1972), and The Shockwave Rider (1975.) And I spent a fair amount of time wishing there was somebody around right now who writes that kind of stuff, preferably in batches rather than an occasional one-off like Orson Scott Card’s Empire and Hidden Empire (okay, that makes them a two-off, I guess.) I think I’ve found one—John Barnes, author of Mother of Storms, Directive 51, and The Man Who Pulled Down the Sky.. Unlike Brunner and Card, he does dabble in the Irwin Allen school of writing (one damn disaster after another), but in the process he takes a serious look at the trajectories of current social, technological, economic, and political phenomena. Consider this a recommendation.

The Unknowing God

For a period that lapped over into my college years, the existentialists told us that the human race is engaged in a frantic effort to become god. As I think about it these days, I am increasingly convinced that many of us already are god, and we are failing to notice it (and falling down on the job) to a dangerous extent. Me, for instance. Most of my days I spend working, on the phone, on the computer, at the office, in court, at home running around finding things (and of course losing things and not realizing it till later), shopping, and so on. If in the middle of all this, I sit down and call the Wired Cat, and she comes over to me, sits down at my feet, and reaches out her front paw to pat my leg, to which I respond by reaching down to rub her head between her ears and down to her neck, for her this is a religious experience. Her divinity has taken time out from managing the universe to communicate with, relate to, and pleasure her. Sometimes, like most divinities, I do things she really doesn’t like, such as taking her to the vet. She seems to accept this as good for her in some way that I understand and she doesn’t. She’s lucky enough to have a divinity who doesn’t do any of the awful things to her that one hears about on Animal Planet (Mr. Wired is an Animal Cops junkie and a hard-core groupie of Anne-Marie Lucas.) But if it did, she’d probably accept that too, as most domestic animals seem to. The ones who have been too utterly traumatized retreat into the animal counterpart of atheism—the feral life. (Atheism is not actually the right word—I am not the first to wonder if there is a word for somebody who believes in the Holy One but just doesn’t like H* very much.)

And of course, to our children, and to most of the children we come into extended contact with (as teachers, for instance, and maybe as pediatric health professionals), we are also god. (Note the lower-case initial, used—as Grace Slick explained when she named her kid “god”—so we won’t get stuck-up about it.) So far as the kids can tell, we (especially parents but adults in general to a considerable extent) run the universe, and occasionally take time out from doing that to interact with the kids, for better and for worse.

The Bible actually plays with this idea. For instance there are two or three references to judges as gods. (One suspects some of the human authors of these passages spent some time on the bench themselves—certainly ordinary human judges have always tended to see themselves as some kind of deity.) Moses is told that he is going to be “in the place of G-d” to Pharaoh, and that his smoother-talking brother Aaron will be his “prophet.”

And there is a story about a rabbi (Hasidic, I think) who, upon being told that somebody he knew was an atheist, said something like “Well, that’s good. It means that if he sees somebody who is poor or in trouble, he won’t just say ‘G-d will help him,’ he’ll get up and actually do something for the guy.” Even professionally religious people may have a kind thought for people who, not believing in a divinity, feel obliged to fill in for H*.

Which, if you accept the hard-core deterministic schema of the behavior of all non-human entities, means that human beings and their actions are the only preserve of free will in the universe, and thus also the only rational place for the divine to operate, by inspiration and impulse. Many rational religious people have trouble believing that the Holy One has ever made the sun stand still or water run uphill, but will accept a divine push toward extraordinarily decent human behavior—in other words, that we are not exactly in the hands of G-d, sometimes we are the hands of G-d.

Jane Grey
War is the End, part II

Does anybody else remember the study that told us we could have won the hearts and minds of the Vietnamese people by giving $10,000.00 to every man, woman, and child in that country, and still have spent less than the $686 billion we actually spent on the war? (Another sourcing problem, obviously.) Anyway, Cecil Adams, of “The Straight Dope” has heard from a history scholar who says the North could have bought and freed all the slaves in the then-US for something like $72 billion in present-day dollars, which was also considerably less than the overall cost of the Civil War, especially if you reckon costs and damages on both sides, which of course all ultimately came out of US GNP. This once more tells us that wars are almost never “about” their official causes and purposes, which could almost always be implemented a lot more cheaply, easily, and with less violence. War itself, or some so far unknown concomitant of war, makes it an irreplaceable element of human polity.

Red Emma

Life Among the Condonauts

I just opened a mysterious envelope from a fellow resident of our condominium building, to discover that, as a member of the condo association, the Wired Household is being sued by another member of the association and by our really heroically estimable janitor, for the alleged misconduct of the erstwhile chair of the association, our upstairs neighbor. This is a peculiarity of condo law-—in order to obtain a remedy for some misbehavior by condo association officers, you have to sue the association, even if you are a member of it. Which means that you are, in a sense, suing yourself. You are certainly costing yourself money. All the costs of defending the suit come out of the pockets of the residents. We could even wind up paying the costs of the other side. This damn thing has got to be mediated, ASAP.

I am the only attorney I know who lives in a condo (for 31 years now) and has never served on the board. I really want to keep it that way. Lawyers are easy marks for pleas of communal obligation. But condo boards are a time sink. I just sent a frantic email to the plaintiffs asking them to please consider mediation. Yikes!


The Blockhead’s Market

May 1, 2011

“No man but a blockhead ever wrote, except for money.” Samuel Johnson

Arianna Huffington is being sued by some of her former unpaid bloggers. Jonathan Tasini and the other members of his class action against her complain that they created the value of the Huffington Post with their unpaid writing, and she then sold it to AOL for $315M. The bloggers, of course, got none of that money. The plaintiffs want a cut, at least $105M.

War is the End; the State is the Means

April 27, 2011

Just finished reading Nicholson Baker’s piece on pacifism in the latest Harper’s. It dovetails nicely with some other thinking I’ve been doing lately. Specifically, I’m remembering the ten years of the Vietnam War, and what it felt like at the time, and trying to figure out why Americans, even those most opposed to the current ten-year wars in Iraq and Afghanistan, are so much less passionate in their opposition than we were to the Vietnam War. One of the major differences, of course, is that we have no military draft today.

I was very active in the struggle against the draft during the Vietnam War, and got the ultimate rush, 20 years later, when one of my students, in a discussion of relatively recent history, literally could not remember the words “draft” or “conscription.” By George, I thought. We really did it! Many of my more radical friends and colleagues, at the time, predicted that ending the draft would take a lot of the juice out of opposition to any future wars. I allowed that they were probably right, but that even so the unspeakably hard choice to kill or not kill ought not to be forced on any unwilling person. I still believe that. But it’s obvious that, without a draft, this war, or the next or the next after that, could conceivably go on forever. That’s how all those European wars—the Seven Years’ War, the Thirty Years’ War, the Hundred Years’ War—got to be so interminable. They were not fought with conscript armies. Neither were the conquests that built and maintained the Roman Empire over 400 years, or the British Empire for 200+ years..

Baker takes up the issue based on what most of us have seen as the ultimate hard case against pacifism, the Second World War/the Holocaust. If you assume, as most of us have after the fact, that the war was necessary to save what was left of the Jews in Europe, then how could one argue against it? What originally disabused me of that notion was reading Arthur Morse’s While Six Million Died, published in 1967. Subsequent research only strengthens the premise of that book—the Second World War may or may not have put an end to the slaughter of the Jews of Europe, but it clearly was not fought for that purpose. We need to disentangle the war from the Holocaust to make sense of either of them.

In point of fact, to be sure, the Holocaust and World War II went on at more or less the same time, and were instigated by a lot of the same people. But they were very different phenomena. They gave rise to very different responses (even from the same people.) And, while they were causally inextricably related to each other, that relationship was almost unimaginably complex. The war provided a pretext for the Holocaust, as war almost always provides a pretext for oppression (up to and including murder) of noncombatant minorities, viz. the Armenians. And the Holocaust, ultimately, obstructed the Nazi conduct of the war, probably fatally. Hitler wasted resources on killing Jews and other “inferior” races that he could have devoted to beating the Allies. (Which may partially account for the reluctance of the Allies to do anything that might have impeded the Holocaust.) He expelled from Germany the Jewish and anti-Nazi scientists who might have given Germany the nuclear bomb. The Six Million, arguably, were martyrs to the Allied victory. Without their deaths, that victory might never have happened.

Those who opposed the Nazis at the time, both in Germany and elsewhere, opposed them, not because of their treatment of Jews and other minorities, but for pretty much the same reasons the Allies had opposed Germany in World War I and the democratic forces in Germany had opposed the Kaiser. Hitler was well on the way to conquering the world. In the course of doing so, he had eliminated most of the hard-won democratic rights enshrined into law in the Weimar Republic. Which is what happens to the civil liberties of citizens in almost any war. Good enough reasons, to be sure, and by no means to be sneered at. But even the staunchest anti-Nazis, at home and abroad, at best had little concern for the Jews, and at worst viewed the racist Nuremberg Laws as one of Hitler’s few good moves. This was as true of anti-Nazi resistance in occupied countries as in Germany itself. Indeed, there were anti-Nazi partisan units in Eastern Europe that killed Jews in their spare time, when Nazi-fighting got slow.

The British found it inconvenient to notice the plight of the Jews, because they were being called on to respond by opening up Mandate Palestine to Jewish refugees, at the expense of British relations with the Arabs. The Americans stayed out of the war until Pearl Harbor was bombed, fortuitously, by the Japanese–because American public opinion tended to side with the Germans against the Jews, but could easily enough be swayed against non-whites who had had the nerve to bomb American territory. The French had no choice but to respond to the invasion of their territory–but their struggle against the Nazis stopped short of any serious effort to protect French citizens of Jewish ancestry, much less alien Jewish refugees from further east. Indeed, rounding up Jews was one of the few activities in which many of the French cooperated willingly or even enthusiastically with the Germans.

The allied War Crimes Trials in Nuremberg made clear what the Allies considered to be the real offenses of the Nazis: violation of treaties, making of aggressive warfare, and torture and murder of Allied prisoners of war. The Nuremberg trials had virtually nothing to say about Nazi treatment of enemy civilians, and nothing whatever about Nazi mistreatment and murder of German and Austrian citizens. It was left to the Israelis and the successor governments of the formerly occupied countries to prosecute those crimes. Obviously none of them were in any shape to do so until at least the 1950s. By then many of the major war criminals were safely hidden away on other continents.

The switching of gears came in the 1960s. It was partly precipitated by the capture and trial of Adolf Eichmann (and Hannah Arendt’s in-depth coverage of it) between 1961 and 1963, and partly by the intensification of the Vietnam War. At that point, hawks, especially liberal hawks like the Henry Jackson faction of the Democratic Party, were holding up World War II as a shining example of a just war fought to protect a helpless minority against a marauding dictator, and a model for U.S. participation in the Vietnam War. It was the American war machine which, in the words of Herman Wouk, “kept my grandmother from being turned into soap.” Draft boards and congressional hawks stated over and over again that opposition to the Vietnam War was equivalent to the America Firsters’ opposition to American participation in World War II, which in turn was tantamount to endorsing the Holocaust. The “war crimes” actually tried at Nuremberg were hardly ever mentioned–except occasionally by anti-war advocates. Pro-war forces gave up their use of the Holocaust analogy only after the My Lai massacre, when it became fairly obvious that the U.S. military was killing at least as many civilians as the Viet Cong.

In the Vietnam and post-Vietnam rationale, the reason the Nazis were Bad People was their murder of helpless civilians, especially Jews. American World War II movies made in the ’60s and after often portrayed German soldiers who weren’t in the SS as “good Germans”, tragically honorable men doing what any patriotic citizen would do (including, presumably, aiding and abetting all the crimes prosecuted at Nuremberg), as opposed to the “bad Germans” who ran concentration camps. It might be inhumane to put civilians into concentration camps and gas them, but strafing, shelling, or dropping bombs on them from overhead was just a normal exercise of warrior morality, i.e., the same sort of thing our warriors were doing.

Getting back to Baker, he goes into considerably more detail than Morse about pacifist opposition, and the reasoning behind it, to American participation in World War II. Many of the pacifists of that era, including important Jewish spokesmen, accepted well before our time the premise that the purpose of any such participation was to save the European Jews, and by extension the Jews in the rest of the world not yet directly threatened by Hitler. But why not find some way to save the Jews that did not involve widening the war? they asked. “The Jews needed immigration visas, not Flying Fortresses. And who was doing their best to get them visas, as well as food, money, and hiding places? Pacifists were,” Baker points out. Moreover, if the purpose of the war was to stop Hitler, war might be precisely contraindicated. “…what fighting Hitlerism meant in practice was…the five-year-long Churchillian experiment of undermining German ‘morale’ by dropping magnesium fire-bombs and 2,000-pound blockbusters on various city centers. The firebombing killed and displaced a great many innocent people—including Jews in hiding—and obliterated entire neighborhoods. It was supposed to cause an anti-Nazi revolution, but it didn’t….If you drop things on people’s heads, they get angry and unite behind their leader. This was, after all, just what happened during the Blitz in London.”

Baker takes a perspective on the Holocaust that I found startling: that it was “the biggest hostage crisis of all time.” Hitler’s threats against the Jews of Europe were largely unfulfilled before the US entered the war. Many anti-war activists proposed negotiating at that point, when the US still had something to offer in exchange for the lives of Europe’s Jews. Holocaust historians Saul Friedländer and Roderick Stackelberg suggest that, although Hitler had long planned the killing of all Jews under German control, “its full implementation may have been delayed until the US entered the war. Now the Jews under German control had lost their potential value as hostages.” The first extermination camp, Chelmno, began operations, coincidentally (?), on December 8, 1941. Pacifist and near-pacifist advocates continued to call for “peace without victory”, an end to military operations in Europe on condition that the Jews be allowed safe passage out of Europe. It was not a popular suggestion among Allied politicians. Among the excuses for not even considering this possibility were Churchill’s statement that “[e]ven were we to obtain permission to withdraw all Jews, transport alone presents a problem which will be difficult of solution.” Anthony Eden, his foreign secretary, told the American Secretary of State that “Hitler might well take us up on any such offer, and there simply are not enough ships and means of transportation in the world to handle them.” This from the engineers of the Dunkirk evacuation two years earlier, who had gotten nearly 340,000 men from the French beaches to England in a mere nine days!

Baker is either a nicer person than I, or just more cautious. These lame obfuscations make it obvious to most modern readers that the Brits—and the US State Department—would not have wanted a massive influx of Jewish refugees even if all of them had somehow grown wings to fly themselves out of Europe. The real point was that both countries had a lingering substrate of anti-semitism to deal with, both in the general population and among their diplomatic apparatchiks in particular. Many of their citizens were likely to be lukewarm in their support of the war if they thought its purpose had anything to do with saving Jews. The diplomatic establishments were nice enough to consider acknowledging this in official communications to be a breach of etiquette, but not decent enough to overcome it with an offer to save Jewish lives. If the Jews were to be saved, the Anglo-Saxon alliance was declaring, it would have to be as an incidental—or perhaps even accidental–by-product of a war being fought for utterly different reasons.

If even World War II, for which the most noble and humanitarian purposes have since been adduced, was not in fact fought for those purposes, what does that say about the rest of the wars which have bloodied the world since humans first aglommerated into groups large enough to have wars? What are the real reasons for war?

The first and most obvious one is They hit Us first. Beginning with the first blood feud, this becomes problematic, because each “first blow” from Them always turns out to be a response to a pre-first blow from Us, and so on. So let’s abandon that game, or at least recognize it for the fraud it is.

The next most popular reason is They might hit Us first, if We don’t hit Them first,which is vulnerable to the same realities.

Then there’s if We don’t hit Them, Those Other Guys Over There might think We’re weenies and start hitting Us. In this age of universal publicity, it should be fairly easy to deal with this proposition without actually hitting anybody.

The fact that both sides, in any war, can come up with some reason for their behavior makes it pretty clear that those reasons are really nothing but excuses.

So if there are no bona fide purposes for war, why do we do it?

I suspect that this hypothesis isn’t even original, but war is not a means to achieve an end. If it were, many of those ends might be achievable by other means. Somehow, that never happens. Because war isn’t a means, it’s an end. Clausewitz to the contrary notwithstanding, war is not the continuation of politics by other means. It is the purpose of politics. It is the purpose of the nation-state (and the street gang, and the clan, and arguably the religion, and maybe even the family.) Domestic politics, and government, and the arts of peace are merely things to do in the interval between wars, to give the crew time and resources to break down the set, get the audience out, build up the new sets, find a new script and get all the lines learned, and then get the new audience in. In the American political context, the Republican party is more honest about this. The Democrats are willing to help us fool ourselves that we don’t choose war. Like Michael Corleone in his declining years, we just get pulled into it against our will because we’re such nice guys. The post-Vietnam series of wars and incursions—Panama, the Balkans, Lebanon, Kuwait, Iraq, Afghanistan—aren’t an aberration. Vietnam was the blip. Vietnam was the play to which we reacted as if it involved real people dying real deaths. Abolishing the draft has revived the concept of the “theater of war.” Vesti la giubba.

Red Emma

Surviving in Third-World America

April 12, 2011

Do you ever get tired of hearing that the U.S. is the only western industrialized country that (doesn’t have handgun control/doesn’t have a national health care program/has an infant mortality rate over __%/imprisons more than __% of its citizens/pick one)? After hearing so many of the pronouncements indicating that we trail the industrial West in good stuff and lead them in bad stuff, are you starting to wonder whether the U.S. really is a western industrialized nation any more? Is it possible that we’ve become, or are at least well on the way to becoming, a Third World country? After all, we are no longer the world’s wealthiest nation, nor its healthiest, nor its best educated. Now that the Soviet Union is no longer marking the boundary of the First World, maybe we are. And how long will it be before we mark that boundary from the wrong side?

I’m willing to leave the geopolitical and macroeconomic implications of all that to the politicians. What concerns me is what concerns just about any ordinary person–how to make it from day to day in a Third Wold, or nearly Third World, country. Obviously, the best way to research this question is to ask people who’ve done it, more or less successfully, all their lives–the ordinary, would-be middle-class people from various Third World countries. Or at least to learn as much as possible about them.

So, based on what we know about real life from Third Worlders, here are some basic suggestions:

* In unity there is strength. Extend your family as far as you can. Begin with real relatives, by blood or marriage, and then quasi-relatives (exes and steps and their families) and then what anthropologists call “fictive kin”–godparents and foster siblings and so on. Cultivate these relationships and use them for the benefit of all concerned.

* One of the most important ways to do this, of course, is to share living space, especially if somebody in the family has a large, fully-paid-for house. This gets everybody economies of scale in housing, utilities, and food. It also puts people who have both jobs and small children within easy reach of potential baby-sitters with neither.

* If you can’t extend your family, you can at least create one. Get married. Form close friendships. Join cooperatives of all kinds. Join the church/ synagogue/mosque/ coven/whatever of your choice. Making it will be hard enough in the company of others. Alone, you’re probably a dead duck.

The only possible exception to this rule is children. Third Worlders typically have them–lots of them, if possible–for retirement insurance. But Third Worlders generally are required to expend fewer resources up front on their kids than American child labor and compulsory education laws allow. Give this one some thought.

* Stay out of the official dollar economy as much as possible. The IRS, of course, frowns on “off the books” income and untaxed barter. But even they have not figured out how to tax you on the value your do-it-yourself activities add to your assets. The official money economy in Third World countries is rigged to underpay the non-rich to the point of starvation while extracting from them in prices and taxes more than they can possibly afford. The only way to survive in such an economy is to stay out of it, both for production and for consumption (including credit–borrow from family, borrow from friends, borrow from your business colleagues, and then let all of them borrow from you, but stay out of the official credit market if you possibly can. Likewise, don’t lend in that market–that is, keep your savings out of banks.)

* Play the lottery–but not very much. It is true, of course, that your chances of winning are slightly less than your chances of getting struck by lightning. But they are also only slightly less than your chances of attaining the American Dream in any of the official legal jobs likely to be open to you. Buying a ticket nearly doubles your chances. It’s hard to beat odds like that. But buying more than a couple of tickets a week is a bad investment of money you should be using elsewhere. By the way, if the prior history of American lotteries is any indication, this batch will be around only another fifteen years or so, so take advantage of them while you can.

* Use public amenities creatively, while there still are any. Their days too are numbered, but while they last, public schools, public libraries, public parks, public hospitals, and similar amenities are usually perks of living in a particular locality. Therefore, given your limited stock of housing dollars, you are usually better off spending them on cramped, shabby accommodations in an affluent town or neighborhood than on a commodious, well-appointed place in the slums. Besides, your well-off neighbors are more likely to have jobs for you–both long-term and free-lance–than slum-dwellers are. And they are usually canny shoppers, so the assortment of merchandise available to you in local stores will be higher quality at lower prices. You and your family will have a much better chance of making good business contacts too. In short, unless you have ambitions in local politics, it is better to be the poorest person in a rich neighborhood than the richest person in a poor neighborhood.

* Education will still pay off, but will be a lot harder to get, and won’t necessarily bring your income above the poverty line. Nevertheless, get as much of it as you can, and try to keep as much of it as possible in four-year colleges, which produce slightly more respectable credentials than community colleges. You may want to consider going outside the U.S., at least for your bachelor’s degree–it could be cheaper.

* Don’t plan to retire. You will probably never get a private pension, and the value of your Social Security grant will almost certainly diminish rapidly. So be prepared to look for the odd jobs you are still physically capable of doing, most notably childcare and other home help for employed family members.

* Stay healthy. If you can’t stay healthy, at least try to stay out of the official health-care system, which you probably can’t afford, and which probably can’t do much for you anyway. Better you should spend your health care dollars on (a) studying self-care; (b) alternative practitioners recommended by people you trust who have not yet died of their own health problems; or (c) if you must use “official” practitioners, use the lowest professional level available to you–that is, better a Physician’s Assistant than a physician; better a Nurse Practitioner than a PA; better a Registered Nurse than a NP; better a Licensed Practical Nurse than an RN. The lower down on the professional scale you go, the more personal attention you are likely to get. Whenever possible, stick with practitioners you pay out of your own pocket–they’re cheaper, they are accountable directly to you rather than some insurance company, and they still realize you have the option of not coming back next time if they screw up this time.

* Stay morally connected. Be active in religious, neighborhood, civic, and volunteer organizations. They will remind you–when it is very easy to forget–that there is more to life than survival, and that, even if the big corporations that control the few remaining permanent full-time secure jobs consider you less than the dust beneath their big wheels, there are plenty of people around you to whom you are not merely valuable but essential.

* Similarly, if you have some sort of artistic or intellectual talent and can’t get the official purveyors of culture to take notice of it, don’t let that stop you from putting it to work in blogs, local newsletters, murals, amateur theatricals or whatever, which are probably the only art your friends can afford. Who knows–someday it may get noticed by the official critics. But even if it doesn’t, you have given and received pleasure.

* You may have to do a lot more groveling than you are used to. It is possible to be marketably obsequious and still keep your self-respect, simply by maintaining your objectivity behind your mask (see W.E.B. DuBois.). We Americans have long believed that people who have attained wealth and prominence must be more deserving than the rest of us. As long as ordinary people had a reasonably decent chance of achieving some wealth and prominence of their own, that was a harmless delusion. Now, however, it is dysfunctional and can even be deadly. The only way to survive psychologically and morally in a Third World United States is to be absolutely certain that, as a human being and a citizen, you have the same ultimate value as any other human being and citizen. Do not allow yourself to become part of any institution that undermines that conviction, unless it pays you relatively well. And withdraw your attention and your allegiance from the artifacts of commercial culture that undermine your sense of your own value as a human being. Nobody, after all, is paying you to watch television, so your family loses nothing if you stop watching.

* Do your homework–speculative fiction is rich in models for the world we are moving into, from the novels of John Brunner (The Sheep Look Up, Stand on Zanzibar, The Shockwave Rider) to Philip Jose Farmer’s “Riders of the Purple Wage” to Robert Heinlein’s future histories. Not to mention, of course, Orwell’s 1984 and Aldous Huxley’s Brave New World. See also Strieber and Kunietka’s Warday and Nature’s End. These are just my particular favorites–there are lots more where they came from. If there is one thing we have learned in the past fifty years, it is that if the warped mind of a speculative fiction writer can imagine a shape for a future dystopia, the grasping hands of a political or economic establishment can implement it.

* Furthermore, there are plenty of ways to learn more about how today’s real-life Third-Worlders are managing. Among the goodies available in public libraries are magazines and newspapers from such places, many of them in English (which is, after all, one of the official languages of India, the Philippines, and many African countries.) Much of the fiction of modern India, the West Indies, and Africa was written in English, and much of it is richly informative.

* And note, by the way, that used books are probably one of the cheapest forms of recreation available. The only thing cheaper are the public domain books available for free on your computer or iPhone.

* Learn to like rice and beans. Together they make up the complete protein necessary for good nutrition, as well as containing lots of fiber. With a little celery, onion, and garlic, they can provide most of your nutritional needs for literally pennies a day. They’re probably healthier than whatever you’re eating now.

* Don’t drink the water. Not unfiltered, anyway, and not bottled—that’s just a waste of money and of valuable natural water imported from many places that need it badly themselves, like Florida. Pick up a used scouting handbook and find out all the cheap and quick methods to purify questionable drinking water. Note that, if you live in the country, the air may smell better, but your drinking water may already be dangerously contaminated with pesticide and chemical fertilizer runoff. Urban problems will be different, but just as serious.

I am not suggesting that the ThirdWorldization of the United States is a good thing, or only trivially harmful. On the contrary, for most ordinary people, it can mean perpetually living on the edge of catastrophe and occasionally slipping over it. But it is time we started getting prepared for it, while we still can.

Red Emma

In Praise of Folly

February 20, 2011

Item 1: So far, there’s one consequence of the 1995 government shutdown that I haven’t seen anybody mention. It was the direct cause of President Clinton meeting Monica Lewinsky. Ordinarily, she would never have spent any substantial amount of time in the Oval Office, as a mere intern. But when Congress shut off the money supply to the federal government, Monica, like all the other interns, suddenly became essential. She could continue to report to work, and even get enlisted to do things interns normally never did, because, unlike the civil servants who normally did them, she didn’t get paid. The pundits now discussing the possibility of another shutdown generally think the Republicans lost that round. If you factor in Monica, that’s not so clear.

Item 2: new entries on the Bennigan’s Index—of the two new eateries advertising their plans to open up within one block of my office, only one has actually done so. I’m getting really skeptical about the other, given that it’s been six months now. And in the meantime, two other cheap eateries in the next block have closed down. This is not encouraging. And of course, Giordano’s Pizza has just filed for bankruptcy.

Item 3: which leads one to wonder. Last year, several economists mentioned the second round of the Great Depression that started in 1937 as a direct result of Roosevelt cutting spending and raising taxes to reduce the deficit. This year, nobody’s talking about it at all. Instead, the GOP is suddenly utterly panicked about the deficit, which of course bothered them not at all when Bush was running it up in the first place.

Item 4: speaking of which, Mr. Wired is watching the SyFy [sic] Channel marathon of disaster movies, which this week is mostly about snakes gone wrong. Roger Ebert once characterized a certain genre of films as “idiot movies,” in which every time a character had to make a choice, it was always the stupidest choice possible. Most of the SyFy disaster films are more like the Ten Little Idiot genre, in which we watch a whole series of characters make such choices, and we get to bet on which one is still standing at the end. Not unlike presidential primary season, except that even the meanest monster snake is still kind of pretty, compared with many politicians.

Item 5: And Republican Congresscritter Mike Beard (see seems to think that if he eats all the pie, G-d will put another one in front of him. Didn’t the Greeks have myths about this?

Red Emma

The Politics of Politics

February 16, 2011

You’ve probably noticed the phenomenon yourself. Any discussion can be completely derailed, any subject can be avoided. All you have to do is say “Well, that’s just politics.” End of discussion. On to the weather and organized sports. Amazingly enough, even elected representatives can blacken one another’s reputations simply by accusing each other of “playing politics” with some important issue. Politics is a dirty word among Americans. Calling someone a politician borders on libel.

It was not always thus. Aristotle said politics is the main thing that distinguishes human beings from lower animals. (Which tells you how little Aristotle knew about cats, for instance. But I digress.) Politics, after all, is the way people make collective decisions, usually about our various visions of The Good, or about distributing scarce resources, without resorting to violence. In most other cultures, politics (a/k/a public service) is still an honored profession. In Central Europe, post-communist politics has achieved a new birth of respectability. What makes American attitudes about politics different?

Politics has been defined as the “manipulation of power,” and as “war by other means.” Usually, when we talk about “playing politics”, we are referring to something else, to what we call “party politics” and James Madison would have called “faction”–putting the success of one’s own group ahead of the merits of the issue in question. It is this sense of the word which we usually have in mind when we talk about certain things being “above politics”–for instance, that “politics stops at the water’s edge,” i.e. that foreign and military policy are “above politics.” Similarly, we appoint government functionaries through civil service, and appoint federal judges for life, to keep them “above politics”–that is, not beholden to or under the control of any particular “faction.”

But, like Madison, we tend to think “faction” is a bad thing because we see it as based on nothing but personal or group advantage. “Viva Yo,” as the Spanish put it. If a faction takes an ideological position of any substance at all, we assume that position is somehow conducive to the personal advantage of faction members, or they wouldn’t be adopting it. There is some basis for this, of course. Very few people who do any serious thinking about public policy issues arrive at positions that are likely to work against their personal advantage and survival. Most of us figure that what’s good for me is also good for just about everybody else, everybody who matters, anyway. But the real purpose of politics is not merely to allow factions to compete for advantage, but to allow divergent visions of The Good to compete for public support and power.

The other aspect of politics which most disturbs ordinary Americans is the necessity of compromise, splitting the difference, making sure everybody leaves the table still a bit hungry. To decide any issue this way, we think, is to start by presuming it can’t be very important. If it were, we would fight to the last drop of blood. Once a question transcends politics in this sense, war cannot be very far away. Once slavery stopped being a normal part of life, like breathing air, and became a moral issue for both sides, politics failed and war became inevitable.

Which puts an entirely different slant on placing anything “above politics.” That which is above politics is also beyond civil dispute. If “politics stops at the water’s edge,” then foreign and military policy lie outside the operation of democracy. Somebody–who may or may not have been popularly elected–decides what that policy should be, and our elected representatives then buckle down to supporting and implementing it. Even if circumstances change, so that a workable policy become unworkable, or a morally neutral policy becomes an abomination, the people and their representatives must continue to implement it to the bitter end. Any attempt to call a halt, for instance by cutting financial support, would be “playing politics” with national security, or so the supporters of the status quo insist.

Similarly, to say that education, or the environment, or other matters of public policy, are “outside politics” is to say either that we are prepared to “go to the mattresses” for them, or that we have unanimous agreement on The Good in those areas. No doubt there have been periods in our history when the latter was true. But, more often than not, this is simply wishful thinking among partisans of one or another vision who desperately want everybody to stop all this arguing and let them get on with their work. Merely wishing, however passionately, will not make it so.

We have to accept the fact that most communities and nations–and particularly ours–are host to numerous factions competing both for material advantage and for their visions of The Good. If we downplay the political realm as a place to play out this competition, we do not thereby eliminate competition. We merely force it to happen in other arenas and by other means. The most common alternatives are violence and money. If you cannot get a hearing for your vision of The Good within the political forum, you can always assassinate one of the more legitimate contenders, or buy off his supporters. Both of these alternatives to politics are popular in Third World countries, and both have achieved some currency even in the U.S. and industrialized Europe as well. The political realm, because its participants can so easily (and often deservedly) be accused of using public funds and facilities for personal advantage, has a hard time protecting itself against infringement by money or violence, and an even harder time distinguishing, in practice and in theory, between personal advantage and ideology.

In countries where, as here, the political realm still exists in a more or less healthy condition, it needs a few things to insure its preservation:
(1) better mechanisms for drawing more people into political dispute, especially people whose opinions are not normally solicited or listened to;
(2) a clear message that dispute is legitimate, and nothing is “above politics,” including ongoing military conflict, national security, and data and principles agreed on by scientifically-educated people; and
(3) mechanisms for public education about issues currently under public dispute, in structures accessible to any interested citizen, and encouragement of a strong ethos requiring those who take part in public debate to educate themselves first. What the “public square” did in a rather rudimentary but thoroughly personalized way in ancient Athens or revolutionary Philadelphia, the Internet is equipped to do in a somewhat shallower but far broader way. For the first time ever, we are technologically equipped to exercise democracy in cities larger than the Aristotelian fifty thousand families.

The questions that so far have been adjudged to “transcend politics” are all, of course, “controversial,” which is what we call any topic when we don’t want to discuss it. What the word actually means is that people disagree about it, and feel strongly about their opinions on all sides, but cannot imagine allowing their minds to be changed by rational argument.

So far, the U.S. has managed to form and preserve a relatively healthy political forum by keeping the really hot “controversial” topics out of it, or allowing discussion within the political realm only by properly licensed “special interest groups.” Such groups are likely to explore an issue more thoroughly and extensively, but they are not necessarily more knowledgeable than the average person on the street. On the contrary, they may just be better organized and more enthusiastic in spreading ignorance and misinformation (and sometimes even disinformation.) Which would be okay if all sides had an equal chance to be heard. But that kind of opportunity depends on all kinds of often unpredictable variables. Money helps a lot. Enough of it can guarantee a hearing. Being perceived as controlling a lot of votes or a lot of publicity is the next best thing. Absent these advantages, the best an interest group can do is try to get a lot of money or a lot of votes, and then parley them into access. Merely having strong, well-researched, carefully-thought-out, well-expressed opinions will not do the job. Maybe we need a more open political realm where it would.

Part of our problem is not merely that we distrust politicians (although, heaven knows, we do!) but that we distrust the political art, even (perhaps especially) when practiced by sincere advocates who are not pursuing their own material advantage. “Rhetoric”, which originally meant the art of persuasion, is now a synonym for the barnyard epithet. Most of us resent anyone who merely states a position without prefacing it modestly with “It’s only my opinion, but…” Anybody who has the nerve to try to change other people’s opinions–except, of course, in the mode of commercial advertising–is somehow infringing on our right to believe whatever we want. The converted are now the only people it is acceptable to preach to. Indeed, most advocacy activity these days is specifically directed only toward inactive sympathizers, and its purpose is not to change their opinions, but to persuade them to act on the opinions they already hold. The only non-sympathizers who can legitimately be confronted with one’s opinion are legislators and other public officials. The purpose of such confrontation is still not to change their opinions, but to change their official actions. We don’t really expect politicians to have opinions of their own, but only to weigh the vote-power behind the opinions of their constituents and act accordingly.

The blogosphere itself, the virtual ground on which we here confront one another, is one of the political arenas with the most potential for civil discourse among widely divergent constituencies. It can easily break down into either a commercial forum for sale to the biggest advertiser or a batch of mutually inaudible echo chambers for the narrowest possible ideologies. But the fact that nobody is paying us to be here, and that we have so far managed to refrain from both real and symbolic threats against each other, is a good augury. This may be the ground on which the American polity revitalizes itself, and we—with all our flaws, crochets, and ideosyncrasies—may be among those who can make it happen.