Archive for March, 2010

A Match Made In California, and Other Ephemera

March 29, 2010

I’m sure that somewhere, there must be a personals section of some publication which starts out with an ad for “middle-aged, pudgy, balding man with good sense of humor and strong sex drive seeking “real woman with curves, PhD, and black belt for good times,” and halfway down the page shows another ad for “real woman with curves, PhD, and black belt” seeking “middle-aged, pudgy, balding man with good sense of humor and strong sex drive for good times.” It probably happens fairly often with slightly less precise duplication. So was I the only person noticing the other day, when NPR broadcast at the beginning of ATC a piece on the dire finances of the California state government, which is cutting all sorts of essential services for lack of funds, and maybe 20 minutes later, another piece on a legislative proposal in the same state to decriminalize recreational use of marijuana by adults and permit local governments to regulate and tax it? This would greatly alleviate the financial mess in the Golden State by simultaneously reducing expenditures on police, courts, and prisons while providing a new source of revenue. Duh!!

And then there’s the likely new prime minister of Iraq, Allad Allawi (sp?) who is routinely referred to in the news as a “secular Shi’ite.” My Jewish affiliations make it easy for me to recognize this locution—it identifies the politician in terms of the mosque at which he does not pray. Reminds me of the want ad allegedly posted in a Belfast paper twenty-odd years ago, which specified that the job was to be filled by “Protestant—need not be Christian.” Now that we in this country are seeing a lot of “cultural Catholics,” we need to get used to this stuff. John Steinbeck, in “Pepin the Great,” depicted a French Chamber of Deputies in which one of the major parties was the “Christian Atheists.” It’s only a matter of time.

And then there’s the current government of Burma, which is asking the world to allow it to make “a gentle transition to democracy.” Hard to know how to take that.

Red Emma

Selective Tax Resistance

March 17, 2010

As I follow the blow-by-blow narrative of the Battle for Health Care Reform, I am overwhelmed by nostalgia sometimes. The spectre of “socialized medicine” and “government takeover,” for instance, has been around since well before Harry Truman made his deal with John L. Lewis that first set up the link between health insurance and employment. OTOH, the AMA’s position has shifted in interesting ways. And the role of the Catholic Church and the Right-to-Birth movement are brand new. But they are raising an issue that actually goes back (within living memory) to the Vietnam War, and arguably to the origins of our nation—conscientious objection to tax payments for certain purposes.

For the benefit of those of you who were too young, or too politically uninvolved, at the time, a lot of people objected to the Vietnam War. The presence of the military draft may have been a catalyst for those objections, but ultimately a lot of people who were not subject to it found other ways to put their objections into action. Many of them refused to pay federal taxes, or that part of their federal tax burden that they deemed payable for the expenses of the Vietnam War.

For more info, see Various friends of mine refused to pay their phone tax, or deliberately worked for wages below the taxable level, or refused to file their tax returns, or filed but did not pay, or paid some specified amount less which they called the war tax deduction, or made out their tax checks to some non-military arm of government such as the Department of Health, Education, and Welfare (as it was then.) [I tried several of these methods at one time or another. But as the daughter of a super-ethical CPA, I could not bring myself not to file.] I also did a fair amount of legal work for tax resisters later on in my career.

Many tax resisters and their sympathizers also supported things like an alternative war tax fund—a way for tax resisters to contribute their tax money to non-military purposes. Sometimes they proposed to make this alternative available to people who could demonstrate their opposition to war in more or less the same way conscientious objectors to actual service in the military could demonstrate their opposition to the Selective Service System.

All these varied branches of the tax resistance movement had two things in common: they were a noble effort based on serious thinking about the role of taxpaying citizens vis-à-vis the military activities of their government; and they didn’t have a chance in hell of succeeding.

Or at least that was what we all thought then. Now I think maybe we should take a new look at tax resistance. Because the Religious Right, the Catholic Church and its various agencies, the GOP, and many of the Tea Partiers, have not only raised the issue of refusal to allow their taxes to fund what they view as the taking of innocent life, but have actually succeeded in legalizing it. They have rammed it through Congress, first in the form of the Hyde Amendment (first attached to appropriations bills for funding what was then the Department of Health and Human Services and, in particular, Medicaid, in 1975, and routinely attached to those appropriations bills every year since then), and more recently in the form of the Stupak Amendment to Obama’s Health Care Reform proposals.

The parallels to war tax resistance are compelling. Like war, abortion is a legal activity. Like war, it is essentially destructive. Like war, nobody is really comfortable with it, but most people reconcile themselves to it in certain limited instances. And, like war, abortion can be, and in many other countries is, financed by the taxpayer.

But, unlike war, abortion can legally be conducted without federal financing. (When a war is run entirely on private money, it ceases to be a war, and becomes privateering or criminal gang activity.)

So the American polity has essentially accepted the legitimacy of refusing to pay taxes to support certain legal activities which are morally offensive to some but not all of the citizenry. Indeed, we have extended it well beyond the boundaries respected by Vietnam War opponents, who merely asked that their own particular tax monies be kept out of the war chest. Hyde and Stupak have demanded, and gained, the right to keep anybody’s tax money from paying for abortions, even the money contributed by pro-choice taxpayers. Why do we apply that approach only to abortion? Are the civilians of Iraq and Afghanistan any less innocent than American unborn children? It is a statistical certainty that some of the “collateral damage” casualties in those countries are pregnant women and their unborn children. Why are we willing to legalize tax resistance to protect American fetuses and not Iraqi and Afghan embryos? (BTW, I can’t even take credit for originating this idea. Philip Roth–not ordinarily one of my favorite authors– does a wonderful riff on it in Our Gang, published in 1971!!!, far beyond my poor power to add or detract.)

Now is the time, obviously, for opponents of the war in Iraq and Afghanistan to demand for ourselves the rights won by the Pro-Birth movement. There is nothing so powerful as an idea whose time has come again.

Red Emma

The New Scarlet Letter is an O

March 17, 2010

And it doesn’t stand for Oprah, except during her fat periods. We who have learned to accept, non-judgmentally, ax murderers, prostitutes, and crooked politicians, still feel free to discriminate against and humiliate people whose weight is higher than their IQ.

The use of physical fitness and health as metaphors for moral and spiritual soundness dates back at least to classical Greece. Socrates, for instance, repeatedly compares the struggle for moral self-discipline to athletic training. St. Paul says the same thing, on several occasions. But there is a substantial difference between saying that moral self-discipline is like the efforts required to maintain physical health and fitness, and our current ethos which holds that physical health and fitness are all there is to moral uprightness.

Diet and exercise are the only context in which ordinary people today ever use words like “sin”, “vice”, and “virtue.” While we are reluctant to appear “judgmental” about real moral faults–even those of child-rapers and mass murderers–we rarely hesitate to express our scorn for smokers and fat people. The moral energy we used to focus on the bedroom now centers on the dining room and the exercise room. Recent medical research, indicating that most fat people eat the same kinds and amounts of food as thin people, but metabolize it differently, seems to have made no dent in our disapproval. Neither do the medical findings that many hard-core smokers are genuinely addicted, and have no more control over their behavior than the alcoholics and junkies we are so reluctant to condemn.

So far, no one in our currently burgeoning ethics industry has analyzed this anomaly. My own armchair assessment is that we have set our moral standards extremely low, which makes us feel free to apply them extremely rigidly. A person can have the family life of Caligula, the political probity of Papa Doc Duvalier, and the disposition of Jack the Ripper, and we will at least tolerate his deviations from the moral norm, so long as he is trim, fit and healthy. (Caligula, by the way, was monstrously fat.) But someone with the altruism of Mother Teresa, the intellect of Albert Einstein, the political uprightness of Abe Lincoln, the family life of June Cleaver, and the professional devotion of Marcus Welby–if she also has a weight problem and a smoking habit–will be barely accepted in polite society.

One of the more unnerving results of this superficial morality is the ease with which many child-rapers and mass murderers convince some board of state officials to parole them. Most long-term male prison inmates under 50 are fanatic in their pursuit of physical fitness. It’s easy to keep fit if you have half-decent exercise facilities and nothing else to do with the next twenty years of your life. My guess is that parole board members, like the rest of us, equate physical fitness with moral probity, at least subconciously. The nastier an inmate’s crimes are, the longer his sentence will be, and the more time he will have to achieve the body beautiful. Which may even suggest that a prisoner’s likelihood of parole increases with the viciousness of his original offense. If this fearsome prospect isn’t enough to refocus our moral energies onto real moral issues, we are probably too far gone to be salvaged.

Jane Grey

The Zapping of the American Mind

March 15, 2010

In Chicago some years ago, a controversy arose within the police department about its normally energetic and successful drive for employee contributions to the United Way. That year, many police officers decided not to make their usual generous donations. The cause of this mass defection was a United-Way-funded program making an attorney available by telephone to consult with people who have just been arrested for any misdemeanor or felony. The time frame in question is the interval between arrest (and the concomitant, mandatory recitation of the Miranda warnings in the suspect’s primary language) and the arraignment/bail hearing. And the purpose, according to the proponents of the program, was to make sure the suspect really understands that he has the right to remain silent.

Many police officers were outraged at a program which virtually guaranteed they would get no information from criminal suspects. Since that is precisely the point of the Miranda warnings–with which the police have coexisted more or less comfortably for over forty years–I find it hard to sympathize with them (although, of course, they have every right to choose not to fund such a program.)

But as an English teacher, I am deeply concerned by the necessity of such a program. Face it, by the time an offender is old enough to be tried as an adult (even in Illinois, which sometimes allows such arrangements for 12-year-olds), he has probably heard the Miranda warnings recited at least fifteen hundred times on prime-time cop shows and in movies, quite aside from any occasions on which he may have heard them “live” in the course of personal encounters with the law. If he still needs a lawyer to tell him that “you have the right to remain silent” means “shut up until your lawyer gets there,” that says something very disturbing about the way the average American processes information.

This problem arises in plenty of places other than police stations, and is not confined to poorly-educated people with scanty knowledge of standard English. In one of my college classes, an intelligent, literate student who speaks standard English with no accent whatever barely escaped disaster on my one-hour midterm exam. The instructions indicated clearly that questions 1 and 2 were 15 minutes each, and question 3 was 30 minutes. The student told me afterward that she read the instructions carefully (which I always remind them to do), including the time limits–and then spent more than 50 minutes on the first question.

In my other incarnation, as an attorney, I once had a client who promised to make a substantial payment on fees he already owed me, if I would represent him in a pending hearing. After the hearing, I asked him when I could expect the payment. He said something like “I won’t be paying that.” I asked him whether he recalled making the promise–he did–and then I asked him what he had meant by it. “I don’t know what I meant,” he replied. Now, that may have been only his rather ungraceful way of avoiding admitting that he had made that promise solely for the purpose of inducing me to represent him when he really needed it, and had never had any intention of paying me. Which, however disturbing it may be, is a problem in ethics, rather than in processing information.

But I have seen too many parallel cases to believe that. The problem–which probably has some Greek-root neurological name known only to Oliver “The Man Who Mistook His Wife for a Hat” Sacks–is not a perceptual disability, but the inability to allow information to influence behavior, even in the most crucial situations (like that of the recent arrestee in the first example.) Socrates, who held that all evil results from ignorance, would be dumbfounded.

Or is it inability? Is it perhaps a habit, long-hidden from consciousness and therefore almost impossible to break, a habit of resisting the impact of information on our behavior? Is it, perhaps, a necessary but overused defense mechanism arising out of a culture in which we are bombarded with constant demands that we stop, go, don’t smoke, see, hear, visit, buy, above all buybuybuy? Is ignorance the last refuge of the free mind? That would certainly explain the fact that most Americans–even highly educated intelligent people in intellectual occupations–will not admit to having learned anything in high school. And indeed, most of the first two years of college in this country (unlike the rest of the world) for all but the smallest elite, consist of a hasty review of what the students were taught in high school, precisely because they either did not learn it, or because they felt obliged to forget it immediately upon receiving their diplomas (in rather the same way the Pythagoreans postulated the soul, going from its previous incarnation into rebirth, was required to forget everything it had learned in its previous lives.)

There is similar resistance to allowing oneself to be influenced by religious objurgations. (Indeed, the willingness to actually pay attention to “preaching” and change one’s behavior as a result, is often regarded as proof positive of having joined a “cult”) and political speeches. Jurors regularly ignore judicial instructions (though studies indicate that may really be a problem of comprehension), and many of them also ignore evidence.

Unfortunately, of all of the sources of information in our universe, advertisers actually seem to have done the best job of circumventing customer resistance, apparently by casting their message as much as possible in terms other than informational. They try to provide either a non-cognitive esthetic experience which leaves the customer with a good feeling about the product, or a non-cognitive bonding experience which–especially among young male customers–builds loyalty. Information is not only irrelevant to those kinds of messages, it actually gets in their way.

Another source of such resistance may be the American legal system, with its proliferation of unenforced and often unenforceable laws. “It’s none of anybody’s business how fast I want to drive!” one respondent was quoted as saying in a newspaper article on people who drive 55 mph in the left lane (Chicago Tribune, Section 2, p. 1, 5/1/95). “It is…judgmental to decide how fast another driver should or should not travel. If and when I am stopped for speeding, I have no quarrel with receiving a ticket….However, I don’t appreciate another citizen justifying traveling just the speed limit in the passing lane in order to keep my speed in check.” Another respondent said “If I choose to risk a ticket by traveling at a more efficient rate of speed, the only people who need be concerned are myself and the local state trooper. To those who mistakenly believe that they are in danger simply because I am going faster than the arbitrarily-set speed limit, all they need to do is move over. To those who feel it is their job to keep me within the limits of the law, butt out!”

If, for instance, Chicago’s ban on downtown street parking (in effect for the last fifteen years) had been enforced, there would be no need for the various physical barriers erected outside the federal building after the Oklahoma bombing to prevent car bombings. Similarly, there have been numerous cases of a legislator proposing a criminal statute, only to find out (usually from his embarrassed research staff) that the conduct it would penalize was already forbidden by another law currently on the books but long forgotten. The NC-17 movie rating (not a law, but a voluntary regulation of the film industry) essentially means “R–but we really mean it this time!” If they had really meant it last time, it would be unnecessary. We try over and over again to command changes in behavior, and the only result is the piling of one ineffective prohibition on top of another (something the behaviorally savvy Jewish tradition specifically forbids, by the way. If you are going to eat bacon, you don’t have to have the pig ritually slaughtered.)

“Preaching,” “scolding,” and “nagging” are the words we use for any kind of discourse intended to change our behavior when we don’t want to change it. But ultimately, all information gets treated like “nagging” by most people most of the time.

And, as noted earlier in the case of the Delinquent Defendant (the Cashless Client?), this resistance to information affects not only how we deal with what we hear, but also what we say and how (if at all) it relates to what we mean. If I will not change my behavior because of what I hear or read, I also won’t change it because of what I say, nor will I expect you to pay any serious attention to what I say. (In the words of the old song, “How could you believe me when I told you that I loved you, when you know I’ve been a liar all my life?”) Probably the most outrageous example of this phenomenon in recent legal history was a case in Juvenile Court in Cook County, Illinois about twenty years ago. The state’s child welfare agency sent two neglected children who were in its wardship to an out-of-state foster home, and then made virtually no attempt to oversee their care. They kept filing reports, though–based on absolutely no information–that the children were doing well. A couple of years later, one of the children died as a result of abuse by the foster family, and the other was hospitalized with severe injuries. The office of the Public Guardian sued the child welfare agency for gross negligence in failing to check on the children regularly and report accurately. The child welfare agency responded by challenging the right of the Public Guardian’s office to bring the case, on the basis of a conflict of interest, because the Guardian’s office had believed the reports! (The Court didn’t buy the argument, fortunately.) We seem to have accepted all too readily the oriental maxim “Fool me once, shame on you; fool me twice, shame on me.” Anybody fool enough to believe anybody about anything (even once) deserves to be deceived, exploited, and railroaded. The deceivers are merely doing business as usual and cannot be held responsible for the consequences of the behavior of others who choose to believe them.

I have spent a fair amount of time in class explaining the legal consequences of various kinds of mendacity in the media, and my students have no trouble grasping that what Rush Limbaugh says about Hillary Clinton is probably libellous and what Ronco says about the Vegematic is probably fraudulent. But most of them have real trouble grasping why it matters. “Of course people lie on television,” they tell me. “That’s what television is for.” Which is a close relative of the old joke about how you can tell when (name your favorite crooked politician) is lying–”His lips are moving.” The medium is the utter absence of any reliable message. Orwell’s prediction of Newspeak–the total corruption of language to make a vehicle of political control–turns out to have overshot the mark. In our efforts to avoid Newspeak, we have turned American English into Nospeak–a vehicle of nothing at all. To choose to believe any communication, and modify our behavior in accord with it, is a tremendous and terrifying leap of faith, which most of us make once or twice in a lifetime, at most.

What, if anything, can a teacher do to breach the barrier between perception and behavior? How do we get across to students hardened against “nagging” the antiquated notion that letters have sounds, words–and combinations of words–have meanings, and it really matters whether you say “uninterested” or “disinterested”, “lie” or “lay”? Does repetition do it? (Probably not, or they’d come out of high school knowing a lot more than they do.) Is there a way to slip under the barrier, using the techniques of advertising? This is precisely what “Sesame Street” has done, with pretty good results. Can an individual teacher, without the high-powered special effects and resources of a national television show, do nearly as well? Or can a more subtle esthetic approach quietly dissolve the barrier, without the student even realizing it? I have known this to happen, often under the influence of poetry (either reading/hearing it or writing it.) “To every door,” says the Talmud “there are many keys, but the greatest key of all is the ax.” Or do we just keep throwing out bits of information and hope that some of them stick? The barrier is almost never completely impermeable (that way lies autism), but the things that penetrate it are likely to be an oddly-assorted and not especially useful conglomeration. Merely throwing out as much high-quality information as possible in hope of raising the quality of the total mix is too haphazard to be satisfying to most teachers (or for that matter, preachers, writers, poets, and politicians.)

Most commentators who predict the demise of literature, or of a particular literature or language, have done so largely in hope of getting credit for single-handedly reviving it. The current generation of English teachers and professional “naggers” has no such high-flown expectations. Most of us would be happy if high-school graduates would remember their sixth-grade grammar long enough to complete a sentence whose subjects and verbs come out even. We pity the high school teacher whose job seems to amount to painting the Mona Lisa on sand. We try to talk faster, so as to keep our instructions within the limits of our students’ current attention span, but we are not yet capable of being the “One-Minute Teacher,” and it’s probably just as well. Long ago, an unnamed talmudic wise guy asked the scholar Hillel to tell him the whole law “while I stand on one foot.” Hillel actually had an answer: “Love the Lord your G-d with your whole heart and soul and strength, and love your neighbor as yourself. This is the whole law and the prophets. Everything else is commentary.” But he could not resist adding, “Go and study it.” Presumably sitting down.

Jane Grey

International Politics As If People Mattered

March 11, 2010

There’s an old story about a town in the area of Eastern Europe that changed hands between Russia and Poland several times between 1850 and 1940. A couple of men met on the street there, and one of them told the other, “Ivan, I hear we’re about to become part of Poland again.” “Thank heaven, Boris,” said the other. “I don’t think I could have stood another one of those Russian winters.”

Which is a good starting point for any examination of how the nation-state, and the relations between nation-states, affect ordinary people who happen to live there. Another good starting-point was contributed by my former teacher Marshall Hodgson, who in discussing the wave of decolonization that swept the Muslim world after WWII, pointed out that freedom for a nation does not necessarily mean freedom for its people, at least not all of them.

The model we have been operating under since the end of feudalism is that a nation-state consists of a specific piece of territory and the people on it. The government of each state promulgates and enforces the laws by which the people live. Ideally (that is, in a democratic state), the government is chosen by the people (or at any rate, by a majority of the people.) But we accept any government as the legitimate sovereign head of a sovereign state, as long as it gets to be the government by a procedure consistent with the law in effect in that state at the time of its accession, whether that procedure involves majority vote by universal suffrage, lottery, single combat, or a hot game of spin the bottle.

“‘Sovereignty’,” as the science fiction writer Robert Heinlein points out, “lies between ‘sober’ and ‘sozzled’ in the dictionary.” Normally, it includes control of relations with other sovereign states, and complete control over “internal affairs” within the state’s territory.

Or does it? The Nuremberg trials and the various international conventions and treaties opposing genocide and supporting human rights have somewhat eroded the legitimate power of even a legitimate government over its subjects. Can a sovereign state simply place an entire class of its subjects in a state of outlawry, strip them of citizenship, the right of residence, property, liberty, and life, as long as it obeys its own laws in doing so?

It depends.

Such behavior is certainly violative of several different treaties and conventions. Many but not all of the nations whose governments engage in such behavior are signatories of some or all of these treaties. So, if there were some uniformly dependable enforcement mechanism for such treaties, it could be invoked at least against those signatories.

That’s a major ‘if.’ In a discussion of events in the Balkans in the 1990s, a friend of mine analogized the position of the United States to that of the biggest guy in a bar, in which a gang of bikers is beating up on some little guy. It is our job as the biggest guy in the bar, said my friend, to defend the poor helpless victim. My immediate response was “the biggest guy in a bar has exactly the same obligation as everybody else in that bar–to call the cops.” Which is what a uniformly dependable enforcement mechanism for human rights, with jurisdiction over all violations of international law, would be. At the moment, of course, in the forum of nations, there are no cops.

Before we start wishing for the establishment of such a police force, let us remind ourselves that the real police (even in societies where the police force is impeccably honest and efficient, and has the full support of the surrounding culture and most of the citizens) still have–and use–the discretion not to act. No police force is required, or willing, or (probably) able to act against every violation of every law. The best police forces exercise their discretion based on such criteria as: can we be spending our time and resources enforcing some more important law? Preventing more serious harm? Are other social mechanisms available to solve this problem as well as the police can, or better? The less admirable will ask such questions as: can we be protecting more important people (or their property)? Can we be arresting less important people? Can we protect the people most likely to vote us a raise? Any global police force would probably have to retain the same discretion not to act, perhaps subject to some more explicit criteria. So any proposal to set up such a police force ought to include the criteria by which they may choose to act or not to act, or the mechanism by which their involvement will be triggered.

Which brings us back to the present. The United Nations as presently constituted can’t function as the global police force, because it can act only through the Security Council, which can be immobilized by the veto of any one of its members. (Imagine your city council unable to mobilize the cops except by the consensus of every local ethnic and religious pressure group, plus the NRA, the local street gangs, and the Chamber of Commerce.) Aside from the obvious–no nation will cooperate in calling the cops on itself–there are also more complex relationships: no nation will call the cops on its historic allies. Which means the cops will be called, under current conditions, only on relatively weak and friendless nations. (Come to think of it, this is not too different from the way the real cops function in many localities.) So if you are a citizen of a Big Nation (or a Little Nation with a Big Ally), and your government is depriving you of liberty, property, or cultural autonomy, or even threatening your health and life, you are strictly out of luck if you expect any help from the UN or its various agencies.

That’s Problem Number One: while we now at least have international agreements forbidding genocide and other human rights violations, we have no uniform and reliable mechanisms for enforcing them.

What we do have, instead, (here comes Problem Number Two) are the various ways nation-states interact with each other, for better and for worse. At their most civilized, nations can make deals. Contracts. Leases, even. And can fulfill those deals peacefully. One of the more visible such arrangements reached its culmination in 1997, when the Territory of Hong Kong, leased by Queen Victoria from the then-emperor of China for 100 years, reverted to Chinese rule at the end of that period. The principals in the deal–the governments of China and Great Britain–behaved with scrupulous regard for each other’s rights. But the people who happened to live on the piece of real estate in question had virtually nothing to say about it. Many voted with their feet, to become citizens of more congenial countries. They had to do this using their own resources, after finding their own destinations and getting official acceptance there, an arrangement likely to be unattainable by Hong Kong’s least affluent citizens. Neither their previous nor their prospective landlord offered them any help in moving out. They had not been parties to the original lease (a contract between an absolute monarchy on one side and a constitutional monarchy on the other, neither of whom felt any obligation to the local residents) and they were not parties to its termination.

Yet another way sovereign nations can interact is by war and the threat of war. Consider the dispute, culminating in war, in the South Atlantic in the early ’80s. The subject was an inhospitable bunch of islands called the Falklands by the Brits (who had settled them) and the Malvinas by the Argentineans (who were geographically adjacent and claimed to own them.) The islands in questions are distinguished by some of the world’s worst weather, and a population of human residents greatly outnumbered by sheep and penguins. That human population, to a man/woman, wished to remain British. Their wishes were as irrelevant to the Argentineans and their allies as the wishes of the penguins and the sheep. The matter was decided, not by majority vote of the people most directly affected, but by the superior military force of the Brits–to whom, again, the wishes of the local residents were irrelevant, although in this instance they won out.

Of course, disagreements between nation-states can also be fought out by such relatively non-violent methods as trade sanctions and embargoes. The apartheid government of South Africa was undermined and ultimately overthrown with considerable help from 20 years of such sanctions. The Sandinista government of Nicaragua (a much smaller country with far fewer resources of its own) suffered the same fate after a much shorter period. The longest-lasting embargo–by the US against Castro’s Cuba–has crippled the latter country, but seems to have had little effect on the power of its government. The same could be said, until 2003 anyway, for the government of Saddam Hussein in Iraq. What is clear in all these cases is that the ordinary people on the street felt the impact of such sanctions before the government, and felt it more severely. You have to do a lot of damage to the people of a nation before its government will even notice. The less democratic a nation is, the more damage you have to do to the people to get rid of the government or change its behavior or even attract its attention. That, of course, is the whole point of being a member of the ruling class–relative immunity to the problems of the lower orders. If being Head Honcho doesn’t get you that, whatever else it does get you isn’t worth the trouble of showing up at the Oval Office every day.

Which is even truer of outright ongoing war. The only way to battle a country into submission is to break stuff and kill people. Most of the stuff, and the people, simply by the law of averages, will not belong to the ruling class. The odd American law specifically forbidding attempts to kill foreign heads of state as such is not even necessary to achieve this result (although it seems especially strange that dropping a bomb on Muammar Khaddafy’s limousine would be illegal, but wiping out the entire town in which he resides would be legitimate warfare, precisely because more people would be killed. Got that? It’s okay to aim at tens of thousands of people, but not at any particular one of them, especially not the one you actually want to get rid of.)

There have been instances in which the ordinary people most at risk from attack on the government nonetheless welcomed it. The ANC supported the trade sanctions against the apartheid government of South Africa from the beginning. Certainly Jews and other victims of the Nazis welcomed the victories of the Allies. Our information from Kosovo during the recent unpleasantness there indicates that the Albanians still residing there welcomed the NATO bombing. This is a heroic posture, analogous to calling in a bombing strike on one’s own position to wipe out the surrounding enemy. One cannot expect it, still less demand it, from ordinary people in every situation that might require it (any more than the police can expect the enthusiastic cooperation of local civilians in the “War on Drugs” in American cities.)

Problem Number Three: Aside from economic sanctions and military action, what else can be done to protect people from their own governments? Accepting and aiding refugees is the most directly useful response. People who are willing and able to leave their homes and their countries may be able to find sanctuary elsewhere. This is scarcely a universal solution, however. It not only does not prevent “ethnic cleansing” and similar human rights violations, it encourages them. A ruler who wants to get rid of a particular group of people can steal their property and throw them out without even having to worry much about his reputation in world opinion. Once the “untermenschen” are gone, their plight ceases to be an issue. People who insist on reviving old grudges from safe haven elsewhere–the Armenian nationalist groups, for instance–are viewed as irredentists, revanchists, and crazies. Even the Jewish demand for economic reparations from the beneficiaries of the Holocaust is viewed as greedy and irrelevant. If I take your wallet, I’m a thief and may go to jail. If I take all your property and throw you out of your house, I’m a home invader and will almost certainly go to jail. But if I take all the property of hundreds of thousands of people and throw them out of their country, I will probably never see the inside of a prison, and I may even die a respected head of state.

That, of course, assumes that safe haven exists for each group of refugees. There is at least some evidence that Hitler’s original intention was simply to evict the Jews from the German homeland, and that the more murderous side of the Holocaust developed only when it became apparent that most European Jews had nowhere else to go. Since then, the UN has made itself useful by establishing refugee camps and facilities wherever people fleeing from (or evicted by) the depredations of their own government can get to. Many of these camps have become permanent fixtures in other countries, with highly visible effects on local politics and economies. The residents of such long-term camps cannot reasonably be expected to forgive, forget, and get on with their lives. They don’t have to be crazy to go on carrying old grudges and demanding return to homes that no longer exist. And the countries that host such encampments cannot be blamed for being unhappy about it. The UN may subsidize the camps and their operation, but they can do little about the social and economic impact on the vicinity.

Permanent resettlement of refugees usually works well for the refugees themselves and most of the nations that receive them. But, as previously noted, it is also an ideal solution for the original oppressor, and can only encourage those who seek to emulate him.

Short-term resettlement of refugees in camps has historically tended either to return the refugees to the same murderous circumstances they left in the first place (as in several African genocides in the last 30 years) or to shade into long-term arrangements with all the drawbacks previously noted.

The real problem (Number Four, if you’ve been keeping track), of course, is: what do the nations of the world really want to do about governmental human rights violations? Protect the victims in their own homes? Get them out of harm’s way? Temporarily or permanently? Force the perpetrating government to behave decently? Or get rid of that government? And what can actually be done by the organizations available and willing to do anything?

What the UN does–when the Security Council will allow it do to anything–is (1) care for refugees, and (2) provide peacekeeping forces where a peace agreement of some sort has already been reached. What various individual nations and alliances (such as NATO and the ad hoc grouping that carried out the Gulf War) do is fight wars–break stuff and kill people. If all you have is a hammer, everything looks like a nail. So the Kosovo problem, like innumerable others before it, was “solved” by refugee camps and bombs. The people on the ground in Kosovo could protest as the bombs fell around them, or they could cheer. But their attitude toward NATO would not affect their chances of being hurt or killed by those bombs. A sword makes a lousy shield. Even the best offense is useless as a defense.

What, if anything, do the techniques of nonviolent action and resistance have to contribute to the situation? Gandhian activist Ibrahim Rugova, who was first detained by the Yugoslav government and then released to a gilded exile in Italy, was apparently thoroughly discredited among his compatriots back home, who view him as naïve at best and a Milosevic tool at worst. But Quaker and other peace groups are, as always, widely respected for their work with the refugees. Some such groups attempted to aid the people of Kosovo on their home ground with food and medical care. This is obviously work worth doing, but it does nothing to stop ethnic cleansing.

From other side, from the point of view of nations that tried to stop or punish Milosevic in the 1990s, the debate raises yet another set of questions. Did any nation outside Yugoslavia have any legitimate national interest in Kosovo? There is no oil or other valuable resource in Kosovo. The other nations in the area may have had a strong interest in preventing or reversing the flow of refugees from Yugoslavia to neighboring countries, where they could upset the fragile ethno-political balance in Macedonia, or the marginally functional economy of Albania. But nobody further away than Greece and Turkey would likely be affected by even the farthest ripple of repercussion (unless, of course, somebody dropped a bomb on their embassy.)

The debate in the US Congress is the same one we have heard over and over since the end of the Cold War: if the US has no “legitimate national interest” in Kosovo, or Somalia, or Haiti, or Cambodia, or any other part of the world where there is no oil and in which none of the warring parties is communist, is there any other valid reason for us to take any kind of action–military or otherwise–against local genocide or human rights violations? The Gulf War was a no-brainer. There is oil in Iraq, and Kuwait, and Saudi Arabia, all either directly involved or directly threatened. But the other places hit us squarely in our most ambivalent nerve. Does a government have the right to commit its resources to defend or assist the citizens of some other country merely because they are being attacked or threatened by a tyrant?

The negative answer we often reflexively give to this question comes from two different places. The first, of course, is the ghost of Vietnam, which in turn is the ghost of “plucky little Belgium” in World War I. After the end of World War I, citizens of countries on both sides became aware that many of the “atrocities” alleged to have been committed by Germany and its allies in Belgium and France had been either highly exaggerated or actually manufactured from whole cloth. Public opinion became understandably skeptical about “atrocity stories” after that. Which is one of the reasons the nature and extent of Nazi atrocities against the Jews and other “untermenschen” beginning in the 1930s received so little credence in the Allied nations even after the information was widely available.

By the 1960s, we had no trouble believing atrocity stories. But the transparent lie on which the Gulf of Tonkin Resolution was based, and the badly crafted communist atrocity stories were still hard to buy. The atrocity stories we were most likely to believe were the ones in which American and South Vietnamese troops were the villains. And we had the same bitter taste in our mouth from having been fooled into an expensive, stupid war by a deceptive government’s propaganda after Vietnam as most Europeans had had after World War I. When we said “never again”, we meant that we would never again allow ourselves to be made fools of.

The second source of our discomfort at committing American resources to rescuing foreign victims of tyranny is a more abstract one, which has an analog in our view of the responsibilities of business corporations. We used to expect corporations to be “good citizens” of wherever they were located–to support local charities and civic activities and the arts. Increasingly, corporate boards take the position that a corporation’s primary, or even sole, responsibility is to make money for its shareholders. If corporate “good citizenship” can be subsumed into the public relations budget as one more way to increase sales, the board will accept it. But only as one more way to make money for the stockholders. If the stockholders want to make charitable contributions, they can and should do so individually out of the dividends the corporation provides them.

Charity, our business philosophers increasingly believe, not only begins in the individual home, but should end there. Only the individual has the duty, or the right, to give away his own resources without recompense. Any aggregate of individuals can legitimately act only for its own–their own–selfish interest. If John or Jane Doe is concerned about the plight of the Albanians in Kosovo, s/he can contribute to the Red Cross or UNICEF or, presumably, the KLA. A country as large as the US, with citizens from so many different backgrounds, cannot (in this worldview) properly have a foreign policy at all, except for the purpose of making America safer or richer, a goal we can presumably all agree on. As a result, many of the debates in Congress seem almost perversely directed toward disguising altruistic motivation as some kind of more broadly defined self-interest.

But all too often, the alternative is to do the opposite–to disguise self-interest as altruism. We are always more willing to go to war for the protection of people who have large numbers of compatriots and relatives living–and voting–in this country, or for people who look like us, or live like us, than for the starving dark-skinned strangers in Somalia and Sierra Leone. Does that mean we should decline to fight for or contribute to the Kosovar Albanians or the Bosnian Muslims because our motives are insufficiently pure? Or does it mean that we should take the claims of the Somalis, the Sierra Leonians, and the Haitians more seriously? Should we demand consistency, insist on defending everybody or nobody? Or can we continue to make ad hoc judgments for the flimsiest of reasons, because defending somebody is still better than defending nobody?

That’s a relatively brief (honest!) statement of the problem. Is there a reasonable and feasible solution? Ultimately, I think the only possible solution is a real, impartial, effective global police force, whose members and commanders would give up their citizenship in any individual nation, presumably in exchange for some really good employee benefits. We have been edging closer to such an apparatus throughout this century. It took World War I to create the League of Nations and World War II to create the UN. Will it take another world war to create a law enforcement system with compulsory jurisdiction over all governments? How about an invasion from Mars, against which all the nations of the world could unite and really mean it? Could some home-grown threat do the same job (an epidemic, for instance)? How about an ecological crisis, like global warming?

And, once we have a global police force, what methods should it use to do its job? The tactics of local police are being seriously questioned in this country these days, largely because of some glaring incidents involving brutality and apparently unjustified shootings of unarmed civilians. If we cannot train our local police to do their jobs with a decent respect for the rights of the people who pay their salaries, what can we expect of a larger force with more powerful weaponry? There have already been incidents of assault and rape committed by “peacekeeping forces” in many parts of the world. The core of the problem is Acton’s old axiom: all power corrupts. If the only counterweight to the abuse of power within a nation by its government is more power applied from a supra-government, who is to keep that power from being abused?

Marx, of course, would be amused but unsurprised at this situation, in which every solution seems to generate a new problem. Gandhi would view the problems as purely short range; nonviolent techniques, properly applied, he would aver, will eventually prevail. Any deaths suffered in the meantime should be regarded as “acceptable casualties”, just as civilian and military casualties in a war would be, except that the casualties of nonviolent action are likely to be considerably fewer. And building a community among the conflicting parties after the end of the conflict is likely to be considerably easier.

I personally find the Gandhian approach attractive. But whatever the techniques the “community of nations” decides to apply to solve these problems, it is absolutely clear that the current ad hoc reactions now in use are at best a waste of resources and lives. A general conversation needs to begin, among nations and within nations and among and within ethnic and religious groups and other communal organizations, about how intercommunal violence and governmental abuses of human rights can best be controlled. Every candidate for political office or communal responsibility should be expected to take a serious part in this conversation, and to be answerable to those s/he represents for that participation. It is up to us as the represented parties to hold them responsible, beginning with the 2012 election in this country.

Red Emma

By Any Other Name

March 9, 2010

Back in the year 2000, Preston King returned to the United States. Sorry, that’s Mr. Preston King. Actually, it’s certainly Professor King, who is head of the Political Science Department of Lancaster University in England. It’s probably Doctor King, which is usually how a person gets to be “professor.” And how he got to be Professor King of Lancaster University in England (rather than Professor King of some other university in his native land, the United States) was by insisting on being called “Mr.” by his draft board in Albany, Georgia, in the late 1950s. The draft board felt that “Preston” would do just fine, thank you, for a draft registrant of the “Negro” persuasion (they had called him “Mr. King” for a while under the mistaken impression that he was white.). King was unwilling to comply with any orders issued by an administrative agency which could not be bothered to address him as it would address a white registrant in the same situation. So he refused to comply with his induction order, and was convicted of draft evasion and sentenced to 18 months in prison. Instead, he headed for England, and established his career, his life, and his family there (where his daughter is now a Member of Parliament.)

King came back because President Clinton pardoned him in time for him to be able to attend the funeral of his brother. His family simultaneously mourned his brother and rejoiced over his pardon. Even the judge who originally sentenced him supported the pardon.

Many people under 45 may be just barely aware that there was ever a draft, or that people ever refused to comply with it. Some really erudite types may know that people resisted the draft during the Vietnam War. But Preston King’s act of resistance happened before Vietnam was a twinkle in Robert McNamara’s eye, and it was resistance, not to war, but to a particular form of racist rudeness. Some years later, another African-American, a woman from Alabama named Mary Hamilton, was cited for contempt of court and sentenced to jail for refusing to give testimony in a criminal proceeding unless the prosecutor addressed her as “Miss Hamilton.” The Supreme Court reversed her conviction within a couple of years*–a lot faster than Preston King got his pardon.

Both these stories may seem downright quaint to younger people today, even young African-Americans. So far as I can tell, nobody under 45, regardless of race or national origin, is willing, under ordinary circumstances, to admit to having a last name, much less insist on being called by it. I had occasion, some months ago, to deal with the corporate bureaucracy of some company who had warrantied a consumer gadget that was giving me trouble. I dealt most of the time with a young woman who gave her name as “Sue.” I made the mistake of calling outside the outfit’s business hours once, and got a voice mail that offered to connect me to the directory. The directory began by telling me, “If you know your party’s last name….” and I realized that “Sue” had never entrusted me with that information. Then I realized that, probably, the only people who did know her last name were her fellow workers, her personal friends, and her family. That was a shocking revelation, for one raised on Emily Post–the last name has now replaced the first name as the index of intimacy. I fleetingly entertained the fantasy of the lovestruck swain going down on his knees and telling his inamorata, “Mary, I love you. May I call you Miss Jones?”

Since that incident, I have made a practice of asking for last names when dealing with telephone voices and live functionaries. Most of them reply that their employer has a policy forbidding them to give their last names to customers. Which makes sense, sort of, because it is obviously against their religion to use the customer’s last name more than is absolutely necessary (i.e., the first time they call, to make sure they don’t have a wrong number.) The only way to tell the difference between legitimate callers and telemarketers is that the latter not only call you by your first name, they use it as often as can be grammatically justified, as a way to forge fake intimacy with a possible customer.

A closely related counter-phenomenon has turned up in the law governing the enforcement of child support laws. A woman claiming government assistance either in collecting child support from the father of her child or in getting any of the mingier substitutes for what was formerly known as “welfare” in the absence of such support, is expected to supply the appropriate government agency with not only the first and last name of the alleged father, but his Social Security number and date of birth. It is hard to imagine any of that information being part of the sweet nothings people whisper in each other’s ears in intimate moments. But apparently we expect the ardent male to provide it to the object of his momentary passion, even when we no longer believe he has any obligation to give his last name to the person to whom he is trying to sell aluminum siding. Last names are intimate. Social Security numbers and dates of birth are even more intimate. First names are for strangers.

Apparently this indiscriminate use of first names is viewed, by those who indulge in it, as “friendly.” Fine. I like my friends to call me by my first name. But strangers are not my friends. Not yet, anyway. By definition. The way a stranger stops being a stranger (without necessarily becoming a friend yet, as opposed to an acquaintance) is by introducing himself or herself to me, by both names. Depending on the situation, this may be the time to say, “But you can call me First-name, if you like.” Or I may introduce myself first, by both names, and possibly invite first-naming.

Most of the telephone voices and live functionaries I deal with either don’t introduce themselves at all, or introduce themselves only by first name. Either way they still insist on first-naming me repeatedly without ever being invited to do so. Even my bank’s ATMs call me by my first name. This behavior does not impress me as friendly. It impresses me as rude and presumptuous, and gives me great fellow-feeling for Mr. Preston King and Miss Mary Hamilton. Would I be willing, like them, to give up my native land or my freedom rather than suffer rudeness gladly? So far, I have not even been offered the choice.

So here’s a revolutionary suggestion to those whose business brings them into regular contact with the public, especially that part of the public whose members are over 50 or were reared in some other culture: don’t call people by their first names unless you are invited to do so. When making contact, introduce yourself by first and last names, and wait to be told how the other person wishes to be addressed. Dealing with the public is difficult enough without raising unnecessary hostilities at the outset. Remember that some of the people you deal with may still be willing to suffer exile or jail rather than put up with rudeness.

* 376 U.S. 650 (1964)

Jane Grey(that’s Ms. Grey to you)

Eliminate the Senate?

March 2, 2010

The strongest argument I have seen on the subject of eliminating the Senate is the statistical one: a voter (or for that matter a sheep) in Wyoming has the congressional power of 70 Californians. Senators representing 40% of the states and no more than 18% of the total population (not including DC and Puerto Rico) have the power to prevent the senate from doing anything, and legislation can be passed by 60 senators who may represent as few as one-third of the total electorate. The small states probably have enough interests in common to warrant setting up a small-state caucus. But the small-state caucus should not have the power to run the country.

The original intent of the Framers may or may not have been the protection of wealth and privilege, but it was clearly born out of a compromise between states that perceived themselves as sovereign nations and wanted above all to protect their sovereignty. Absent such protection, the Constitution probably would never have happened at all, and the United States of America would probably have long since gone the way of the United Arab Republic and the Union of Soviet Socialist Republics.

But the intent of the Framers doesn’t have much to do with the way Congress looks today. For instance, the House of Representatives was assigned two-year terms to keep them accountable to the will of their local constituents. Today, however, many representatives spent more time in Congress than most senators, and they keep getting re-elected with little difficulty.

The Framers also originally intended the House of Representatives to represent the interests of their local constituencies, and the Senate to represent the interests of the country as a whole. That system fell apart well before the Civil War, mostly over issues of tariffs and trade with their differential impact on the North and the South. Today we take it for granted that senators will represent the interests of the primary industry or business of their respective states.

And, of course, the cities do not figure in the equation at all. There are 38 states with smaller populations than New York City, and 26 smaller than Los Angeles. Each of those states gets two senators and at least one congresscritter. The cities get precisely none. Most states are cobbled together from one or two large cities and their environs plus a “downstate” or “upstate.” Senators, generally, get to be senators by juggling the interests of cities and downstates, and only the congresscritters from the cities get to represent interests of those cities, in which the largest number of Americans live.

The depopulation of the “heartland” is a problem worthy of national attention in its own right. We may want to reverse it, or, given the shrinking water supply in those states, to encourage it. But we should at the very least be talking about it in the national legislative forum. That discussion is not going to happen any time soon, because it is in the best interest of those who represent the Plains states to keep their constituencies small. It makes campaigning cheaper and re-election easier.

While the size of the senate is constitutionally limited to two senators per state, the size of the House of Representatives has no constitutional limit at all. It is supposed to fluctuate with the decennial census. But it was capped at 435 by the Reapportionment Act of 1929 (!!!), and has not been increased since. This means that the House is now almost as mis-proportioned as the Senate, and completely out of proportion to what the Framers had in mind. The original argument on the subject in the 1780s was between 30,000 and 40,000 citizens per representative. Today, the average number of constituents per congresscritter is upwards of 600,000, though of course it varies somewhat from state to state.

Of course, the size of congressional delegations dictates the distribution of the Electoral College. Which means that a voter from a small state has more say in electing our president than, say, a New Yorker. Again, I do not mean to imply that the denizens of Wyoming and Vermont are elitists and oligarchs bent on protecting their riches and privileges. But they have been handed a dominant role in the governing of their country on a silver platter, and they cannot be blamed for accepting it.

So okay, the current structure of the legislative branch of the United States government needs revision. Doing it would require a constitutional convention during which all kinds of other mischief could happen, up to and including a civil war. Such mischief may be a bit less likely if we talk about the issues in terms of statistical fairness rather than the privileges of oligarchical elites. The alternative would be continuing the current system until both houses maneuver themselves into total paralysis, and fall into totally ceremonial roles not unlike those of the Roman Senate after the end of the Republic. At the moment, that’s starting to look like a real possiblity.

Red Emma

Where is Your Next Miracle Coming From?

March 1, 2010

This Saturday, I was part of a fascinating conversation over lunch at our synagogue. It all started with trying to get our Hebrew-English grace after meals leaflet retyped to eliminate some glitches. We chatted about whose computer had Hebrew typeface, and I mentioned a mutual friend (let’s call her Lee) who used to be into self-publishing children’s books and sounded like the sort of person who ought to have Hebrew. Oh no, said our companion. She’s got too much to handle right now for that.

And then proceeded to tell us a remarkable story. Lee (who has always been a remarkable person anyway, having at one time or another worked as a professional clown, an improv actress, a playwright and poet, a self-published children’s book author, a police officer, and a lawyer) has been carrying on various of those professions over the last few years with a serious cystic disease which has been (unbeknownst to me) shutting down her kidneys. A couple of months ago, Lee stopped in to the synagogue during the week to run some errands, and stuck her head into the office to say hello to whoever might be there. She was greeted by the congregational secretary (let’s call her Anna), who was having a slow afternoon and was more than glad to chat. In the course of the chat, Lee chanced to mention that she would probably be going on dialysis in the next few weeks. Why’s that? asked Anna, and Lee explained about the kidney problem. Anna thought this over for a few days afterward. I can do something about this, she decided. She realized (she told somebody later) that she had never spent a day in bed for illness since she had her tonsils out at age 6. That kind of good health ought to be good for something, she reflected, and went to get tested for every parameter known to man. Sure enough, she turned out to be a really good match for Lee. The transplant was done two weeks ago. Anna is doing just fine, and Lee just got out of intensive care.

It’s nice to be part of a congregation in which miracles happen.

Jane Grey