Archive for April, 2009

What’s the Good Word?

April 30, 2009

A couple of months ago, I noticed, it was “FORE-closure,” accent on the first syllable.  For a couple of days, every newscaster I heard seemed to be saying it that way.

Then I noticed that print media were doing a new hack job on “lie” and “lay.” They used to use “lay” to mean both “place” and “recline.”  Now they seem to be using “lie” in both transitive and intransitive forms—“lie that cushion down here, okay?”

Today I heard a BBC newsreader talk about unemployment in “Ko-KO-mo,” accent on the second syllable.  His colleague on the ground in Indiana went out of her way to say “KO-ko-mo” several times in her report, and by the end of the piece, the guy back at the home office was saying it too, and apologizing for having gotten it wrong before.  Nice to know some people can actually learn on the go.

It all reminded me of a story I heard from a local newscaster when I took my students on a tour of the Museum of Broadcast Communications.  Back in 1994, when he was doing the late news, somebody handed him a printout that had just come through, about the death of Rwandan president Juvénal Habyarimana.  The reporter, who was due to go on the air in 30 seconds, knew there was no way he was going to get the pronunciation right on such short notice. So instead, he claims, he said “The President of Rwanda has just been killed in a plane crash. His name is being withheld pending notification of his next of kin.”

Jane Grey

The Ghost of Future Past

April 30, 2009

Last week, my toaster oven caught fire once too often, so I dumped it.  After careful study of Consumer Reports and a walk through various stores, I finally went to Sears and bought a small, reasonably-priced item that had been rated fairly high by CR.  Just to be on the safe side, I got the extended warranty with it, which I almost never do (virtually every time I have, the store I got it from has gone out of business. When clerks try to sell me on it, I tell them I’m protecting their jobs by turning it down.)  When I got it home and set it up, the first thing I saw, embossed on the glass door, was “in case of fire, keep door closed and unplug.”

Ummmmmm, I thought..

I’m an old hand at small appliances. I’ve been keeping house for 46 years.  The first toaster oven I got lasted more than ten years and cost something like $7.95.  Only the last two I’ve purchased ever caught fire.  And now they not only provide instructions in the manual for what to do in case of fire, they print them on the front of the appliance itself.  Like now the purchaser should expect them to catch fire?  Yikes!!!

OTOH, we have an Amana microwave that is at least 29 years old and still works pretty well.  It’s built like a tank, and cost a fair amount when we bought it.  If  I could find a toaster oven built the same way, I’d buy it.

Somehow all of this calls to mind the future we were envisioning in the early 1960s, the “Kennedy ‘60s” as opposed to the “Vietnam ‘60s.”  I kept reading stuff with “Post Scarcity” in the title.  Some colleges were setting up departments of Leisure Studies.  We’d need them, we were told, because pretty soon, most of our jobs would be automated away.  And the result would be, not unemployment, but leisure.  We wouldn’t need to buy stuff so often, because it would all last forever (hence the connection with toaster ovens.)  People were setting up intentional communities, to improve the quality of life by sharing major purchases, activities, and domestic labor (and, sometimes, sex.)

I can do without the personal jetpacks and thevideophone (well, okay, we actually sort of have videophones.)  But I want my 15-hour work week and a toaster oven that lasts till I get tired of looking at it.  When do I get my old future back?

Red Emma

Sin? Yes. Original? No.

April 28, 2009

Like most people these days, I’ve been hearing a lot about this being the 10th anniversary of the Columbine killings, and the latest book to explore the tragedy. In various places on the Other Blog, people talk about it as being the result of either original sin, or psychopathy, on the part of the killers, or at least one of them.

Original sin is not a Jewish doctrine. And my own view of it was best summed up by a Jesuit I used to work with, who told me that after his first month of hearing confessions, he had become absolutely convinced that there is no such thing as original sin—“it’s all the same damn thing over and over, nothing original about it.” And from the point of view of moral theology, it makes no sense. Those who believe in original sin believe it is universal. So why should it manifest in such terrible ways in one person and so utterly trivially in another?

The Jewish view on this stuff has to do with the Yetzer Ha Ra, the Evil Impulse. Everybody has one, and in fact some Jewish thinkers believe that it is stronger in seriously holy people. Presumably, like playing basketball wearing ankle weights, this trains the soul to greater strength. But it is never insuperable. It’s just something one has to take into account in working out one’s decisions. Moral competence involves being able to manage one’s Evil Impulse, or at least compensate for failing to do so.

Psychopathy presents a lot more problems for me. The current party line among shrinks is that it is a congenital mental defect, and we have no tools for curing or even alleviating it. It amounts to being born without empathy or conscience. Does that mean that a person born with this defect is not morally responsible for the crimes he commits? I have real trouble with this. Most courts most of the time do not consider this to fit the legal definition of insanity, even though it sounds as if it should (Illinois law says: “A person is not criminally responsible for conduct if at the time of such conduct, as a result of mental disease or mental defect, he lacks substantial capacity to appreciate the criminality of his conduct.”) And theologically, what follows logically from the existence and incurability of psychopathy is that some people are the moral equivalent of yersinia pestis, the bubonic plague bacteria—just “born to be bad”--original sin in single instances rather than as a general human property.

I am generally suspicious of any analysis of human character that traces its traits back to heredity (whether genetic or historical), or prenatal conditioning, or even early infant treatment. The further back we push the formation of character, the more immutable it becomes, and the more we feel we are justified in merely incapacitating or destroying people with bad character traits. That way lies eugenics and euthanasia.

So I guess I am doing the scientifically unpardonable—refusing to accept a scientific hypothesis, while being mostly ignorant of its validity, because I object to the social and political consequences of accepting it. In this case, of course, everybody is ignorant of its validity. This set of issues goes through styles, fashions, trends, and fads quite regularly, and is likely to continue doing so during our lifetimes anyway. In reality, our choice is between accepting the premise of early and immutable character formation, or rejecting it, with no solid understanding of its validity in either case. I am voting to reject it, because I do not want to live in a society that accepts it.

Red Emma

Sentence First, Verdict Afterward

April 24, 2009

Or, Crime Doesn’t Pay

(at least not like it used to)

Having a lawyer on your blog means occasionally being subjected to Long Lawyer Stories, or Long Lawyer Rants on the state of the system. Today, you get a rant on the state of the criminal law system.

First of all, real life is not Perry Mason. Back when I was an academic for a couple of years, I did a paper on Perry Mason. The research was fascinating. For one thing, the sun never sets on Perry Mason. Somewhere in the world, in some language, at any given moment, Perry Mason is being broadcast or cablecast. Among the results are some profound changes in other people’s legal systems, most notably that of the Italians.

The Italians, like most of continental Europe and Latin America and Japan, have what is known as a civil law system, rather than the common law system that rules in the English-speaking countries. In a civil law system, criminal cases are prosecutor-driven, but the judge does his own investigating, and the defense attorney is a minor player. In a common law system, cases are supposed to be judge-driven, but the prosecutor and the defense do their own investigating and use the results to try to persuade the judge of their respective accuracy. What Perry Mason did for the Italians was push the defense attorney into a position of greater prominence and activity.

But what reality, over the same period, has done to the American legal system is turn it into a marketplace, in which the leading role is taken by the prosecutor. You guys may already have known this, but most Americans don’t—90+% of all criminal cases are resolved by plea bargain. No trials, no Perry Mason. When a criminal case does go to trial, that’s usually a sign that the plea bargain market has broken down, often due to inadequate information on one or both sides.

The market metaphor is not meant to imply that anybody is taking money under the table. So far as I know, that’s rare, and certainly not a necessary component of the system. But plea bargaining is just that—a bargaining process. One side (usually the prosecutor) makes an offer, the other side makes a counter-offer. Each side may introduce bits and pieces of information to weight the bargain to one side or the other. Eventually, the parties meet somewhere in the middle.

Another parallel to the economic marketplace is the defendant’s position on all this. If the defendant is in jail pending trial (because of inability to post bail, either because the bail is too high, or because he simply has no money at all), he is not only enduring the discomforts and dangers of the jail system, he is also making metaphorical deposits into a virtual account. That is, from any sentence he ends up being given, the time he has already served in pretrial confinement will be subtracted. (The old hands know this from the start; the newbies find it out quickly from the old hands.)

Which brings us to the major insanity of the system, which is taken for granted by both defendants and attorneys on both sides: if you plead guilty, you go free (immediately or pretty soon.) If you maintain your innocence, you can stay locked up for a very long time. Got that? If you’re guilty, you get out. If you’re innocent, you stay in.

Apparently the movie “American Violet” (“now playing in selected theaters,” according to its website) talks about this issue in some detail, and is also supposed to be a good movie.

This insanity in turn is based on an assumption which is pretty much universal in the law enforcement system, among cops and lawyers alike: if a suspect didn’t do whatever he has been arrested for doing, he has probably gotten away with something as bad or worse. This is a statistical approach to law enforcement, which is still in its developing phases. Eventually, I suspect, it will turn into a calculation that, since 75% (or whatever) of all criminal suspects are in fact guilty of the offense in question or something just as bad, a Bayesian system of justice will convict all suspects and cut all maximum sentences by 25%. As I think I’ve said in previous comments, what matters to cops and prosecutors is not necessarily that the right person be punished, but that the right kind of person be locked up.

Anyway, the impact of this system on habitual offenders is probably not what those who set it up had in mind. The average defendant is overjoyed to plead guilty to something slightly less heinous than the original charge, and “get off” with “time served” plus a few months. He considers this a “good deal.” His lawyer has told him it was a good deal. So has the prosecutor.

The judge’s role in this scenario is minimal, but in some ways the most heinous element of the whole thing. Because the judge is the one who asks the defendant to assert on the record that “no one has promised you anything in exchange for this plea.”

Mr. Wired once went through a legal proceeding in another state in which he was advised by his attorney to assert something in the trial, on the record, under oath, which everybody involved knew to be untrue. He was barely 20 at the time, and under a lot of pressure, so he did it. He still feels like a perjurer for having done so. Okay, maybe his conscience is more acute than most people’s, but isn’t that what the legal system is supposed to want?

In fact, most people (certainly including most of my clients) at one time or another get advised by various functionaries of The System to lie about something significant, in order not to gum up the works. The criminal courts are not even the most egregious instance. What these coerced falsehoods accomplish, aside from smoothing out the administrative process, is leave The System in a position to renege on any deal based on such lies, just by blowing the whistle. “Uh, gee, we didn’t know that Recruiter Sergeant Slick was going to tell the recruit to lie about his flat feet. We’ll court-martial both of them for recruiter fraud.” In short, infinite possibilities for blackmail.

A couple of peripheral notes about Perry Mason: good trial lawyers don’t do Perry Mason, they do Colombo (“help me out here, I’m confused. How is it that….?”) A trial lawyer who did Perry Mason would probably get successfully sued for malpractice.

CynThesis

In the Kingdom of Darkness, the Blind Rule

April 19, 2009

(or, Recession?  What Recession?)

As I perused the Sunday paper business and career section this afternoon, I encountered a familiar-looking article.  It was addressed to novice job-hunters, and the subject was negotiating salary in the course of a job interview.  There were a lot of cautions about how employers probably don’t have as much wiggle room in setting salaries these days, so the job applicant shouldn’t make unrealistic demands, and shouldn’t talk about pay and benefits at all until everything else has been dealt with.

Nothing surprising, I suppose.  Similar to stuff I’ve read in the business and career sections before, except for the recession-linked cautions.

And I kept blinking and asking myself, “What planet does this writer live on?  Negotiate for salary?”

I found myself remembering stories my friends told me about their jobhunts.  One woman, who was switching career fields in her 30s or so, was asked by a prospective employer how much she had in mind for compensation.  She named a figure slightly lower than what she had been getting in her previous field.  The employer said quickly, “Oh no, we never pay our girls that much.”  Young feminist that she was, she was unable to keep herself from asking, “Yes, but what do you pay your women?”  Needless to say, she didn’t get the job.

Another friend of mine, who before becoming the stay-at-home mother of a child with a disability, had been a crackerjack executive secretary, interviewed for a part-time secretarial job with a nonprofit.  The ad she was responding to offered a pay range from $5.00 to $7.50 an hour (this was twenty-some years ago.)  After laying out her experience and skills, she asked about compensation, and the interviewer told her it would be six dollars an hour.  My friend reiterated her experience and skills, and pointed out that surely that ought to be worth $7.50.  (If it wasn’t, I told her later, then the $7.50 was reserved for Katherine Gibbs herself.) The interviewer drew herself up, in a huff, and said, “You can’t tell me how much money I’ll pay you.”  So much for negotiation.  My friend actually managed not to point out that she could, at least, decide how much it was worth her while to accept, when working would involve paying for commuting, work clothes, and child care, and that Lincoln had, after all, freed the slaves a while back.  She simply walked out and took her skills elsewhere.

Some years later, I worked in a law office that handled employment discrimination cases.  I interviewed at least three prospective clients with precisely the same story, all unknown to each other and working in entirely different fields, so these are independent cases.  All three of them had been fired after asking for a raise.  The employer’s rationale, in all three cases, was:
1. Asking for a raise = refusing to continue working at the current salary
2. Refusing to continue at the current salary = insubordination
3. Insubordination = valid grounds for firing.
Unfortunately, the law provided no remedy for any of these people, and I had to send two of them on their way. (The third had a reasonably decent case for discrimination aside from the immediate cause of her firing.)

In short, the job market in which my friends, my clients, and I have been operating since back in the ‘70s anyway was one in which the average job applicant or employee had no leverage and no ability whatever to negotiate salary, either before or after being hired—well before the current recession was a twinkle in George Bush’s eye.  And we are all well-educated, competent women with no serious self-esteem problems.

Well, okay, some things have changed. I got my first job in Chicago only after the nice lady from the employment agency I was working with was asked by the employer what method of birth control I used, and she told him I was on the Pill.  (Which in turn was only a couple of months after a phone call with the landlord of the apartment we were applying to rent at long distance, while we were still in Boston, in which he asked us what our “ethnic background” was, and we actually felt obliged to tell him.  And we weren’t even tempted to show up in blackface.  So yes, Virginia, the Civil Rights Act really did make a difference.)

Similarly, all the career articles one sees in business sections these days tell people it’s okay to have been fired or laid off, it’s happening to everybody, and nobody thinks it says anything about your competence or industriousness.  This is apparently something new, due to the Recession.

Mr. Wired was fired, sometimes very traumatically, at least 8 or 9 times in a total working life of 20 years.  I managed to get traumatically fired only 3 times over a somewhat longer period, but I suspect that’s because I was usually making a lot less money than he was.  Both of us have been characterized by most of our employers and co-workers as competent and energetic, but we were both a bit tone-deaf to office politics, I think.  If either of us had taken it personally or let it jam up our jobhunts, we would probably be living on the street by now.  But back then, nobody even talked about what to do after getting fired.  It just didn’t happen to Our Kind of People, apparently.  So we operated in the dark and managed to remain among the employed without any advice from the experts.

In short, we have apparently lived in our own private recession for the last 40 years, while the rest of the world was flourishing like the green bay tree.  Which no doubt gives us an advantage over a lot of other people who are just now getting dumped into the cold tub we have been swimming in for years.

Either that, or until very recently, everybody else has been lying.

Care to guess which?

Red Emma

Two Rubies Plus Ten Percent

April 13, 2009

A Thought Experiment

Rosewood. Tulsa.  Native American treaty rights cash-outs.  The American internment camps for Japanese-Americans.  The German slave labor camps.  Four hundred years of African-American slavery.  Reparations are once again in the headlines.  Reparations have even been, or are now in the process of being, paid, to the victims of all but the last-listed category of injustice.  Randall Robinson has been trying to rouse public awareness of and interest in reparations to the descendants of African-American slaves for years, and he is once again gaining notice.

All of these campaigns that have resulted in payment plans have ultimately been worked out by lawyers, for better and for worse.  What lawyers typically worry about in such cases are two problems: calculating the damages, and identifying and locating the persons to be compensated.  The smaller the number of original victims, and the shorter and more recent the period over which damages were incurred, the easier the solutions.  So the consequences of the race riots in Rosewood, Florida, and Tulsa, Oklahoma, have been highly manageable.  The Japanese-American internees were not only a small group, but easily identified from records still available to the federal government.  The German slave labor camp inmates and their families have been a lot harder both to identify and to locate, and the calculation of damages has likewise been difficult, but a settlement has nevertheless been worked out.

That leaves the really hard cases.  Reparations for four hundred years of slavery, for instance.  How would the damages be calculated?  The easy way–the current value of forty acres and a mule, times the number of persons to be compensated? Or the hard way–the surplus value of 400 years of agricultural labor in an economy almost unimaginably different from our own, which has left only incomplete records, times some arbitrary rate of compound interest?  And to whom should these funds be paid?  All provable descendants of African slaves in North America, regardless of their current racial identification?  Most of the known descendants of Thomas Jefferson’s slave and alleged mistress Sally Hemmings today consider themselves “white.”  In all likelihood, that is also true of a significant proportion of the descendants of other slaves.  Are they still entitled to be compensated for the suffering of their ancestors in the generations before their family succeeded in “passing” as white?  Perhaps on a pro-rated basis in comparison to descendants of slaves who are still classified as “African-American”?  For that matter, what about the large proportion of today’s African-American community who have significant white ancestry?  Would this too require some pro-rating formula?  Or should compensation be paid only to those who are currently living as African-Americans, since they are the ones who bear the burdens created by slavery in the first place?  That depends on exactly what is being compensated in the first place–the labor stolen from generations of slaves, or the damage done by the residuals of slavery to the African-Americans now living.  Obviously, this is the kind of problem the lawyers will have to tackle after agreement is reached on the principal of paying any kind of compensation at all, to anyone.

Assuming we ever reach that point, that brings us to what may be the hardest case of all, one which has never been discussed anywhere by anyone, so far as I know–reparations for the stolen labor and lives of women.  Think about it–not a week of race rioting, not eight years of internment, not four hundred years of slavery, but at least the ten thousand years of known human history.  Not isolated ethnic or racial minorities, but 53% of all the human beings who have ever lived on this planet.

From the lawyer’s point of view, of course, that is precisely the problem.  The larger the projected sum of reparations, or the number of people eligible to receive it, the less likely it is that it will ever be paid, or even recognized as a moral obligation.  Reparations for ten thousand years of unpaid women’s labor could only take the form of a drastic transfer of wealth from men to women.

And, unless some very careful planning and preparation were done beforehand, much of that wealth would be transferred back to fathers, husbands, and pimps within the year (much the way many Native Americans who were paid for the transfer of treaty lands away from their tribe found themselves back in poverty again within a few months or years after the payment was made.)  The use of debt peonage, sharp dealing, and just plain intimidation would play the same role between men and women as they did between ex-slaveowners and ex-slaves in the post-bellum South.   Poverty often involves not only lack of resources, but lack of the power necessary to hold onto the resources one acquires.

The problem of designating recipients would be the easiest part of the settlement, for a change.  Everybody with the double-X chromosome.  Since everybody’s ancestry is 50% female, ancestry becomes useless as a criterion.  What about transsexuals? some would ask. On one hand, they have chosen their status, but on the other hand, once having chosen it, they are now subject to the same disadvantages as their double-X sisters.  Last week, an Iranian who had undergone gender reassignment from male to female some years before proclaimed that she wanted to change back, because of the social, political, and legal disabilities imposed on Iranian women by the religious establishment.  Clearly, she would be eligible for reparations.  Transgendered women are a small enough proportion of the population that their inclusion in the settlement would have little or no impact on anyone else’s benefits.

As a socioeconomic “thought experiment”, the idea is useful. It could certainly be used to impress upon the human race generally the magnitude of its debt to women.  It might also be useful as a way to figure out the point at which the effort to redress past injustices becomes the futile task of unscrambling eggs or putting the toothpaste back into the tube.   That point probably lies just beyond calculating and paying reparations for  four hundred years of African-American slavery.

Am I saying this is a futile enterprise?  On the contrary, I would love to see some serious number-crunching.  The short way?  That would be based on the biblical statement that a valiant woman’s worth is above rubies.  That means at least two rubies, plus.  Hence the title of this essay–the current value of two rubies plus an arbitrarily-chosen ten per cent, for each woman on the planet today.  The long way would involve something like the following:  figure out the total value in modern dollars for a thirty-hour week of domestic labor and child-care.  Multiply that by the average lifespan of human beings over the last ten thousand years, minus 6 years (the average length of unproductive childhood), times 53% of the number of human beings who have ever lived on the planet.  That takes care of housework and child care.  Then calculate an average figure for the uncompensated labor of women in agriculture, pastoral work, and other “family businesses,” and perform the same set of calculations with it.  Maybe figure in an additional 25% for “pain and suffering” resulting from domestic abuse (something, by the way, that could also reasonably be done in calculating reparations for the descendants of African-American slaves as well.)  Then divide the resulting total by the number of women currently alive.  The final figure could be used for all kinds of things, beginning with calculating alimony and personal injury awards for the incapacitation or wrongful death of women in Western countries, and calculating bride prices elsewhere.  Forget the “price above rubies”–let’s crunch some real numbers!

CynThesis

“EEUUWW” is Not an Argument

April 13, 2009

That Other Blog Over There seems to be getting a lot of posts from a guy who thinks “Ick” is an argument against certain sexual acts, and thinks he can best make his point by referring to those acts as graphically as possible, so as to provoke the “ick” response in as many readers as possible.

I’ve already read Henry Miller, thank you very much.  And Norman Mailer, and the rest of those boys.  More to the point, I’ve spent a great deal of time with college-age females talking about why they don’t eat liver, and sashimi, and brains, and giblets, and okra (“EEEEUUWW, slimy”), etc.  It’s the same response.  Little kids do it even more vociferously, about anything they’ve never experienced before that isn’t on the Disney Channel.

Nonetheless, traditionalists, going back at least as far as C.S. Lewis, consider the disgust response to be a valid moral guide.  Many of my relatives consider seafood and pork products disgusting, and probably also consider this a valid moral guide.  (BTW, Maimonides says that isn’t a valid approach to keeping kosher.  We are supposed to recognize that bacon and lobster may well be delicious, and give them up joyfully in spite of that, because we are so commanded.  What merit is there in giving up something inherently disgusting?)

Well, never mind lobster and escargots and spinach and broccoli and the various other morally neutral foods that many people find disgusting.  Autopsies are disgusting, but may be medically and forensically valuable.  Surgery is disgusting, unless you do it for a living.  A cesarean delivery is disgusting, to lay people.  And perhaps more to the point, a normal vaginal birth is likely to be perceived as disgusting, even by the father of the child n question.  Normal, loving, marital sex is disgusting to children.  You get the idea.

The disgust reflex tells us absolutely nothing about the moral value of its object. It is utterly useless as a moral guide.  Can we turn our attention to how people treat each other?

Jane Grey

Panhandlers and the Jewish Tradition

April 13, 2009

These are hard days for street beggars.  More and more municipalities have tried to ban their activities by law (the courts have held that merely asking somebody for money is free speech, protected by the First Amendment, but that “coercive” panhandling can legitimately be barred. So far there has been no binding precedent on the constitutionality of laws prohibiting camping in public parks, sleeping in public spaces, and searching for food in dumpsters and garbage cans.) Police conduct regular “sweeps” of places ordinarily frequented by homeless people, seizing and destroying their property, and packing the residents off to shelters or jails, or just “away.”  Dumpsters behind restaurants and multi-residential buildings are locked, or noxious or even poisonous substances sprayed on their contents.  Citizens approached by panhandlers are becoming less and less likely to be generous or even polite.  Municipal governments tend to see panhandlers as, at best, a blight on the landscape, and at worst, potential or actual criminals.

The liberal response to panhandling isn’t much better.  Community organizations which have traditionally thought of themselves as “do-gooders” are almost universally taking the position that panhandlers are either people with mental disabilities, who belong in some proper care facility, or substance abusers, who will use cash contributions only to degrade themselves further by feeding their habit.  The best response they can come up with from this perspective is voucher systems–would-be contributors buy vouchers for fifty or twenty-five cents each, and then hand them out to panhandlers.  The latter can redeem the vouchers at local food pantries, soup kitchens, grocery stores, restaurants, and sundry shops, but only for “legitimate” purchases, such as food, non-alcoholic beverages, and personal hygiene articles.  Not surprisingly, in some cities, a thriving black market for vouchers has already developed; the panhandlers sell them to “fences” at a discount, for cash, which they then presumably use for whatever anti-social purpose the vouchers were supposed to defeat, and the fences then use them for purchases of food etc. for which they would have otherwise paid cash, often twice as much. To the extent that they see any ethical quandary in this situation, the “do-gooders” define it as “How can I keep this person from starvation without helping him/her feed an addiction?”  The voucher system in fact does this job more or less adequately.  But it ignores a much older system of ethical priorities.

We Americans of the 1990s did not invent street beggars, or programs to repress and eliminate them, or voucher systems for that matter.  Third World countries take for granted the presence of swarms of beggars in any place likely to generate any surplus food or other resources.  Police may try to keep them from being too much of a nuisance to tourists and honest working people, but they rarely define the mere presence of any beggars as a “problem.”  Nor do the tourists and working people in question; “this is a Third World country,” they presume.  “The poor we have always with us.”  People give to them, or not, based on purely individual decisions.  Those decisions may in turn be motivated by religious tradition, emotion, or political commitment.  This is pretty much the traditional approach to beggary in such countries, dating back to the beginning of money economies.  It was also the accepted attitude in pre-industrial Europe.  Several religious orders, in their early years, supported themselves by begging.  The implicit bargain was that those who donated were freeing the friars to devote their time to prayer and good works, and in return would receive some of the merit therefrom.  This arrangement also existed on an individual basis–”give me a penny, kind sir, and I will pray for you.”  In addition, the medieval Catholic Church defined “feeding the hungry, clothing the naked, and sheltering the homeless” as “corporal works of mercy”, meritorious in and of themselves, regardless of the deserts of the beneficiary. Many aristocrats and wealthy burghers retained “almoners” to take charge of giving out alms to people in need.

Islam and Judaism have similar traditions.  The giving of alms is one of the four pillars of Muslim practice.   “Tzedakah”–usually translated “charity”–is one of the obligations to which religious Jews are commanded.  In the Buddhist tradition, the monastic orders supported themselves by begging, and the laypeople who donated to them gained spiritual merit by doing so.

But the coming of industrialization to Europe, and especially the dissolution of the monasteries and religious orders in Britain and Northern (Protestant) Europe, changed this picture.  The presumption took hold that any able-bodied adult (and both terms were defined very loosely) could work, and anyone who could work should work, and deserved no support from anyone if s/he was not working.  “Sturdy [able-bodied] beggars” could be driven out of wherever they appeared, often with corporal punishment, or rounded up and imprisoned in “workhouses.” Substance abuse was not the issue–until quite late in the 19th century, almost everybody drank prodigious amounts of alcohol, and anyone who could afford it could legally obtain various kinds of opiates and cocaine derivatives over the counter for self-medication of various real and imagined ailments. Probably at least half of these people would today be classified as addicted to something.  That was not a major social concern, except among a few minority religious groups like the Methodists and their offshoot, the Salvation Army.

In industrialized Eastern Europe, especially in what is now called Poland, the Jewish community was called on to respond to several waves of increased indigency, often connected with the movement of refugees from the various wars which infested the area over the 17th through the early 20th century.  Community organizations, soup kitchens, food pantries, and other large-scale charitable organizations were set up, drawing on the financial support of wealthy Jewish entrepreneurs and the moral support of the rabbinic establishment.   And some of these organizations set up voucher programs, to discourage beggars from “bothering” working citizens and make sure that only “deserving” poor people received alms.  There is a story about one such campaign in Poland a hundred years or so ago; a meeting of the umbrella community organization was held, and one of the major items of “new business” on the agenda was a proposal for a voucher program, which would prohibit beggars from approaching individuals directly, and require them to apply to some authorized agency for help.  The Hafetz Hayyim, a very holy rabbi, who had been asked to attend the meeting to lend his moral support to its decisions, raised his hand. “Point of order,” he called out. “What is your point of order, rabbi?” asked the chair.  “You have called this proposal ‘new business.’  But in point of fact this is old business.  It is as old as Sodom and Gomorrah.”

This requires an explanation. In Genesis 18:20, the Holy One tells Abraham “The sins of Sodom and Gomorrah cry out to Me.  I am going to destroy them.”  After a charming interlude in which Abraham tries unsuccessfully to talk the Holy One out of His plan, the latter sends a couple of angels into Sodom to tell Abraham’s nephew Lot and his family to get out of town before the sulfur and brimstone hit.  The local citizens, on seeing these personable strangers enter Lot’s house, gather outside Lot’s door and demand “Send out these men, that we may know them.”  As we all learned in Sunday school, when the Bible says “know”, it means “have relations with.”  The angels and Lot’s family get out safely, and Sodom and Gomorrah get what’s coming to them.

Christians read this story as an indictment of the evils of deviant sexuality.   They presume that the sins that “cried out to heaven” and got the Holy One’s attention in the first place were also sexual sins.  But that is not how the Jewish tradition reads it.

To elaborate on this tradition, we need to understand the Jewish concept of midrash. Loosely translated, it means “story- telling.” In practice, it means that, unlike the Saturday Evening Post in its heyday, the rabbis have taken responsibility for what the characters in the biblical narrative do between installments. While the stories woven to account for how the dramatis personae got from here to there do not have the authority of scripture, and may often wildly contradict each other, they provide us with a world-view and a way of looking at scripture that is part and parcel of the Jewish mindset. Jews do not read the Bible raw and unaccompanied; it is always filtered through commentary and midrash.

So what does the midrash say about Sodom and Gomorrah? (My source, for the sake of convenience, is Louis Ginzberg’s  Legends of the Jews.) “If a stranger merchant passed through their territory, he was besieged by them all, big and little alike, and robbed of whatever he possessed. Each one appropriated a bagatelle, until the traveler was stripped bare. If the victim ventured to remonstrate with one or another, he would show him that he had taken a mere trifle, not worth talking about. [Anyone involved in consumer protection work knows that it is vastly easier to steal one dollar each from a million people than a million dollars from one person–and far less likely to be prosecuted and punished.] And the end was that they hounded him from the city….After a while travelers avoided these cities, but if some poor devil was betrayed occasionally into entering them, they would give him gold and silver, but never any bread, so that he was bound to die of starvation.  Once he was dead, the residents of the city came and took back the marked gold and silver which they had given him, and they would quarrel about the distribution of his clothes, for they would bury him naked….

“The cause of their cruelty was their exceeding great wealth.  Their soil was  gold, and in their miserliness and their greed for more and more gold, they wanted to prevent strangers from enjoying aught of their riches.  Accordingly, they flooded the highways with streams of water, so that the roads to their city were obliterated, and none could find the way thither.  They were as heartless towards beasts as towards men.  They begrudged the birds what they ate, and therefore extirpated them.  [Nowadays, in some place, people get arrested for feeding pigeons.]  They behaved impiously towards one another, too, not shrinking back from murder to gain possession of more gold….

“Their laws were calculated to do injury to the poor. The richer a man was, the more was he favored before the law. The owner of two oxen was obliged to render one day’s shepherd service, but if he had but one ox, he had to give two days’ service….For the use of the ferry, a traveler had to pay four zuz, but if he waded through the water, he had to pay eight zuz [one of the earliest examples of the now well-known fact that the poor pay more].” Ginzberg follows with a story of a outsider woman who had married a man of Sodom. “Once a beggar came to town, and the court issued a proclamation that none should give him anything to eat, in order that he might die of starvation. But [the woman] had pity upon the unfortunate wretch, and every day when she went to the well to draw water, she supplied him with a piece of bread, which she hid in her water pitcher.  The inhabitants of the two sinful cities, Sodom and Gomorrah, could not understand why the beggar did not perish, and they suspected that someone was giving him food in secret.  Three men concealed themselves near the beggar, and  caught [the woman] in the act of giving him something to eat. She had to pay for her humanity with death; she was burnt upon a pyre….

“The people of Admah [one of the other “cities of the plain” destroyed with Sodom and Gomorrah] were no better than those of Sodom.  Once a stranger came to Admah, intending to stay overnight and continue his journey the next morning.  The daughter of a rich man met the stranger, and gave him water to drink and bread to eat at his request.  When the people of Admah heard of this infraction of the law of the land, they seized the girl and arraigned her before the judges, who condemned her to death.  The people smeared her with honey from top to toe, and exposed her where bees would be attracted to her. The insects stung her to death, and the callous people paid no attention to her heart-rending cries.  Then it was that God resolved upon the destruction of these sinners.”

Which brings us to the visit of the angels to Sodom, and the locals’ demand to gang-rape them, that actually appears in the narrative in Genesis. Midrash explains it thus: “It was not the first time that the inhabitants of Sodom wanted to perpetrate a crime of this sort. They had made a law some time before that all strangers were to be treated in this horrible way.” In short, the midrashic tradition is that the cities of the plain were punished for their inhospitality to the poor and the stranger. Their proposed attack on the angels, like most rapes, was not a sexual act, but an act of violence. It was especially evil because it was directed against victims especially protected by Heaven–strangers and travelers, people with no other source of protection among the locals.

For the origin of this idea, we need to look at the book of Deuteronomy, for instance 24: 19, where “the widow, the orphan, the stranger, and the poor” are repeatedly described as being under the special protection of Heaven (one might even say that Heaven has placed them under an affirmative action program.)  This protection is necessary, in an agrarian society, because these people in particular have no link to the means of survival–ownership or share-cropping rights on agricultural land.  The Book of Ruth–in which the title character is a widow, an orphan, and a stranger, and is given what she needs to survive and feed her family–is the paradigm of the proper treatment of this protected class.  The Sodom and Gomorrah story, in the Jewish tradition, is the paradigm of the improper treatment of this same class.

So when a rabbi talks about Sodom and Gomorrah, chances are he is talking about mistreatment of  poor and helpless people.  Clearly, that was what the Hafetz Hayyim meant.  And that is the lens through which religious Jews have to decide what to do about panhandlers.  We have a divinely-imposed obligation to help those who cannot support themselves.  Anyone who has any resources whatever to spare (over and above the support of their own family) is subject to that obligation, even poor people–even, in fact, a beggar who has had a good day, vis-a-vis one who has not been so lucky.

Among the most important and best-known pronouncements on the obligation to give to the poor are those of Moses ben Maimon–Maimonides.  Most Jews and many non-Jews are familiar with his classification of the eight grades of charity, of which the highest is enabling a poor person to become self-supporting, and the next highest is the gift in total anonymity on both sides.  It should be noted that he did not intend the higher levels to replace the lower (the person-to-person face-to-face gift, with varying degrees of good grace on the part of the giver.) In his less well-known works, Maimonides, who is writing in the 1200s in highly-urbanized Spain and Northern Africa, is realistic, and perfectly willing to admit that there are phony beggars out there, people who claim needs they do not in fact have.  The Holy One has allowed these fakers to exist, he tells us, to create a benefit of the doubt for people who refuse to give to beggars (Maimonides was realistic about those people, too.)  If all the beggars out there were really destitute, he says, anyone who failed to give to one of them when s/he could afford to would be committing a grave sin.  Since some of them are fakes, those who refuse to give are guilty only in proportion to the ratio of real beggars to phonies.  Maimonides also says that anyone who cannot respond to a beggar’s request for alms by giving money or other physical goods has an obligation at least to give him/her a cheerful greeting.

“Tzedekah” is usually translated “charity.”  But in fact, its meaning is much closer to “justice” or “righteousness”–closely related to what Plato and Aristotle mean by “justice” as “giving to each person what s/he deserves.”  It is an individual obligation, and creates individual rights.  The passages in Exodus and Deuteronomy which impose it are mostly written in the second person singular (for example, Ex. 22: 20ff.; Deut. 15; 7ff.) The beggar who approaches a by-passer is not asking for a favor, s/he is invoking a right.  “Fiddler on the Roof” retells an old Jewish beggar joke–”Alms for the poor,” cries the beggar. “Not this week, business hasn’t been so good,” replies the by-passer.  “So because you had a bad week, I should suffer?” says the beggar.  Maimonides would utterly concur with him.

So we have three possible views of undeserving beggars: Maimonides in the 13th century says that Heaven allows them to exist to keep stingy people from incurring grave sin–and if we refuse to give to them, we are taking a calculated risk; the “Poor Law” administrators of newly-industrialized England say that these are people who are refusing to work despite being able to do so, and giving to them merely encourages their idleness; and today’s “liberal” organizations say they are crazy or addicted, and giving to them only encourages them in not getting the care they need to become sane, sober, self-supporting human beings.  The Poor Law administrators had no trouble distinguishing between deserving and undeserving beggars–any adult (loosely defined) with the usual number of limbs and organs who wasn’t working was undeserving. Period.  But both Maimonides and today’s do-gooders are willing to grant at least some ambiguity in this category–we can’t always tell by looking at or talking to a beggar whether s/he is for real.  Maimonides took the position that, morally, the safest way to deal with this ambiguity was to give; today’s liberals take precisely the opposite position, perhaps because they believe the proportion of really destitute people to fakes has shifted, and possibly even reversed, since the 13th century.  The various surveys and studies of homeless people, street people, and beggars have produced conflicting and admittedly inconclusive results.  We generally don’t know, and probably can’t know, which of the people who approaches us on the street is for real.

And that’s assuming that the currently respectable definitions of “deserving” and “undeserving” are valid.  Suppose every beggar, every homeless person, were to clean up, sober up, and apply for every job currently available.  Suppose–even less likely–that each of those jobs were actually filled with a homeless person or a beggar.  How many people would that leave?  Given the incomplete state of our statistics on homelessness and indigency, we can’t know–but it is hard to believe there really are jobs out there for every one of the people who “ought” to be seeking them.

Certainly it is morally, spiritually, emotionally, intellectually, and physically better to be unemployed and sober than to be unemployed and drunk–but who are we to begrudge the unemployed alcoholic his daily ration of rotgut if we are not willing to help him sober up?

At any rate, the point of voucher programs is to benefit people who, although probably addicted or crazy, are at least genuinely in need and should not be ignored altogether.  There are, I think, three possible ways to deal with such people:
(1) You can assume the panhandler is a person like yourself, capable of making rational decisions about allocating his/her resources and just in need of some help in acquiring resources.  In which case, the appropriate course of action is to give her/him some money. [In this society, the way to acquire full human dignity is to earn or inherit a lot of money.  But one can have at least some dignity simply by virtue of having some money, regardless of how it was obtained, and being free to decide how to spend it.  We grant that much dignity to children over the age of 5 or so, to whom most parents give cash allowances.  Arguably, even the panhandler deserves no less.]

(2) Or you can assume that the panhandler is for some reason not capable of spending money on what s/he needs without degrading him/herself even more.  In that case, if you are genuinely concerned about the panhandler as a person, you will do what I have known both my husband and my father to do on various occasions–take him/her to a restaurant–or what I have, more timidly, done–bring him/her a sandwich from the nearest fast food joint.

(3) But if all you really want is to get the panhandler out of your face so you can go on about your business, you can give him/her a voucher.  This is not a symbol or an instrument of personal concern; it is a substitute for it.  Unlike real charity, it is not a means of bringing giver and recipient closer together, but of setting a distance between them which the giver considers appropriate.

Don’t get me wrong; vouchers aren’t as bad as using the police to sweep the “riffraff” off the “nicer” streets.  And they’re a lot better than shooting street people, as is done in Rio these days. But the program should not be dignified with the name of charity, when it is really nothing but a relatively humane method of crowd control.

A final question:  should our decisions on what to do about street beggars be guided by a general policy–that is, should we always give, or never give, based on some general principle?  Or should we make our decisions one day at a time and one beggar at a time, based on our circumstances and theirs at that particular moment?  I lean toward the latter position.  Can I afford it today?  Maybe my finances fluctuate more than most people’s, but that is often a valid question for me.  Assuming I can, am I obligated to give to every beggar who approaches me, or may I pick and choose?  Assuming I may, on what basis do I make those choices?  As a practical matter, I don’t give to people who scare me, or try to intimidate me, and I have no scruples about that.  And I don’t give to people who really do strike me as phonies.  (But I do try to remember the cheerful greeting when I’m not in a position to give.)

Ultimately I have to use my own judgment, bearing in mind what Maimonides says on the subject.  I believe that part of the obligation to give to those in need is an obligation to look at each person who asks as an individual, rather than merely a vehicle for my virtue.  The Hafetz Hayyim and many orthodox Jews today would probably disagree with that position, and might say instead that if your path today intersects a beggar’s, Heaven has made that happen, for the beggar’s benefit or yours, or both.  Some people find that a really appealing spiritual path. It is the source of many legends–not only in the Jewish tradition, by the way–about beggars turning out to be Elijah the prophet, or some other spiritual VIP, who rewards those who treat him kindly, with either spiritual blessings or, sometimes, material ones.  This may be another way of saying that a beggar who seems phony may turn out to be the real thing, for all we know–an important reminder even to those of us trained in the social sciences, about where we should be applying the benefit of the doubt.

In the abstract, of course, Maimonides is right–the best thing we could do for street beggars is put them in a position to support themselves.  But when clean, sober middle managers with MBAs are being “downsized,” that solution is obviously a long way off.  An economy in which even an unskilled person with episodic mental problems can get a job that pays enough to put a roof over his or her head is what we should be working for, in the long run* .  The long run ought not to be coterminous with the messianic age.  “In the long run,” as Winston Churchill says, “we are all dead.”  Street people, if not cared for decently, are likely to be dead in the short run–some statistics indicate that the average street person will last, at most, ten years or until age sixty, whichever comes sooner.  We now know that changing the political party in power, or its philosophy, does nothing whatever to reduce the number of beggars on the street.  Until we can figure out what will, we have an obligation to tend to the short-run welfare of the poor we have with us.

Jane Grey

The Argument From Abstract Authority

April 7, 2009

Many years ago, my husband and I spent a lovely summer afternoon at a cookout with a friend of his from work, and her mother and brother. They were all of Slavic ancestry, self-made and reasonably well-educated. And I had two very strange experiences in the space of that single afternoon.

The first was while I was in the kitchen with our friend’s mother, cutting up tomatoes for a salad. And the mother asked me, more or less out of the blue, what I thought of the Equal Rights Amendment, which was at that time still a fairly hot topic. I told her I favored it, and tried to explain, as un-complicatedly and un-aggressively as possible, why. She replied, “Well, I’m against it. I don’t remember why right now.” I thought she should have waited till I was through using the knife, but I managed not to respond.

Then, later, when we were all outside eating the aforementioned salad and other goodies, the brother asked me what I thought about the military draft (also a hot topic at the time.) Unlike the ERA, the draft was a subject on which I was at the time a certified expert. There were maybe ten people in the country who knew as much about it as I did, and two or three who knew more. (Most of them, BTW, did not work for the Selective Service System.) So I explained why I thought it was a bad idea, again without trying to challenge him, but at fairly great length. He had asked for my opinion, okay, I would give it. When I finished, he said, “But that’s only your opinion, isn’t it?”

I thought of telling him, Well, if you really wanted to know what G-d told me on Mt. Sinai, you should have said so. Being at that time a much nicer person than I am now, I didn’t.

The other, closely parallel phenomenon, was a scene in the movie Close Encounters of the Third Kind. Richard Dreyfuss is at the dinner table, building out of mashed potatoes a model of a mountain in Wyoming which he has never seen, for reasons he does not understand, and his little boys are baffled and frightened to see their father losing control. A conversation ensues, most of which I don’t remember, but somewhere in the middle of it, one of the children asks their mother, “Mom, do we believe in UFOs?”

In all three instances, opinions were being treated, not as conclusions arrived at from observed phenomena, but as components of group identity or affiliation. The boy in Close Encounters was asking a question kind of like “are we Catholic or Lutheran?” My friend’s mother and brother were asking, “which side are you on?” Kind of like many of the small-town characters in George Eliot’s novels, who have their opinions on religion, politics, and business, pretty much the way they have their hats. Since our friends at the picnic didn’t know me very well, they were checking me out, rather than inviting me into a rational discussion.

The woman who was against the ERA but couldn’t remember why had obviously heard some authoritative argument on the subject which told her it was okay to believe what all her friends believed. That was all she needed to know about it, so she immediately forgot the rest. The man who asked for my opinion on the draft wasn’t interested in the merits of my argument; he just wanted to make sure I didn’t expect him to change his mind.

Which tells us some scary things about our culture these days. “Rhetoric,” which used to mean the art of persuasion, is now used mostly as a synonym for the barnyard epithet. But, used in its classic sense, it would be an even dirtier word for most people, because trying to persuade someone to change his/her opinion is now seen as fundamentally immoral. We don’t talk in order to learn and be changed by learning. We talk in the same way and for the same reason that we wear political buttons or put bumper stickers on our cars—to identify ourselves. “Here I am. I can do no other, and you’d better not try to make me.”

Martin Buber says that entering into any kind of serious relationship—with The Holy, with another person, with a tradition—requires the willingness to be changed in and by that relationship. Clearly, most of us most of the time don’t do that. When we do, our friends are likely to treat us as if we had joined a cult. Indeed, that’s part of the operational definition of a cult—an organization that deliberately attempts to make people change their religious opinions. It’s okay to try to recruit the unaffiliated, who started out with no serious opinions on the subject. But groups that evangelize members of other groups are beyond the spectrum of respectability.

I try to imagine Socrates in Chicago, hanging out at the court buildings where it is not unusual to be between obligations and willing to make conversation with semi-strangers. But who would talk to a man who starts out by saying, “I don’t know much of anything. What do you know?” Inviting me to try to change his opinions. What kind of kook would do that? Would he take his hemlock with cream or lemon? Or would he ask me which one tasted better?

Jane Grey