Archive for July, 2008

Why Does My Cat Get Better Medical Care Than I Do?

July 27, 2008

Vets are really interesting people.  In the first place, their education is generally more selective and more rigorous than that of human physicians.  They have to know all the same stuff, but about multiple species. And it’s a lot harder to get into vet school than into med school. Partly because there are fewer vet schools.

The vets I know have all seriously considered the practice of human medicine and specifically rejected it (even though they obviously could have qualified.) Usually this is because they see human patients as deserving a lot of their medical problems, whereas animals don’t.  I’m not sure I agree with this.  Plenty of dogs and some cats end up in the vet’s office as a result of getting into fights that they could and should have avoided, after all.  And obesity is an increasing problem among critters for most of the same reasons it’s a problem among people, except that it should be easier for a critter to stay slim if somebody else is entirely in charge of feeding it.

And vets have a separate code of professional ethics, sort of like the one human doctors have, except:

1) there is, obviously, not much of a confidentiality issue between vet and critter, and

2) the code specifically requires a vet to refrain from causing the patient pain AND fear.  Which would be a pretty good idea for human doctors too, wouldn’t you think?

Anyway, I raise these issues because I just got a postcard indicating that my cat is due for her shots next month.  How come I never get postcards about mammograms and endoscopies for myself?  Not to mention that vet care is almost always a lot cheaper than human medical care, and so are the meds.  This is partly because vet care is (so far) not expected to be covered by third-party payment, and partly because most meds and many procedures are tested on dogs and cats before being approved for people, so the R&D has been long since amortized. 

One of the first child-protection activists, a Brit whose name I can’t remember and am too lazy to look up right now, began his career by bringing an abused child to the local SPCA.  “She’s an animal too,” he said, essentially, “and deserves the same protection as the carter’s horse.”  Maybe we could try a similar approach with medical care?

Jane Grey

Feminist Rant (brief)

July 23, 2008

If you haven’t read Margaret Mead’s Male and Female, it’s worth reading. Yes, I know her anthropological research has been discredited, but her sociological theorizing is still pretty good. What M&F tell us is:

  1. In every culture, men and women do different, and non-overlapping, tasks.

  2. Tasks that are assigned to women in some cultures may be assigned to men in others, and vice versa.

  3. But, in every culture, those tasks assigned to men—whatever they are–are considered more important than those assigned to women—whatever they may be.She doesn’t get to what I consider the crucial corollaries to those rules:

  4. In situations where men are not available to perform their traditional tasks, women will be expected do them, and will often do them very well.

  5. But no matter how well the women perform these male tasks, they will never get the recognition or the compensation that the men got for them, and as soon as the emergency is over, the tasks will revert to traditional male dominance.

  6. And on the other hand, in situations where traditional male tasks are no longer necessary or possible, the men will generally do nothing at all, and expect (and often get) the same amount of recognition and compensation as consolation for the loss of their social roles that they used to get for performing them.

  7. So far, there has never been a situation in which traditional female tasks are no longer necessary or possible. The closest to it was the” empty” life of the middle-class stay-at-home wife of the late ‘40s and ‘50s described by Betty Friedan, and work rapidly rushed in to fill that void, as Parkinson’s Law dictates it must.

Red Emma

While It’s Still On Our Minds (April 3, 2001)

July 20, 2008

Six months ago, before the 2000 election, most Americans had just barely heard of the Electoral College.  Those who knew anything at all about it knew that it was a vestigial organ, a harmless anachronism right up there with saying “Oyez” at the opening of a court session.  It was, the more sophisticated of us believed, the way the popular vote gets translated into the official tally.

 

Some really sophisticated students of history knew that, on at least two prior occasions, the Electoral College vote had failed to reflect the popular vote, and had in fact elected a president who had the smaller share of the popular vote.  But that could never happen now, we told ourselves.  Last time it did happen was in 1876. And somehow or other, they worked that one out.

 

Because most Americans use the word “history” to mean “gone and forgotten,” we mostly didn’t know just how they worked it out.  The Hayes-Tilden Compromise, a mere footnote in most widely-read books about U.S. history, involved a commission which worked out the Electoral College stalemate by giving the presidency to the Republican who had received fewer votes, in exchange for giving the white Southern Democratic politicians a free hand to scrap Reconstruction and impose Jim Crow legislation.  Almost everything the Civil War had been fought for, everything the Civil Rights movement would have to fight for again a century later, was lost in the Hayes-Tilden Compromise.  Quite aside from the fact that more than half the American voters had voted against the official winner. Some footnote.

 

Somehow, between our ignorance of summer 2000 and Election Day, many of us realized that, this year, it was really possible that the winner of the popular vote might not win the Electoral College vote.  And somewhere in the weeks after Election Day, we all found ourselves accepting, without much objection, the fact that the candidate who had, obviously, won the popular vote, in fact probably would and ultimately did lose the Electoral College vote.  By that time, the specifics of how he lost the Electoral College vote were so outrageous that the revisitation of Hayes-Tilden seemed trivial by comparison.  At least that was legal.

 

Well, let’s look at the Electoral College more carefully.  Under the Constitution, every state, regardless of its population, gets two senators and at least one congressional representative.  This gives the smaller states disproportionate weight in the Senate.  And, since each state’s share of votes in the Electoral College equals the number of its total Congressional delegation (senators + representatives), smaller states get disproportionate representation in the presidential election too.  The vote of an individual citizen of Rhode Island counts far more than the vote of an individual New Yorker.

 

But it gets worse.  Because all but three states have adopted the “winner take all” rule (this is not in the Constitution; the states that have it have it by state statute) a state with ten Electoral College delegates, whose voters went for Candidate A by a mere 51%, will cast the same number of Electoral College votes for that candidate as a state the same size, in which 99% of the vote went for that candidate. So the votes of 48% of the voters in the second state do not count at all.  (This reality gave rise, in the 2000 election, to the practice of “trading votes.”  A voter in a state in which the vote was expected to be close, who strongly support third-party candidate Nader but was unwilling to risk a Bush victory, would contact a voter of similar sympathies in a state expected to vote overwhelmingly for Bush [or Gore.] The latter would then cast his vote for Nader. The idea was that this might improve Nader’s chances of getting his 5% and qualifying for federal matching funds in 2004, without jeopardizing Gore’s chances in the tight races.)

 

Then, on top of these generalized, nationwide problems, we have a more-or-less random and heretofore not-very-significant level of electoral mismanagement, vote count error, voter error, and voting machine malfunction, that happens in every election.  Experts now tell us it runs well into the 6 figures nationwide, but that up to now we haven’t paid much attention to it because the margin of victory in previous elections within living memory has always been large enough not to be affected by it, and because we have generally assumed that such errors were no more likely to happen to voters for one candidate than to voters for any other, so that, statistically, they would cancel each other out.  But in fact, recent events suggest strongly that electoral error of all kinds is more likely to happen in poor neighborhoods than elsewhere, because public expenditures on electoral machinery and staff, like public expenditures on just about everything else, vary directly with the income level of the people who live in the area in question.  To the extent that lower-income voters are more likely to vote for one party than for the other, the impact of electoral error will not be random.  And this year, of course, the margin of victory was well within the statistical margin of error.

 

On top of all this, there is a whole other set of electoral problems we had been aware of–mostly having to do with the low level of voting and registration among lower-income, non-white, and poorly-educated citizens.  This included people who were unable to get time off from work to vote on a weekday, people who couldn’t find their polling place, people with disabilities who couldn’t get into the polling place, and people who simply didn’t understand the political process.  Ironically, the 2000 election overcame some of these problems, and got a lot of people out to vote who had never done it before, but under circumstances in which their votes either counted less or were not counted at all.

 

And, finally, there is the localized election fraud factor–votes being deliberately discouraged or manipulated or disappeared by over-zealous or corrupt officials, presumably acting independently and without the knowledge of their candidates.  This problem arises from the fact that, in most localities, elections are managed by elected officials, political appointees, or partisan volunteers–people who cannot possibly be relied on to be impartial.  While we are well aware of the potential for fraud and intimidation in the Third World countries to whose elections we send observers, it seems not to have occurred to us that our own election judges and boards of election commissioners are no less partisan than those in South Africa or Haiti.  While it was a little nervy of Cuba’s Castro and Russia’s Putin to offer to send observers to the Florida recount, we probably could have benefited from some kind of impartial outside oversight of the process.

 

On the other hand, the things that bothered the American public most–the length of time elapsed between Election Day and the declaration of The Winner, and the involvement of lawyers and courts–bothered me not at all, and should not have bothered a properly educated citizenry.  The people who complained, “When are we going to have a president?” had apparently never been taught, or at any rate never learned, that (a) we had a president through January 20, 2001–the man we elected in 1996, and (b) under the Twelfth Amendment of the Constitution, we would have an identifiable, active, plenipotentiary president even it the crisis stretched on into March (which, under that same Amendment, it conceivably could have.)  And the people who proclaimed with Dick Cheney, “The rule of law must prevail. The lawyers should go home,” apparently have the same notion of the “rule of law” as the old Soviet Union–a rule of the executive unhampered by courts or lawyers.  We Americans have allowed ourselves to be embarrassed in the international forum by our readiness to resort to litigation to resolve disputes.  The countries that find this bizarre are either authoritarian regimes in which no one would dream of challenging the powerful, or well-regulated social democracies in which many disputes are pre-empted by thorough governmental regulation.  I would prefer the latter arrangement, but not having chosen either of those alternatives should not embarrass us.  Litigation is better than violence or intimidation or outright submission, and no worse than democratic governmental regulation.

 

Red Emma

Bringing Democracy Home (2004)

July 20, 2008

In Iraq, the Shi’a leader Ayatollah Sistani is refusing to cooperate with the American occupation in its plans for setting up a government. The Americans want to do it with caucuses. The Ayatollah wants direct elections. Normally, in situations like this, I figure the political arrangements of other countries are their own business. But this time, I’m with the Ayatollah, one hundred percent.

In fact, I don’t think he goes far enough. Iraq isn’t the only country that needs direct elections. So does the USA. Especially if we take it on ourselves to liberate other countries and set up their governments. The USA needs a complete reform of our electoral system, and our elections should be monitored by the UN or some other international NGO until we’ve accomplished it.

The Republic for which it stands

Most of us didn’t pay much attention to the Electoral College before 2000. We were vaguely aware that our quadrennial exercise in civic responsibility was not a direct election, that we were really voting for delegates to some mythical beast called the Electoral College rather than for our chosen candidate. The more politically sophisticated among us knew that the Electoral College vote could theoretically elect the candidate who did not get the largest number of popular votes. Hard-core history buffs knew that there was actually at least one occasion when it happened. For the rest of us, the Hayes-Tilden compromise was so esoteric that it never even turned up on quiz shows. We thought of the Electoral College kind of the way one thinks of the old stuffed moose head Grandpa keeps in the attic—someday we really ought to get rid of it, but it’s just harmless clutter. All it does is take up space. We’ve lived with it this long, and it never caused a real problem. Why borrow trouble? Cross that bridge when we come to it.

More about the Hayes-Tilden compromise later. First let’s talk about the 2000 election. Please. Yes, I know we aren’t supposed to talk about it. That’s just beating a dead horse. It shows we’re sore losers, not willing to let bygones be bygones and get on with our political lives. None of the candidates this year are willing to raise the issue of the validity of the 2000 election, or even the question of how we make sure it doesn’t happen again. Before 2000, we didn’t talk about the Electoral College because it couldn’t possibly cause any problems. Now that we know it can and it has, we don’t talk about it because only losers complain.

The 2000 election had several sets of problems. One was the flawed mechanism for voting in many states. Not just Florida. Indeed, Illinois had more miscounted and uncounted votes than Florida, probably more than twice as many. In Illinois it didn’t matter, because an accurate count of the popular vote would not have changed the Electoral College vote from Illinois. The same is probably true of New York. The hanging chads of Florida mattered because the popular vote was so narrowly split that 500+ votes could change the Electoral College outcome.

And that narrow split mattered because of the “winner-takes-all” rule, which prevails in all but a few states. The rule says that the winner of the popular vote in a state, even if s/he wins by a single vote, gets the entire Electoral College delegation of that state.

The “winner-takes-all” rule is not part of the U.S. constitution. It is not part of the constitutions of most states that have it. It is a mere statute, a creature of legislature. It can be repealed at the state level, by the same process used to rename an airport or raise a cigarette tax.

Even without the “winner-takes-all” rule, the Electoral College system gives disproportionate power to smaller states, since the size of a state’s E.C. delegation is determined by the size of its total congressional delegation—both its senators and all of its representatives. Since even Wyoming (with a population smaller than that of most counties in New York, Illinois, or Florida) has two senators and one congressional representative, a citizen of Wyoming has a lot more voting power than a citizen of Illinois (which also has only two senators, along with 19 representatives.)

The 2000 debacle was not the worst possible outcome of the Electoral College system. At least all of the delegates accepted the mandate of the voters who elected them, rather than allowing themselves to be bribed or intimidated into voting for somebody else. At least the election was not thrown into Congress. At least no special commission was formed to sort things out. (That, by the way, is what happened as a result of the 1876 election, the Hayes-Tilden compromise. The electoral vote split close to evenly, and the commission created to resolve the problem worked out a deal in which the Republicans got the White House in exchange for giving the Democrats a free hand to abolish the reforms of Reconstruction in the South. That was the beginning of Jim Crow—hardly the answer to a trivia question.) It is cold comfort to know how much worse it could have been. How much worse does it have to get before we change it? What faustian bargain is the next commission, or the next Supreme Court, going to make?

An eye to the future

There will be a next time. And probably, it will come a lot sooner than 2124. The Electoral College virtually guarantees the recurrence of a close vote. It’s a lot easier for 538 votes to split 50/50 than for 105 million to do so.

But there have been some reasonable objections raised to direct popular election of the president. Most notably, if every one of those 105 million votes really mattered, we would have to put a lot more attention and money into making sure each of them was properly counted. As noted earlier, in most states, the winner-take-all rule means that the size of the landslide doesn’t matter, as long as there really is one. We can afford to be sloppy. We can afford to put our oldest, least functional voting machines in poor neighborhoods and rural areas. We can afford to make polling places inaccessible to people with disabilities. We can afford to mess around with the absentee votes of people in the military. Only the “battleground” states really have to worry about running accurate and honest elections.

Oops. That’s an argument that cuts both ways, isn’t it? What we’re really saying is that democracy is expensive. Freedom isn’t free. There’s an original thought. You get what you pay for. How much do we value our republican form of government? Why don’t we put our money where our mouth is? Or is democracy only something we want to ram down the throats of ignorant foreigners, whether they want it or not?

UN monitoring

I wasn’t joking about that. I’m not joking about any of this. To a considerable extent, the mechanical flaws in the 2000 election were caused by outdated technology. But the up-to-date technology we’re looking at now may be even scarier. The wave of the future is on-line voting. Experts in computer security tell us such a system can be made secure from hacking and fraud. But that would be expensive, and would have to be monitored by an impartial agency immune to bribery and intimidation. One of the corporations reportedly interested in setting up computerized voting systems this year is—guess who?—Halliburton. Need I say more?

The shrinking electorate

A smaller proportion of voting-age citizens vote in each successive election. That’s actually a double problem. First, fewer people are eligible to vote, especially in minority communities. We are told that one-third of all African-American males of voting age are in jail, prison, or on probation or parole at any given moment. That means a similar percentage of African-American males are likely to have felony convictions on their records. In many states (including of course Florida) ex-cons are permanently barred from voting or holding office.

And second, a smaller proportion of people eligible to vote actually bother to register and vote. The 2000 debacle itself may have an effect on voter interest, though we can’t predict which way it will cut. My brother in Georgia believes more strongly than ever that every vote counts; but many of my students in Chicago know that their votes (above the 50% line) made absolutely no difference.

Whatever happened to the secret ballot?

Finally, the whole purpose of voting has been transformed by polling and fundraising into a sophisticated version of the Prisoner’s Game. Game theorists have brought us this situation in which the police arrest two men for some crime or other. They offer each man the same deal: the man who confesses first and implicates his accomplice gets minimal time (in some versions of the game, no time at all.) The man who confesses second, or not at all, gets the max. If both men confess, they both get the max. If neither of them does, they both go free. This requires the participants to put most of their energy into reading each other rather than dealing with the actual merits of the case (like, who actually committed the crime, if anyone; what kind of evidence do the police have, if any; which of the arrestees is most guilty?)

This year’s Democratic primary is nothing but an electoral variant of the Prisoner’s Game. The voter’s job is not to figure out who is the best candidate (however the voter may define that); the voter’s job is to figure out which of the candidates is most “electable.” Which means the voter’s job is not to read the candidates (which of these people is most likely to act in my interest or consistently with my principles? which of them is most competent to govern? which of them do I like best? trust? which one would I want marrying my daughter? which would I want to have lunch with?) The voter’s job is to read the other voters.

And that’s a tricky and deceptive business. Sociologists studying race relations in the ‘60s and ‘70s kept running into what they called the “neighbor problem.” People asked about their willingness to work or live near a member of another race were very likely to say, “I wouldn’t mind, but my neighbor (or the guy next to me on the assembly line) wouldn’t like it at all.” Some of them were projecting their own unacceptable attitudes onto the neighbor. Others were guessing at other people’s attitudes, but were (judging from the polling results of those very neighbors) as likely as not to be wrong.

Election pollsters have actually tried asking two sets of questions: who do you plan to vote for? and who would you vote for if you thought he could win? They invariably get two different results. The Prisoner’s Game corrupts the voting process at its very root. We waive our own best judgment in picking candidates, to improve our chances of voting for the winner. And we’re not all that good at picking the winner.

The Prisoner’s Game has become even dirtier, now that the forces of mass communication, backed by the power of special interest dollars, have been put to work shaping our perceptions. Advertising shapes what we want and expect from a candidate, what we believe about a particular candidate, and what we want and expect from the electoral process. Advertising tells us who the mythical neighbor is and who he will vote for.

Some immodest proposals

First, I’m not at this point suggesting amending the Constitution (unless the Republicans are so intent upon running this governor of California for president that they want to eliminate the requirement that a president be born a US citizen—then we can bargain.) We don’t really need to. All we really need to do is abolish the winner-takes-all rule on the state level, and require that the Electoral College vote proportionally reflect the popular vote in each state.

Second, we need to abolish all permanent bars on voting. Many states allow convicted felons to vote after they have completed their sentence; some require a waiting period after that. Either provision would be acceptable.

Third, we really seriously need some public-sector, impartial, externally accountable agency monitoring the electoral process, especially if we go to on-line voting. Under no circumstances should the process be “privatized.” Merely barring US corporations wouldn’t help, in this era of global conglomeration.

And finally, here’s something the individual voter can, and must, do: free yourself from the Prisoner’s Game. Vote for the candidate you honestly believe to be the best, without regard to his or her “electability.” Tell your friends about it, too. Announce it as you enter the polling place. Maybe we need some buttons made up: “Don’t vote for the electable. Vote for the best.”

Only if we do all these things can we create a democracy worth exporting.

Red Emma

Joining Nebraska, Maine, and Afghanistan (November 2004)

July 20, 2008

Colorado will have a referendum on the ballot this November, asking voters whether they want to divide the state’s Electoral College votes in proportion to the popular vote.  If it passes, Colorado will join Nebraska and Maine as the only states in the US without the “winner-takes-all” provision for their Electoral College votes. 

 

In the meantime, President Bush regularly proclaims his intention to ensure direct elections in Afghanistan and Iraq.  Not just “democracy,” direct elections! 

 

Most of us didn’t pay much attention to the Electoral College before 2000.  We were vaguely aware that our quadrennial exercise in civic responsibility was not a direct election, that we were really voting for delegates to some mythical beast called the Electoral College rather than for our chosen candidate.  The more politically sophisticated among us knew that the Electoral College vote could theoretically elect the candidate who did not get the largest number of popular votes. Hard-core history buffs knew that there was actually at least one occasion when it happened.  For the rest of us, the Hayes-Tilden compromise was so esoteric that it never even turned up on quiz shows. (Okay, it was the end of Reconstruction and the start of Jim Crow in the American South, but who cares about that?) We thought of the Electoral College kind of the way one thinks of the old stuffed moose head Grandpa keeps in the attic—someday we really ought to get rid of it, but it’s just harmless clutter. All it does is take up space. We’ve lived with it this long, and it never caused a real problem. Why borrow trouble? Cross that bridge when we come to it. 

So let’s talk about the 2000 election, when the Electoral College did cause a real problem.  Please. Yes, I know we aren’t supposed to talk about it.  That’s just beating a dead horse.  It shows we’re sore losers, not willing to let bygones be bygones and get on with our political lives.  None of the candidates this year are willing to raise the issue of the validity of the 2000 election, or even the question of how we make sure it doesn’t happen again. Before 2000, we didn’t talk about the Electoral College because it couldn’t possibly cause any problems.  Now we don’t talk about it because only losers complain.

The major problem of the 2000 election was the flawed mechanism for voting in many states.  Not just Florida.  Indeed, Illinois had more miscounted and uncounted votes than Florida, probably more than twice as many.  In Illinois it didn’t matter, because an accurate count of the popular vote would not have changed the Electoral College vote from Illinois.  The same is probably true of New York.  The hanging chads of Florida mattered because the popular vote was so narrowly split that 500+ votes could change the Electoral College outcome. 

And that narrow split mattered because of the “winner-takes-all” rule, which prevails everywhere but in Nebraska and Maine (and maybe after November, in Colorado.) The rule says that the winner of the popular vote in a state, even if s/he wins by a single vote, gets the entire Electoral College delegation of that state. 

The “winner-takes-all” rule is not part of the U.S. constitution.  It is not part of the constitutions of most states that have it.  It is a mere statute, a creature of legislature.  It can be repealed at the state level, by the same process used to rename an airport or raise a cigarette tax.

The “winner-takes-all” rule, as we have seen so far this year, means that only states with closely divided votes matter in the campaign.  Only states with closely divided votes need to worry about the accuracy of their vote counts.  The rest of us (Illinois voters, for instance) could all just stay home, for all the difference it makes to the electoral process. 

Some reasonable objections have been raised to direct popular election of the president. If every one of our 105 million votes really mattered, voter by voter rather than state by state, we would have to put a lot more attention and money into making sure each of those votes was properly counted. In most states, the winner-takes-all rule means that the size of the landslide doesn’t matter, as long as there really is one. We can afford to be sloppy. We can afford to put our oldest, least functional voting machines in poor neighborhoods and rural areas.  We can afford to make polling places inaccessible to people with disabilities. We can afford to mess around with the absentee votes of people in the military.  Only the “battleground” states really have to worry about running accurate and honest elections.

Oops.  That’s an argument that cuts both ways, isn’t it?  What we’re really saying is that democracy is expensive.  Freedom isn’t free. There’s an original thought.  You get what you pay for.  How much do we value our republican form of government? Why don’t we put our money where our mouth is?  Or is democracy only something we want to ram down the throats of ignorant foreigners, whether they want it or not?

So it’s time for Illinois to follow Afghanistan and Iraq and Nebraska and Maine and maybe Colorado down the path our politicians think is for external use only—let’s repeal the “winner-takes-all” rule and vote as free individual citizens for the most important politician in the world.

Red Emma

 

 

 

 

 

Anything for a Laugh?

July 16, 2008

Okay, I finally managed to read the New Yorker article. Talk about unintended consequences, after he read it, Mr. Wired went and ordered the last twenty years worth of the New Yorker on CD-ROM from Amazon. Anyway, a few comments here:

  1. Was the cover offensive? I was uncomfortable with it. Does this mean liberals have no sense of humor except when laughing at conservatives? Dunno. After all, the New Yorker says the cover is laughing at conservatives, or at least poking fun at what they say about the Obamas. It just doesn’t work for me. Is that because it isn’t “politically correct,” as several people have remarked? No, I think the whole “political correctness” thing is a con, closely related to why feminists allegedly have no sense of humor. Conservatives require liberals to laugh when attacked. If we don’t, it proves we have no sense of humor and aren’t good sports. I do not aspire to be a good sport.

  2. The cover had virtually nothing to do with the article, except that it depicted the Obamas. If it had been a cartoon of the Obamas in 1890s swim suits at the beach, it would have been equally relevant to the article. It’s unfortunate that most people, whether pro- or anti-Obama, probably never got past the cover.

  3. The article was absolutely fascinating. That’s partly because it’s mostly set in my neighborhood, and quotes several people I know personally, so it’s kind of fun following their tracks. I ran for office myself a while back, seeking backing from some of the same people Obama contacted. And David Axelrod is my godson’s ex-girlfriend’s father, for what that’s worth. (It does put me well within six degrees of several ex-presidents and ex-candidates. My daughter’s late father-in-law, an esteemed character actor, puts me in the same position to just about every actor in the world.)

    The article depicts Obama as a really shrewd political operator, who has followed a detailed plan to work his way up the political ladder. Unlike the ancient Romans, we contemporary Americans have nothing against ambition, so this should not be a problem for him. One is left with the distinct impression that Obama can do anything he puts his mind to. Let’s hope that includes doing a good job as the Leader of the Free World.

Jane Grey

Notes from Underground and Other Curiosities

July 14, 2008

Does anybody else here remember sleeping in the basement in hot weather, because it was the coolest place in the house? Or enjoying the cool breezes upon entering the subway or tunnel? More to the point, does anybody remember when that stopped? These days the basement is as warm and sticky as the rest of the house, only worse because it has less ventilation and no air-conditioning. Same goes for subways and tunnels. This has perturbed me for a while, so I finally googled “underground temperatures” and found several references to this phenomenon, such as  http://www.cosis.net/abstracts/EGS02/02897/EGS02-A-02897.pdf

No, it’s not my imagination (or yours, as the case may be.) Yikes.

Other interesting developments, not necessarily related to climate change: the use of “climate change” instead of “global warming,” which seems to have originated as a conservative reaction on the theory that it would sound less scary. But in fact, it doesn’t. If global warming gets bad enough, one of the results could be a new Ice Age. (Look it up. It involves shutting down the Gulfstream.) So “climate change” is both more accurate and more scary.

The moribundity of Starbuck’s. I rarely visit any of them, though there are something like 7 in a 3-block radius of my office, and another one a block from my home. But, like Bud, I will miss not drinking their coffee. (One of the few correct uses of that phrase I’ve seen in a long time.)

And a bright idea for this month’s crisis—house-sitting services for foreclosed properties. One hears about abandoned homes being vandalized or turned into drug havens, at serious expense to the mortgage holder and the neighborhood. Why not rent the places out, at nominal cost, to otherwise homeless senior citizens? Solve two problems at once. (I will always admire Wendell Berry, BTW, for characterizing livestock feedlots as “turning a solution into two problems.” I wish I’d said that.)

CynThesis

“Poor” is a Four-letter Word

July 13, 2008

Remember the war on poverty?  Probably not.  For most Americans, the concept is more remote than the war on the Spanish in 1898.  After all, it is probably easier to remember the attitude behind the Spanish-American War (“Cuba ought to be part of the United States”) than the attitude behind the War on Poverty (“the richest country in the world should not have citizens without indoor plumbing.”) 

Demographically speaking, the number of people who can remember an era when there were no beggars on our streets is shrinking daily.  So is the number of people who ever believed that some of our adult citizens should not be expected or required to hold paying jobs, and that the people who did hold paying jobs should be able to make enough money from them to support, not merely the individual worker, but an entire family.

These days, we expect to see homeless people anywhere we go that does not charge admission.  We expect “stay-at-home moms” to be a tiny minority of all mothers, and to have attained their status only by virtue of being married to rich guys.  We expect most of the people who wait on us in checkout lines and lunch counters to have no health insurance.  We expect our children and our elderly and disabled relatives to be cared for by people who get their own food from church pantries and occasional soup kitchens. We carefully don’t ask other people’s incomes and don’t tell our own.  We don’t ask, don’t tell, and don’t care.  But in the depths of our uncaring hearts, we know.

We do not even identify these people as “poor”, much less find their numbers and ubiquity disturbing.  We certainly cannot imagine starting even the most metaphorical of wars to end this state of affairs.

Part of the problem is that Americans are not willing to acknowledge the wide spectrum of sub-economies and subcultures that exist in the lower half of this country’s income scale.  We skew the sample by defining poverty to exclude what, in earlier generations, would have been called the “deserving poor”–the people who, in President Clinton’s words, “work hard and play by the rules” and still cannot achieve the American Dream of home ownership, secure retirement, health insurance, and college education for their children.  The people John Kerry characterizes as “struggling to get into the middle class.” 

In any other industrialized country, that would constitute poverty. In ours, it is a problem that dares not speak its name.  We may call the population in question “blue-collar” or “working-class,” but both those terms still carry the now-antiquated flavor of the era when a man [sic] who had not gone to college and who worked with his hands could support an entire family single-handed and provide them with a home and a car, college for his children, 2 weeks a year on a vacation away from home, health care, and a comfortable retirement. 

Today, let’s face it, a person with the same education and skills and the same work ethic is poor.  Indeed, people with considerably better education, skills and work ethic are poor.  It takes the incomes of at least two of them to support a family at a level of minimal survival: at best, a home and 2 cars, all in poor condition, but no health insurance, no retirement savings except Social Security, and no way to put their children through college.  These are today’s deserving poor–except that we have decided they deserve nothing more than they already have, not even the dignity of visibility.

Many of us think of poverty as merely constant, routine deprivation, a life without new cars, new clothes, nice furniture, gourmet food, private schooling, and foreign travel.  It certainly is that.  But far more significant to actual poor people (rather than those who observe them, or claim to) is the non-routine aspect of life, the lurching from one emergency to another, with no margin for error.  A psychologist (Stephen Bezruchka, in “Health and Poverty in the US,” Znet, December 9, 2003) says that poor people are less healthy than the rest of us because of stress caused by shame and anger, because they feel diminished by not having “made it.”  I think he’s on the wrong track.  Yes, poverty causes stress, which causes bad health.  But the stress comes much more from fear and anxiety than from anger and shame.  Poor people are always “catching up” on one bill by putting off another, which then has to be caught up with the same way later, and so on.  Sooner or later, the fragile structure falls apart, with consequences that can range from embarrassing to fatal.  Poverty means waiting for the next emergency in constant fear of not getting through it.

Some of us can remember the old days of “welfare”, when poor people, who were mostly not working, could at least make up in time for what they lacked in money.  Today, poverty, except for those physically unable to work, means having no money and no time.  Poor people’s jobs provide no paid time off, even for such necessities as illness, childcare failures, medical appointments, repair appointments, funerals, jury duty, school conferences, and voting.  And a lot of poor people have to have two or more of these jobs.  Unavoidably, taking time off always means losing money, and usually means losing a job.  Not taking time off can mean guaranteeing that the next generation will be poorly educated, in poor health, and—well—poor.

Additionally, poor people have less control over their time than is presumed by employers, schools, and government agencies they deal with.  If, as Woody Allen says, 80% of life is just showing up, the underclass is disproportionately likely to miss out on life.  Sometimes this is purely a matter of flakiness–people don’t allow enough time to get places by the means of transportation available to them, or oversleep, or forget an appointment altogether.

But more often, they don’t show up because of unavoidable problems that are disproportionately likely to affect poor people.  The two biggies are illness (the subject’s own or that of a family member) and vehicular breakdown (if a poor person can afford a car at all, it is likely to be in precarious condition.)  Poor people and their families are more likely than the rest of us to get sick (either because poor health contributes to poverty by diminishing earning and increasing required expenditures, or because poor people can’t afford the health care–especially preventive care–necessary to stay healthy–or, most likely, both.) And illness in a poor family is likely to be more time-consuming (for the patient and his/her family) than among more affluent people.  Most of us feel abused if we have to wait an hour for a scheduled doctor appointment.  But poor people get much of their health care at emergency rooms and public clinics where the wait may take all morning, all afternoon, or even all day or all night.   Most of us pick up our prescriptions by popping into the pharmacy and popping out again; poor people often have to wait in another long line after seeing the doctor, to get the prescription filled.  And most such medical crises, for poor people, involve not merely the patient, but someone to wait with him/her and someone else to take care of the kids.  The presumption on the part of the poor people’s health care system that “these people don’t have anything else to do with their time” turns into a self-fulfilling prophecy as these problems cause poor people to lose their jobs.

Vehicular breakdown is inevitable, given the kinds of cars poor people can afford.  Like health problems, it not only happens more often to poor people, but is more time-consuming and requires more help from friends and family when it happens.  Most of us just call the auto club and go on about our business. But, even when poor people can afford to join an auto club, they are likely to frequent places auto club tow truck drivers cannot or will not go.  So friends and family must take up the slack, thereby increasing both the amount of time the driver has to spend waiting for help and the number of poor people involved in the enterprise for that length of time, all of them risking their own jobs for the good of the family as a whole.

Housing problems may have similar impact. A person’s residence may become instantly uninhabitable from fire, flood, or loss of utilities, or the person may be evicted for any number of just and unjust reasons.  Some of these problems may result from the person’s utterly avoidable failure to pay rent or maintain the premises, or similar culpable behavior by people s/he lives with; but many of them result from the neglect and misfeasance of the landlord, other tenants, or other people over whose behavior the subject has absolutely no control.

Similarly, poor people are disproportionately likely to be crime victims–another unavoidably time-consuming situation.  Their children are more likely to have problems in school requiring parental presence and response.  All of these issues result in the frequent failure of poor people to show up as required, at work, in school, at doctors’ appointments, in court, and at required appointments at government agencies.  Being poor, in short, is at least a part-time job, and sometimes a full-time job, in itself, on top of any other responsibilities the poor person may have.

This perception of unreliability is one of the major sources of discrimination against poor people.  Reasonably enough, anybody who has to depend on others to show up–for work, for appointments with doctors or lawyers or teachers, for court, or whatever–really doesn’t like people who can’t be relied on to show up. 

Another reason we don’t like poor people is personal appearance.  Poor people are more likely than the rest of us to be fat, and to have ugly teeth.  They tend to age faster, and lose the graceful mobility of youth earlier.  They are more likely to suffer disfiguring injuries, and less likely to be able to get them repaired. 

As a result of all of these issues, many employers–even those that pay poverty-level wages–don’t like to hire poor people.  There was a brief period during the late ’60s and early ’70s when they could get away with hiring young people and married women from middle-class families, and not having to pay middle-class wages to support their middle-class lifestyles.  Essentially, the employers were being subsidized by the parents and husbands of their workers.  That still works, up to a point, for those who hire young people, but not so well for the employers of married women.  According to the Bureau of Labor Statistics, a married woman in the paid work force is actually more likely than a married man to be the sole support of other non-working family members.  If she is paid poverty-level wages, she will be a poor person, with all the disadvantages that entails.

And, like the rest of us Americans, employers have spent a long time imagining that nobody in this country is poor except those who want or deserve to be. We attribute poverty to laziness, or to wrong choices (usually those made early in life, but not generally including a bad choice of parents), generally having to do with education and parenthood.

So we are certainly not prepared to consider the possibility that poverty can sometimes improve the character.  When we think about poor people at all, we think of them primarily as lazy and stupid, or at best with a certain native shrewdness that enables them to take advantage of other poor people even lazier and more stupid than themselves.  We believe they do not marry, they do not love their sexual partners or their children, and often they do not even love their mothers.  Liberals may excuse such behavior as a natural consequence of poverty.  Conservatives deplore it openly, and pat themselves on the back for not holding the poor to a lower standard than the rest of us.  But we all see poverty and bad character as closely correlated, whichever direction the causation may run.  Any suggestion that poverty might improve the moral character is  derided as “romanticizing.”

So let’s look at the virtues of poor people.  First, of course, they work really hard, the cardinal virtue of today’s culture.  They may or may not be married or stay married (a behavior pattern they share with most of the middle class these days), but typically they love their children, their parents, and their siblings, and go to a considerable amount of trouble to help them when needed.  They are likely to be religious and active in their congregations.  They often donate to charity in far larger proportion to their incomes than the affluent. They are used to making hard choices, so they pay attention to the costs and consequences of what they do.  Every choice is a hard one, so every choice is a mindful one.

As noted earlier, poverty in this country is not just a state of constant deprivation, but a state of lurching from one crisis to another, always sure there will be a next one, but never sure of getting through it.  Which means that poverty in this country requires, above all, courage.  Not the flashy, well-rewarded courage of the NASCAR driver or the fake courage of the “action” movie star, but the routinely renewed courage of getting up every morning to face the current crisis and prepare as well as possible for the next one. 

A few people give up–mostly men who leave their families rather than go on with the struggle to support them, and a much smaller number of women who escape into drugs or madness.  But the rest hang in there until they are physically unable to go on.  This is the courage of Leningrad or Sarajevo under siege, the courage of the Middle Passage, the courage of the Warsaw Ghetto. 

In short, if we no longer have the resolve and the wisdom to try to end poverty, can’t we at least have the decency to honor it? If we can’t allow poor people a way out of poverty, can’t we at least allow them a reasonable measure of self-esteem?  Can’t we give Purple Hearts for the walking wounded among us, and Bronze Stars for the extraordinarily brave? And even an occasional Medal of Honor for persistence and bravery above and beyond the call of duty?  Money, food, and housing may be finite resources, but there is always enough honor to go around.

 Red Emma

DEFINING TACKINESS UPWARD

July 12, 2008

 

When I was a kid, growing up in South Florida, we always dried our laundry on the clothesline.  It dried within a day, and always smelled wonderful.  Hanging it out was a chore, but taking it down was delicious. 

 

Then I moved north, to where, three or four months a year, there was no sunshine to hang the laundry out in.  Most of the places I lived had washers and driers in the basement, and that made perfectly good sense.  Drying stuff on clotheslines indoors, especially in unheated basements, left the laundry feeling and looking really icky. 

 

But it never occurred to me that there was anything wrong with clotheslines as such, in sunny weather.  Until I started reading about the cosmic battle between eco-homemakers and neighborhood or condo associations.  The former say that clotheslines, at least in sunny weather, are earth-friendly and economical.  The latter say they’re eyesores and make the place look like a slum. A slum?  The neighborhood I grew up in was a slum? No way! (BTW, whenever the Chinese want to throw some sort of international bash, like the Olympics, the government bans clotheslines, too. So this is not just an American thing.)

 

Well, the latest thing to fall under the eagle eye of neighborhood and condo associations is window-unit air conditioners.  Got that?  Things that (back when I was growing up in South Florida) were considered the height of luxury and the badge of affluence.  Not that my family had one.  But the doctor and the lawyer who lived down the street from us did. I think the lawyer had two of them.  Wow!

 

Window units have been a hassle for a while, at least in our part of the world.  In the condo building we live in, owners are expected to take them out of the windows in winter and whenever repairs or tuckpointing are being done on our section of the building.  Mr. Wired and I are not as young as we used to be, and we have lived in our unit for forty years.  So we have trouble both with physically moving the damn things and with finding any place for them to sit while out of the windows.  And we don’t really understand why we’re expected to take them out anyway.  We cover them tightly during the winter, so there’s no heat loss.  And anyway, we’re on the first floor of a steam-heated building, so if there were any leakage, it wouldn’t matter much.  Dutifully, we pay our extra assessment for not taking them out, which is about what it would cost to hire somebody else to do the job, and at least saves us the floor space where they would be sitting if we took them out of the windows.  But so far, at least, our condo association has not attacked the concept of window units as such, thank heaven, unlike some developments in the suburbs. 

 

Installing wall units or central air is bloody expensive.  But now, apparently, it is on the way to becoming the next required sumptuary expenditure after a clothes drier, in an era when we are all supposed to be pinching our pennies.  Whose idea was this, anyway?

 

And BTW, has anybody done any research on the point at which the heat emitted outdoors by other people’s air conditioners makes it impossible for people in the neighborhood to survive without one?  There has to be a tipping point. Maybe it doesn’t matter, if we already require such homogeneity that nobody with an air conditioner would dream of living next door to a household without one?

 

CynThesis

 

 

The Political is the Personal?

July 11, 2008

 Americans have been “voting for the man [sic] and not the party” for at least my entire conscious lifetime.  In fact, we do it somewhat less often now than we used to in the ‘50s and ‘60s, when we were mostly convinced, with George Wallace, that there wasn’t “a dime’s worth of difference” between Democrats and Republicans as such.  I have voted for, I think, three Republicans myself, over a longish lifetime.  They were all good guys, they all won, and I don’t regret any of those votes.  But that was back in the good old days, when there often really wasn’t a dime’s worth of difference. 

 

More recently, two things have happened roughly in parallel.  The political parties have become more different, and the political candidates’ personas have become more manufactured.  Unfortunately, neither of those developments has much to do with the real political issues confronting us right now.  The parties differ mainly over what we like to call “cultural” or “values” issues, most of them outside the scope of federal government control.  And the politicians, or rather, their handlers, plug into the handiest Jungian or Freudian or Frazerian archetype.  McCain is the old warrior king; Obama is the young challenger.  Hillary’s main problem was that the closest thing to an archetype she could find was a schoolmarm.  Female archetypes are scarce, and mostly ambivalent:  maiden (sexpot, airhead, virgin, whore)/mother (smothering, rejecting, controlling, Madonna)/crone (witch, grandmother, bag lady).  Anyway, neither reversing Roe v. Wade and stopping gay marriage nor overthrowing the grizzled old warrior will help us revive the American economy, save the middle class, end the war in Iraq, reinstate the Fourth Amendment, or stop climate change. 

 

I think we got into this mess because we Americans don’t trust our own judgment about abstractions.  It’s easier to decide we like or don’t like a particular person, even if that requires us to ignore everything we know about the marketing of political candidates and the fabrication of persona. “I don’t know much about….but I know what I like,” is a lazy thinker’s approach to just about everything.  We have been encouraged to fall into it by an advertising-saturated culture that does its marketing by making us like the product, rather than giving us any useful information about it.  That may be okay when selecting a toothpaste.  Unfortunately, the system that has given us at least 150 different varieties of toothpaste can provide us with only two major political parties, and at most only one-and-a-half political ideologies.

 

Red Emma