November,
2004
Hello to my friends and
family.
I
received in mid-December last year the following tongue-in-cheek email message
from a friend, after she had received my annual end-of-the-year letter: “I must say though that I was disappointed in
the length. Surely Gary
you can do better than 12 pages next year!! I had scheduled reading your
letter as my ‘bedtime’ reading and had to supplement it with something
else. Shameful.” I promised her I would do better this year,
and decided to make it a Thanksgiving letter because I had written all I had to
say by the middle of November.
The
letter this year is divided into two parts:
the first 19 pages or so is the usual family ventures and news along
with a few excerpts from a book that was great fun to read, and second, more of
the chapters and a compilation of notes and commentary on various issues and
articles of interest I read over the last year.
The second half also includes a longer section on income/wealth
inequality in the U.S.
(a few of you saw an earlier draft of this piece, but it has been slightly
abbreviated and some knowledgeable faculty comments and criticism added). Some of what I have written will, I hope,
amuse a little or be of some interest; some will perhaps inform and some probably
annoy.
* * *
For those who want the family news and want to dispense
with the rest: Krystin is in her second
year at UM-Morris and doing well.
Elliott is in 8th grade and doing well. My dad remains at the Kenwood and is doing
well. Pat and I have the same jobs, live
in the same house, and are doing well.
We are all another year older.
* * *
In pursuit of my extremely modest, largely unsuccessful,
and for most of you unnecessary quest to further the spread of rationality,
thoughtfulness, and understanding of the world, I am scattering through this
letter short excerpts from Bryson’s book A Short History of Nearly
Everything, which I mentioned in my letter last year. I do so because I think most people would
find them interesting, if not downright fun, and at the same time you may find
them enlightening. I find this stuff
fascinating and so does Elliott; we read the book aloud together this
year. So I can’t resist retelling a few
of Bryson’s stories. (The Bryson
excerpts will be in this type font in
bold, so if you want to skip them, you can easily do so.) Let’s start with the vastness of space, which
most of us understand intellectually but which Bryson puts in terms that I, at
least, can understand at a human level.
Let’s imagine [to
get an idea of how big space is], for purposes of edification, that we are
about to go on a journey by rocket ship.
We won’t go terribly far—just to the edge of our own solar system—but we
need to get a fix on how big a place space is and what a small part of it we
occupy.
Now the bad news, I’m afraid, is that we
won’t be home for supper. Even at the
speed of light, it would take seven hours to get to Pluto. But of course we can’t travel at anything
like that speed. We’ll have to go at the speed of a spaceship, and these are
rather more lumbering. The best speeds
yet achieved by any human object are those of the Voyageur 1 and 2
spacecraft, which are now flying away from us at about thirty-five thousand
miles an hour [and it took them 12 years to cross the orbit of Pluto].
Now the first thing you are likely to
realize is that space is extremely well named and rather dismayingly
uneventful. Our solar system may be the
liveliest thing for trillions of miles, but all the visible stuff in it—the
Sun, the planets and their moons, the billion or so tumbling rocks of asteroid
belt, the comets, and other miscellaneous drifting detritus—fills less than a
trillionth of the available space. You
also quickly realize that none of the maps you have ever seen of the solar
system were remotely drawn to scale.
Most schoolroom charts show the planets coming one after the other at
neighborly intervals—the outer giants actually cast shadows over each other in
many illustrations—but this is a necessary deceit to get them all on the same
piece of paper. Neptune in reality isn’t
just a little bit beyond Jupiter, its way beyond Jupiter—five times farther
from Jupiter than Jupiter is from us, so far out that it receives only 3
percent as much sunlight as Jupiter.
Such are the distances, in fact, that it
isn’t possible, in any practical terms, to draw the solar system to scale. Even if you added lots of fold-out pages to
your textbooks or used a really long sheet of paper, you wouldn’t come close. On a diagram of the solar system to scale,
with Earth reduced to about the diameter of a pea, Jupiter would be over a
thousand feet away and Pluto would be a mile and a half distant (and about the
size of a bacterium, so you wouldn’t be able to see it anyway). . . .
Now the other thing you will notice as
we speed past Pluto is that we are speeding past Pluto. If you check your itinerary, you will see
that this is a trip to the edge of our solar system, and I’m afraid that we’re
not there yet. Pluto may be the last
object marked on schoolroom charts, but the system doesn’t end there. In fact, it isn’t even close to ending
there. We won’t get to the solar
system’s edge until we passed through the Oort cloud, a vast celestial realm of
drifting comets, and we won’t reach the Oort cloud for another—I’m sorry about
this—ten thousand years. Far from
marking the outer edge of the solar system, as those schoolroom maps so
cavalierly imply, Pluto is barely one-fifty-thousandth of the way.
Of course we have no prospect of such a
journey. A trip of 240,000 miles to the
moon still represents a very big undertaking for us. A manned mission to Mars, called for the by
the first President Bush in a moment of passing giddiness, was quietly dropped
when someone worked out that it would cost $450 billion and probably result in
the deaths of all the crew (their DNA torn to tatters by high-energy solar
particles from which they cannot be shielded).
Based on what we know now and can
reasonably imagine, there is absolutely no prospect that any human being will
ever visit the edge of our solar system—ever.
It is just too far. [This
disappointed Elliott a great deal, and unless someone devises a way to eclipse
the speed of light as the ultimate speed limit, all the major science fiction
stories will remain more fiction than science.]
As it is, even with the Hubble telescope, we can’t see even into the
Oort cloud, so we don’t actually know that it is there. Its existence is probable but entirely
hypothetical. . . .
[Once you get beyond the solar system,
there is space—nothing.] And there is a
great deal of this nothingness until you get to the next bit of something. Our nearest neighbor, . . . Alpha Centauri,
is 4.3 light-years away, a sissy skip in galactic terms, but that is still a
hundred million times farther than a trip to the moon. To reach it by spaceship would take at least
twenty-five thousand years, and even if you made the trip you wouldn’t be
anywhere except at a lonely clutch of stars in the middle of a vast
nowhere. To reach the next landmark of
consequence, Sirius, would involve another 4.6 light-years of travel. And so it would go if you tried to star-hop
across the cosmos. Just reaching the
center of our own galaxy would take far longer than we have existed as beings.
Space, let me repeat, is enormous. The average distance between stars out there
is 20 million million miles. Even at
speeds approaching the speed of light, these are fantastically challenging distances
for any traveling individual. Of course,
it is possible that alien beings
travel billions of miles to amuse themselves by planting crop circles in
Wiltshire or frightening the daylights out of some poor guy in a pickup truck
on a lonely road in Arizona (they must have teenagers, after all), but it does
seem unlikely.
* * *
Late last year Pat played emcee for part of an office
holiday gathering. She asked the
assembled group (about two dozen people ranging in age from 20 to 60, with
educational backgrounds ranging from high school to advanced graduate
education) to compose a limerick with an apple theme. She was the only one in the room who knew
what a limerick was. She tried to
explain that it was a little poem, usually risqué if not downright obscene,
with five lines that rhymed in a certain pattern. They didn’t understand. At this point, as Pat was regaling me with
this story, we called Elliott into the room and asked him if he knew what a
limerick was. Not surprisingly (for
someone age 13), he did not. So we told
him. I even picked a few off the web and
read them to him. But I must say that
Pat, and then I when she told me, was dumbfounded that a group of reasonably
well-educated people did not know what a limerick was.
* * *
We took the kids to Cancun,
Mexico, over
Christmas week 2003. This fulfilled a
promise I made to Krystin when she began taking Spanish in 7th
grade: We would take them to Mexico
some winter. That was six years ago, and
I figured I better make good on the promise.
(She, of course, never forgot, and reminded me regularly through the years. It is amazing what kids remember—and what
they forget when you want them to remember something.)
At a dinner party a few years ago, before we went to Australia, one of the people at the table said
that if you have been to Bloomington, you have
been to Sydney. Even without having seen Sydney,
we thought at the time the remark was arrogant; after having seen Sydney, we thought it both
arrogant and stupid because it was so demonstrably incorrect. I am, however, prepared to make a similar
assertion that is intended not to be arrogant but accurate: If you have been to Bloomington, you have been to the hotel zone
of Cancun. True, there are some
differences; the 8-lane divided freeway is the 4-lane divided Kukulcan
Boulevard, there is no ocean nearby, the climates in January are not at all
alike, and the trees are palms, but the buildings are not dissimilar, some of
the stores and restaurants are the same, and (I wager, with regret) that the
hotel staffs are largely Mexican/Latino while the guests are largely Caucasian
American/European. Something I read
while there described Cancun as “a banker’s
idea” and I believe it. There really was
no town of any significance in the area, I learned, until someone had the
bright idea of turning it into a resort area.
We stayed in the middle of the hotel zone, a long strip
of land that runs parallel to the coast and is connected at top and bottom by
bridges, leaving a very large lagoon between the hotel zone and the coast. That banker, whoever he or she was, was not
dumb; the ocean beach of several miles is powdery nearly-white sand, the water
is crystal clear, strikingly blue, and warm, and the climate appears to be
temperate most of the time. Building a
lot of hotels for Americans and Europeans getting away from winter was a stroke
of capitalist genius.
Cancun’s
hotel zone is only barely Mexico. I suppose, however, that it doesn’t advertise
itself as anything other than a resort town.
We could, however, walk across the street to a covered shopping mall and
find some of the same stores that exist in any American city—and some of the
same restaurants. I did let the kids get
food from McDonald’s one day, out of desperation—all the other restaurants in
the mall were packed and they were starving.
(Of course, they were starving most of the trip, to listen to them, so
this was nothing new.) We ate twice at a
place right on the bay, called “Come ‘n Eat.”
When I examined the menu, it said the place was owned/run by “the Anderson group.” I doubt they are Mexican. But it was pleasant because we sat outside
looking out at the water. Not all
the stores were American chains; some were individual and apparently
Mexican-owned or operated.
We
traveled on an MLT package, which advertises worry-free vacations. The more accurate slogan, in our experience,
is worry-filled vacations, since the plane was two hours late leaving, they had
no buses at the airport in Cancun to pick us up, they put us in a tiny hotel
room (with sealed windows looking out at rocks) instead of one we had paid for
(an ocean view with a balcony), and the staff guy at the hotel gave us bad
advice so we were unable to go out to the island of Cozumel to go
snorkeling. (We did manage to get the
hotel room changed half way through the week.)
I know from friends that there are indeed very good tours in the
world. This wasn’t a tour and the
arrangements were second-rate.
Despite these annoyances, however, we had a great
time. Among the best travel advice we
have ever received came from my friend of 30 years, retired Regents’ Professor
of Political Science Frank Sorauf. One
of the premier sites in the Yucatan is the
ruins of Chichen Itza,
the Mayan capital. Frank advised us to
go the night before (Chichen Itza is about 125 miles from Cancun), after the
tour buses have departed, have dinner, see the light show at night, and then
get up early and see the ruins before the tour buses arrive. We followed his advice to the letter, stayed
in the hotel he recommended, and were delighted with the venture.
The main hotel building (no rooms, just the lobby and
restaurant) was built in the 17th century, and part of it with rocks
taken from the ruins of Chichen Itza
(at that time unexcavated, of course).
It sits on lovely grounds that I decided are best described as manicured
jungle. It was reminiscent of the rain
forests of Australia
in terms of the exotic (to us) flora and fauna.
The rooms are in cottages that were originally built for the
archaeological workers at Chichen Itza
in the 1920s, I think. There is no TV
and no telephone—and we (well, at least Pat and I) didn’t miss either of them
at all.
The ruins themselves cannot adequately be described in
words. Suffice it to say that with a
good walking tour, they are fascinating.
We did climb the large pyramid, 91 steps to the top, with no railings
and only a rope draped up the middle of the steps on one side. I climbed up and down very carefully,
with my fingers firmly curled around the rope.
The kids took the more risky approach of walking down nowhere near the
rope. One of guys on our walking tour
asked how many people they lose each year from falls; our guide said one or
two. I believe it. (One thing we learned, with the guide’s
narrative about the pyramid, which was a watchtower, is how advanced the Mayans
were in the fields of mathematics, astronomy, engineering, and physics—the
construction of the building, and its orientation with respect to the
solstices, equinoxes, and lunar and solar eclipses, is extraordinary. Alas, the first Spanish explorers to find the
Mayans promptly burned all of the accumulated Mayan records because they
believed them to be blasphemous. One
result is that to this day only about 5% of Mayan hieroglyphics have been
deciphered.)
There are two principal ways to get from Cancun to Chichen Itza driving. One is the tollway, a freeway similar to
those in the US
(but with far fewer exits and entrances) or the two-lane highway. We took the two-lane highway there, driving
through numerous small towns en route.
The kids got at least a mild flavor of what life in real Mexico looks
like for those who are not well off, as opposed to the antiseptic and
Americanized version on the hotel zone, because they saw the thin children, the
broken-down huts with dirt floors, the even-thinner dogs, the junk and debris
all over, and the people running up beside vehicles to sell things as the cars
slowed down for the speed bumps. Some of
the towns were more prosperous than others, not surprisingly. (We drove back on the tollway, on Christmas
Eve day; I think in the two and one-half hour drive back we saw about 6 cars.)
I have finally figured out, at my advanced age, that I am
not a sun-and-beach-and-water person. We
went out on a boat one day to go snorkeling (in what they say is the second-largest
coral reef in the world, second only to the Great Barrier Reef in
Australia). We were out on the water for
about 4 hours, in the middle of a cloudless, hot day. The snorkeling was great, and as an added
entertainment we all got to “fly the spinnaker.” I knew what a spinnaker was—the front,
triangular sail on the boat—but I had no idea what flying it meant. They detach the sail at the bottom and put on
a rope with a “seat” on it and lower it to the surface of the ocean. One swims out to it, gets the rump adjusted
on the seat, and then the crew raises the sail again so it blows in the
wind. All four of us flew the spinnaker,
back and forth, from water surface to perhaps 20 feet or more up in the air. Then one lets go and falls into the water, at
whatever height is comfortable (for me, when my legs were already dragging in
the water; for the others, from various heights).
I did not intend to fly the spinnaker, but after I saw
two gentlemen in their 60s or 70s do it, I figured I could, too. And the crew leader kept on urging “the
cowboy” to do it. (I was the “cowboy”
because I was wearing a broad-brimmed hat to keep the sun off my face. It is difficult for me to imagine anyone less
like a cowboy than me, but it was all in good fun.)
I
had problems swimming back to the boat.
It wasn’t that far, and even though I am not a great swimmer, it was not
difficult—but I was struggling. One crew
member threw me a life preserver, which I used to get around to the back of the
boat to the ladder. Once up on the boat,
I slipped, whacked my hip bone on the edge of the boat on the way down, and
fell back into the ocean—and gulped down a large mouthful of ocean water. A crew member jumped in immediately to help
me, although I was OK getting back up the ladder—bruised, and gagging from the
water, but OK. I realized, however, that
I was suffering from too much sun. My
grandmother used to say that she would get “sunsick” if she spent too much time
in the sun; I think I am vulnerable to the same thing. I was woozy, slightly nauseated, and had a
headache—even though I had plenty of sunscreen on and a hat to shield my head,
face, and neck. Fortunately, at that
point everyone had flown the spinnaker so it was time to return to shore. I promptly went and sat under the palm trees
while everyone had their complimentary beer/soda. (I felt better once we got back into the
air-conditioned hotel and had a good meal.
Pat and Krystin wanted to go across the street to the mall, which had
several restaurants to choose from; I refused to go outside the hotel back into
the sun. Elliott didn’t want to go back
in the sun, either.) So I need to tell
myself that I cannot stay out in the sun for long periods.
On
the other hand, we did go swimming in the ocean a few times. Short periods in the sun were no
problem. The water, as I said, was
crystal clear and warm, and the waves were fun to ride in. When we were back home I asked Elliott what
his favorite part of the trip had been; he said it was playing in the
ocean. I was astonished—this is a kid
who, for all the years we would gather with Pat’s family on the east coast at a
beach house on the Atlantic, would have
nothing to do with going in the water.
He disliked the ocean.
We
took the bus into Cancun (which really isn’t
much of a town), to eat brunch and go to a large bazaar. The brunch was one of the two times I was
quite worried about the food. I knew it
wasn’t a good sign when one of the waiters was drying the silverware by
hand. But we didn’t get sick. After browsing through many of the shops in
the “flea market,” seeing the same items over and over, I concluded that there
is some large Mexican wholesaler who distributes souvenir items to small-time
vendors. And the same items appeared in
the shopping mall across from the hotel as well as in the little vendors on
Isla Mujera (Island of the Women) that we went
out to one day. One is clearly not
purchasing items produced by hand and locally-made. The concierge at the hotel told us to pick up
the return bus outside the Wal-Mart.
One
afternoon we sat by the hotel pool and had margaritas while we wrote a couple
of post cards. Why is it I thought I
liked margaritas?
Cancun was a great place
to go to be warm and play in the water and on the beach. Since I prefer education with my travels, I
found it somewhat lacking in that respect, but for what it holds itself out to
be, it is extremely good.
* * *
We
have all seen from time to time those lists of words that do not exist but
should. We received such a list this year
from our good friend Rowland Evans in Australia, and several of the
“words” caught our eye. Or, to be more
accurate, several of them made both of the kids hoot. I particularly liked “aquakinesis: the ability of a spoon to move itself
directly under the tap in the sink, thus ensuring a spray of water over anybody
who turns on the tap,” because it happens to me all the time. I also appreciated “billge: the pile of unpaid bills, school excursion
permission forms and leaflets that accumulate on the corner of the kitchen
bench or somewhere in the hallway.” In
our case, it’s the dining room table, which we seem unable to keep from piling
up with junk. Krystin self-identified
with “dresstitute: descriptive of a
woman when she looks into her crowded wardrobe and realizes she has nothing,
absolutely nothing, to wear” (sorry about the sexist implications, but Krystin
picked it out). One entirely appropriate
one for many of us is “eespondent: the
disappointment that follows when you discover all 20 emails have the subject
line ‘do you want a bigger penis?’” We
thought, and Elliott agreed, that “feng shoey:
the ancient male practice of abandoning at least one pair of shoes in
every room in the house” applied to him.
We all agreed that “horrorgami:
the vain attempt to refold a map back the way it came, especially in
high wind or while still driving” was a talent Pat demonstrated at a high
level. And then there was one that was
uniquely Australian:
“oodumboodumboodia: the sense of
panic that strikes many Sydneysiders when trying to decide how many “o’s” to
include when spelling Woolloomooloo.”
* * *
We saw “Rent” in early spring with Krystin at the
Ordway. I didn’t know what I was getting
into, although Krystin did; she told me it was about sex, gays, and dying from
AIDS. Oh, I thought, this sounds
great. But it was quite good; we enjoyed
it. I am glad we didn’t take Elliott;
even though kids now know a lot more “stuff” than I did when I was 13, there
was even “stuff” in “Rent” that I think Elliott doesn’t know—and doesn’t need
to know, at least not yet.
I am quite surprised that “Rent” has been the big hit
that it has. A guy behind me in the
theater, before the event started, said that this isn’t a play, it’s a rock
concert, and people should yell and scream and participate, rather than sit
quietly as they would in a play. Not
only is it a rock concert, however, it is also about a group of young people
(early 20s) who mix and match in ways that the Christian right wing would
definitely NOT approve (lesbian couples, gay/cross-dressing couples, biracial
couples, some of whom use cocaine, others of whom have AIDS). It treats them all sympathetically and
joyfully (I guess). I would think that
anyone on the religious right would have gotten up and walked out midway
through (or earlier).
I wonder if the popularity of “Rent” is related to an
interesting hypothesis advanced by Samuel Huntington about the
denationalization of elites in the United States (about which more,
later).
* * *
[The chapter is “The Rise of Life.” Proteins are essential for life.] Proteins are what you get when you string
amino acids together, and we need a lot of them. No one really knows, but there may be as many
as a million types of proteins in the human body, and each one is a little
miracle. By all the laws of probability
proteins shouldn’t exist. To make a
protein you need to assemble amino acids . . . in a particular order, in much
the same way that you assemble letters in a particular order to spell a
word. The problem is that the words in
the amino acid alphabet are often exceedingly long. To spell collagen, the name of a common type
of protein, you need to arrange eight letters in the right order. But to make
collagen you need to arrange 1,055 amino acids in precisely the right
sequence. But—and here’s an obvious but
crucial point—you don’t make it. It makes itself, spontaneously, without
direction, and this is where the unlikelihood comes in.
The
chances of a 1,055-sequence molecule like collagen spontaneously
self-assembling are, frankly, nil. It is
just isn’t going to happen. To grasp
what a long shot its existence is, visualize a standard Las Vegas slot machine
but broadened greatly—to about ninety feet, to be precise—to accommodate 1,055
spinning wheels instead of the usual three or four, and with twenty symbols on
each wheel (one for each common amino acid).
How long would you have to pull the handle before all 1,055 symbols cam
up in the right order? Effectively
forever. Even if you reduced the number
of spinning wheels to two hundred, which is actually a more typical number of
amino acids for a protein, the odds against all two hundred coming up a
prescribed sequence are 1 in 10260 (that is, a 1 followed by 260
zeros). That in itself is a larger
number than all the atoms in the universe. . . .
For
random events to produce even a single protein would seem a stunning
improbability—like a whirlwind spinning through a
junkyard and leaving behind a fully assembled jumbo jet. . . .
Yet
we are talking about several hundred thousand types of protein, perhaps a
million, each unique and each, as far as we know, vital to the maintenance of a
sound and happy you. And it goes from
there. A protein to be of use must not
only assemble amino acids in the right sequence, but then must engage in a kind
of chemical origami and fold itself into a very specific shape. Even having achieved this structural
complexity, a protein is no good to you if it can’t reproduce itself, and
proteins can’t. For this you need DNA. DNA is a whiz at replicating—it can make a
copy of itself in seconds—but can do virtually nothing else. So we have a paradoxical situation. Proteins can’t exist without DNA, and DNA has
no purpose without proteins. Are we to
assume that they arose simultaneously with the purpose of supporting each
other? If so: wow.
And
there is still more. DNA, proteins, and
the other components of life couldn’t prosper without some sort of membrane to
contain them. No atom or molecule has
ever achieved life independently. Pluck
any atom from your body, and it is no more alive than is a grain of sand. It is only when they come together within the
nurturing refuge of a cell that these diverse materials can take part in the
amazing dance that we call life. Without
the cell, they are nothing more than interesting chemicals. But without the chemicals, the cell has no
purpose. As the physicist Paul Davies
puts it, ‘If everything needs everything else, how did the community of
molecules ever arise in the first place?’
It is rather as if all the ingredients in your kitchen somehow got
together and baked themselves a cake—but a cake that could moreover divide when
necessary to produce more cakes. . .
.
So
what accounts for all this wondrous complexity?
Well, one possibility is that perhaps it isn’t quite—not quite—so
wondrous as at first it seems. Take
these amazingly improbable proteins. The
wonder we see in their assembly comes in assuming that they arrived on the
scene fully formed. But what if the
protein chains didn’t assemble all at once?
What if, in the great slot machine of creation, some of the wheels could
be held, as gambler might hold a number of promising cherries? What if, in other words, proteins didn’t
suddenly burst into being, but evolved?
Imagine
if you took all the components that make up a human being—carbon, hydrogen,
oxygen, and so on—and put them in a container with some water, gave it a
vigorous stir, and out stepped a completed person. That would be amazing. Well, that’s essentially what Hoyle and
others (including many ardent creationists) argue when they suggest that
proteins spontaneously formed all at once.[1] They didn’t—they can’t have. As Richard Dawkins argues in The Blind Watchmaker, there must have
been some kind of cumulative selection process that allowed amino acids to
assemble in chunks. Perhaps two or three
amino acids linked up for some simple purpose and then after a time bumped into
some other similar small cluster and in doing so ‘discovered’ some additional
improvement.
Chemical
reactions of the sort associate with life are actually something of a
commonplace. It may be beyond us to cook
them up in a lab . . . but the universe does it readily enough. Lots of molecules in nature get together to
from long chains called polymers. Sugars
constantly assemble to form starches. Crystals can do a number
of lifelike things—replicate, respond to environmental stimuli, take on a
patterned complexity. They’ve never
achieve life itself, of course, but they demonstrate repeatedly that complexity
is a natural, spontaneous, entirely commonplace event. There may or may not be a great deal of life
in the universe at large, but there is no shortage of ordered self-assembly. .
. .
So
powerful is this natural impulse to assemble that many scientists now believe
that life may be more inevitable than we think—that it is, in the words of the
Belgian biochemist and Nobel laureate Christian de Duve, ‘an obligatory
manifestation of matter, bound to arise wherever conditions are appropriate.’ De Duve thought it likely that such
conditions would be encountered perhaps a million times in every galaxy.
* * *
(Written January
1, 2004) At our annual
marvelous New Year’s Eve dinner, the 5 couples make predictions about the
upcoming year—where the Dow Jones will be in a year, the number of games the
Minnesota Twins will win, (in election years) who will be the president-elect,
and so on. This year the question-master
threw in a couple of new ones: What will
be the country of origin of the Pope on New Year’s Eve, 2004 (I said Brazil)
and will there be a “major” terrorist attack on/in the United States during
2004. (I guess we didn’t make it clear
whether an attack ON Americans but outside the boundaries of the U.S.
would count.) The breakdown of the
answers to this last question were fascinating:
the women all said no and the men all said yes. I have no idea what that means. Perhaps the women were hopelessly naïve;
perhaps the men were overly pessimistic.
I guess we shall see, during the course of the year, which sex was
right.
* * *
We have been running an animal exchange program at our
house this year.
— December 31: One dog,
one cat.
— January 1: Two dogs,
one cat
New Year’s Day 2004 we got a new puppy. Or, more accurately, Pat got a new puppy,
half Brittney and half Lab, from an animal shelter kind of place half way to Duluth (literally). We named her Lily, and played with her while
watching Michigan
get hammered by USC in the Rose Bowl.
Through the night and into the next morning, however, she was vomiting
repeatedly, so I finally took her to the vet.
$288 and about 4 tests later, they determined that she had
parvovirus. We could have taken her in
for treatment, for about $1000 for several days, with about 75% chance of survival. The shelter said they would treat her and
wanted her back immediately, so Pat and Krystin drove back up and dropped her
off. The woman at the shelter said there
was about a 50% chance the puppy would survive (and would not let Pat or
Krystin out of the car, for fear that they would track the parvovirus into the
shelter area). We received a call Sunday
morning that she had not made it. It is
astonishing to me how we could personalize the life of a small animal, and
mourn its death, even though we know perfectly well there are millions of
animals dying around the world in nature and in shelters as prey as well as
from disease, abuse, and starvation. She
was a delightful, playful little girl for the 18 hours we had her, and she
touched our lives. She does not know, of
course, that she is missed, but we do.
In the meantime, I had to clean the floors and carpeting
with a bleach solution and we had to wash virtually all our clothes in hot
water with bleach in order to minimize the presence of the very nasty
parvovirus.
So that was a depressing way to start the year. At a time of the year that I personally find
depressing in a larger sense (not in the psychiatric, clinical sense): The holidays are over, a time of seeing
friends and family and gift-giving, and we now faced the longest part of the
Minnesota winter with its short days, long nights, and cold.
— January 2: One dog,
one cat
— January 18: Two dogs,
one cat
The story seemed to turn out OK in the end, because Pat
found a 3-month-old lab/springer spaniel mix a couple of weeks later and
he—Andy—has turned out to be a wonderful little dog. But boy, we were not used to a puppy! He had so much energy.
— February 18: Two dogs,
two cats
And then, in mid-February, some cruel jenny ass[2]
left a kitten in a canvas bag in one of the women’s restrooms in Pat’s
building. The security staff were
notified and brought it to Pat’s office, since they have animal research labs
as part of their complex. No researcher,
of course, would use an animal whose background was completely unknown, so Pat
volunteered to provide a home for this kitty.
It was black, just like our other cat and two dogs. The poor little guy, in our house, was
surrounded by two large animals (the dogs) that she hissed at for a day or so
and then decided to ignore, and our older cat (who would have nothing to do
with the kitten and just hissed at him).
He was brave, however; he walked right up to the dogs. Andy slobbered all over him and got him
slimed up. Cody just stuck his nose at
the cat. She wasn’t bothered by either,
and then spent half an hour bathing the dog slobber off her when she walked
away. He was a little hellion and cute
as a button. Elliott and I each had
about two dozen cuts on the backs of our hands from playing with the cat. But we have to stop bringing animals into our
house.
Elliott has decided that having a black kitty in the
house “is like living in a horror movie.
You never know when this black creature will suddenly jump on you from
nowhere.” That’s what kittens do.
The introduction of the cat also provided an opportunity
for instruction in non-English naming.
The vet at Pat’s office named the kitten “Bella” because he looked like
a black bat with these big eyes and ears that stick up (after Lugosi, the
actor, who played Dracula—somewhat of a stretch, I thought). I told Pat, however, that the name was Bela
(pronounced BAY lah), not Bella (pronounced BELL ah)—I said Bella is a female name (as in
Abzug) and Bela is male. That led to
some quibbling about pronunciation, so I called our good friend Maria Bales,
who is Hungarian, and asked her how to pronounce the name spelled b-e-l-a and
what sex it was. It is BAY la and it is
male. So our black cat inherited in an
odd fashion the name Bela. And that he
remains. Mr. Lugosi.
— August 18: Two dogs,
one cat
Our old cat Vickie was not having a good time. The young Bela loved to jump on her and chase
her; the dogs frightened her. We finally
asked my dad if he would, on a trial basis, take Vickie for a month. They were about the same age (Vickie turned
17 in August) and moved at about the same speed. After a period, it turns out that my dad
liked the cat and I am certain, although she isn’t telling me, that Vickie
likes the calm and certainty of my dad’s apartment. I brought some cat food down to him in late
September and he insisted on paying for it.
I said she was my cat and I’d pay for the food. He said no, “she’s my cat.” I am sure she especially likes the fact there
are no dogs and no other cats.
— September 28: One dog,
one cat
After eight months, Pat finally decided Andy had to
go. Our cute little puppy in January had
turned into an 80-pound dog that loved to chew things and that bounded around
playing—and knocking things over. He
remained an extremely sweet, good-tempered, affectionate dog, but he was just
too big for a small city lot and house.
We had to put twine all across the top of the fences to keep him out of
the gardens, and we had to build a 2-foot extension on top of our chain-link
fence all the way around the back yard to keep him from leaping over it and
getting out. It was amazing to watch
him: From a sitting position, he could
leap like a gazelle in one smooth, beautiful movement and he cleared the fence
with grace and speed. But one cannot
have an 80-pound dog roaming the city.
The kids didn’t like him at all because he chewed so much.
So the neighbor of a woman who works with Pat came and
took him—they have a big yard and a park across the street, where Andy has much
room to run and romp.
— September 29: Two
dogs, one cat
We
were not long with only one dog, however.
Pat felt quite bad getting rid of her dog (and, actually so did I and
both the kids, even though he had been a pain in the neck—because he really is
a nice dog), into which she had invested eight months of training and
tenderness and affection. So the day
after Andy left she drove to an animal shelter in Mankato and picked up Abby, a smaller (!) dog
that the vet guesses is part cocker, part English spaniel. She’s only a little over 30 pounds, doesn’t
chew, and is very affectionate. And
she’s a year old, so she won’t grow much.
Thank heavens.
— October 2: Two dogs,
two cats
And three days later, Pat and Krystin were at Petco
getting dog food. It was pet adoption
day, and Krystin found a lovely little kitten.
She didn’t want to go home without it, Pat said “OK,” and we got a
second cat, Jenny. She was not too fond
of the dogs, at first, but has accommodated to them. She and Bela run and play and sleep
together. She is also dangerous: I was napping on a sofa in our living room
and Jenny got into one of her wild kitten moods, ran across the floor, jumped
on my face, and used it as a launching pad to jump on the back of the sofa—and
with her back claw put a gash across my face just above one of my eyes across
the bridge of my nose. There was blood
everywhere. I recovered, but my nap sure
was rudely ended. The kitten now gets
locked upstairs if I want to nap on the sofa.
I am optimistic that we are now at our full animal
quota. I wish someone else would pay all
the vet bills and for all the pet food.
* * *
A faculty friend and I were having lunch one day and she
told me that as a joke, she asks incoming graduate students a question: if they had to choose one, which would they
be, a complete a__ hole or irremediably stupid.
No modifiers are allowed and you cannot assume that because the one is
stupid, the other is smart, or that the stupid one is nice because the other is
not. My friend related that the answers
break down along gender lines: women by
and large chose “stupid” and men choose “a__hole.” She explained that this is most likely
related to the different orientations associated with masculinity versus
femininity. “Men—more so than
women—think about the social world hierarchically, concerned with power and
status attainment; in contrast, women—at least more so than men—tend to think
about the world in terms of their connections with others, and are more
concerned with harmonious relations between and among people than they are
about power or status, which being an a__hole, of course, often affords.” In fact, when queried about their reasons for
choosing “irremediably stupid,” “the majority of women spontaneously asserted
that more harm is done in this world by a__holes than by stupid people.”
I
hemmed and hawed when she asked me, and I finally said stupid because I can’t
stand to be around a__ holes so why would I want to be one. (This is a hard choice.) Pat said she knew immediately which she
would be: the a__hole, because most of
the ones she has known are also smart and successful. Her rationale was that she could go to a
therapist to get help with her behavior, to be less of a jerk, but if you’re
just stupid—as opposed to uneducated—there’s nothing that can really help. Hers was the better answer. My friend responded that “the one
problem with a__holes and therapy, however, is that although everyone else can
see the need for the a__hole seeking therapeutic assistance, the a__holes are
usually the last to see the need (well, at least the a__holes whom I’ve
known, anyway).”
* * *
We had quite the exciting first week in May. Krystin’s last final exams were May 7, and we
were to drive out that Friday to pick her up and bring her home for the
summer. Thursday night May 6, Pat and
Elliott took the two dogs for a walk after dinner, as is their wont from time
to time. They take them over to the
local schoolyard, where they can run off-leash for a bit and release some of
their pent-up energy. Well, they had
quite the adventure. Elliott will tell
the story:
After a refreshing little nighttime walk in the park,
it came time to put the dogs back onto the leash. Mom decided that we would go the long way out
to allow the dogs to run just a little bit more. After we reached the end of
the path, mom reached over to grab Cody’s collar, but he had other plans. He ran like a bullet out into the street
after a little animal that we could not identify. Andy followed suit.
Mom’s first fear was for the animal, that the dogs
might kill it. Several seconds later the
animal was hanging off Andy’s face and both dogs were screeching
uncontrollably. By this time half the
neighborhood is awake. One lady in a
nearby house shouted “SHUTUP!” We ran
out into the street as fast as Cody had done five seconds ago. I grabbed Cody by the collar, mom did the
same with Andy. By that time the animal
was rolling around all over the place.
It eventually recovered itself and ran under a chain link fence.
Mom and I were trying to calm ourselves down and get
out of the street before a bus decided to come by. However, before we had a
chance to do anything, the animal reappeared from under the fence to finish
what it started. We didn’t stick around
to let it try. We darted home with the
animal in hot pursuit. When we reached
the backyard, the animal was nowhere to be seen. We weren’t complaining. We walked up onto the deck to go inside we
noticed that there was blood everywhere.
Almost literally. We couldn’t
figure out where it was coming from.
Then we noticed Cody. His mouth
was drenched in thick crimson liquid.
Mom washed him off a bit, thinking the blood belonged to the
critter. When the blood refused to stop
coming after five or ten minutes, we knew.
By now it was almost 11 at night, but apparently our
poor dog was more important than my next day at school. I accompanied mom as we quickly drove out to
the emergency vet in Golden Valley. We sat in the waiting room for a good hour or
so, when finally the vet approached us.
Cody’s lip had been split all the way through in three places along the
upper left side. The wounds had ripped a
minor artery. The vet gave Cody
anesthesia, and cauterized the wound.
Cody eventually returned home about 1:30. I stayed home
the next day to make sure that Cody’s battle wounds didn’t decide to blow
open. He ended up alright in the end.
Was it the mysterious kamikaze gopher that bit
Cody? Did Andy bite him in all the
excitement? Did Cody accidentally bite
himself? What happened to the
animal? We’ll never know. We know one thing though, Mom and I won’t be
going over to the schoolyard for a while.
* * *
We took the kids for a two-day trip to the Soudan
Underground Mine and the Soudan Underground Laboratory, the latter run by the
University in conjunction with some other organizations. The mine and laboratory are connected because
the latter took advantage of the existence of the mine.
The tour through the mine (which is no longer in operation)
provided me another example of a job I am glad I never had. These guys worked 12-hour days far
underground, and for the larger part of the time the mine was in operation they
had candles (which the miners paid for themselves) stuck in their helmets. They dug out iron ore in near dark in dirty,
dusty—AWFUL—conditions. I cannot imagine
that life expectancy among iron miners was long. The tour guide pointed out to us that the
people who came to work in the mines (before/around the turn of the 20th
century) left their home countries to get a better life. What they left must have been pretty terrible,
he commented, if working in the iron mine was better. We were amused when the guide told us that
the mine owners separated the Finns from each other, because they tended to be
rabble-rousers who wanted to form unions and improve working conditions.
The lab, on the other hand, represented the opposite end
of the spectrum—big science. The
underground cavern that was excavated adjacent to the mine contains two major
projects. One investigates neutrinos;
the other is looking at “dark matter,” which may “explain how galaxies are
formed.” I won’t go into the science of
the two experiments, which I do not understand very well, but what we saw is
interesting enough.
The neutrino work involves a set of over 400 octagonal
steel plates that are 26 feet across, suspended on I-beams, in a cavern
excavated for this purpose. The cavern
is enormous. Sandwiched in between the
steel plates are what looks like Styrofoam sheets of the same size and shape;
these sheets have a gazillion wires running out of them to multitudes of
computers. The whole contraption—which
is huge—weighs over 6,000 tons. The only
access to the mine is a small elevator that carried the miners down and the ore
up (which tourists also ride to get down there, all 2700 feet down); in order
to construct the neutrino gizmo, they had to bring the steel sheets down in
pieces and weld them together once they were underground. At some point the Fermilab, outside Chicago, will direct a beam of protons toward the Soudan
gizmo (through the ground, underneath Illinois,
Wisconsin, and Minnesota) and the physicists hope to start
learning more about neutrinos.
The other lab contains the “Cryogenic Dark Matter
Search.” Don’t ask. It has huge refrigerators that have dark
matter detectors that are cooled to about 1/100th of a degree above
absolute zero (-460 degrees). Almost as
cold as other parts of Minnesota
in January.
It was all quite interesting, even though the poor tour
guide, a young professor of physics, was trying to explain the projects to us
four numbskulls in a way that we could understand. We sort of got the idea. But this is “big science” at work today;
these projects cost about $170 million, if I recall correctly, with funding
from the U.S. Department of Energy, the United
Kingdom, the National Science Foundation, the State of Minnesota, and the
collaborating universities and laboratories.
No one institution can afford to conduct these kinds of experiments
alone—they are just too expensive.
And who cares about all this esoteric research? See my little narrative about SETI and
Bryson’s chapter on Manson,
Iowa (which follow), and his
chapter on space (the first except).
* * *
Here is one of the more demoralizing comments any of my
colleagues have ever received on a student evaluation form from one of his
classes: “Dr. _____ is very patient and
helpful, but I sometimes worry that his carefully constructed, grammatically
precise language (both spoken and written) are actually confusing for some
students. Admittedly we need practice with academic-style language, but
occasional more colloquial explanations or examples would probably be
appreciated.” Heavens, we would not want
the faculty to use carefully constructed and grammatically precise
sentences. And the kid gave my friend a
great review as an instructor!
* * *
[What if a bid meteor did hit the
earth?] People knew for a long time that
there was something odd about the earth beneath Manson, Iowa. In 1912, a man drilling a well for the town
water supply reported bringing up a lot of strangely deformed rock. . . . The water was odd, too. It was almost as soft as rainwater. Naturally occurring soft water had never been
found in Iowa
before.
In 1953, after sinking a series of
experimental bores, university geologists agreed that the site was indeed
anomalous and attributed the deformed rocks to some ancient, unspecified
volcanic activity. This was in keeping
with the wisdom of the day, but it was also about as wrong as a geological
conclusion can get.
The trauma to Manson’s geology had come
not from within the Earth, but from at least 100 million miles beyond. Sometime in the very ancient past, when
Manson stood on the edge of a shallow sea, a rock about a mile and a half
across, weighing ten billion tons and traveling at perhaps two hundred times
the speed of sound ripped through the atmosphere and punched into the Earth
with a violence and suddenness that we can scarcely imagine. Where Manson now stands became an instant hole
three miles deep and more than twenty miles across. The limestone that elsewhere gives Iowa its hard
mineralized water was obliterated and replaced by the shocked basement rocks
that so puzzled the water driller in 1912.
The Manson impact was the biggest thing
that has ever occurred on the mainland United States. Of any type.
Ever. The crater it left behind
was so colossal that if you stood on one edge you would only just be able to
see the other side on a good day. It
would make the Grand Canyon look quaint and
trifling. Unfortunately for lovers of
spectacle, 2.5 million years of passing ice sheets filled the Manson crater
right to the top with rich glacial till, then graded it smooth, so that today
the landscape at Manson, and for miles around, is as flat as a tabletop. Which is of course why no one has ever heard
of the Manson crater.
[Bryson talked with the Iowa geologists who knew
most about the crater.] I asked them how
much warning we would receive if a similar hunk of rock was coming toward us
today.
‘Oh, probably none,’ said Anderson breezily. ‘It wouldn’t be visible to the naked eye
until it warmed up, and that wouldn’t happen until it hit the atmosphere, which
would be about one second before it hit the Earth. You’re talking about something moving many
tens of times faster than the fastest bullet.
Unless it had been seen by someone with a telescope, and that’s by no
means a certainty, it would take us completely by surprise.’ . . .
What scientists can do—and Anderson and
Witzke have done—is measure the impact site and calculate the amount of energy
released. From that, they can work out
plausible scenarios of what it must have been like—or, more chillingly, would
be like if it happened now.
An asteroid or comet traveling at cosmic
velocities would enter the Earth’s atmosphere at such a speed that the air
beneath it couldn’t get out of the way and would be compressed, as in a bicycle
pump. As anyone who has used such a pump
knows, compressed air grows swiftly hot, and the temperature below it would
rise to some 60,000 Kelvin, or ten times the surface temperature of the
Sun. In this instant of its arrival in
our atmosphere, everything in the meteor’s path—people, houses, factories,
cars—would crinkle and vanish like cellophane in a flame.
One second after entering the
atmosphere, the meteorite would slam into the Earth’s surface, where the people
of Manson had a moment before been going about their business. The meteorite itself would vaporize
instantly, but the blast would blow out a thousand cubic kilometers of rock,
earth, and superheated gases. Every
living thing within 150 miles that hadn’t been killed by the heat of entry
would now be killed by the blast.
Radiating outward at almost the speed of light would be initial shock
wave, sweeping everything before it.
For those outside the zone of immediate
devastation, the first inkling of catastrophe would be a flash of blinding
light—the brightest ever seen by human eyes—followed an instant to a minute or
two later by an apocalyptic sight of unimaginable grandeur: a roiling wall of darkness reaching high into
the heavens, filling an entire field of view and traveling thousands of miles
an hour. Its approach would be eerily
silent since it would be moving far beyond the speed of sound. Anyone in a tall building in Omaha or Des
Moines, say, who chanced to look in the right direction would see a bewildering
veil of turmoil followed by instantaneous oblivion.
Within minutes, over an area stretching
from Denver to Detroit and encompassing what had once been Chicago, St. Louis,
Kansas City, the Twin Cities—the whole of the Midwest, in
short—nearly every standing thing would be flattened
or on fire, and nearly every living thing would be dead. People up to a thousand miles away would be
knocked off their feet and sliced or clobbered by a blizzard of flying
projectiles. Beyond a thousand miles the
devastation from the blast would gradually diminish.
But that’s just the initial
shockwave. No one can do more than guess
what the associated damage would be, other that it would be brisk and
global. The impact would almost
certainly set off a chain of devastating earthquakes. Volcanoes across the globe would begin to
rumble and spew. Tsunamis would rise up
and head devastatingly for distant shores.
Within an hour, a cloud of blackness would cover the planet, and burning
rock and other debris would be pelting down everywhere, setting much of the
planet ablaze. It has been estimated
that at least a billion and a half people would be dead by the end of the first
day. . . .
The amount of soot and floating ash form
the impact and following fires would blot out the sun, certainly for months,
possibly for years, disrupting growing cycles.
In 2001 researchers at the California Institute of Technology analyzed
helium isotopes from sediments left from the later KT impact and concluded that
it affected Earth’s climate for about ten thousand years. This was actually used as evidence to support
the notion that the extinction of the dinosaurs was swift and emphatic—and so
it was in geological terms. We can only
guess how well, or whether, humanity would cope with such an event.
And in all likelihood, remember, this
would come without warning, out of a clear sky. . . .
[We could not blow it up, if we saw one,
because we haven’t the rockets to do it with, and even if we did, it would take
a year’s notice at least. More likely is
that we would not spot it until it was fewer than six months away, far too late
to react.] Interestingly, because these
things are so difficult to compute and must incorporate such a significant
margin of error, even if we knew an object was heading our way we wouldn’t know
until nearly the end—the last couple of weeks, anyway—whether collision was
certain. For most of the time of the
object’s approach we would exist in a kind of cone of uncertainty. It certainly would be the most interesting
few months in the history of the world.
And imagine the party if it passed safely.
Think of the Earth’s orbit as a kind of
freeway on which we are the only vehicle, but which is crossed regularly by
pedestrians who don’t know enough to look before stepping off the curb. At least 90 percent of these pedestrians
[asteroids] are quite unknown to us. We
don’t know where they live, what sort of hours they keep, how often they come
our way. All we know is that at some
point, at uncertain intervals, they trundle across the road down which we are
cruising at sixty-six thousand miles per hour.
* * *
I spent way too much time with the
medical establishment the last half of August, but it wasn’t because anything
was wrong with me. Krystin developed a
tingling in her fingertips and up the side of her left hand and also lost much
strength in the arm. After I ignored her
for a week, on the theory that she had pulled a muscle or something, I finally
on a Monday emailed her endocrinologist to tell her about the symptoms. The doctor must have been working on her
computer, because she fired back a message telling me she wanted her to be seen
that day and that this sounded like something progressive.
So starting with Neurology and
ending up in the emergency room because the regular departments had no
available time on the MRI machine, Krystin and I had a lot of “quality time”
together. For the next two weeks we had
appointments with Endocrinology, Family Practice, the MRI again, and then for
an EMG. That last one is a nasty test. I had one in 1992; laying on my stomach, with
the back of my arm up, the doc jabbed a needle (that was, at the time) about
the size of a small screwdriver into my arm from my wrist to my shoulder, and
once inserted, moved it around to get electrical readings from the nerves and
muscles. I had little spots bleeding all
up my arm. No anesthetics allowed. And he had the nerve (no pun intended) to
tell me that I wasn’t “relaxed” as he was making his way up my arm jabbing me
and twisting this thing around. I didn’t
tell Krystin about the nature of the test, other than to say it was not the
most pleasant thing I had ever gone through.
In the intervening 12 years, at least the size of the needle has shrunk
and the readouts are now on a multi-colored computer monitor with all kinds of
nifty graphs and statistics. But the
test is the same: insert the needle and
then twist and turn it. Krystin bore up
pretty well, although she clenched her teeth several times and had tears in her
eyes from the pain. Oh yes, before they
start using the needle, they tape metal to your fingertips and shoulders and
then jolt you with little shocks of electricity to get readings on nerve
transmissions.
The upshot of all this was that
there was no identifiable nerve or muscle damage, although they did get
“abnormal” readings on all her arms and legs (they did one insert on the other
arm and a leg). Their default conclusion
was that high blood sugar numbers from her diabetes were causing neuropathy and
the likely “cure” was to keep the numbers under control. At least one friend on the Medical School
faculty, whom I told about all this, was skeptical that they could conclude it
was diabetic neuropathy absent any evidence.
Whatever the cause, by early October the tingling and weakness were
disappearing, Krystin told me. It may or
may not be coincidental that beginning in July, and especially after the tests,
she was doing a MUCH better job of controlling her blood sugars.
Fortunately, the other three of us
in the family had nothing other than the routine appointments with the health
care system.
* * *
Life in Minnesota
is wonderful. Pat and I went to the
theater on June 1 to see “The Pirates of Penzance.” We both had on long wool slacks, long-sleeved
shirts; I had on a fleece-lined jacked and Pat a light knee-length
overcoat. It was darn cold. And it didn’t warm up much immediately
thereafter; we were sitting on our back-yard deck on June 18 in jackets and
long pants because it was so cold.
(Don’t ask why we didn’t go in the house. In Minnesota
we sit outside whenever we can because we’re cooped up inside for months
because the winters are long.) And we
were still sitting outside in jackets and long pants at the end of June! I finally turned on the air conditioning—my
measure of whether or not summer has arrived—on June 29.
We went to the theater again in August—and AGAIN we had
to wear jackets. It was among the
coldest Augusts since records have been kept.
* * *
Our international travel in 2004 consisted of a tour of
the United Nations building in New
York City.[3] After arriving home from Manhattan in June, where we had taken the
kids for 5 days, I sort of felt like I had arrived home from a foreign
country. That’s not intended to reflect
badly on Manhattan; it’s just that it is so
different from our lives in South Minneapolis
that it seems like a foreign place. The
small towns in northeastern Australia
were not quite as different.
As is our wont in traveling, we of course tried to cram
in as much as possible in a short period of time. We took the kids to see the Metropolitan
Museum of Art, the American Museum of Natural History, the United Nations, St.
Patrick’s Cathedral, Central Park, the World Trade Center site (via a walking
tour that included Battery Park, Wall Street, the first national capitol,
etc.), took the Circle Line boat tour around the entire island, walked across
the Brooklyn Bridge for dinner and then back to see the lights of
Manhattan. Pat and I saw the Cathedral
of St. John the Divine. Elliott and I
also visited Temple Emanu-el synagogue, on Central Park West, and went to the
top of the Empire
State Building. Pat and Krystin also went shopping (of
course) in mid-town and near the World
Trade Center
site.
A few observations about what we saw and did.
— St. John the Divine, an
Episcopal cathedral, is said to be the largest cathedral in the world. It is also sort of a pile, half Romanesque
and half Gothic, and not finished even though they have been trying to build it
since the early 1890s. The interior
makes some sense, as you look at the immensely vaulted ceiling, but the outside
is a complete jumble of construction that looks awkward and ugly. Some of the exterior stonework, completed in
recent decades, used living humans as models for what in an earlier day would
have been gargoyles or saints. Both St.
Patrick’s and Emanu-el, at least architecturally, make more sense and are more
coherent. They are also more
attractive. I was surprised to see a
rose window in Temple Emanu-el (along with a lot of other stained glass), which
I have always associated with Catholic and Episcopal cathedrals. All three religious edifices—St. Patrick’s, St. John, and Emanu-el—are
cavernous. (It was pointed out to me,
when I expressed surprise at a rose window in a synagogue, that a rose window
is/was an architectural/design element, not a religious symbol.)
— On our boat tour around Manhattan, we happened to see the
Queen Mary 2, which the tour guide said is the largest ship in the world (2600
passengers, 1200 crew, 17 decks, 4 swimming pools, 20 restaurants, etc.). I assume he meant “largest passenger ship”
because right next to where it had docked there was a WWII aircraft carrier
that is now a museum. He also told us
that a modern aircraft carrier is twice as long as the WWII version—which, just
looking at the two, would clearly make it MUCH larger than the QM2. And I once recall reading somewhere that the
crew of a modern aircraft carrier numbers nearly 5000 (or more). I cannot vouch for the accuracy of that
recollection.
— Museum restaurants have significantly improved the quality of
the food they offer. I was very
impressed both with the quality and the selections. And I have concluded that museum shops have
some of the most interesting merchandise to be found.
— Manhattan
is a great place to visit, and there are a million things to do, but I would
never want to live there (at least not at my age—I might have enjoyed it when
in my 20s). It is too noisy, too
crowded, and (for my tastes) too dirty.
The subway is an extremely effective way to move a large number of
people around, but most of the stations are grungy and ill-kempt. (The subways themselves, I should say, are
quite clean. The newer cars on some
routes were as pleasant as it is possible to be in a dark tunnel in the
ground.) I asked Elliott what he thought
about Manhattan;
he gave it the thumbs down (although he liked the food and the things he saw at
the museums).
— The kids said they enjoyed the trip, but they groused almost
constantly about how much we made them walk.
Gad, they are 30+ years younger than we are and you’d think we were
torturing them. But in general they are
getting better at traveling, and even traveling together!
— Elliott at one point asked where all the houses were. I recalled for him what the boat tour guy had
told us: 1.5 million people live on Manhattan. Then I pointed to all the tall buildings
around us and said “they live there.” He
didn’t like the idea of living in a condo or apartment. I am assured by a native New Yorker that in
the other boroughs there are neighborhoods like South
Minneapolis, with houses and yards. (According to Wikipedia, The Free
Encyclopedia on the Web, quoting the Census Bureau, “New
York County [Manhattan] is the most densely populated county in the United States. As of the census of 2000, there are 1,537,195 people, 738,644
households, and 302,105 families residing in the county. The population
density is 66,940 [per square mile].” The U.S.
average population per square mile is 79.6; in Minnesota,
it is 55 people per square mile; Hennepin
County is 1,854. I think you could say that Manhattan is densely populated. Duh.)
— Instead of doing what most people do when they visit New
York—go to Broadway plays—we went as a family to see the latest Harry Potter
movie (we had seen the first two together, along with the Lord of the Rings
movies, so this continued the habit). We
could not find a play that we could all agree on, and rather than spend $100
per ticket to make 1 or 2 of us see something we did not want to, we decided to
do a movie instead.
— Pat commented that she felt safer walking the streets of New
York at 11:00 at night (when there were still a LOT of people out and about,
presumably mostly tourists since sensible New Yorkers were probably home in
bed) than she would at home (where the streets of South Minneapolis would be
deserted and dark). The tour boat guide
said the crime rate in New York is the same as
Cucamonga, California.
(And with classic New York
narcissism, he added “whatever that is.”)
I don’t know if he made up the statistic, but he said it in all
seriousness.
Krystin provided me her summary of the trip.
My trip to NYC can be summed up in two words: Mentally Fulfilling.
My parents, and probably my brother, would beg to
differ, saying I complained all the time about my legs hurting or I’m thirsty
or have to go to the bathroom. But the
way I see it, in 10 years when I look back on it, I won’t remember all those
little details. I’ll be remembering the Museum of Modern Art,
lunch in Central Park with my brother, Ground Zero, walking across the Brooklyn Bridge,
and my mom sticking chop sticks up her nose at that fun little Brooklyn restaurant.
One of my most memorable moments would have to be visiting Ground Zero
during a 3-hour walking tour that the four of us took one morning. It was absolutely amazing to see how far
they’ve come in three years. Of course
I’ve seen the site on TV and heard it talked about all over the news, but
actually being there in person—it’s almost impossible to describe the emotions
that were going through my head. It was
incredible.
Of course I can’t go without mentioning one of my
favorite activities: shopping. There was plenty of it in New
York: 5th Avenue, Times
Square, some shopping mall.
I even stopped in Tiffany’s on 5th
Avenue just so I could get one of those famous
little turquoise Tiffany bags. I didn’t
have anything to put in it of course, so my mom took a white paper towel left
over from lunch and stuffed it in there so it looked like I got something. A girl can dream, can’t she? Then there was the day the four of us went to
the Natural History Museum. My mom and I
didn’t want to see some of the same things my dad and brother wanted to see, so
we split up. My mom and I were sitting
on the front steps of the museum taking a quick break and thought, if we sneak
off and go shopping instead, would they know?
But we figured if we came back with a bunch of shopping bags, it would
be a little obvious. So instead we
enjoyed the museum and all that it had to offer.
In thanks for all the entertainment New York provided us, I, in return, provided
some entertainment for a few lucky New Yorkers.
I might as well mention this incident because I know if I don’t, my
loving father will. We had just enjoyed
a nice dinner at a Mexican restaurant near Chinatown,
and were walking back to our hotel, me with leftover burrito in hand. We were crossing the street only a block from
our hotel, and in the middle of the street was one of those grates that workers
use to get underground. It was dark out,
and I was wearing flip-flop sandals (known to those of you from older
generations as “thongs”) and I remember thinking to myself, “I should watch out
for that grate, it could be dangerous.”
You know, it’s funny how once you put your mind to something else, how
fast you forget what you just told yourself not to do. So of course I misstep and trip right over
the grate, falling flat on my face and scraping my left elbow pretty badly. Luckily my right arm was safe because my
burrito was there to cushion the fall.
Both my dad and brother rush to help me up, and hold on to me as we
finish crossing the street and up onto the sidewalk. Where is my mom? Walking ahead of us so that I can’t see her
laughing. Thanks mom. Lucky to say, both my burrito and I were
fine; I was not hurt on the outside, aside from my scrape, but my ego was
hurting pretty badly for a while after that.
A lot of people, including my dad, have said that they
would never like to have grown up in New
York City. Although
the borough of Manhattan
itself is mostly devoid of individual houses and yards, I have come to the
personal conclusion that I would have enjoyed growing up in NYC. I have always been a big city girl (I say
that as I sit in my small-town college dorm room), and I love the city life,
with all the people and lights. A lot of
people would also say it is not a good place to raise a child, but children
that were raised in New York City
still grow up, still get married and get jobs, and have children themselves. So it doesn’t seem like that bad a place to live.
But, that’s just my opinion.
Besides, if you live in New
York, you wouldn’t need to buy a car because you
could just take the subway everyone.
Which, by the way, is probably not really as dirty and grungy as my dad
makes it out to be; and I know he made a comment about them. It’s a subway in New York, dad, not the Orient Express!
All in all, the trip was a very good and very
memorable one. I’m sure that I will
never live there myself, but I would never turn down another opportunity to
visit the city of never-ending lights and people.
* * *
Elliott, in mid-November, took about $10 off me in
blackjack after I taught him how to play.
He uses a basic strategy card to maximize the odds (I’m always the
house), so he knows when to split pairs and double down. Should he ever choose to go to a casino
sometimes when he’s older and gamble for fun, he’ll be a careful gambler. He plays for dimes with me (usually 20¢ per
hand). He starts out with a set amount,
and if he loses it, he quits. When he
has won, he’ll quit. Mostly he quits
when he’s ahead more than he loses. He’s
funny, though—if I get 20s or 21s a few times close together, he gets mildly
grumpy at losing 60¢ or 80¢, but he’ll go out and spend $50 on a video game or
$20 on a CD or a DVD and never think twice about it. Maybe that’s what will make him a good
gambler. Our thrifty son. When he was young, and accumulated his
allowance or birthday/Christmas money, he was like Ebenezer Scrooge. Now that he’s in his teens, he’ll drop money
on electronic stuff without hesitation.
But he gets surly at losing 60¢.
* * *
As of the middle of fall semester, 2004, her third
semester in college, Krystin has decided that her major will be history. After she told us that she liked her history
course the most, and realized that she liked history most in high school, she
concluded she should be a history major.
Never in a million years did I think she would choose history—I think I
even ventured it as a suggestion before she started college and she reacted
like I had suggested she eat raw eggs.
But that’s what college is all about—learning what interests you, what
you’re good at, and following it up. As
someone who reads history, and who almost majored in it myself, I’m
delighted. But we’ll see if that
sticks. She doesn’t have to decide for
another semester.
* * *
When I look over this letter,
and the ones from prior years, someone who read them and did not know me very
well might think I go around with a sour look and a dour attitude about the
country and life in general. Of course
that’s not true; I’m actually pretty upbeat and of sunny disposition virtually
all the time (some might say foolishly so).
And while holiday letters are supposed to be good news and family
updates, mine are not only that; I like to use them to explore and think about
problems in the world that have my attention.
Since problems are by definition troubling, much of my letter that
follows probably conveys the impression I am troubled. I am, about the problems, but they don’t
haunt my every waking hour.
* * *
No matter how hard you try you will
never be able to grasp just how tiny, how spatially unassuming, is a
proton. It is just way too small.
A proton is an infinitesimal part of an
atom, which is itself of course an insubstantial thing. Protons are so small that a little dib of ink
like the dot on this I can hold
something in the region of 500,000,000,000 of them, rather more than the number
of seconds contained in half a million years.
So protons are exceedingly microscopic, to say the very least.
Now imagine if you can (and of course
you can’t) shrinking one of those protons down to a billionth of its normal
size into a space so small that it make a proton look enormous. Now pack into that tiny, tiny space an ounce
of matter. Excellent. You are ready to start a universe. . . . [This is a spot where the material] is so
infinitesimally compact that it has no dimensions at all. It is known as a singularity.
Get ready for a really big bang. Naturally, you will wish to retire to a safe
place to observe the spectacle.
Unfortunately, there is nowhere to retire to because outside the singularity
there is no where. When the universe begins to expand, it won’t
be spreading out to fill a larger emptiness.
The only space that exists is the space it creates as it goes along. . .
.
And so from nothing our universe
begins.
In a single blinding pulse, a moment of
glory much too swift and expansive for any form of words, the singularity
assumes heavenly dimensions, space beyond conception. In the first lively second . . . is produced
gravity and the other forces that govern physics. In less than a minute the universe is a
million billion miles across and growing fast.
There is a lot of heat now, ten billion degrees of it, enough to start
the nuclear reactions that create the lighter elements—principally hydrogen and
helium, with a dash (about one atom in a hundred million) of lithium. In three minutes, 98% of all the matter there
is or ever will be has been produced. We
have a universe. . . . And it was all
done in about the time it takes to make a sandwich
One thing that Bryson’s book made me realize is that it
must be very frightening or disturbing to live in a world and because of belief
or faith reject many of the major advances made in the sciences. I have not reprinted any of Bryson’s
recounting of the advances in evolutionary biology or in the development of the
paleontological record because the tales, while equally as or more interesting
as the ones I quote, are too complex to squeeze even into collections of
paragraphs. But many in the world would
deny the findings of many of the biological sciences, as well as the cosmology
of the universe, because it either questions or contradicts long-held beliefs.
I
try to believe everything on a provisional basis, so I find all these things
quite exciting and interesting. One of
my great regrets about dying, apart from no longer being around my spouse and
children—assuming I die before they do—and losing all my friends, is that I
will miss all of the interesting discoveries in science that will inevitably be
made after I am gone.
In the words of a scientist friend of mine, “any viewpoint which
is provisional is in the realm of the intellect. Any viewpoint which cannot be revisited is
automatically disqualified and becomes a matter of faith.” The big bang theory (which Bryson talks
about, and which I excerpt later), while “provisional” in the technical sense,
is probably only slightly less provisional than the theory of gravity. I asked an astronomer faculty friend about
this. I was told this: “Most
astronomers don’t refer to the Big Bang as a theory anymore. We talk about Big Bang-type cosmologies or
models. The 1967 discovery of the
3-degree background radiation confirmed that the Universe began in a
super-dense state of matter and energy.
No doubt about it. . . . The Big
Bang was the creation and our observable Universe is a consequence of that
creation. Why it happened, we don’t
know, but we as astronomers try to understand what happened and is still
happening as we observe stars being born and dying. We are all products of the Universe.”
I do
want to make it clear, if and as you read this letter, that while I am not
religious, I am certainly not anti-religious, either. The choice of belief is personal. In some of the articles I quote, it may seem
that there is an anti-religious bias, which I do not intend. My point, in a couple of places, is rather to
look at the implications of injecting (certain) religious beliefs into public
policy. I think it fair to look at what
happens when church and state become intertwined, both here and elsewhere
around the world. But I do not want
friends and relatives of faith to think I am taking pot shots at them, because
I am not. And I trust that all of you
have known me long enough to know that I would not do so.
* * *
Those of you who read The New York Times regularly
will have seen this article awhile ago, but I thought it was fascinating: “The Futile Pursuit of Happiness.”[4] The reporter spent time with psychologists at
Harvard and the University of Virginia, an economist at Carnegie-Mellon, and a
Nobel Laureate in economics at Princeton
talking about how people predict how they will feel in the future. (I checked with a friend in the Department of
Psychology at Minnesota,
who assured me that “they are respectable researchers and the work has been
replicated many times. . . . You
can trust it.”) These four guys have
been studying
the
decision-making process that shapes our sense of well-being: how do we predict what will make us happy or
unhappy—and then how do we feel after the actual experience? For example, how do we suppose we'll feel if
our favorite college football team wins or loses, and then how do we really
feel a few days after the game? How do
we predict we'll feel about purchasing jewelry, having children, buying a big
house or being rich? And then how do we
regard the outcomes? According to this small corps of academics, almost all
actions—the decision to buy jewelry, have kids, buy the big house or work
exhaustively for a fatter paycheck—are based on our predictions of the
emotional consequences of these events.
What they have found is that people are generally lousy
at understanding “what we want and . . . improving our well-being.” We make mistakes when we imagine “how we will
feel about something in the future.” We
don’t get things basically wrong—we all know we will react differently to
unhappy or aversive events and happy events.
What they found, however, is
that
we overestimate the intensity and the duration of our emotional reactions . . .
to future events. In other words, we
might believe that a new BMW will make life perfect. But it will almost certainly be less exciting
than we anticipated; nor will it excite us for as long as predicted. The vast majority of Gilbert’s [one of the
psychologists] test participants through the years have consistently made just
these sorts of errors both in the laboratory and in real-life situations. And whether Gilbert’s subjects were trying to
predict how they would feel in the future about a plate of spaghetti with meat
sauce, the defeat of a preferred political candidate or romantic rejection
seemed not to matter. On average, bad
events proved less intense and more transient than test participants predicted.
Good events proved less intense and
briefer as well.
As interesting is that we appear to have internal
defenses against big bad things (death of a family member, loss of a
job) but not little bad things.
“Our emotional defenses snap into action when it comes to a divorce or a
disease but not for lesser problems. We
fix the leaky roof on our house, but over the long haul, the broken screen door
we never mend adds up to more frustration.”
One of the researchers commented that “our research simply says that
whether it’s the thing that matters or the thing that doesn’t, both of them
matter less than you think they will.” “Things that happen to you or that you buy or
own—as much as you think they make a difference to your happiness, you’re wrong
by a certain amount. You’re overestimating how much of a difference they
make. None of them make the difference
you think. And that’s true of positive
and negative events.” We never seem to
learn, however. We will “adapt to the
new BMW and the plasma TV . . . but we seem unable to predict that we will
adapt. Thus, when we find the pleasure
derived from a thing diminishing, we move on to the next thing or event and
almost certainly make another error of prediction, and then another, ad
infinitum.”
The defense against negative events is also something
that we seem not to realize that we possess, even though many—most—have
experienced traumatic events. Typically,
if a loved one dies, “we project a permanently inconsolable future.” People do not “’recognize how powerful psychological
defenses are once they’re activated. . . . We’ve used the metaphor of the
‘psychological immune system’—it’s just a metaphor, but not a bad one for that
system of defenses that helps you feel better when bad things happen. . .
. What's surprising is that people don’t
seem to recognize that they have these defenses, and that these defenses will
be triggered by negative events.’”
So, nothing will be as bad you expect—and nothing will be
as good as you expect. And you will get
over it, however bad it is.
* * *
If one reads history, as I like to do, one sometimes
wonders about how to judge the actions and beliefs of people in other times and
places. Four of the first five U.S.
presidents—Washington, Jefferson, Madison, Monroe—owned slaves. Only Washington
freed them upon his death. Alexander
Hamilton argued slavery was immoral (as did a lot of others). Martin Luther was a vicious anti-semite (and
urged the princes of Germany
to squash peasant uprisings provoked by food shortages). The list is long, and not only American. How is one to measure these figures of
history?
Christopher Orlet, in an interesting piece on the
Butterflies and Wheels website (“fighting fashionable nonsense”), “A Defense of
Whig History,” argues that it is appropriate to judge historical figures
by some modern day standards.[5] The prevailing view, he writes, is that
“judging historical figures, particularly a nation’s heroes, by contemporary
moral standards is unfair. Among many
historians it is not only unfair, it is an academic abomination known
derisively as whig history.” Orlet
argues that whig history is used to rationalize “depraved behavior.”
The
Great Reformer [Martin Luther], it is repeatedly alleged, was but a product of
his time and place, i.e., a typical superstitious, Jew-hating, Medieval Saxon,
and as such modern society cannot hold him accountable for beliefs, ideas and
actions that only today in our hypersensitive, morally advanced times are
thought sinful. This scarcely corresponds with our innate need to hold our
ecclesiastical heroes--men like Luther, Augustine of Hippo, and the numerous
contemporary Muslim clerics thought to have God’s ear--to a higher standard of
moral accountability than the rest of us mere laypeople. . . . Yes, Luther was an ultra-nationalist who
loathed Jews, Anabaptists, Catholics, peasants, the Renaissance and reason, but
didn’t everyone? And yes, Augustine
advocated burning heretics and advised that the Jew “suffer and be continually
humiliated,” but then in his day that was simply par for the course. Taken to its logical conclusion, then, we
must concede that Adolf Hitler and Heinrich Himmler were but products of their
time and national character. Thus if
Martin Luther’s anti-Semitism is excused on grounds that it was normal for his
place and time, are we to absolve the Nazis of the Holocaust since their
anti-Semitism was similarly common in twentieth-century Deutschland?
One
problem with accepting some version of whig history means that it is difficult
to argue that moral and ethical standards evolve (and get better). If one engages in whig history, “many of our
historical heroes would come off looking no better than a Senator Joe McCarthy
or a Slobodan Milosevic.”
Orlet says he doesn’t buy the objection to whig
history. “Generally speaking, wrong has
always consisted of inflicting injuries on other people, whereas ‘right wrongs
no man,’ to quote the Scottish proverb.
It follows then that murder, hatred, exploitation, intolerance, and
bearing false witness have always been wrong, and have always been known to be
wrong.” Presumably the Ten Commandments
have been known for quite a few centuries, so one can make an argument “the
Christian rabble-rousers of the Middle Ages who led the persecution of
‘witches’ and ‘Jewish devils’ were fully aware of the viciousness of their
acts, despite the blessings of Mother
Church. Anti-Semitism and
slavery remain two of history’s popular moral benchmarks, though most modern
historians grant dispensations to historical heroes for their Jew hatred and
slave-owning. Difficulties arise,
however, when one recalls men, even in Medieval Europe, who condemned the
heinousness of anti-Semitism and slavery. One of these was the theologian Pierre
Abelard, who, in the tenth century, wrote in defense of the Jews. . . . We know what Abelard received for his pains:
murder attempts, condemnation and castration. Meanwhile Luther’s excuse was that Yahweh
expected too much from sinful man, that there was no way in hell mankind could
keep God’s rigorous commandments. May as
well then toss Holy Writ down the crapper.”
I have always been somewhat uneasy trying to absolve
historical figures—even the ones I admire—of what seem to me behaviors or
beliefs that should have been unacceptable at any time, even if that does seem
like imposing 21st century standards on the past. There are plenty of the examples beyond
Abelard. At the same time, however, it
is probably not fair to judge those in the past by every ethical and moral
standard that is accepted today. All of
which means that an historian or biographer has to be a discriminating analyst
and judge.
* * *
I happened upon an article about a “discipline” that I
had never heard of, “science studies.”
The more I read, the more annoyed I became.
Science studies, . . . since its inception in the
1970s, . . . has produced a sizeable body of work that purports to show that
not just the agenda, but even the content of theories of natural sciences is “socially
constructed.” All knowledge, in
different cultures, or different historical times—regardless of whether it is
true or false, rational or irrational, successful or not in producing reliable
knowledge—is to be explained by the same causes. . . .
In other words, being
scientific about science means we no longer believe that modern science as we
practice it in universities and companies does not mean we are discovering “truth.”
In principle, there is nothing whatsoever wrong in the
agenda of science studies: Modern
science is not a sacred form of knowledge that cannot be examined
skeptically. Science and scientists must
welcome a skeptical look at their enterprise from social critics. The problem with science studies comes in
their refusal to grant that modern science has evolved certain distinctive
methods (e.g., controlled experiments and double-blind studies) and distinctive
social practices (e.g., peer review, reward structures that favor honesty and
innovation) which promote a higher degree of self-correction of evidence, and
ensure that methodological assumptions that scientists make themselves have
independent scientific support. Science
studies start with the unobjectionable truism that modern science is as much a
social process as any other local knowledge.
Here
is the kicker. “Science studies scholars
invariably end up taking a relativist position. They argue, in essence, that
what constitutes relevant evidence for a community of scientists will vary with
their material/social and professional interests, their social values including
gender ideologies, religious faith, and with their culturally grounded
standards of rationality and success. . . .
The sciences of modern western societies are not any more ‘true’ or ‘rational’
than the sciences of other cultures. If
modern science claims to be universal, that is because Western culture has
tried to impose itself on the rest of the world through imperialism.” What this comes down to is that one cannot
discriminate between science and religion, science and belief based on nothing
at all or based on ideology.
Those
who take this view want, for example, to “’to pursue science . . . as
Christians, starting from and taking for granted what we know as Christians.’” The approach is not limited to Christianity,
however; there are those who also want to engage in “’Vedic science’ [Hindu] or
‘Islamic sciences,’ complete with miracles and other manifestations of the
supernatural.” If one eliminates the “godless,
materialist assumptions of modern scientists (who happen to be overwhelmingly
male, white, imperialists Westerns anyway),” the argument goes, scientific
evidence can lend support to an approach that includes “God as acting in
nature.”
[These other] movements are opposed to a naturalistic
worldview which happens to be fundamental and necessary for
science-as-we-know-it. They are keen on
asserting their right to their own sacred sciences because they want to bring
in the supernatural as an explanation of natural phenomena. All religious fundamentalists want a
full-blooded version of their faith, which is there not just for spiritual
solace, . . . but which can make propositional claims about the world. They want to believe in a God that actually
does some work in this world, both at the level of nature and at the level of
social life of men and women. Naturalism
as a worldview, and as a method, makes God, or Spirit, irrelevant and
unnecessary to explaining the workings of nature, human beings and
society. Religious fundamentalists
correctly sense that naturalism is the biggest threat there is to a strong
version of their faith.
Naturalism
is the opposite of supernaturalism and denies supernaturalism in the operation
of the universe in which we live. For
scientists who embrace naturalism (which is virtually all of them in the United
States and the rest of the developed world), there is no room for a deity “over
and above this world who can abrogate natural laws, as in Judeo-Christian and
Islamic traditions; or as a spiritual force or energy that resides in nature,
but is able to exist separately from it as well, as in the case of Hinduism and
Taoism.” The practical import of
naturalism is not to be anti-religious, but simply an affirmation that “all
hypotheses and events are to be explained and tested by reference to natural
causes and events which can be apprehended by sense experiences.” Religious fundamentalists, however, refuse to
accept that approach, and “hold that there are more valuable avenues of
knowledge than ‘mere’ sense experiences.”
Naturalism
also takes the position that “nature is best accounted for by reference to
material principles, i.e., by mass, energy and physical-chemical properties
which have been described by sciences employing naturalistic methodology.” Small segments of most major religions accept
naturalism (the Catholic Church, for example, accepts Darwinian evolution, with
the caveat that there is a soul).
Finally, naturalists maintain that “here is no defensible evidence for
the existence of supernatural forces anywhere in nature. Naturalism is the only
possible inference from all the scientific evidence available to us, up to this
moment.”
This
is not, of course, a trivial issue. “In
the militantly conservative movements in all major world religions, then, the
secularization of science—the hard-won freedom of science from the churches,
the Brahmins and the mullahs—is seen as a modernist error that needs to be
corrected.” We also see it in lawsuits
across the country seeking to have creationist or “intelligent design” taught
alongside evolution in K-12 biology classes.
Even as I wrote this, I noted an article in the newspaper about a
lawsuit in Alabama
seeking to remove stickers from biology textbooks telling students that
evolution is only a theory and that they need to approach it critically.
My conclusion is that when I go into the operating room,
I want to be operated on by a surgeon trained in naturalistic science, not “Islamic
science” or “Vedic science” or “Christian science.” I just hope this kind of stuff doesn’t spread
so far that it inhibits the scientific advances I want to see in the world so
that, for example, Krystin’s diabetes can be cured. But I also noted Benjamin Franklin’s comment
to his grandson after he (Franklin) had been part of a commission that debunked
mesmerism, “some think it will put an end to Mesmerism, but there is a
wonderful deal of credulity in the world, and deceptions as absurd have supported
themselves for ages.”[6]
The only consolation one might take out of the spread of
“Islamic” or “Vedic” science is that those nations/groups that rely on such
“science” are unlikely to be very successful in the international competition
to develop knowledge (e.g., treatments or cures in medicine, advances in
engineering, and so on). I will wait to
see how many “Christian” or “Islamic” scientists win Nobel Prizes. The sad part of such ventures is that much
time and effort on the part of otherwise very bright scientists might be wasted
on fruitless ventures. One thinks of
Lysenko and the Soviet effort in the 1950s to advance an approach to genetics
that Western scientists knew was nonsense:[7]
When
the rest of the scientific world were pursuing the ideas of Mendel and
developing the new science of genetics, Russia
led the way in the effort to prevent the new science from being developed in
the Soviet Union. Thus, while the rest of the
scientific world could not conceive of understanding evolution without
genetics, the Soviet Union used its political
power to make sure that none of their scientists would advocate a genetic role
in evolution. It was due to Lysenko’s
efforts that many real scientists, those who were geneticists, . . . were sent
to the gulags or simply disappeared from the USSR. Lysenko rose to dominance at a 1948
conference in Russia
where he delivered a passionate address denouncing Mendelian [the “father” of
modern genetics is Gregor Mendel] thought as ‘reactionary and decadent’ and declared
such thinkers to be ‘enemies of the Soviet people.’ He also announced that his speech had been
approved by the Central Committee of the Communist Party. Scientists either groveled, writing public
letters confessing the errors of their way and the righteousness of the wisdom
of the Party, or they were dismissed.
Some were sent to labor camps. Some were never heard from again. Under Lysenko's guidance, science was guided
not by the most likely theories, backed by appropriately controlled experiments,
but by the desired ideology. Science was
practiced in the service of the State, or more precisely, in the service of
ideology. The results were predictable: the steady deterioration of Soviet
biology. Lysenko’s methods were not
condemned by the Soviet scientific community until 1965, more than a decade
after Stalin's death.”
* * *
No persuasive reason has ever been
advanced for why hominid brains suddenly began to grow two million years
ago. For a long time it was assumed that
big brains and upright walking were directly related—that the movement out of
the forests necessitated cunning new strategies that fed off of or promoted
braininess—so it was something of a surprise, after the repeated discoveries of
so many bipedal dullards, to realize that there was no apparent connection
between them.
‘There is simply no compelling reason we
know of to explain why human brains got large,’ says Tattersall. Huge brains are demanding organs; they make
up only 2 percent of the body’s mass, but devour 20 percent of its energy. . .
.
Tattersall thinks the rise of a big
brain may simply have been an evolutionary accident. He believes with Stephen Jay Gould that if
you replayed the tape of life—even if you ran it back only a relatively short
way to the dawn of hominids—the chances are ‘quite unlikely’ that modern humans
or anything like them would be here now.
‘One of the hardest ideas for humans to accept,’ he says, ‘is that we
are not the culmination of anything.
There is nothing inevitable about our being here. It is part of our vanity as humans that we
tend to think of evolution as a process that, in effect, was programmed to
produce us.
* * *
I received from a friend and passed along to a few others
a few quotes from the right-ward end of the political spectrum. Here are few samples.
“We need to execute people like John Walker in order
to physically intimidate liberals, by making them realize that they can be
killed, too. Otherwise, they will turn
out to be outright traitors.” - Ann Coulter, at the
Conservative Political Action Conference, 02-26-02
“Environmentalists are a socialist group of
individuals that are the tool of the Democrat Party. I’m proud to say that they are my enemy. They are not Americans, never have been
Americans, never will be Americans.” - Rep. Don Young
(R-Alaska), Alaska
Public Radio, 08-19-96
“I would warn Orlando
that you’re right in the way of some serious hurricanes, and I don’t think I’d
be waving those flags in God’s face if I were you. This is not a message of hate; this is a
message of redemption. But a condition
like this will bring about the destruction of your nation. It’ll bring about terrorist bombs; it’ll bring
earthquakes, tornadoes and possibly a meteor.” - Pat
Robertson, speaking of organizers putting rainbow flags up around Orlando to support sexual
diversity, The Washington Post, 06-10-98.
For the record, Orlando remains undestroyed by meteors.
“Emotional appeals about working families trying to
get by on $4.25 an hour are hard to resist. Fortunately, such families do not exist.”
- Rep. Tom DeLay (R-Texas), House Majority Whip, during a debate on increasing
the minimum wage, Congressional Record, H3706, 04-23-96
“Chelsea is a Clinton. She bears the taint; and though not
prosecutable in law, in custom and nature the taint cannot be ignored. All the great despotisms of the past—I’m not
arguing for despotism as a principle, but they sure knew how to deal with
potential trouble—recognized that the families of objectionable citizens were a
continuing threat. In Stalin’s penal
code it was a crime to be the wife or child of an ‘enemy of the people.’ The Nazis used the same principle, which they
called Sippenhaft, ‘clan liability.’ In
Imperial China, enemies of the state were punished ‘to the ninth degree’: that is, everyone in the offender’s own
generation would be killed and everyone related via four generations up, to the
great-great-grandparents, and four generations down, to the
great-great-grandchildren, would also be killed.” - John
Derbyshire, National Review, 02-15-01
“I know this is painful for the ladies to hear, but if
you get married, you have accepted the headship of a man, your husband. Christ is the head of the household and the
husband is the head of the wife, and that’s the way it is, period.”
- Pat Robertson again, The 700 Club, 01-08-92
“I tell people don’t kill all the liberals. Leave enough so we can have two on every
campus—living fossils—so we will never forget what these people stood for.”
- Rush Limbaugh, Denver
Post, 12-29-95
“Get rid of the guy. Impeach him, censure him, assassinate him.”
- Rep. James Hansen (R-Utah), talking about President Clinton, as reported by
journalist Steve Miner of KSUB radio who overheard his conversation, 11-01-98
“We’re going to keep building the party until we’re
hunting Democrats with dogs.” - Sen. Phil Gramm
(R-Texas), Mother Jones, 08-95
“My only regret with Timothy McVeigh is he did not go
to The New York Times building.” - Ann Coulter,
The New York
Observer, 08-26-02
“Homosexuals want to come into churches and disrupt
church services and throw blood all around and try to give people AIDS and spit
in the face of ministers.” - Pat Robertson again, The
700 Club, 01-18-95
“Two things made this country great: White men & Christianity. The degree these two have diminished is in
direct proportion to the corruption and fall of the nation. Every problem that has arisen (sic) can be
directly traced back to our departure from God’s Law and the disenfranchisement
of White men.” - State Rep. Don Davis (R-NC), e-mailed
to every member of the North Carolina House and Senate, reported by The
Fayetteville Observer, 08-22-01
When Pat read them, she wrote an email back to ask me if
there were a similar set of quotes one could compile on the left-ward end of
the political spectrum. I told her I did
not believe so, because in general I have not found those on the liberal end of
the spectrum to engage in the kind of viciousness that characterizes some of
these quotations. I asked a couple of
friends about my surmisal and they thought I was generally right.
One possible answer to Pat's question appeared in an
essay in The Chronicle of Higher Education by Alan Wolfe, a political
scientist at Boston
College.[8] He wrote about an obscure German philosopher
named Carl Schmitt, who was a Nazi party member who survived the war (with
reputation mostly intact) and died in 1985 at the age of 96. Interestingly, Schmitt was embraced by those
on the left, “impressed by his no-nonsense attacks on liberalism and his
contempt for Wilsonian idealism.” Wolfe
says it is “not that they admire Schmitt's political views. But they recognize
in Schmitt someone who, very much like themselves, opposed humanism in favor of
an emphasis on the role of power in modern society.” The leftists understood that Marxism needed
reconsideration after the collapse of communism around the world, but “they
have clung fast to an authoritarian strain in Marxism represented by such
20th-century thinkers as V.I. Lenin.”
Perhaps more to my point, in Wolfe’s analysis “Schmitt's
way of thinking about politics pervades the [way] contemporary . . . conservatism
has flourished, often in ways so prescient as to be eerie. In particular, his
analysis helps explain the ways in which conservatives attack liberals and
liberals, often reluctantly, defend themselves.”
Schmitt
wrote that . . . in politics, the core distinction is between friend and enemy.
That is what makes politics different
from everything else. Jesus’s call to
love your enemy is perfectly appropriate for religion, but it is incompatible
with the life-or-death stakes politics always involves. Moral philosophers are preoccupied with
justice, but politics has nothing to do with making the world fairer. Economic exchange requires only competition;
it does not demand annihilation. Not so
politics.
”The political is the most intense and extreme antagonism,” Schmitt wrote. War is the most violent form that politics takes, but, even short of war, politics still requires that you treat your opposition as antagonistic to everything in which you believe. It’s not personal; you don’t have to hate your enemy. But you do have to be prepared to vanquish him if necessary.
Conservatives have absorbed Schmitt's conception of politics much more thoroughly than liberals. Ann H. Coulter, author of books with titles such as Treason: Liberal Treachery From the Cold War to the War on Terrorism and Slander: Liberal Lies About the American Right, regularly drops hints about how nice it would be if liberals were removed from the earth, like her 2003 speculation about a Democratic ticket that might include Al Gore and then-California Gov. Gray Davis. “Both were veterans, after a fashion, of Vietnam,” she wrote, “which would make a Gore-Davis ticket the only compelling argument yet in favor of friendly fire.”
”The political is the most intense and extreme antagonism,” Schmitt wrote. War is the most violent form that politics takes, but, even short of war, politics still requires that you treat your opposition as antagonistic to everything in which you believe. It’s not personal; you don’t have to hate your enemy. But you do have to be prepared to vanquish him if necessary.
Conservatives have absorbed Schmitt's conception of politics much more thoroughly than liberals. Ann H. Coulter, author of books with titles such as Treason: Liberal Treachery From the Cold War to the War on Terrorism and Slander: Liberal Lies About the American Right, regularly drops hints about how nice it would be if liberals were removed from the earth, like her 2003 speculation about a Democratic ticket that might include Al Gore and then-California Gov. Gray Davis. “Both were veterans, after a fashion, of Vietnam,” she wrote, “which would make a Gore-Davis ticket the only compelling argument yet in favor of friendly fire.”
Liberals, in contrast, according to Schmitt, “can never
be political. Liberals tend to be optimistic about human nature, whereas ‘all
genuine political theories presuppose man to be evil.’ Liberals believe in the possibility of
neutral rules that can mediate between conflicting positions, but to Schmitt
there is no such neutrality, since any rule--even an ostensibly fair one--merely
represents the victory of one political faction over another.” Schmitt liked thinkers “who treated politics
without illusions” and leaders who are “in no way in thrall to the
individualism of liberal thought, are willing to recognize that sometimes
politics involves the sacrifice of life. They are better at fighting wars than
liberals because they dispense with such notions as the common good or the
interests of all humanity. . . . Conservatives are not bothered by injustice
because they recognize that politics means maximizing your side’s advantages,
not giving them away. If unity can be achieved only by repressing dissent, even
at risk of violating the rule of law, that is how conservatives will achieve
it.”
Wolfe summarizes what he sees as the residual impact of
Schmitt’s thinking on American politics:
Liberals
think of politics as a means; conservatives as an end. Politics, for liberals,
stops at the water's edge; for conservatives, politics never stops. Liberals
think of conservatives as potential future allies; conservatives treat liberals
as unworthy of recognition. Liberals believe that policies ought to be judged
against an independent ideal such as human welfare or the greatest good for the
greatest number; conservatives evaluate policies by whether they advance their
conservative causes. Liberals instinctively want to dampen passions;
conservatives are bent on inflaming them. Liberals think there is a third way
between liberalism and conservatism; conservatives believe that anyone who is
not a conservative is a liberal. Liberals want to put boundaries on the
political by claiming that individuals have certain rights that no government
can take away; conservatives argue that in cases of emergency
-- conservatives always find cases of emergency -- the reach and capacity
of the state cannot be challenged.
There are, of course, no party lines when it comes to conservatives and liberals in the United States. Many conservatives, especially those of a libertarian bent, are upset with President Bush's deficits and unenthusiastic about his call for a constitutional amendment to ban gay marriage. And, on the other side of the fence, there are liberals and leftists who want to fight back against conservatives as ruthlessly as conservatives fight against them.
Still, if Schmitt is right, conservatives win nearly all of their political battles with liberals because they are the only force in America that is truly political.[9]
There are, of course, no party lines when it comes to conservatives and liberals in the United States. Many conservatives, especially those of a libertarian bent, are upset with President Bush's deficits and unenthusiastic about his call for a constitutional amendment to ban gay marriage. And, on the other side of the fence, there are liberals and leftists who want to fight back against conservatives as ruthlessly as conservatives fight against them.
Still, if Schmitt is right, conservatives win nearly all of their political battles with liberals because they are the only force in America that is truly political.[9]
I
imagine that at least some conservatives would disagree with Wolfe’s
characterization—and no doubt many of them repudiate both Schmitt’s views and
Coulter’s language. But I think there is
a grain of truth in the description of the different approaches to politics
that the two ends of spectrum bring. A
friend of mine sent me a copy of an editorial[10]
doing a post-mortem on the election; the author noted that a number of articles
“attack Kerry for his nuance over key issues and his reluctance to go for the
Republican jugular when it came to the war in Iraq.”
One
faculty colleague wrote back to me, after reading the list, that “this is
pretty appalling. This is stuff that
clearly would be right at home in Nazi Germany or Spain during the Inquisition.”
* * *
I had a hallway conversation last spring with one of the
senior faculty in one of the biological sciences, and he made a comment that I
thought portends ill for the country. In
light of the increasing secretiveness of the federal government (especially when
it comes to research), the general social climate, and the direction of public
policy, he told me, a number of his colleagues in the biological sciences have
said to him that it may be time to leave the United States. These are not foreigners who came to the
U.S.—these are Americans born and raised. I see that one of the leading scientists
doing stem cell research has given up; he left the University
of California and went to England,
where stem cell research is supported. I
wonder if he will now return to California.
Speaking of science, the University of Minnesota
found itself the target of proposed legislation, sponsored at one point by 35
members of the Minnesota House of Representatives that would have eliminated or
reduced funding for the University if it conducted any embryonic stem cell
research. The University had announced
that it would conduct such research, without any public money, on stem cells
from embryos that were to be discarded from in-vitro-fertilization
clinics. As anyone who has followed the
stem cell research issue is aware, (1) the federal government has banned the
use of federal funds for stem cell research except on certain cell lines that
already existed (and which researchers have concluded are largely worthless for
their research), and (2) stem cell research holds enormous promise for the
treatment or cure of such diseases as childhood diabetes, Parkinson’s, and
cardiovascular disease, to name a few.
Fortunately, the legislation died a quiet death.
At the same time these folks in the Minnesota
legislature were proposing the restrictive legislation, the New
Jersey legislature was considering legislation that would have
appropriated about $10 million in state funds for stem cell research and California was considering
a referendum to provide $3 billion for stem cell research. (The ambition was breathtaking and the
measure passed handily, with support from Governor Schwarzenegger as well as
Christopher Reeves and the Reagan family, although it was opposed by the
Republican party in California.) Up to now, the pioneering work in stem cell
research has been taking place outside the United States because of the
federal restrictions. (The states may
fund such research—it is entirely legal—but universities may not use federal
funds for stem cell research.) Some of
my faculty colleagues pointed out that Minnesotans will certainly take
advantage of any treatments or cures that come from stem cell research; all the
propose restriction would have meant is that none of the advances would be
discovered at the University of Minnesota.
The other drawback to this kind of legislative proposal
is the threat to academic freedom.
Academic freedom does not always play well in public, I know, but it is
not clear to me how else a society effectively can conduct research. A university is one of the few places in
society that (at least tries to) speaks truth to power and investigates (and
sometimes defends) that which is unpopular.
The University does not prescribe what research faculty should do. The idea of the state legislature prescribing
or proscribing otherwise legal research is scary. The U.S. (at least up until the
imposition of restrictions on stem cell research) has built up an incredible
record of medical triumphs, many if not most of which came out of university
laboratories. History suggests that research
ordered from on high isn’t particularly effective; the federal government
achieves some direction in research by seeking grants in certain areas (e.g.,
cancer treatments, heart disease, rocketry, engines, whatever). But a
faculty member can seek or not seek those federal research funds, as he/she
wishes. In this case, the legislature is
proposing to infringe on the right of faculty members (or anybody else, for
that matter) to conduct research that is perfectly legal. Recently Iowa
imposed a ban on stem cell researchers—and quite promptly some of the top
medical researchers in the country left the University of Iowa.
The history of American higher education is replete with examples of political
forces intervening to stop or hinder research that some group did not like—and
without exception they look rather stupid in retrospect. This incident, I am sure, will repeat that
pattern.
I wrote immediately to my state legislators asking that
they oppose the legislation; they agreed with me and said they would. Both also expressed support for a contrary
piece of legislation being introduced, in response, specifically providing
state funds to the University to conduct such research. I also pointed out to them that if the United
States does not create an environment that supports stem cell research, not
only will the discoveries be made elsewhere in the world, it may be that
Americans will have to fly to other locations (England, Germany, Korea, Japan)
in order to receive treatment. One could
have said “so much for America’s
leadership in the medical sciences,” but with the passage of the California initiative,
it will now likely become the world’s leading center of stem cell
research. I wrote to a friend shortly
after the election that, to quote Ross Perot, that “giant sucking sound” you will
hear will be the best and brightest young researchers leaving our universities,
all heading west to appointments at California
institutions. (I have asked my two
legislators if there is any chance that the Minnesota legislature will do an about-face from
last year and appropriate funds for stem-cell research in order that the
University can remain a leader in this kind of research. I suspect the likelihood is extremely small.)
Those who oppose embryonic stem cell research believe
that the research involves the taking of a life (the embryo)—even if the
embryos would have been dumped in the garbage.
This strikes me as an example of where preconceived and fixed
ideological notions clash with support for advances that would benefit all of
us. It is interesting, however, that
both Senator Orrin Hatch, otherwise a tenacious conservative, and Nancy Reagan
both support stem cell research. I am
particularly incensed about this because finding a treatment for diabetes would
prolong Krystin’s life and vastly improve its quality. As well as perhaps lead to an effective
treatment for Parkinson’s, for which I am at some risk because my father has
it. To say nothing of the millions of
other people the treatments have the potential to benefit. But some will insist on calling a
to-be-discarded embryo a life, and thus not to be used in research.
Michael Kinsley, writing in The New Republic, made
some interesting points on this issue, I thought. He began by writing that “the one unavoidably
admirable thing about the Right to Life movement is the selflessness of its
cause. For all the damage it has done to
American politics, . . . for all the misery it would create if it got its way,
for all the thuggishness of its rhetoric and sometimes its action, for all the
essential wrongness of what it believes and wants to impose on nonbelievers, .
. . almost uniquely among powerful interest groups, it is dedicated to the
interest of someone else. Or, rather,
something it believes to be someone else.”
Kinsley says the argument that life begins at conception is not a tough
one to make; “there is a comfortable intuitive logic to it. Any other line you might draw—birth? second trimester? age 18?—raises the question of how you can
confer humanity on a being just this side of the line and deny it to a
virtually identical one just the other side.”
And moreover, when one poses the right to terminate a pregnancy when
there is an identifiable human organism against a lesser right, such as a
woman’s right to choose, the fetus wins.
Kinsley argues that the stem cell debate is different
because there is no identifiable human being; to the human eye, there is no
identifiable anything because one needs a powerful microscope to see an embryo
just days after conception. “Any humanity
you confer on it must derive from faith, not observation nor logic.” On the other side, many human lives are at
stake if stem-cell research lives up to its promise (and the facts, thus far,
look very good). The stem-cell debate
puts the issue more starkly than the abortion debate, he maintains. Given that many of the embryos that would be
used for stem-cell research are discarded or frozen, byproducts of the in-vitro
fertilization business, “the selflessness required to day, ‘OK, I’ll suffer and
die prematurely so that this dot can stay frozen for the next thousand years’
is much more dramatic.”
The stem-cell debate, Kinsley says, also affects the
abortion debate. If the anti-abortionist
accepts the premise that these embryos can be used for stem-cell research,
“once you decide that a five-day-old embryo maybe isn’t as human as you and me,
the tempting logical clarity of the absolutist right-to-life position
disappears. . . . The possibility that
human life emerges gradually, like so much in nature, and doesn’t turn on
instantly like an electric light, doesn’t seem so implausible.”
In other words, when a fetus grows into a baby that is
born (or when the mother is forced by law to carry the fetus full term and give
birth to a baby), there is no cost to another living human being (except for
the non-trivial burdens on the mother, rarely father, grandparents, siblings,
society, etc., in providing for a child that was not wanted). But when a to-be-discarded-anyway microscopic
embryo is not used (assuming advances in medicine), then there is great cost to
many other living human beings.
I am not a single-issue voter, but if I were, my single
issue would be support for stem cell research.
* * *
The great Caltech physicist Richard
Feynman once observed that if you had to reduce scientific history to one
important statement it would be “All things are made of atoms.” They are everywhere and they constitute
everything. Look around you. It is all atoms. Not just the solid things like walls and
tables and sofas, but the air in between.
And they are there in numbers you cannot conceive.
The basic working arrangement of atoms
is the molecule. . . . A molecule is
simply two or more atoms working together in a more or less stable
arrangement: add two atoms of hydrogen
to one of oxygen and you have a molecule of water. Chemists tend to think in terms of molecules
rather than elements in much the way that writers tend to think in terms of
words and not letters, so it is molecules they count, and these are numerous to
say the least. At sea level, at a
temperature of 32 degrees Fahrenheit, one cubic centimeter of air (that is, a
space about the size of a sugar cube) will contain 45 billion billion
molecules. And they are in every single
cubic centimeter around
you. . . . Think about how many [sugar cubes] it would
take to build a universe. Atoms, in
short, are very abundant.
They are also fantastically
durable. Because they are so long lived,
atoms really get around. Every atom you
possess has almost certainly passed through several stars and been part of
millions of organisms on its way to becoming you. We are each so atomically numerous and so
vigorously recycled at death that a significant number of our atoms—up to a
billion for each of us, it has been suggested—probably once belonged to
Shakespeare. A billion more came from
Buddha and Genghis Khan and Beethoven, and any other historical figure you care
to name. (The personages have to be
historical, apparently, as it takes the atoms some decades to become thoroughly
redistributed; however much you may wish it, you are not yet one with Elvis
Presley).
So we are all reincarnations—though
short-lived ones. When we die, our atoms
will disassemble and move off to find new uses elsewhere—as part of a leaf or
drop of dew. Atoms, however, go on
practically forever. Nobody actually
knows how long an atom can survive, but according to Martin Rees it is probably
about 1035 years. [That is,
100,000,000,000,000,000,000,000,000,000,000,000 years—a very long time.]
Neutrons and protons occupy the atom’s
nucleus. The nucleus of an atom is
tiny—only one millionth of a billionth of the full volume of the atom—but
fantastically dense, since it contains virtually all of the atom’s mass. As Cropper has put it, if an atom were
expanded to the size of a cathedral, the nucleus would be only about the size
of a fly—but a fly many thousands of times heavier than the cathedral. . . .
It is still a fairly astounding notion
to consider that atoms are mostly empty space, and that the solidity we
experience all around us is an illusion.
When two objects come together in the real world—billiard balls are most
used for illustration—they don’t actually strike each other. ‘Rather,’ as Timothy Ferris explains, ‘the
negatively charged fields of the two balls repel each
other . . . were it not for their electrical charges
they could, like galaxies, pass right through each other unscathed.’ When you sit in a chair, you are not actually
sitting there, but levitating above it at a height of one angstrom (a hundred
millionth of a centimeter), your electrons and its electrons implacably opposed
to any closer intimacy.
* * *
There are varieties of lists of things that are variously
amusing, entertaining, disturbing, and so on.
Here is one of my favorites that came out earlier this year, from The
Guardian: “Francis Wheen’s top 10
modern delusions.” Wheen, a journalist
and author, wrote How Mumbo-Jumbo Conquered the World: A Short History of
Modern Delusions. Here are some from
his top 10. (The list is a quote from The
Guardian.[11])
“God is on our side” George W Bush
thinks so, as do Tony Blair and Osama bin Laden and an alarmingly high
percentage of other important figures in today’s world. After September 11 2001 Blair
claimed that religion was the solution not the problem, since “Jews, Muslims
and Christians are all children of Abraham” - unaware that the example of
Abraham was also cited by Mohammed Atta, hijacker of the one of the planes that
shattered the New York
skyline. RH Tawney wrote in Religion and
the Rise of Capitalism that “modern social theory, like modern political
theory, developed only when society was given a naturalistic instead of a
religious explanation.” In which case
modern social and political theory would now seem to be dead.
The market is rational Financial
sophisticates in the 21st century smile at the madness of the South Sea Bubble
or the absurdity of the Dutch tulip craze. Yet only a few years ago they scrambled and
jostled to buy shares in dotcom companies which had no earnings at all nor any
prospect of ever turning a profit. To
justify this apparent insanity, they maintained that such a revolutionary
business as the internet required a new business model in which balance sheets
were irrelevant. In short, they thought
they had repealed the laws of financial gravity - until they came crashing down
to earth.
America's economic success is entirely due to private
enterprise In the 19th century, the American government promoted
the formation of a national economy, the building of railroads and the
development of the telegraph. More
recently, the internet was created by the Pentagon. American agriculture is
heavily subsidised and protected, as are the steel industry and many other sectors
of the world's biggest “free-market economy.” At times of economic slowdown, even under
presidents who denigrate the role of government, the US will increase its deficit to
finance expansionary fiscal and monetary policies. But its leaders get very cross indeed if any
developing country tries to follow this example.
Astrology
and similar delusions are "harmless fun" Those
who say this never explain what is either funny or harmless in promoting a
con-trick which preys on ignorance and anxiety.
Yet even the Observer, Britain's
most venerable and enlightened Sunday newspaper, now has a horoscope page.
There is no such thing as reality Hence the
inverted commas which postmodernists invariably place round the word. They see
everything from history to quantum physics as a text. . . . . But if all
notions of truth and falsity cease to have any validity, how can one combat
bogus ideas - or indeed outright lies? There
is, for instance, a mass of carefully empirical research on the Nazi
extermination of the Jews. As Professor
Richard Evans points out, “to regard it as fictional, unreal or no nearer to
historical reality than, say, the work of the ‘revisionists’ who deny that Auschwitz ever happened at all, is simply wrong. Here is an issue where evidence really counts,
and can be used to establish the essential facts. Auschwitz was
not a discourse.” [See “science studies”
earlier.]
We mustn’t be “judgmental” In
2002 the Guardian revealed that Christian fundamentalists had taken control of
a state-funded school in Gateshead and were
striving to “show the superiority” of creationist beliefs in their classes. When Jenny Tonge MP asked Tony Blair if he was
happy that the Book of Genesis was now being promoted as the most reliable
biology textbook, he replied: “Yes. . .
In the end a more diverse school system will deliver better results for our
children.” This is the enfeebling
consequence of unthinking cultural and intellectual relativism. If some schools start teaching that the moon
is made of Swiss cheese or that the stars are God’s daisy chain, no doubt that
too will be officially welcomed as a healthy sign of educational diversity.
(The
response of Robert Carroll, author of The Skeptic’s Dictionary, to this
one: “This is the rationale used by all
those folks—and their numbers seem to keep growing—who claim it’s only fair
that intelligent design be taught as an alternative to any other scientific
theory of evolution now taught.
Scientists need not concern themselves with what is good or bad science. They should concern themselves with ‘balance’
and ‘both sides.’ What is the other side’s
view on gravity? electricity? the Pythagorean theorem? Which begs the
question: Is our children learning?”[12])
“It could be you. . .” This
was the advertising slogan for the National Lottery, that monument to
imbecility, which was introduced (fittingly enough) by John Major. And millions of British adults apparently
believed it, even though the odds on winning the jackpot are 13m to one. It
could be you . . . but it bloody well won’t be. [True enough, but Pat and I still buy the
occasional Powerball ticket when we travel to small towns—because it seems like
most of the winners live in small towns or rural areas.]
* * *
Apropos science and society and government, David Denby
made an interesting observation in reviewing a book on the Scottish
enlightenment by James Buchan. Buchan
described the legacy of the Scottish enlightenment: “‘In demanding that experiment, not inherited
truth, define the business of living, the Edinburgh
philosophers stamped the West with its modern scientific and provisional
character.” Denby goes on to comment
that “these sentences may make us wince.
The wall between church and state that we are urging on the Iraqis has
grown thinner in America. And a country in which something like seventy
percent of the population believes in Satan and a considerable number disdains
evolutionary biology is unlikely to be one in which ‘experiment not inherited
truth’ is flourishing. At the moment
religiosity—the substitution of faith for common sense in practical affairs—is
running amok in the United States, and intrusions of religion into science,
politics, and culture, contrary to the spirit of the Founding Fathers, have
become commonplace.”
One disturbing example of the impact of fundamentalist
religious belief on public policy was reported on by William Hoyt on the web
site of the Committee for the Scientific Investigation of Claims of the
Paranormal. Hoyt reported on the growth
of anti-vaccination movements around the world, and looked specifically at
pertussis (whooping cough). First, he
notes that while few in the developed world have suffered from pertussis in
recent years, “before the 1940s, it was a major cause of infant and child
morbidity and mortality in the U.S.”[13] The reason for the decline, of course, was
vaccination.
When the prevalence of pertussis declined, some began to
argue that vaccination was no longer needed.
There were (later determined to be faulty) studies in a couple of places
that suggested problems with the vaccine and that because of economic and
public health changes, it was no longer needed.
Some countries began to discontinue vaccination. And guess what happened. In Sweden, where the rate had been about 300
cases per 100,000 people in the 1940s, and about 50 per 100,000 in 1975, by the
early 1980s—when vaccination had been discontinued—the rate was 3,370 and it
hit over 10,000 thereafter. The United Kingdom
sharply reduced vaccination; the number of cases skyrocketed. Japan reduced infant vaccination
from 90% to 10% and within a few years had an epidemic. Australia saw the same thing
happen. The countries that did not
abandon vaccination saw no increase in pertussis rates. Most of the countries that saw the huge
increase in number of cases quickly reversed policy and the pertussis rates
returned to earlier levels.
Hoyt wrote about all this that
distorted numbers, confusion of correlation with
causation, and statistical innumeracy certainly played roles in this sad
story. Sensationalist media campaigns
fanned the glowing embers. But in each
of the countries that experienced the raging fires of epidemics there were
other forces at work. Most prominent in
passive anti-vaccination movements were religious groups whose opposition was
based on religious or moral grounds.
Prominent in both passive and active anti-vaccination movements are
followers and practitioners of homeopathy, chiropractic, and natural and
alternative medicine. . . .
When anti-vaccination alarm takes hold—characterized
by sudden attacks of the media, mistaken researchers, fervent religious groups,
and alternative medicine quacks—the infected society begins to make horrid,
whoppingly bad decisions. There is, as
yet, no Latin name for this peculiar social disease.
Sam Harris, writing in the Los Angeles Times,
points out that “there are now more people in our country who believe that the
universe was created in six solar days than there were in Europe
in the 14th century. . . . We
elected a president who believes the jury is still out on evolution [a view which
no reputable biologist accepts] and who rejects sound, scientific
judgments on the environment, on medical research, on family planning and on
HIV/AIDS prevention in the developing world.”
But President Bush’s view is shared by a large number
of Americans, according to an article in the November, 2004, issue of National
Geographic:
Evolution by natural selection, the central concept of
the life’s work of Charles Darwin, is a theory. It’s a theory about the origin of adaptation,
complexity, and diversity among Earth's living creatures. If you are skeptical
by nature, unfamiliar with the terminology of science, and unaware of the
overwhelming evidence, you might even be tempted to say that it’s “just” a
theory. In the same sense, relativity as
described by Albert Einstein is “just” a theory. The notion that Earth orbits around the sun
rather than vice versa, offered by Copernicus in 1543, is a theory. Continental drift is a theory. The existence, structure, and dynamics of
atoms? Atomic theory. Even electricity is a theoretical construct,
involving electrons, which are tiny units of charged mass that no one has ever
seen. Each of these theories is an
explanation that has been confirmed to such a degree, by observation and
experiment, that knowledgeable experts accept it as fact. That’s what scientists mean when they talk
about a theory: not a dreamy and unreliable speculation, but an explanatory
statement that fits the evidence. They embrace such an explanation confidently
but provisionally—taking it as their best available view of reality, at least
until some severely conflicting data or some better explanation might come
along.
The rest of us generally agree. We plug our televisions into little wall sockets, measure a year by the length of Earth's orbit, and in many other ways live our lives based on the trusted reality of those theories.
Evolutionary theory, though, is a bit different. It's such a dangerously wonderful and far-reaching view of life that some people find it unacceptable, despite the vast body of supporting evidence. As applied to our own species, Homo sapiens, it can seem more threatening still. Many fundamentalist Christians and ultra-orthodox Jews take alarm at the thought that human descent from earlier primates contradicts a strict reading of the Book of Genesis. Their discomfort is paralleled by Islamic creationists such as Harun Yahya, author of a recent volume titled The Evolution Deceit, who points to the six-day creation story in the Koran as literal truth and calls the theory of evolution “nothing but a deception imposed on us by the dominators of the world system.” The late Srila Prabhupada, of the Hare Krishna movement, explained that God created “the 8,400,000 species of life from the very beginning,” in order to establish multiple tiers of reincarnation for rising souls. Although souls ascend, the species themselves don't change, he insisted, dismissing “Darwin's nonsensical theory.”
Other people too, not just scriptural literalists, remain unpersuaded about evolution. According to a Gallup poll drawn from more than a thousand telephone interviews conducted in February 2001, no less than 45 percent of responding U.S. adults agreed that “God created human beings pretty much in their present form at one time within the last 10,000 years or so.” Evolution, by their lights, played no role in shaping us.
Only 37 percent of the polled Americans were satisfied with allowing room for both God and Darwin—that is, divine initiative to get things started, evolution as the creative means. (This view, according to more than one papal pronouncement, is compatible with Roman Catholic dogma.) Still fewer Americans, only 12 percent, believed that humans evolved from other life-forms without any involvement of a god. . . . In other words, nearly half the American populace prefers to believe that Charles Darwin was wrong where it mattered most.[14]
The rest of us generally agree. We plug our televisions into little wall sockets, measure a year by the length of Earth's orbit, and in many other ways live our lives based on the trusted reality of those theories.
Evolutionary theory, though, is a bit different. It's such a dangerously wonderful and far-reaching view of life that some people find it unacceptable, despite the vast body of supporting evidence. As applied to our own species, Homo sapiens, it can seem more threatening still. Many fundamentalist Christians and ultra-orthodox Jews take alarm at the thought that human descent from earlier primates contradicts a strict reading of the Book of Genesis. Their discomfort is paralleled by Islamic creationists such as Harun Yahya, author of a recent volume titled The Evolution Deceit, who points to the six-day creation story in the Koran as literal truth and calls the theory of evolution “nothing but a deception imposed on us by the dominators of the world system.” The late Srila Prabhupada, of the Hare Krishna movement, explained that God created “the 8,400,000 species of life from the very beginning,” in order to establish multiple tiers of reincarnation for rising souls. Although souls ascend, the species themselves don't change, he insisted, dismissing “Darwin's nonsensical theory.”
Other people too, not just scriptural literalists, remain unpersuaded about evolution. According to a Gallup poll drawn from more than a thousand telephone interviews conducted in February 2001, no less than 45 percent of responding U.S. adults agreed that “God created human beings pretty much in their present form at one time within the last 10,000 years or so.” Evolution, by their lights, played no role in shaping us.
Only 37 percent of the polled Americans were satisfied with allowing room for both God and Darwin—that is, divine initiative to get things started, evolution as the creative means. (This view, according to more than one papal pronouncement, is compatible with Roman Catholic dogma.) Still fewer Americans, only 12 percent, believed that humans evolved from other life-forms without any involvement of a god. . . . In other words, nearly half the American populace prefers to believe that Charles Darwin was wrong where it mattered most.[14]
nationalgeographic.com
Harris
argues that religion, in some of its vitriolic forms, is one of the major
problems in society. He suggested that
President Bush and those who agree with him, in seeking a constitutional ban on
gay marriage, have a few other prohibitions and sanctions they should consider
as well if they are going to rely on the Bible.
For example, Leviticus 20:13 requires that homosexuals be killed; the
Bible “also instructs us to murder people who work on the Sabbath, along with
adulterers and children who curse their parents.” The Bible also sanctions slavery. And much though Americans detest Osama bin
Laden, Harris notes that the Koran tells every Muslim “‘to make war on the
infidels who dwell around you.’” Even
though most Muslims choose to ignore that injunction, one cannot argue that bin
Laden is not following at least one directive in the Koran. Harris concludes that “whatever their import
to people of faith, ancient religious documents shouldn’t form the basis of
social policy in the 21st century.
The Bible was written at a time when people thought the earth was flat,
when the wheelbarrow was high tech.”
Similarly,
Harris notes that those who oppose stem cell research, “drawing from what
they’ve heard from the pulpit, believe that 3-day-old embryos—which are
microscopic collections of 150 cells the size of a pinhead—are fully endowed
with human souls and, therefore, must be protected as people. But if we know anything about the neurology
of sensory perception, we know that there is no reason to believe that embryos
at this stage of development have the capacity to sense pain, to suffer or to
experience death in any way at all.
(There are, for comparison’s sake, 100,000 cells in the brain of a
fly.)” Religious beliefs have consequences,
Harris points out; President Bush and many members of Congress, relying on
religious views, “have decided to put the rights of undifferentiated cells
before those of men and women suffering from spinal cord injuries, full-body
burns, diabetes, and Parkinson’s disease.”
The consequence of belief that “‘life starts at the moment of
conception’” is that “you will happily stand in the way of medical research
that could alleviate the sufferings of millions of your fellow human beings.”
A
reviewer of a book by Harris points out that
the
terrorists who flew jet planes into the World Trade
Center believed in the
holiness of their cause. The Christian
apocalypticists who are willing to risk a nuclear conflagration in the Middle East for the sake of expediting the second coming
of Christ believe in the holiness of their cause. In Harris’s view, such fundamentalists are
not misinterpreting their religious texts or ideals. They are not defaming or distorting their
faith. To the contrary, they are taking
their religion seriously, attending to the holy texts on which their faith is built. Unhappily for international comity, the Good
Books that undergird the world’s major religions are extraordinary anthologies
of violence and vengeance, celestial decrees that infidels must die.
Harris’s
argument is that religion should be subject to the same scrutiny that we insist
on for other beliefs. I think the better
argument is that America’s
great gift to political systems, the separation of church and state, needs more
respect than it has received in modern American politics. (I recognize that many Muslims, among others,
do not see this as a great gift, because they believe church and state are
inseparable, and while I intend no offense to Islam, I must say that I utterly
and completely disagree with its view.)
Garry
Wills wrote an interesting editorial in The New York Times after the
election. He wrote about a visit of the
Dalai Lama and his (Wills’s) role in asking questions at a public event in Chicago; the Dalai Lama
asked for tough questions, not deference.
“The only one I could think of was:
‘If you could return to your country, what would you do to change
it?’ He said that he would disestablish
his religion, since ‘America
is the proper model.’” America’s advantage is its
Enlightenment heritage. Wills wrote that
“I later asked him if a pluralist society were possible without the
Enlightenment. “Ah,” he said. “That’s the problem.”
The
Dalai Lama seemed to envy America
its Enlightenment heritage.
America, the first real democracy in history, was a product
of Enlightenment values—critical intelligence, tolerance, respect for evidence,
a regard for the secular sciences.
Though the founders differed on many things, they shared these values of
what was then modernity. They addressed
“a candid world,” as they wrote in the Declaration of Independence, out of “a
decent respect for the opinions of mankind.” . . . The secular states of modern Europe do not understand the fundamentalism of the
American electorate. It is not what they
had experienced from this country in the past.
In fact, we now resemble those nations less than we do our putative
enemies. Where else do we find
fundamentalist zeal, a rage at secularity, religious intolerance, fear of and
hatred for modernity? Not in France or Britain
or Germany or Italy or Spain. We find it in the Muslim world, in Al Qaeda,
in Saddam Hussein’s Sunni loyalists.
Americans wonder that the rest of the world thinks us so dangerous, so
single-minded, so impervious to international appeals. They fear jihad, no matter whose zeal is
being expressed.
* * *
[Plate tectonics, and why Australia
is sinking.] There was one major problem
with . . . Earth theories that no one had resolved, or even come close to
resolving. That was the question of
where all the sediments went. Every year
Earth’s rivers carried massive volumes of eroded material—500 million tons of
calcium, for instance—to the seas. If
you multiplied the rate of deposition by the number of years it had been going
on, it produced a disturbing figure:
there should be about twelve miles of sediment on the ocean bottoms—or,
put another way, the ocean bottoms should by now be well above the ocean
tops. Scientists dealt with this paradox
in the handiest possible way. They
ignored it. But eventually there came a
point when they could ignore it no longer.
In the Second World War, a Princeton
University mineralogist named Harry Hess was put in charge of an attach
transport ship. . . . Aboard this vessel
was a fancy new depth sounder called a fathometer, which was designed to
facilitate inshore maneuvers during beach landings, but Hess realized that it
could equally well be used for scientific purposes and never switched if off,
even when far out at sea, even the heat of battle. What he found was entirely unexpected. If the ocean floors were ancient, as everyone
assumed, they would be thickly blanketed with sediments, like the mud on the
bottom of a river or lake. But Hess’s
readings showed that the ocean floor offered anything but the gooey smoothness
of ancient silts. It was scored everywhere
with canyons, trenches, and crevasses and dotted with volcanic seamounts. . . .
Then in 1960 core samples showed that
the ocean floor was quite young at the mid-Atlantic ridge but grew
progressively older as you moved away from it to the east or west. Harry Hess considered the matter and realized
that this could only mean one thing: new
ocean crust was being formed on either side of the central rift, then being
pushed away from it as new crust came along behind. The Atlantic floor was effectively two large
conveyor belts, one carrying crust toward North America, the other carrying
crust toward Europe. . . . .
When the crust reached the end of its
journey at the boundary with continents, it plunged back into the Earth in a
process known as subduction. That
explained where all the sediment went.
It was being returned to the bowels of the Earth. [This and other discoveries led to the
science of plate tectonics.]
Today we know that the Earth’s surface
is made up of eight to twelve
big plates (depending on how you define big) and twenty or so smaller ones, and
they all move in different directions and at different speeds. . . .
The connections between modern
landmasses and those of the past were found to be infinitely more complex that
anyone had imagined. Kazakhstan, it
turns out, was once attached to Norway
and New England. One corner of Staten
Island, but only a corner, is European. So is a part of Newfoundland. Pick up a pebble from a Massachusetts
beach, and its nearest kin will now be in Africa. The Scottish
Highlands and much of Scandinavia are substantially American. Some of the Shackleton
Range of Antarctica, it is thought,
may once have belonged to the Appalachians of the eastern U.S. Rocks, in short, get around. . . . There are . . . many surface features that
tectonics can’t explain. Take Denver. It is, as everyone knows, a mile high, but
that rise is comparatively recent. When
dinosaurs roamed the earth, Denver
was part of an ocean bottom, many thousands of miles of feet lower. Yet the rocks on which Denver sits are not
fractured or deformed in the way they would be if Denver had been pushed up by
colliding plates, and anyway Denver was too far from the plate edges to
susceptible to their actions. It would
be as if you pushed against the edge of a rug hoping to raise a ruck at the
opposite end. Mysteriously and over
millions of years, it appears that Denver
has been rising, like baking bread. So,
too, has much of southern Africa; a portion of
it a thousand miles across has risen nearly a mile in 100 million years without
any known associated tectonic activity.
Australia, meanwhile, has been tilting and sinking. Over the past 100 million years as it has
drifted north towards Asia, its leading edge
has sunk by some six hundred feet. It
appears that Indonesia is
very slowly drowning, and dragging Australia with it.
* * *
Since terrorism seems to be on the mind of a lot of
people, including many in the U.S.
and other governments, I was interested to read an article in The Chronicle
of Higher Education about the (largely-ignored) contributions that the
social sciences can make to understanding and responding to terrorism. Scott Plous and Philip Zimbardo—the latter of
whom, a very distinguished scholar, served as President of the American
Psychological Association for the year after the 9/11 attacks—maintain that the
social sciences have made significant advances in studying terrorism. Citing a
vast literature, Plous and Zimbardo report several findings about which there
appears to be little dispute.
First,
studies suggest that, compared with the general public, terrorists do not
exhibit unusually high rates of clinical psychopathology, irrationality, or
personality disorders. . . . The idea of
a "terrorist personality" rests on unsteady empirical, theoretical,
and conceptual foundations. Indeed,
because terrorist cells require secrecy, terror organizations frequently screen
out unstable individuals who might compromise their security.
Nor do terrorists differ greatly from other people in self-esteem, religiosity, socioeconomic status, education, or personality traits such as introversion. Nasr Hassan, who spent years studying Palestinian terrorists, put it this way during a lecture she gave in 2002: “What is frightening is not the abnormality of those who carry out the suicide attacks, but their sheer normality.” Thus far, behavioral research has found only one psychological attribute that reliably differentiates terrorists from nonterrorists: a propensity toward anger.
In the words of a recent National Research Council report titled “Terrorism: Perspectives From the Behavioral and Social Sciences”: “There is no single or typical mentality—much less a specific pathology—of terrorists. However, terrorists apparently find significant gratification in the expression of generalized rage.”
Nor do terrorists differ greatly from other people in self-esteem, religiosity, socioeconomic status, education, or personality traits such as introversion. Nasr Hassan, who spent years studying Palestinian terrorists, put it this way during a lecture she gave in 2002: “What is frightening is not the abnormality of those who carry out the suicide attacks, but their sheer normality.” Thus far, behavioral research has found only one psychological attribute that reliably differentiates terrorists from nonterrorists: a propensity toward anger.
In the words of a recent National Research Council report titled “Terrorism: Perspectives From the Behavioral and Social Sciences”: “There is no single or typical mentality—much less a specific pathology—of terrorists. However, terrorists apparently find significant gratification in the expression of generalized rage.”
It
seems that one of the primary reasons for becoming a terrorist is “the desire
for revenge or retribution for a perceived injustice.” Individuals are reacting to actions by
“police officers, soldiers, or others.”
But there is one demographic statistic on which terrorists vary from the
general population: “Although exceptions
exist, terrorists are usually males between 15 and 30 years of age -- the
same population most likely to commit violent crime in general, and the
demographic group least likely to be deterred by the threat of physical force.”
One
conclusion they draw from the research is that “large-scale military responses
to terrorism tend to be ineffective or temporarily to increase terrorist
activity. . . . Although every situation is different, researchers have found
that military responses to international terrorism can unwittingly reinforce
terrorists' views of their enemies as aggressive, make it easier for them to
recruit new members, and strengthen alliances among terrorist organizations.”
So
how is a nation to respond? One way is
to take measures against terrorism, presumably things the Department of
Homeland Security is supposed to be doing.
“Although self-protective measures will never be foolproof, they have
the virtue of being nonprovocative and less costly than war. . . . In the long run, research indicates that at
least three priorities are of paramount importance: reducing intergroup conflict, creating
incentives for the reduction of terrorism, and socializing young people to
reject violence as a means of problem solving.”
Plous and Zimbardo suggest that there is much the United States could along these
lines that it is not now doing. They
also comment that “the futility of fighting terrorism with large-scale military
strikes is perhaps clearest in the case of Iraq,” and conclude that “the time
has come to rethink our global strategy on terrorism.”
* * *
There are some intriguing web sites that deal with
subjects I think are interesting to contemplate. Let me write about a series of connected
topics that might be called “speculative science”—and there is more speculation
than science with some of this.
As you probably know, there has been a reasonably
organized search for extra-terrestrial intelligence (SETI) going on for quite a
number of years: the searchers scan the
skies for organized or patterned (artificial) radio transmissions that would
emanate from an advanced civilization elsewhere in the universe. Thus far they have found nothing, although
the search of the skies has not been comprehensive.
I wondered how thoroughly and how far afield we were
searching, and asked an astronomer friend at the University if a civilization
out there somewhere perhaps 50 light years away could pick up the telecasts of “I
Love Lucy” from the 1950s.[15] He wrote back that “there are only a few
radio telescopes devoted to searching for artificial radio signals from other
planets and some would indeed say that the whole thing is a waste of
time. It really is like looking for a needle in a very large
haystack. The odds for finding something are very slim but the optimists
just keep improving their instruments and analysis techniques and one day hope
to find something. However, even current detectors would be able to pick
up TV or radio signals from systems almost a 100 light-years away because the
equipment is so sensitive. So, some other civilization around a star 40
light years away could indeed be picking up ‘I Love Lucy’ or ‘Jack Benny.’”
I
wrote back to inquire about finding civilizations 100 million light-years
away. He replied: “I didn’t mention 100 million light years; I
said only 100 light years. The instruments are sensitive but not that
sensitive. That’s a factor of a million squared (= one trillion) in terms
of the original intensity of the signal.[16]
That’s impossible! Radio telescopes do detect signals from billions of
light years away but those are typically thousands of times more powerful than
all the radiation put out by our entire galaxy! There is no way
that any civilization can produce anything that powerful nor is there any
chance that we could detect any artificial signal from a civilization that far
away.”
This
strikes me as analogous to having an instrument in your house that could detect
life in other houses within 100 yards—but which would have no ability to detect
life anywhere on Earth beyond that distance.
If you lived in a rural or remote area, you would never know that there
was any other intelligent life on the planet.
Given the enormous distances between stars in the galaxy, much less
between stars in the universe, it is a virtual certainty that any civilization
would exist in the equivalent of a remote or rural area. Thus the challenge for SETI, I suspect: the range of search (now) is limited to only
an impossibly tiny part of the universe.
Be that as it may, an integral element of SETI thought is
the Drake equation, named after Frank Drake, the Cornell astronomer who devised
it. The Drake equation attempts to put
in numbers the likelihood that there are other advanced civilizations in the
Milky Way galaxy. The formula is this
(don’t be put off by it; it’s not complicated once you get past the apparent
algebra—it’s multiplication):[17]
N = R* • fp • ne • fl •
fi • fc • L where,
N = The
number of civilizations in The Milky Way Galaxy whose electromagnetic emissions
are detectable. We know N must equal at least 1 because we are here. The Big Question is whether it is greater
than 1.
R* = The rate of formation of stars suitable
for the development of intelligent life.
fp = The fraction of those stars with planetary
systems.
ne = The number of planets, per solar system,
with an environment suitable for life.
fl = The fraction of suitable planets on which
life actually appears.
fi = The
fraction of life bearing planets on which intelligent life emerges.
fc = The
fraction of civilizations that develop a technology that releases detectable
signs of their existence into space.
L = The length of time such civilizations release detectable
signals into space.
The great difficulty with the Drake equation is that we
have no good evidence for the numbers that should be inserted so that we can
estimate N. There is a small amount of
data to suggest numbers for R* (the rate of star formation suitable for life) and
f(p) (the fraction of stars that have planetary systems). The estimate for n(e) (number of planets, per
solar system, with an environment suitable for life) has dropped in recent
years with various astronomical discoveries about the nature of planets and
stars. As best anyone can tell, f(l) (fraction of
suitable planets on which life actually appears) should be reasonably high,
once a planet is capable of sustaining life (because the evidence from Earth is
that life began fairly promptly after the planet was formed). But of course that is a sample size of one,
and we don’t know if the development of life was an accident. The remaining three factors in the equation
are just guesses; no one really has any idea.
For example, as a Wikipedia article points out, the lower limit
for L (length of time such civilizations release detectable signals into space)
would seem to be about 66 years—because that is how long we have had radio
astronomy.[18] One contributor to Scientific American
estimated 420 years for L.
The author of the Wikipedia article observes, however,
that “the remarkable thing about the Drake equation is that by plugging in
apparently fairly plausible values for each of the parameters above, the
resultant expectant value of N is generally >>1. This has provided considerable motivation for
the SETI movement.” There is a certain
logic to the argument, in my opinion, especially if one expands it to cover the
universe, not just the Milky Way. If
there are at least 100 billion galaxies in the universe (probably the average
current estimate, although the number may be much higher), and there are
perhaps 200 billion stars, on average, in each galaxy, then there are 100 x 200
billion stars, some of which will have planets and some of which may support
life (of some kind, not necessarily like ours).
These numbers are so large that it is hard to believe there is not life
somewhere else.
The SETI Institute takes a similar view (what a
surprise). “Is it possible that despite
our best efforts, we will fail to find signals from distant societies simply
because intelligent aliens don’t exist?
In a recent book, Rare Earth, University of Washington academics
Peter Ward and Donald Brownlee suggest that intelligence may be, in Shakespeare’s
words, ‘wondrous strange.’ In support of their view, the authors make an
inventory of specific properties of our solar system, properties they claim
will seldom be found elsewhere.” The editor
of the SETINews, however, disagrees.
In fact, most of these characteristics are not rare. Our solar system is neither unusual with
regard to its composition (sufficient heavy elements to make rocky planets) or
location (away from the brutally hostile environment of the galactic center). .
. .
One argument is not easily dismissed. Could it be that biology is commonplace, but
technologically-inclined intelligence is rare? There is precious little data to guide us.[19]
Course
materials from a physics course at the University of Hong Kong
help to understand some of this.[20] “It must be remembered that this number [N in
the Drake equation] refers to the Milky Way only and that the universe contains
at least as many galaxies as there are stars in the Milky Way. The reason we chose to restrict ourselves to
the Milky Way is because intragalactic communication seems far easier than
extragalactic communication. The nearest
star in our galaxy would require several years for a round-trip message, while
the farthest stars require tens of thousands of years. In contrast, to exchange a message with a
civilization in the Andromeda galaxy would require about 4 million years, while
further galaxies would require several billion years (by which time we might
well be extinct). So, our chances for a
real conversation are much better if we limit ourselves to the Milky Way
galaxy.”[21]
According
to these physics class materials, “given the uncertainty in the various factors
in this equation, we can approximate by saying that N = L. The
number of civilizations in our galaxy with the ability to communicate is equal
to the lifetime of such civilizations. So,
depending on the value of L, N can range from about 100 to several
billion. Most likely, an intermediate
value is closer to the truth, so most scientists take N = 1 million. Thus, there may be a million or so
civilizations within our own galaxy with whom we might communicate.” Here are some estimating numbers that one
might use.
Estimated Probability
Factors for Determining the
Number of Civilizations in the Milky Way |
||||
|
Sagan (1974)
|
Best
estimate |
Most favourable
case |
Least favourable
case |
Number of
stars |
100 billion
|
300 billion
|
300 billion
|
300 billion
|
Fraction of
sunlike stars |
1
|
3/10
|
1
|
1/15
|
Average number
of planets per star |
10
|
10
|
20
|
5
|
Fraction of planets
suitable for life |
1/10
|
1/40
|
1/3
|
1/1000
|
Fraction of planets
where life does actually arise |
1
|
½
|
1
|
1/1,000,000
|
Fraction of planets
where civilizations develop |
1
|
¾
|
1
|
1/1000
|
Ratio of civilization
lifetime to galaxy lifetime |
L/10 billion
|
L/10 billion
|
L/10 billion
|
L/10 billion
|
Number of
civilizations in the galaxy now |
L/10
|
0.8L
|
300L
|
1/100 billion
|
If
the length of time an intelligent communicating civilization survived were 1000
years, Sagan’s estimate of the number now in existence would be 100, the
current best guess would be 800, the most favorable conditions suggest there
would be 300,000, and the least favorable conditions suggest the chance of
another civilization at 1 in 100 billion (that is, of the 100 billion stars in
the Milky Way, only ours and perhaps two others have an advanced civilization).
We can imagine two extreme cases: a technical society that destroys itself soon
after reaching the communicative phase [i.e., L is less than 100 years, about
where humans are now] and a technical society that learns to live with itself
soon after reaching the communicative phase; if the society gets past L= 100
years, it is unlikely to destroy itself from then on and its lifetime can be
measured on a stellar evolutionary scale [i.e., L is greater than 100 million
years].[22]
If
you guess that technologically advanced civilizations last longer, then the
estimates increase (so you might guess a million, as suggested by the physics
course materials). Of course, we have no
idea how long such a civilization can last.
We have only our own to measure by, and the chances of humanity
surviving are problematic. Recall, too,
there are 100 billion or more galaxies in the universe. Hence my personal view, anyway, that it is almost
inconceivable that there is no intelligent life anywhere else in the universe.
One response to the promulgation of the Drake equation
was the Fermi paradox, proposed by the physicist Enrico Fermi (one of the
contributors to the development of the atomic bomb): “If there were very many advanced
extraterrestrial civilizations in our galaxy, then, where are they? Why haven’t we seen any traces of intelligent
extraterrestrial life, e.g., probes, spacecraft, or transmissions?”
There are a number of what may be insurmountable
difficulties in ever discovering if there are other intelligent civilizations
in the universe. First, as Bryson makes
clear in his hypothetical solar system trip, and as the NASA table below
demonstrates, the distances are so great that unless the current understanding
of physics is incorrect—that it is not possible to travel at greater than the
speed of light—neither we nor any other civilization has any way of traveling
very far in the universe.
According
to the PBS program “Nova,” “the nearest big galaxy to our Milky Way, the
Andromeda galaxy, is two million light-years away. The most distant galaxies we can now see are
10 or 12 billion light-years away. We
could never see a galaxy that is farther away in light travel time than the
universe is old—an estimated 14 billion or so years. . . . This horizon describes the visible
universe—a region some 28 billion light years in diameter.”[23]
Out
of the trillions of stars in the entire universe, the closest one to Earth,
Alpha Centauri, is about four light-years away.
Bryson mentioned it in the first excerpt I included. I found this NASA chart to be another helpful
way to comprehend the distances of space (and the challenge of traveling very
far).
Destination
|
Jet
(600 mi/hr) |
Rocket
(25,000 mi/hr) |
Sunbeam
(186,000 mi/sec) |
Moon
|
16.5 days
|
9.4 hr
|
1.2 sec
|
Sun
|
17 years 8 months
|
4 months
|
8.5 min
|
Mercury
|
10 years 10 months
|
3 months
|
5 min
|
Venus
|
5 years 5 months
|
1.5 months
|
2.5 min
|
Mars
|
8 years 10 months
|
2.5 months
|
4 min
|
Jupiter
|
74 years 3 months
|
1 year 9 months
|
35 min
|
Saturn
|
150 years 5 months
|
3 years 7 months
|
1 hr 11 min
|
Uranus
|
318 years 6 months
|
7 years 7 months
|
2 hr 30 min
|
Pluto
|
690 years 1 month
|
16 years 5 months
|
5 hr 25 min
|
Alpha Centauri
|
4.8 million years
|
114,155.2 years
|
4.2 years
|
Sirius
|
9.6 million years
|
228,310.4 years
|
8.4 years
|
Pleiades Cluster
|
—————
|
—————
|
400 years
|
Crab Nebula
|
—————
|
—————
|
4000 years
|
Center of the Milky Way
|
——————
|
—————
|
38,000 years
|
Andromeda Galaxy
|
——————
|
—————
|
2.2 million years
|
So even traveling to Mars, in
a jet, would take almost 9 years. And I
thought the flight from Los Angeles to Sydney was long.
“The
table above is much more than a chart of travel time, it represents a glimpse
into the past. When you look into the night sky you are looking into the
history of the universe. The sunlight that shines on us is 8.5 minutes old when
it reaches Earth. Sunlight reflected
from Pluto takes 5.5 hours to reach the astronomer’s telescope. When the light of Sirius hits your eye, those
photons have been traveling for over 8 years through space. This means you are seeing that star not as it
is tonight but as it was over 8 years ago.
And most of the stars we see in the sky are hundreds or thousands of
light years away. The Andromeda galaxy
[the closest one to the Milky Way], at a mere 2.2 million light years, is truly
a next door neighbor. All of the other
galaxies are millions upon millions of light years distant.”
The
universe is about 28 billion light-years across. Let us say we finally received and recognized
radio transmissions from another civilization 200 million light years away—which
we cannot do—that would mean the transmissions originated 200 million
years ago. It might be possible that
within a few dozen or hundred years after receiving those first transmissions,
aliens might come to our solar system, because if they follow the same
developmental path as humans, it would take awhile between development of the
technology for the first radio transmissions and the development of
interstellar travel—just as we have had radio transmitting capability since the
1930s but are not exactly tripping our way into intergalactic space quite yet. At the same time, however, if
faster-than-light travel is not possible, the likelihood that aliens will be able
to get here are as remote as the chance we would get to them—they would have to
travel for far more than 200 million years (because they would be
traveling at far less than the speed of light).
I
asked a physicist friend, “does modern physics/astronomy hold out any hope that
human beings will ever be able to go faster than the speed of light? The answer was “no.” So unless some method of space travel now
unimaginable is devised, or unless a species lives for millions of years, we
won’t intersect with aliens. He wrote
that “in principle they wouldn't need to be composed of things that live
millions of years in order for a being to survive long enough to go enormous
distances. The reason is the relativistic phenomenon of space contraction
(or equivalently time dilation).” That
is, the closer one gets to the speed of light in traveling, the more that time
“speeds up” compared to someone stationary on Earth. “This is the source of the famous twins
‘paradox.’ One twin leaves the Earth at a high rate of speed, travels for
awhile, say a few years, turns around and returns to Earth. The
Earth-bound twin would have aged a great deal more than the traveling
twin. So, for example, the traveling twin could travel to the nearest
extraterrestrial planet and return still alive and young if his speed were
close enough to the speed of light, but his Earth-bound twin would have aged at
least 80 years by the time the traveling twin arrived back on Earth.”
However,
he pointed out, “the rub is that, to an inhabitant of Earth, it still takes an
enormous amount of time for the traveler to arrive, so much so that the human
would have to live millions of years if the human were alive when the traveler
started his journey. Another consequence
of this is that an alien traveler whose life span is comparable to ours could
survive a trip from his planet to ours if he moved at a high enough speed, but
if he came from a location millions of light years away he would have had to
start millions of years ago in order to arrive in our lifetime.”
“A
practical problem with all of this, I believe, . . . is that it would take a
tremendous amount of energy to accelerate something of the mass of a human to a
velocity close enough to the velocity of light for this to be useful. And
to do it quickly enough to take advantage of the space contraction within a
lifetime would also take an enormous amount of power. . . . The latter is
probably the most limiting factor since there is a limit on how much
acceleration a human can tolerate. Of course we can imagine that there
are alien beings where this wouldn't be a problem, maybe, but if they are
composed of the elements that we know of it is hard to imagine.”
Even
aside from all that, if these aliens were able to travel quickly through space,
there is no reason to believe they would have any interest in coming to our
out-of-the-way rather average solar system in an average galaxy. They would not know about us—if they were 200
million light years away, they would have no way of receiving any radio
broadcasts from Earth. Even though every
episode of “I Love Lucy” in the 1950s, every broadcast of “All Things
Considered” on National Public Radio, and every television and radio broadcast
since the technology was invented have leaked out into the universe, the
earliest ones are only 60-some light-years from Earth even now, so all but the
minutest part of the universe has no idea we are here (and unless their
technology is a lot better than ours, the aliens won’t find us because, as with
us trying to find theirs, once the radio waves get too far from Earth, they
become virtually undetectable).
One issue that causes misunderstanding about aliens and
UFOs and so on, I think, is L: how long
will or does a technologically advanced civilization last? Take my hypothetical example of humans
receiving radio waves from a civilization 200 million light years away. If the civilization existed for 1 million
years and then disappeared (and remember that recorded human history has
lasted about 7000 years, so far), we would receive transmissions for the next
million years and then no more—and by the time we receive the first
transmissions, that civilization would have been dead for 199 million
years. On the other hand, some probably
implicitly assume that humanity will go on into the indefinite future, barring
some catastrophe, and perhaps some day we shall indeed spread to the
stars. Once a civilization has spread to
more than one solar system or galaxy, it is difficult to imagine how it would
be completely eradicated short of the universe collapsing on itself—but at the
same time, as we now understand the physical laws of the universe, spreading to
the stars is unlikely to happen
It
is for all of these reasons that Bryson made his wry comment that aliens might
“amuse themselves by planting crop circles in Wiltshire or frightening the
daylights out of some poor guy in a pickup truck on a lonely road in Arizona
(they must have teenagers, after all), but it does seem unlikely.”
With respect to the possibility of other advanced
civilizations existing elsewhere in the universe, it strikes me that unless we
find a means to travel faster than light, those civilizations (and ours) will
exist in isolation. Perhaps once in a
great while one civilization might by chance receive radio transmissions that
originated in another (which hasn’t happened for us yet), but apart from that
accident, we are almost guaranteed to have no knowledge, now or in the future,
ever, if there are other civilizations, unless one happens to be quite close by
in the Milky Way. (And if there are
civilizations that have figured out how to travel faster than the speed of
light, they obviously haven’t come in our direction. Yet.)
Another difficulty with contact with other civilizations
is that they may not survive. Perhaps
they destroy themselves through nuclear war, biological weapons, or gray
goo. According to Robin Hanson, an
Economics professor at George
Mason University,
this leads to the question of the “Great Filter.”[24] Given that we have been unable to find any
sign of intelligent life, much less intelligent life that travels in space, at
least anywhere near us in our part of the universe, there is at least the
suggestion that the odds are against such life developing. So there is an argument that no civilization
in the past has developed; “the Great Silence implies that that one or more of
[the steps necessary for life to evolve and a civilization to reach the point
of traveling in space] are very improbable; there is a ‘Great Filter’
along the path between simple dead stuff and explosive life. . . . In fact, so far nothing among the
billion trillion stars in our whole past universe has made it all the way along
this path.” The last statement, of
course, goes beyond what one can reasonably assert. We don’t know that “nothing . . . has made it
all the way along this path.” All we
know for certain is that we haven’t found it/them.
Hanson looks at the various factors that led to the
present state of human civilization, which obviously began with having a star
and planet that was hospitable to life (as we know it), that life did evolve,
that it took a turn to intelligent species, and that we developed
technology. (That is a very
condensed list.) He makes the point that
there are plausible scientific explanations or hypotheses for each of the steps
that led to the present state of human civilization and that “our main data
point [there is only one civilization we know of] implies that at least one of
these plausible stories is wrong—one or more of these steps is much more
improbable than it otherwise looks. If
it is one of our past steps, such as the evolution of single-cell life, then we
shouldn’t expect to see such independently evolved life anywhere within
billions of light years from us. But if
it is a step between here and a choice to [spread out into the galaxy] that is
very improbable, we should fear for our future. . . . Rational optimism regarding our future, then
is only possible to the extent we can find prior evolutionary steps which are
plausibly more improbable than they look.
Conversely, without such findings we must consider the possibility that
we have yet to pass through a substantial part of the Great Filter. If so, then our prospects are bleak.”
All one can do is lay out the “Great Filter” notion. It is hard to tell whether the sequence of
events that led to us are more or less probable than events that might lead to
our extinction or prevent us from traveling into the galaxy. There is also the large caveat that there may
be other civilizations, now or in the past, which we do not and cannot know
about because of the vast distances of space.
But one can look at the risks to
humanity, even if one cannot quantify them.
A faculty member in Philosophy at Oxford University,
Nick Bostrum, has written a paper titled “Existential Risks: Analyzing Human Extinction Scenarios and
Related Hazards.”[25] It sounds like a gruesome subject but it
really isn’t; since all of the risks are somewhat remote. He tries to be matter-of-fact. Bostrum defines existential risk as “one
where an adverse outcome would either annihilate Earth-originating intelligent
life or permanently and drastically curtail its potential.” In other words, what would wipe out humanity,
or, at the least, what we call civilization.
He notes that these existential risks, except for a few, are recent
phenomena, and that humans have no mechanisms for managing such risks. “With the exception of a species-destroying
comet or asteroid impact (an extremely rare occurrence), there were probably no
significant existential risks in human history until the mid-twentieth century,
and certainly none that it was within our power to do something about.” There have been disasters of various
magnitudes—earthquakes, disease, war, industrial accidents (Bhopal), and the like, but they have not
presented a threat to the species.
Bostrum sets four categories of risk, which he labels
Bangs, Crunches, Shrieks, and Whimpers (which he draws from the T. S. Eliot
poem “The Hollow Men”: “This is the way
the world ends, Not with a bang but a whimper”). While his categories may make sense for his
purposes, I am going to combine them into one summary and comment on them as I
go along. These are not in any order of
probability.
— Nuclear holocaust:
Bostrum reminds us that both the U.S. and Russia still have large
stockpiles of nuclear weapons and that other nations are in varying stages of
trying to develop them. Some humans
might survive a nuclear holocaust but they might exist under stone-age
conditions.
— A genetically-engineered biological agent.
— The accidental or deliberate misuse of nanotechnology[26] (“molecular
nanotechnology will enable the construction of bacterium-scale self-replicating
mechanical robots that can feed on dirt or other organic matter. Such replicators could eat up the biosphere
or destroy it by other means” such as poison, burning, or blocking sunlight). It could also lead to grey goo. It is easier for a malicious human or agency
to develop destructive nanobots than it is to develop a defense against them.
According to World Wide Words, www.quinion.com/words/turnsofphrase/tp-gre3.htm,
the term “gray goo” originated in 1986 in
Eric Drexler’s book The Engines of Creation. He is the guru of the world of
nanotechnology, in which individual molecules are manipulated as though they
are snooker balls. In fiction, but not (yet) in fact, intelligent
sub-microscopic machines do extraordinary things like building spaceships from
raw materials without human intervention or circulate in the bloodstream to
monitor our bodily fitness and cure every ill.
Bill Joy of Sun Microsystems, hardly a techno-Luddite, has written
about a negative side to this magical molecular mystery that may one day be
ours. He argues that these nanotechnological auto-assemblers might get out of
control and convert the planet and every living thing on it to a uniform but
useless mass of bits and pieces: the grey
goo (a term actually invented by Mr. Drexler). Mr. Joy goes as far as saying that there are
some areas of research we ought not to pursue, because the consequences might
be so dire.
“The nightmare is that combined with genetic materials and thereby
self-replicating, these nanobots would be able to multiply themselves into a “gray
goo” that could outperform photosynthesis and usurp the entire biosphere,
including all edible plants and animals.”
[American Spectator, Feb. 2001]
“Grey goo is a wonderful and totally imaginary feature of some
dystopian sci-fi future in which nanotechnology runs riot, and microscopic
earth-munching machines escape from a laboratory to eat the world out from
under our feet.” [Guardian,
July 2001]
According to Wired News, 6/17/03, “Most of today’s nanotech specialists say Drexler’s
vision of molecule-sized robots is science fiction, not science fact. Others predict disaster if Drexler’s dream
ever comes to pass. ‘I don’t rule out
anything that might happen in the 22nd or 23rd century,’ said Kevin Ausman, executive director of Rice University’s
Center for Biological and Environmental Nanotechnology. But Drexler’s itty-bitty robotics ‘isn’t
nanotechnology that anyone is working on experimentally, or even has the
beginning of a coherent plan to achieve.’
Instead, researchers are using the special properties of particles
measured in nanometers (billionths of a meter) to develop new tests for cancer
and Alzheimer’s disease, sturdier building materials, water- and
stain-resistant clothes, and flexible computer displays.”
In January, 2004, a
news release from EurekAlert reported on an analysis from a medical think tank
at the University
of Toronto. Apparently Prince Charles has expressed
opposition to nanotechnology research; the think tank said that to impose a
moratorium would increase the divide between the have and the have-not
countries because nanotechnology has the potential to improve the health,
environment, and economies of developing countries. The potential benefits include detection of
cancer and HIV/AIDS, detection of TB, miniature diagnostic devices suitable for
use in remote areas, better delivery of drugs and vaccines, repair of skeletal
tissue, monitoring of soil and crop toxicity, improved water purification, and
clean-up of oil spills.
I would not have put these two high on the list of likely
threats, but a friend of mine in computer science took at look at Bostrum’s
article and wrote to me that “I think his nanotech/biotech arguments have some
merit. We should be careful, and
sometimes a little scared.”
— We are living in a simulation and it
gets shut down. This is a weird
one: we are a computer simulation that
exists because of the huge amount of computing power that will likely become
available in the future and that could be used to run simulations of previous
human civilizations. This seems to me
the least likely of all of the disaster scenarios, and my computer science
friend agrees; he dismissed this one out of hand. It “seems like pure fantasy, on a scale with
the creationist argument that you can’t disprove that the world wasn’t created
moments ago, with artifacts and memories in place. So what?”
— Badly programmed superintelligence: We could make a mistake in programming and it
destroys humans, which its “enormous intellectual advantage” might permit, or a
flawed superintelligence is badly-programmed and implements flawed goals. A related one is takeover by a “transcending
upload” (an upload is the transfer of the biological brain functions of a human
to a computer; an enhanced upload could improve itself so substantially that it
dominates all other humans/uploads). My
computer science faculty friend thought this one improbable. “There were a bunch of big-deal
retrospectives and forecasts during 2000, and I can’t think of a single one
that forecasted that we’d hit a tipping point where computers could suddenly
start to ‘outcreate’ us.”
— We could be killed by an extra-terrestrial civilization. Right now this risk also seems very low,
unless it is possible to travel faster than the speed of light and the aliens
just haven’t found us yet—and presuming the aliens are hostile rather than
friendly.
— A physics disaster (there was worry that the original atomic
bomb would set the atmosphere on fire, which it didn’t; there is corresponding
worry that some high-energy particle accelerator might create a vacuum in this
part of the universe, or some other physics experiment might have unforeseen
consequences).
— Naturally-occurring disasters, such as if AIDS were as
contagious as a cold. A friend of mine
on the Medical School faculty, however, points out that at no time in human
history, even in the past under far less hygienic circumstances, did any virus or
bacterium even come close to wiping out all or most of humanity. Even the plague in Europe in the Middle Ages
only took perhaps a third of the population—and it didn’t get China, India,
or the civilizations in the Western Hemisphere.
— An asteroid or comet impact, which while extremely rare, is
possible; there is decent evidence to suggest that some of the mass extinctions
on Earth in the past were caused by the impact of a large asteroid or some
other such body. (See the excerpt from
Bryson on Manson, Iowa, back a few pages.)
— Runaway global warming:
It may be that release of greenhouse gases is a “self-reinforcing
feedback process” that would make Earth similar to Venus, which has a surface
temperature of 450˚.
— A misguided world government or other static social
equilibrium stops technological progress; a fundamental religious or fanatic
ecological movement dominates the world.
A related possibility is a repressive totalitarian regime (based on
mistaken religious or ethical convictions, allows only part of the good things
a post-human world could include). Read
Aldous Huxley or George Orwell.
— Resource depletion or ecological destruction.
— “Dysgenic” pressures (genetic selection for less
intellectually talented individuals, although the timescale for genetic changes
in the species, and genetic engineering, may make the other factors more likely
to come into play before this one)
— Something unforeseen. One
possibility in this vein has been suggested by Olivia Judson, an evolutionary
biologist at Imperial College in London. Writing for The New York Times last
April, about the two robot explorers wandering around Mars, she notes that
there are plans to bring materials back from Mars. She recalled that H. G. Wells, in “The War of
the Worlds,” “imagined Earth invaded by space ships bearing monstrous
conquering Martians. All human defenses
prove impotent, but the Martians sicken and die when attacked by Earth’s
humblest living creatures, microbes.”
The evidence suggests that Mars might once have had life. There are places on Earth where it is
inconceivable that there is life—but there is (in ice sheets, acids, suspended
in salt crystals for millions of years, the cores of nuclear reactors). If there is or was (probably microbial) life
on Mars, and it has survived an extremely harsh environment for millions of
years, and it is brought back to Earth, who knows what might happen if it
escapes into the environment. Judson
notes what happened to the natives of Mexico when the Spaniards brought
smallpox and measles (90% died), but also points out that “most of the time
when members of a species arrive in a new place, they either die or harmlessly
settle in.” But we simply do not know
what will happen if there is life in those Martian rocks, and while NASA is
preparing a containment facility, that is no guarantee the organism will stay
contained. (What if the return ship
crashes?)
So,
I’ve ranged from the possibility that we might traverse the universe to
encountering another civilization to the possibility that we might be
extinguished or extinguish ourselves. If
we are ever to get out to the stars—and get the physicists to revise the laws
of nature for us so we can hop around the universe like we now hop around
cities in airplanes—we have to get by our existential risks. While we can’t do much about the meteor, we
can about the rest.
There you have it.
From extra-terrestrials to the end of humanity. A happy way to end the year!
* * *
A
related question is the “singularity” in computing, which is the postulation
that there will come such an enormous growth in computational power that it
will reach a point we can’t now imagine very easily (that it will in fact “take
off” so dramatically that it will be a new ball game—that there will be super
artificial intelligence that can replicate and improve itself). My friend on the computer science faculty
told me that “mostly, we’d say that it has already happened, and continues to
happen. Nobody imagined 30 years ago
that people could design a building by drawing on a computer tablet and then
put on a 3D headset and walk through it.
Nobody imagined the immense power computers have today, or many of the
applications. Certainly nobody imagined
the power of the Internet in disseminating all sorts of human-generated content
all over the world. So far, all this power seems to amplify human intelligence
more than create an artificial creative force.
Will that change? It seems unlikely. But, of course, nobody imagined we’d be where
we are now.”
* * *
[Humans are composed of roughly ten
trillion cells. That’s
10,000,000,000,000,000.] Every cell in
nature is a thing of wonder. Even the simplest
are far beyond the limits of human ingenuity.
To build the most basic yeast cell, for example, you would have to
miniaturize about the same number of components as are found in a Boeing 777
jetliner and fit them into a sphere just five microns [0.0002 inches] across;
then somehow you would have to persuade that sphere to reproduce. . . .
If you could visit a cell, you wouldn’t
like it. Blown up to a scale at which
atoms were about the size of a pea, a cell itself would be a sphere roughly
half a mile across, and supported by a complex framework of girders called the
cytoskeleton. Within it, millions upon
millions of objects—some the size of basketballs, others the size of cars—would
whiz about like bullets. There wouldn’t
be a place you could stand without being pummeled and ripped thousands of times
every second from every direction. Even
for its full-time occupants the inside of a cell is a hazardous place. Each strand of DNA is on average attacked or
damaged once every 8.4 seconds—ten thousand times in a day—by chemicals and
other agents that wham into or carelessly slice through it, and each of these
wounds must be swiftly patched up if the cell is not to perish.
The proteins are especially lively,
spinning, pulsating, and flying into each other up to a billion times a
second. Enzymes, themselves a type of
protein, dash everywhere, performing up to a thousand tasks a second. Like greatly speeded up worker ants, they
busily build and rebuild molecules, hauling a piece off this one, adding a
piece to that one. Some monitor passing
proteins and mark with a chemical those that are irreparably damaged or flawed. Once so selected, the doomed proteins proceed
to a structure called a proteasome, where they are stripped down and their
components used to build new proteins.
Some types of proteins exist for less than half an hour; others survive
for weeks. But all lead existences that
are inconceivably frenzied. As de Duve
notes, ‘the molecular world must necessarily remain entirely beyond the powers
of our imagination owing to the incredible speed with which things happen in
it.’
But slow things down, to a speed at
which the interactions can be observed, and things don’t seem quite so
unnerving. You can see that a cell is
just millions of objects—lysosomes, endosomes, ribosomes, ligands, peroxisomes,
proteins of every size and shape—bumping into millions of other objects and
performing mundane tasks: extracting
energy from nutrients, assembling structures, getting rid of waste, warding off
intruders, sending and receiving messages, making repairs. Typically a cell will contain some 20,000
different types of protein, and of these about 2,000 types will each be
represented by at least 50,000 molecules.
‘This means, says Nuland, ‘that even if we count only those molecules
present in amounts of more than 50,000 each, the total is still a very minimum
of 100 million protein molecules in each cell.
Such a staggering figure gives us some idea of the swarming immensity of
biochemical activity within us.’
* * *
One
of the most thoughtful contributing authors for The Atlantic Monthly,
William Langewiesche, wrote in the January/February 2004 issue about the space
program and the debate about sending humans into space (versus sending unmanned
robots and satellites). The title caught
my eye: “A Two-Planet Species?” Langewiesche contended that the “proponents
of manned space flight—particularly at NASA—are in an unenviably defensive
position.” They have weak arguments for
about the need for astronauts, the benefits from laboratory experiments in
space are thin, and “finally, bizarrely, they argue that only the excitement of
a human-space-flight program can persuade the American public to foot the bill
for the robotic efforts—a self-indicting logic if ever there was one. When all
else fails, they fall back on the now empty idea that the shuttle program is a
matter of national prestige—or on the still emptier claim, made obliquely by
the NASA administrator Sean O’Keefe last spring, that a retreat from human
space flight would be akin to a return to the Stone Age for all humankind.”
That,
Langewiesche says, makes the opponents of manned space flight seem more
logical. They see the accidents, the
small amount of real science, and compare the expense “(currently about $6
billion annually) with underfunded research on Earth (for example, in
oceanography and clean energy).” Langewiesche
says that the arguments are all correct.
“But the critics here are not merely noting the problems in human space
flight; they are setting up a straw man—the shuttle—in order to knock a much
larger thing down. An honest national debate would demand more.” He contends that NASA could do a number of
things that would make sense (e.g., ground the shuttle fleet for now). “This would not mean, however, that the
opponents of human space flight had won. Indeed, it may be that a pause to regroup is
precisely what a vigorous human-space-flight program now needs. One thing for sure is that the American public
is more sophisticated than the space community has given it credit for. In the event of a grounding the public might
well be presented with a question now asked only of insiders—not whether there
are immediate benefits to be gleaned from a human presence in space but, more
fundamentally, whether we are to be a two-planet species. If upon due
consideration the public’s answer is ‘yes,’ as it probably should be, the
solutions will be centuries in coming. Compared
with the scale of such an ambition, a pause of a few decades now to rethink and
rebuild will seem like nothing at all.”
If
Langewiesche’s views represent sound policy—and they well might—then our first
steps to interstellar travel may be delayed a bit. I won’t see it, but Elliott, who is
interested in this stuff as well, may live long enough to do so.
* * *
Alexis de Tocqueville’s “Democracy in America will continue to be read with profit as long as the United States
survives as a republic and, indeed, as long as democracy endures,” wrote one
American professor recently, adding that it was ‘arguably the best book ever
written not only on America
but also on democracy itself.’ Of
course, America
may well now be heading, in a big hurry, to a post-democratic position; so the
book only became out of date on September
11 2001. Even so, you couldn’t
call it irrelevant. It is so
enlightened, so uncannily accurate, that it is impossible to gainsay even now.” (The Guardian, April 3, 2004)
It is this worry, I suspect, that led a friend of mine, a
Libertarian by political belief who doesn’t care much for either Democrats or
Republicans (he refers to them as the two wings of the federal party), to write
to me that “at best I may (*may*) pull the lever
beside his [Kerry’s] name in November while holding my nose, for fear that the
very Republic may come to an end if I do not.”
Which reminds me of Ben Franklin’s counsel: “Those who would give up essential
liberty to purchase a little temporary safety deserve neither liberty nor
safety.”[27]
* * *
Early in the year I thought a great
deal about income and wealth distribution in the U.S. because the present situation
and the trends bother my sense of justice and fairness and because I believe
they pose a considerable threat to the future peace and stability of the
nation. In ruminating about the subject
I also read considerably in the literature.
What I found interesting, as the year went along, is that the subject
started to receive considerable attention—in TIME magazine, in our local
newspaper, and even in the campaign speeches of the presidential election. So here is what I discovered, along with a
few pertinent (and impertinent) quotations, and then considered why this
subject warrants thought. (I should warn
you that this is a fairly long essay on one subject, so you may wish to skip to
the next * * * if you are not interested in the topic.)
“We can
either have democracy in this country or we can have great wealth concentrated
in the hands of a few, but we can’t have both.” – Supreme Court Justice Louis Brandeis
Larry Bartels,
Professor of Politics and Public Policy at Princeton, writes that “for the past
thirty years, the United States has been conducting what one observer . . . has
called ‘a massive social experiment’ regarding the political and social
consequences of increasing economic inequality. . . . The economic causes of these trends—technological
change? demography? global competition?—are a matter of some scholarly
controversy. But the important political
point is that, whereas most rich democracies have significantly mitigated
increasing economic inequality through government action, the United States has
mostly been content to let economic trends take their course, doing ‘less than
almost any other rich democracy to limit economic inequality’ through
employment and wage policies, taxes, and transfers.”[28]
I must point out that these data
come from a variety of sources and they cannot legitimately be compared with
each other. I am told by an economist
friend that it is not good form to compare income figures with wealth/asset
figures, which I can understand; they are not measuring the same thing. What I have found and incorporated here are a
variety of different measures that all speak, in different ways, to the same
issue.
Robert Greenberg and Isaac Shapiro of the Center on Budget and
Policy Priorities,[29]
citing data from the Congressional Budget Office, write that “in 1979, the share
of the nation’s after-tax income flowing to the top one percent of the
population was less than half the share received by the bottom 40 percent of
the population.”
Percentage of After-Tax Income
1979 2000
Top 1% 2.8 million people 7.5 15.5
Bottom 40% 110 million people 19.1 14.6
In other words, in 2000 more after-tax income went to the top 1%
than to the entire bottom 40% of the U.S. population. In dollar terms, the average after-tax income
of the top 1%, 1979 to 2000, went from $286,000 to $863,000, or increased by
about 201%. By comparison, the average
after-tax income of the bottom fifth, 1979 to 2000, rose $1,100, or about
9%. (The numbers are in constant dollars.)
“The
freest government cannot long endure when the tendency of the law is to create
a rapid accumulation of property in the hands of a few, and to render the
masses poor and dependent.” – Daniel
Webster, 1782-1852
Greenberg and Shapiro go on to report that “before-tax
income was more concentrated among the top one percent of the population in
2000 than at any point since 1929. (These data are from a National Bureau for
Economic Research[30]
study that covers years through 1998 and subsequently has been extended through
2000.) When these data and the CBO[31]
data are examined together, they suggest that the top one percent of the
population received a larger share of the national after-tax income in 2000
than at any time in the past 70 years.”
These data, of course, precede the most recent federal tax cuts, which
will very likely make the gap even wider (an increase in inequality which may
have been temporarily offset by the decline in stock market values for the last
three years).
The Urban
Institute-Brookings Tax Policy Center has looked at the impact of the
2001 and 2003 tax cuts. They find the
average annual increase in after-tax income as a result of the cuts will be as
follows (from Greenstein and Shapiro):
Millionaires 112,925
Top 1% 26,335
Middle fifth 676
Bottom fifth 3
One of my faculty reviewers observed, however, that “it is probably not unusual for the bottom fifth of the
income distribution to have experienced a reduction in income taxes of only $3
on average, since a family of three would pay no federal income taxes on the
first $18,000 of earnings and would, in addition, likely receive an earned
income credit.”
A Wall Street Journal reporter wrote, about the 400
highest-earning taxpayers, “so much money in so few hands.” Those 400 had “a combined $70 billion of
adjusted gross income in 2000 – an average of $174 million each, or nearly four
times the comparable 1992 figure of $46.8 million. . . . Meanwhile, the top 400’s tax burden plunged
from 26.4 percent to 22.3 percent, on average.
(It had risen briefly, peaking at 29.9 percent in 1995.)” In the meantime, according to one
calculation, “if the latest cuts had been in effect in 2000, the average member
of the 400 would have saved another $8.3 million in taxes, bringing the tax
rate down to just 17.5 percent.”[32]
[Sacks was
writing about the 400 highest-earning taxpayers just before President Bush’s
at-the-time upcoming visit to Africa] That’s
more than the combined incomes of the 166 million people living in four of the
countries that the president is visiting this week: Nigeria,
Senegal, Uganda and Botswana. - Jeffrey Sachs, The
New York Times, July 9,
2003
“This
is going to blow up in our faces. There
is no way a society can have this much of its accumulated wealth distributed to
these few people.” – Henry Schacht,
chairman, Lucent Technologies
Another way to look at inequality is to look at the distribution
of net worth. According to data prepared
by Wolff, in 1998, the wealthiest top 1% of Americans owned 38.1% of the nation’s
net worth; the wealthiest 5% (which of course includes the wealthiest 1%) owned
56.4%. In a society of 100 people, 5 of
them would thus own over half of the net worth.
If one were to account for the 10% most well off, they would own 67.9%
of the net worth. By comparison, the
middle 20% own 4.5% while the 40% of the U.S. population least well off
owned 0.2% of the nation’s net worth.
And that bottom 40% owned 0.9% in 1983, so they lost ground over 15
years. Interestingly, so did all the
other income categories except the top 1%:
The nation’s wealthiest went from owning 33.8% of the nation’s net worth
(1983) to owning 38.1% (1998).[33] Everyone else lost share.
“To turn $100 into $110 is work. To turn $100 million into $110 million is
inevitable.” — Edgar Bronfman,
Seagrams Co.
Data from the U.S. Census Bureau show income by quintiles (fifths)
and the top 5% (note that these are different from net worth, above; net worth
likely shows greater inequality because it includes assets, such as stocks and
bonds, whereas income is simply that).[34] Here, in collapsed form, are the Census
Bureau data; for each fifth, the numbers jumped around a few tenths of a point
from year to year, but within a clear trend line, of which I am showing only a
few of the data points.
Household Shares of Aggregate Income by
Fifths of the Income Distribution[35]
Year Lowest Second Third Fourth
Highest Top 5%
Fifth Fifth Fifth Fifth Fifth
1967 4.0 10.8 17.3 24.3 43.8 17.5
1970 4.1 10.8 17.4 24.5 43.3 16.6
1975 4.4 10.6 17.1 24.7 43.1 15.9
1980 4.3 10.3 16.9 24.9 43.7 15.8
1985 4.0 9.7 16.3 24.6 45.3 17.9
1990 3.9 9.6 15.9 24.0 46.6 18.6
1995 3.7 9.1 15.2 23.3 48.7 21.0
2000 3.6 8.9 14.8 23.0 49.8 22.1
2001 3.5 6.7 14.6 23.0 50.1 22.4
This says, for
example, that in 2001, the top fifth of the households in the U.S. had 50.1% of the income in the
country. The lowest fifth did their best
up until about 1980; after that it was pretty much downhill.
Here are the income ranges for each
of the fifths of the U.S.
population (households):
Bottom Quintile: Income Range:
$0 - 17,970; Share of Total Income: 3.5%
Second Quintile: Income Range:
$17,970 - $33,314; Share of Total Income: 8.7%
Middle Quintile: Income Range:
$33,314 - $53,000; Share of Total Income: 14.6%
Fourth Quintile: Income Range:
$53,000 - $83,500; Share of Total Income: 23.0%
Top Quintile: Income Range:
$83,500 and up; Share of Total Income: 50.2%[36]
Being in the top quintile
doesn’t exactly mean you’re wealthy beyond imagination, at $83,500 plus.
One of the things I had to learn about in exploring this issue was
the Gini coefficient, a statistical calculation performed on data which is a
measure of the inequality of distribution of income (it could be football team
income in the NFL, state revenues in the 50 states, family income in a country,
and so on). The easiest way to
understand the Gini coefficient is this:
If you have 10 families and aggregate societal income of $10,000, if all
10 families each have $1000 in income, the Gini coefficient is 0 (perfect
equality). If one family has all $10,000
of the income and the other families have zero, the Gini coefficient is 1
(perfect inequality). The closer the
Gini coefficient is to 1.0, the more inequality there is. (I should note that there are many Gini
coefficients in the literature, depending on the source—U.N., Census Bureau,
Congressional Budget Office, etc.—and the authors apparently use different data
sources. For my purposes here, it is not
important that the studies all use exactly the same numbers; my inquiry is more
about trends and comparisons than the precise numbers.)
The recent movement toward increased inequality of income has not
always been the case in the U.S. The Census Bureau has tracked the Gini
coefficient for the country for both households and families. In some years it declined, reflecting decreasing
income inequality, but the trend is unmistakable. The family Gini coefficient went from .376 to
.429 (1947-1997); the household Gini coefficient went from .399 to .488 (1967-2001).[37] Basically since 1976 both household and
family coefficients have been steadily increasing. (The household Gini declined for two years in
the mid-1970s, but then began to go up again.
It has declined slightly in other years as well—both family and
household numbers bounce around some.)
“What
we have here is a form of looting . . . .
The rich don’t need the money and are a lot less likely to spend it -
they will primarily increase their savings.
Remember that wealthier families have done extremely well in the US in
the past twenty years, whereas poorer ones have done quite badly. So the redistributive effects of this
administration’s tax policy are going in the exactly wrong direction.” – George A. Akerlof,
2001 Nobel laureate at the University of California
in Berkeley, in a July 29, 2003 interview with Der Spiegel (Berlin).[38]
Among the more provocative
conclusions about income inequality have been set forth by Professor Bartels at
Princeton.[39] Bartels examined annual pre-tax income growth
by quintiles under Democratic and Republican presidencies 1948-2001 (lagging
the calculation one year to allow the impact of a new administration). Here is his table (the numbers are
percentages of income growth):
All Democratic Republican
Years Presidents Presidents
20th
percentile 1.58 2.63 .60
40th
percentile 1.66 2.45 .93
60th
percentile 1.86 2.46 1.32
80th
percentile 1.97 2.37 1.60
95th
percentile 2.10 2.11 2.09
About the first column of data,
Bartels observed that “it is clear . . . that, even in percentage terms, real
income growth near the bottom of the income distribution has been considerably
slower than at the top of the distribution in post-war U.S. Moreover, poor families have been subject to
considerably larger fluctuations in income growth rates than families at higher
income levels. For example, families at
the 20th percentile experienced declining real incomes in 17 of the
54 years covered by my analysis, including seven declines of three percent or
more; by comparison, families at the 80th percentile experienced 11
real declines, only one of which (in 1948) exceeded two percent.” My contrarian faculty reviewer observed that
it’s not unusual that people
in the bottom fifth would have more fluctuations in income growth in every period. The more elastic the supply of labor (quick
response to changes in wage rates), the more likely that unskilled labor
workers are likely to face layoff and the less likely their incomes are to rise
during expansions. And, throughout the
history of the U.S.,
there has been a decline in the demand for unskilled labor, precisely the group
that makes up the bottom quintile most of the time.
Bartels’s conclusion about the
second and third columns: “Democratic
presidents have produced slightly more income growth for poor families than
rich families, resulting in a modest decrease in overall inequality. Republican presidents have produced a great
deal more income growth for rich families, resulting in a substantial increase
in inequality. On average, families at
the 95th percentile of the income distribution have experienced
identical income growth under Democratic and Republican presidents, while those
at the 20th percentile have experienced more than four times as much
income growth under Democrats as they have under Republicans. These differences are attributable to
partisan differences in unemployment (which has been 30 percent lower under
Democratic presidents, on average) and GDP growth (which has been 30 percent
higher under Democratic presidents, on average); both unemployment and GDP
growth have much stronger effects on income growth at the bottom of the income
distribution than at the top.” He also
wrote that “the most affluent American families have done very well under both
parties, and have been relatively unaffected by changes in the structure of the
U.S.
economy over the past half-century.”
There is arguably a problem with these data. As one of my reviewer colleagues pointed out, “the income growth under Republican and Democrat
presidencies is confounded with control of the Congress. With the exception of the 1953-55 Congress,
from 1933 until 1995, in no year did the Republicans control Congress
regardless of who was president. Using
this same method of analysis, the Republicans could claim credit for the strong
economic growth during Clinton’s
second term. (I suspect that neither is
actually responsible.)” On the other
hand, Democratic and Republican presidents offer different kinds of budgetary
proposals to Congress, so what Congress enacts will in part be a reflection of
what the President proposes.
“Poverty
is an anomaly to rich people. It is very
difficult to make out why people who want dinner do not ring the bell.” – Walter Bagehot, 19th Century English
economist
I wonder about the extent to which
the decline in income inequality in the U.S. in the post-WWII period is a
reflection of the G.I. Bill, which sent millions of people to college who
probably otherwise would not have attended (including, most likely, both my
father and my father-in-law), thus giving them the chance to do far better than
their parents and to change quintiles in the income distribution (which is what
happened with both my father and my father-in-law).[40] I haven’t seen this effect mentioned anywhere
in the literature I looked at, but I have a hard time believing the G.I. Bill
didn’t have a noticeable impact on income levels.
“We
used to think of Great
Britain, with its castles and peerages, as
being the epitome of a class-based society.
Today, we far surpass Britain
in the disparity of income. That is
economically disastrous and morally wrong.” – Rep. David Obey (D-Wisconsin), March 11, 1996
The U.N. provides a Gini coefficient on income distribution in
nations throughout the world (for those countries which data exist). [41] Here are the numbers:
>.500 Mexico
(.519), Chile (.575), El Salvador
(.508), Papua New Guinea, Guatemala, Lesotho, Colombia, Paraguay, Honduras,
South Africa, Nicaragua, Brazil, Swaziland, Botswana, Namibia (.707)
.400 - .499 Trinidad and Tobago (.403), United States (.408), Singapore, Hong Kong, Uruguay, Costa Rica
(.459), Turkey (.400), China, Cambodia, Turkmenistan, Tunisia, Iran, Thailand,
Ecuador, Mongolia, Guyana, Bolivia, Russia, Philippines, Peru, Dominican
Republic, Panama, Venezuela, Malaysia .(492)
.350 - .399 United Kingdom (.36), Italy, New Zealand, Lithuania, Estonia,
Germany, Portugal (.385), Algeria (.353), Viet Nam, Moldava, Jordan,
Azerbaijan, Laos, India, Jamaica, Armenia, Georgia, Morocco, Ghana (.396)
.300 - .349 Belarus (.304), Luxembourg, Canada, S. Korea, Poland, Latvia,
Spain, Netherlands, France, Switzerland, Australia, Greece, Israel, Ireland
(.359), Romania (.303), Indonesia, Kazakhstan, Bangladesh, Sri Lanka, Egypt,
Tajikistan (.347)]
.250 - .299 Sweden (.250),
Belgium, Czech Republic,
Finland, Norway, Slovakia,
Slovenia, Croatia, Uzbekistan
(.268), Macedonia, Ukraine, Kyrgyzstan (.290)
<.250 Hungary (.244), Denmark,
Japan
(.249)
The U.S.
appears to have less equal income distribution than most of Europe
or other industrialized countries. I don’t
think of us as in good company in this respect, right there with Turkey, Singapore,
and Uruguay. The data for the U.S.
are from 1997; were they updated, one suspects that the U.S. data might be even more
discrepant from the rest of the developed world.
Professor Edward Wolff, an economist at New York University,
has commented on much of this.[42]
The top 5 percent own more than half of
all wealth. In 1998, they owned 59
percent of all wealth. Or to put it
another way, the top 5 percent had more wealth than the remaining 95 percent of
the population, collectively. Wealth
inequality in the United
States has a Gini coefficient of .82,[43]
which is pretty close to the maximum level of inequality you can have.
The bottom 20 percent basically have zero
wealth. They either have no assets, or their debt equals or exceeds their
assets. The bottom 20 percent has typically accumulated no savings. A household in the middle—the median
household—has wealth of about $62,000. $62,000 is not insignificant, but if you
consider that the top 1 percent of households’ average wealth is $12.5 million,
you can see what a difference there is in the distribution.
Things are even more concentrated if you
exclude owner-occupied housing. It is
nice to own a house and it provides all kinds of benefits, but it is not very
liquid. You can’t really dispose of it,
because you need some place to live. The
top 1 percent of families hold half of all non-home wealth. The middle class’s major assets are their
home, liquid assets like checking and savings accounts, CDs and money market
funds, and pension accounts. For the
average family, these assets make up 84 percent of their total wealth.
[The United States is] much more unequal
than any other advanced industrial country.
Perhaps our closest rival in terms of inequality is Great Britain. But where the top
percent in this country own 38 percent of all wealth, in Great Britain it is more like 22 or
23 percent. What is remarkable is that
this was not always the case. Up until
the early 1970s, the U.S.
actually had lower wealth inequality than Great
Britain, and even than a country like Sweden. But things have really
turned around over the last 25 or 30 years. In fact, a lot of countries have
experienced lessening wealth inequality over time. The U.S. is atypical in that inequality
has risen so sharply over the last 25 or 30 years.
Thomas Piketty (Ecole des Hautes
Etudes en Sciences Sociales, Paris) and Emmanuel
Saez (University of California at Berkeley)
did a study entitled “Income Inequality in the United States, 1913-1998.” They found that wage inequality in the U.S.
was compressed by World War II and stayed compressed until the 1970s. Since then it has increased. They also looked at the differences between
the U.S., Great Britain, and France; the latter two also
suffered from World War II. Wealth
concentration is now less in both countries than it was a century ago (and less
so in France than Great Britain). Their conclusion is an interesting one. “In France, top incomes are still
composed primarily of dividend income. . . .
In the United States,
due to the very large rise of top wages since the 1970s, the coupon-clipping
rentiers[44]
have been overtaken by the working rich.
Such a pattern might not last for very long because our proposed
interpretation [of the data] also suggests that the decline of progressive
taxation observed since the early 1980s in the United States could very well
spur a revival of high wealth concentration and top capital incomes during the
next few decades.” (Great Britain was in between the U.S.
and France.)
Saez and a colleague from Columbia did another
study, this one of wealth shares over 84 years (as opposed to income).[45] They wrote that “top wealth shares were very
high at the beginning of the period but have been hit sharply by the Great
Depression, the New Deal, and World War II shocks. Those shocks had permanent effects. Following a decline in the 1970s, top wealth
shares recovered in the early 1980s, but they still are much lower in 2000 than
in the early decades of the century.”
Most of the changes they found were in the top 0.1%. “Evidence from the Forbes 400 richest
Americans suggests that only the super-rich have experienced significant gains
relative to the average over the last decade.”
These results are consistent with the Piketty and Saez study, and they
conclude that “the rentier class of the early century is not yet
reconstituted. The most plausible
explanations for the facts have been the development of progressive income and
estate taxation which has dramatically impaired the ability of large wealth
holders to maintain their fortunes.”
“Together
the 400 richest Americans are worth more than $1 trillion. Just 400 people —
they could all stay at New York’s Plaza Hotel
at the same time — are worth nearly one-eighth of the total gross domestic
product of the United States,
the world’s richest economy.” – Holly
Sklar, Houston
Chronicle, October 4,
1999[46]
The Congressional Budget Office also examined tax burdens. I have commented previously that I am
unimpressed with claims that taxes were too high, inhibiting economic
activity. Greenstein and Shapiro review
the CBO data:
Contrary to claims
that taxes were at or near record levels before the 2001 tax cut, the CBO data
show that the percentage of income that most Americans paid in federal taxes
declined between 1979 and 2000 and was actually at relatively low levels in
2000, in historical terms. Among the middle fifth of families, for
example, the percentage of income paid in federal taxes—including income,
payroll, and excise taxes—dropped from 18.6 percent of income in 1979 to 16.7
percent of income in 2000. The 16.7 percent level was the lowest during
the 21-year period the CBO data cover.
The CBO data also show that before-tax incomes shot up faster among the
top one percent of the population during the 1990s—when their federal taxes
were increased—than during the 1980s, when their federal taxes were
reduced. These results—and the fact that investment and productivity
growth accelerated, rather than slowed, in the 1990s—cast doubt on the simple
theory that action to increase the tax burdens of those households is
economically destructive.
“I
am conscious that an equal division of property is impracticable. But the
consequences of this enormous inequality producing so much misery to the bulk
of mankind, legislators cannot invent too many devices for subdividing
property. . . . Another means of silently
lessening the inequality of property is to exempt all from taxation below a
certain point, and to tax the higher portions of property in geometrical
progression as they rise.” – Thomas
Jefferson
Revenue statistics from the OECD[47]
show that the United States’
total tax revenue as a percentage of Gross Domestic Product in 2000 was
29.6%. Of the 30 OECD members, only Mexico, Japan,
and Korea
had a lower percentage. Most of the OECD
member countries had percentages in the mid-30s to the low 40s; the OECD
average is 37.4%. This suggests to me that
Americans, in comparison with citizens in other developed countries, are taxed
much more lightly. Another set of OECD
data, from 1998, confirms this impression:
The average effective household tax rate (personal income tax) for
1991-96 in the U.S.
was 14.4%. There were others that were
lower (Japan, France, Czech
Republic, Greece,
Hungary, Korea, Mexico,
Poland, Spain, and Sweden), but the majority were
higher. (Most other OECD countries also
have a value-added tax that ranges from 2.6% to 22%, so even those countries
with lower personal income tax rates have total tax burdens that are higher
than the U.S. I believe the U.S. is the only OECD country that
does not have a VAT. A political science
faculty friend who teaches in international politics points out, however, that “most states have a sales tax, which operates
similarly. The VATs of the European
countries are actually regressive. It’s
their progressive income taxes, and their redistributive social policies, that
distinguish their societies from ours.”
Even putting aside questions of favorable genetic inheritance and
environment, inequality of income/assets would be less of a concern if there
were sufficient social mobility and access to the means to accomplish it (e.g.,
education[48]). The American Heritage Foundation has
sponsored at least one paper which argues that there is great
inter-income-quintile mobility.[49] Serious research at the Federal Reserve Bank
of Chicago by Bhashkar Mazumder,
however, leads to another conclusion.[50] Alan Krueger,[51]
writing in the New York Times in 2002, described the research and noted that
the correlation between the earnings of father and son was about 65%. The same was true for fathers and daughters. After looking at other studies on
intergenerational income mobility, Krueger concludes that “the data challenge
the notion that the United
States is an exceptionally mobile
society. If the United States stands out in
comparison with other countries, it is in having a more static distribution of
income across generations with fewer opportunities for advancement. . . . Only South Africa and Britain
have as little mobility across generations as the United States.” If one is born in the economic bottom fifth
of the population, the chances one will get to the top are about 7% and about
18% to the middle. There is a 37% chance
one will stay in the bottom quintile.
Carol Graham and H. Peyton Young of the Brookings Institution[52]
have written that “the strong belief that the United
States is the land of opportunity helps explain why
support for income redistribution is low in America
compared to Europe. . . . The majority of Americans believe they will
be above average (mean) income in the future, a Lake Wobegon
fantasy if there ever was one.
“To what extent is the United States the land of
opportunity? How does the myth—which has
clear effects on political attitudes—square with reality? Evidence suggests that mobility in the United States
is not higher than in other countries in the Organization for Economic Cooperation
and Development and indeed may be lower.
A Stockholm University study finds more mobility in Sweden than the United States. . . . Indeed a
recent Brookings Institution book records more mobility in a 10-year period in Peru than in the United States. . . . While the public may still believe that we
live in the land of opportunity, the reality is one of increasing
stratification of incomes and of opportunities.”
Another
author, looking at Dr. Mazumder’s study, [53]
wrote that public beliefs are “rooted in an imaginary economy where anyone can
make it. Rolled-up sleeves, diligent
study, or a dollop of blind luck coupled with raw ingenuity can rocket a
high-school dropout or a penniless immigrant into the income stratosphere. If this were true, it would make the chasm of
American inequality mildly easier to bear.
We might forgive a skewed income distribution if we thought that
everyone had a fair shot at getting to the top.” However, in the U.S., “a good predictor of a person’s
future earnings is the income (and earnings) of his/her parents. Caste and class still stalk the American
economic landscape.”
“A
number of recent economic studies have corroborated the intuition that the United States
is not the free-wheeling rags-to-riches capitalism of lore. These are generally done by trying to
estimate the impact of parental income on child’s income, or, in economese, the
“intergenerational income elasticity.” This measures the percent change of a
child’s income correlated with a 1% change in the parent’s income, controlling
for the age differences at the time of measurement.
“The
latest estimate by Bhashkar Mazumder finds that this number is around 0.6,
which is extremely high relative to both past studies and other OECD
countries. Say Anna’s parents are 10% richer
than Alice’s, then Anna is likely to have a 6%
higher income than Alice. What’s worse is that this number itself
varies by income; the very rich and very poor are unusually good at passing
along their economic characteristics to their kids.”
Professor Krugman echoes Graham and Young. He opened an article in The Nation
with the following:[54]
The other day I found myself reading a
leftist rag that made outrageous claims about America. It said that we are becoming a society in
which the poor tend to stay poor, no matter how hard they work; in which sons
are much more likely to inherit the socioeconomic status of their father than
they were a generation ago.
The name of the leftist rag? Business
Week, which published an article titled “Waking Up From the American
Dream.” The article summarizes recent
research showing that social mobility in the United States (which was never as
high as legend had it) has declined considerably over the past few
decades. If you put that research
together with other research that shows a drastic increase in income and wealth
inequality, you reach an uncomfortable conclusion: America looks more and more like a
class-ridden society.
Krugman argues that “the myth of income mobility has always
exceeded the reality: As a general rule,
once they’ve reached their 30s, people don’t move up and down the income ladder
very much.” There have been studies that
show people moving from lower-paid to higher-paid jobs, or that children of
parents in one of the lower fifths of the population move into higher
fifths. Krugman maintains, however, that
serious studies show that such movement is less than it appears.
Krugman agrees that “America
was once a place of substantial intergenerational mobility: Sons often did much better than their
fathers. A classic 1978 survey found
that among adult men whose fathers were in the bottom 25 percent of the
population as ranked by social and economic status, 23 percent had made it into
the top 25 percent. In other words,
during the first thirty years or so after World War II, the American dream of
upward mobility was a real experience for many people.” Again, I wonder about the effect of the G.I.
Bill. But in recent years, the number
has declined. Krugman: “Business
Week . . . finds that this number has dropped to only 10
percent. That is, over the past
generation upward mobility has fallen drastically. Very few children of the lower class are
making their way to even moderate affluence.
This goes along with other studies indicating that rags-to-riches
stories have become vanishingly rare, and that the correlation between fathers’
and sons’ incomes has risen in recent decades.
In modern America,
it seems, you’re quite likely to stay in the social and economic class into
which you were born.”
Part of this, Krugman agrees with Business Week, is because
of the growth in the number of jobs at places like Wal-Mart, “dead-end,
low-wage jobs and the disappearance of jobs that provide entry to the middle
class.” Krugman argues, however that “public
policy plays a role—and will, if present trends continue, play an even bigger
role in the future.” Krugman’s
conclusion:
Put it this way: Suppose
that you actually liked a caste society, and you were seeking ways to use your
control of the government to further entrench the advantages of the haves
against the have-nots. What would you
do? One thing you would definitely do is
get rid of the estate tax, so that large fortunes can be passed on to the next
generation. More broadly, you would seek
to reduce tax rates both on corporate profits and on unearned income such as
dividends and capital gains, so that those with large accumulated or inherited
wealth could more easily accumulate even more.
You’d also try to create tax shelters mainly useful for the rich. And more broadly still, you’d try to reduce
tax rates on people with high incomes, shifting the burden to the payroll tax
and other revenue sources that bear most heavily on people with lower incomes.
“The haves
are on the march. With growing inequality,
so grows their power. And so also
diminish the voices . . . of civil society, the voices of a democratic and
egalitarian middle class.” – James K.
Galbraith[55],
Created Unequal: The Crisis in American Pay (1998)
As
the refined data from the Federal Reserve Bank of Chicago demonstrate, and as
Krugman wrote, “in reality, moves from the bottom to the top quintile are
extremely rare; a typical estimate is that only about 3 percent of families who
are in the bottom 20 percent in one year will be in the top 20 percent a decade
later. About half will still be in the
bottom quintile. And even those 3 percent that move aren’t necessarily Horatio
Alger stories. The top quintile includes
everyone from a $60,000 a year regional manager to Warren Buffett.”
On the import of a sharp downward
revision on intergenerational income mobility in the U.S.,
the Chicago Federal Reserve’s Mazumder concludes that “given the rising
evidence from studies in other countries it appears that the U.S. may be among the most immobile
countries. This comparative view
suggests that there might be some important institutional features about the U.S.
that create such a high level of persistence of income.”[56]
“I
believe in a graduated income tax on big fortunes, and in another tax which is
far more easily collected and far more effective-a graduated inheritance tax on
big fortunes, properly safeguarded against evasion and increasing rapidly with
the size of the estate.” – Theodore
Roosevelt
The estate tax appears repeatedly in the discussions of the
mobility and inequality data. Professor
Saez, who wrote with Piketty about England,
France, and U.S.
(see above), was quoted in an article by Professor Krueger in The New York
Times: “So the capital-rich rentier
class has been replaced by handsomely paid executives, athletes, and
entrepreneurs. But Professor Saez
predicts ‘if the estate tax goes away and top income rates are cut, a new
generation of rentiers should reappear down the road.’”[57] Saez certainly hints at its importance when
he and Kopczuk write about “the development of progressive income and estate
taxation which has dramatically impaired the ability of large wealth holders to
maintain their fortunes.”
Bartels at Princeton has a view
that parallels Graham and Young (“the strong belief
that the United States
is the land of opportunity”). “The
apparent absence of outrage over inequality is sometimes attributed to a
peculiarly American faith in upward economic mobility. For example, a journalist observing strong
public support for repealing the estate tax wrote that
Many
voters say they can imagine this tax applying to them. While only 1 percent of Americans classified
themselves as wealthy in a 1999 Newsweek poll, another 41 percent said it was “very
likely” or “somewhat likely” that they would become wealthy. Other surveys show that Americans are much
less prone to class envy than Europeans are, and more optimistic that they or
their children will join the ranks of the affluent.”
Professor Bartels
studied this issue in some depth. The
results of his study are a depressing commentary on the level of sophistication
in the thinking of the American public.
He reported that
the
proportion of people favoring repeal of the tax in [the 2002 National Election
Study survey[58]]
varied with relevant circumstances and political views. In the sample as a whole, almost 70% favored
repeal. But even among people with
family incomes of less than $50,000 (about half the sample), 66% favored
repeal. Among people who want to spend
more money on federal government programs, 68% favored repeal. Among people who said that the difference in
incomes between rich people and poor people has increased in the past 20 years and
that that is a bad thing, 66% favored repeal.
Among those who said that government policy is a “very important” or “somewhat
important” cause of economic inequality, 67% favored repeal. Among those who said that the rich are asked
to pay too little in federal income taxes, 68% favored repeal. And finally, among those with family incomes
of less than $50,000 who want more spending on government programs and
said income inequality has increased and said that is a bad thing and
said that government policy contributes to income inequality and said
that rich people pay less than they should in federal income taxes—the 11% of
the sample with the strongest conceivable set of reasons to support the estate
tax—66% favored repeal.
Why is this so? Professor Bartels suggests the idea of “misplaced
self interest,” and goes on to write that “additional evidence to that effect
appeared in the 2003 survey of Americans’ views on taxes sponsored by NPR, the
Kaiser Foundation, and the Kennedy
School. Asked whether ‘most families have to pay the
federal estate tax when someone dies or only a few families have to pay it,’
half the respondents said that ‘most families have to pay,’ while an additional
18% said they didn’t know. Thus,
two-thirds of the American public seems not to understand the single most
important fact about the estate tax: that it is only paid by very wealthy people.”
“It
should be no surprise that when rich men take control of the government, they
pass laws that are favorable to themselves.
The surprise is that those who are not rich vote for such people, even
though they should know from bitter experience that the rich will continue to
rip off the rest of us.” – [Catholic Father] Andrew Greeley, novelist and columnist, Chicago Sun-Times,
February 18, 2001
A friend in Economics at the
University tells me that “there has been some work in economics on optimal
taxes, where ‘optimal’ is defined as balancing the disincentive effect of taxes
on effort against the desire to redistribute from the haves to the have nots.” He goes on to say that “in general,
economists are less enthusiastic about income redistribution than are others
because of the unintended consequences that political reformers sometimes don’t
consider. And the smaller the geographic
area, the less enthusiastic we are. If
an individual city . . . tries to help the poor by rent controls or other
devices to redistribute income, the property owners let their rental buildings
run down, or the rich pick up and leave.
Same to a lesser extent when an individual state does so. Same for a country, at least when the country
is small and Scandinavian.”
At the same time, the economist
Saez at Berkeley
write that “the difficult question to answer is why large fortunes did not
recover from those shocks [World War I, the Great Depression, World War II]
during the very prosperous decades following World War II. The most natural and realistic candidate for
an explanation seems to be the creation and the development of the progressive
income tax (and of the progressive estate tax and corporate income tax).”
So
why is all of this even worth any attention?
I think there are several reasons, but first I want to address a
preliminary point. Do I think a U.S
income distribution with a Gini coefficient of 0 would be a good idea? I do not.
That kind of flat income distribution is reminiscent of the
(unattainable and, in my opinion, not even attractive) ideal espoused by
communism, I think: “from each according
to his ability, to each according to his need.”
As a prescription for society, that makes me shudder. It takes no account of human needs and
aspirations—or human competitiveness and differences in drive. I suspect such a society would dull all
innovation.
“Nothing in
the fundamentals of capitalism requires a very steep pyramid of power or the
malignant maldistribution of wealth and income.” - William Greider
Richard
Freeman, writing in Boston Review a few years ago, put one concern this
way.[59] “Falling or stagnating incomes for most
workers and rising inequality threaten American ideals of political ‘classlessness’
and shared citizenship. Left unattended,
the new inequality threatens us with a two-tiered society—what I have elsewhere
called an ‘apartheid economy’—in which the successful upper and upper-middle
classes live lives fundamentally different from the working classes and the
poor. Such an economy will function well
for substantial numbers, but will not meet our nation’s democratic ideal of
advancing the well-being of the average citizen. For many it promises the loss
of the ‘American dream.’”
“In
a rich society, no one should be allowed to suffer from deprivation such as
homelessness, starvation and illness.
This ideal is essential, not simply as a matter of human good, but as
the price we pay for a measure of domestic tranquility.” - John Kenneth Galbraith
“History
has shown that large inequalities of wealth, power and knowledge are a natural
part of all human civilizations. But so
are the resentments and rebellions that develop when these inequalities become
too great.” – Anatole Kaletsky,
The Times (London),
February 4, 1999
One consideration is purely
pragmatic. One might recall Marie
Antoinette declaring “let them eat cake” (which she really didn’t say,
apparently). The period, however,
warrants recollecting: the economic
circumstances and the division between rich and poor in France were such that the peasants
led a revolution in 1789 which included, among other things, the beheading of
the king and queen and a lot of aristocrats (and various others as well during
The Terror). I don’t know that the U.S.
faces the possibility of a revolution in that sense, but one also doesn’t have
to go as far back as France in 1789 to find increasing popularity of
demagoguery in times of economic distress; think about Huey Long and Father
Coughlin in the 1930s, when the economy was in the dumps and income/asset
inequality was as marked as it is again today.
One can also think about the nation’s urban riots in the 1960s, which
were related in part to economic divisions in the country. (I did not include in all of the data here
the numbers for minority groups, which are significantly worse than the overall
data that I presented.)
![Text Box: An amusing (at least to me) side story, from the web site “The Straight Dope.” According to Cecil Adams, “while Marie Antoinette was certainly enough of a bubblehead to have said the phrase in question, there is no evidence that she actually did so, and in any case she did not originate it. The peasants-have-no-bread story was in common currency at least since the 1760s as an illustration of the decadence of the aristocracy. The political philosopher Jean-Jacques Rousseau mentions it in his Confessions in connection with an incident that occurred in 1740. (He stole wine while working as a tutor in Lyons and then had problems trying to scrounge up something to eat along with it.) He concludes thusly: ‘Finally I remembered the way out suggested by a great princess when told that the peasants had no bread: “Well, let them eat cake.”‘
“Now, J.-J. may have been embroidering this yarn with a line he had really heard many years later. But even so, at the time he was writing—early 1766—Marie Antoinette was only ten years old and still four years away from her marriage to the future Louis XVI. Writer Alphonse Karr in 1843 claimed that the line originated with a certain Duchess of Tuscany in 1760 or earlier, and that it was attributed to Marie Antoinette in 1789 by radical agitators who were trying to turn the populace against her.”
The Yahoo website provides further elaboration, citing “The Straight Dope”: “Actually, Rousseau wrote ‘Qu’ils mangent de la brioche,’ which essentially means ‘let them eat a type of egg-based bread‘ (not quite cake, but still a bit extravagant). Rousseau claimed that ‘a great princess’ told the peasants to eat cake/brioche when she heard they had no bread.
“So it’s highly unlikely that Marie uttered the pompous phrase. Perhaps Rousseau invented them to illustrate the divide between royalty and the poor—which is certainly how the phrase has been used ever since.
“However, ‘Let them eat brioche‘ isn’t quite as cold a sentiment as you might imagine. At the time, French law required bakers to sell fancy breads at the same low price as the plain breads if they ran out of the latter. The goal was to prevent bakers from making very little cheap bread and then profiting off the fancy, expensive bread. Whoever really said ‘Let them eat brioche’ may have meant that the bakery laws should be enforced so the poor could eat the fancy bread if there wasn’t enough plain bread to go around.”
And according to the Urban Legends web site, Lady Antonia Fraser, who has recently written a biography of Marie Antoinette, declared that she never said it. “It was said 100 years before her by Marie-Therese, the wife of Louis XIV,” Fraser explains. “It was a callous and ignorant statement and she [Antoinette] was neither.” Fraser and Adams appear not to agree about Marie Antoinette’s character.](file:///C:/Users/engst266/AppData/Local/Temp/msohtmlclip1/01/clip_image001.gif)
While I think there is much the United States could do to improve itself and the world, a revolution would hardly serve anyone’s purposes (except perhaps Al Qaeda). At the personal level, those of us who have children do not want to see them grow up in a nation wracked by internal war and fighting. The folks I know who don’t have children share that sentiment. Galbraith and Kaletsky, however, have put their finger on one big potential problem with the trend in the income and wealth distribution. I don’t think anyone should be so naïve or complacent as to think this country is immune from tumult arising from economic circumstances. Part of the reason is the web and TV: while there have been great gulfs between the wealthy and the rest of the population in the American past, they almost certainly weren’t as well known to the mass public as the difference is today. Sure, President Bush and the political system appear to enjoy considerable support in the society (or at least support by those who respond to polls), but there is an underclass that could, at some point, rise up with unpleasant consequences for all of us. I am not predicting that this will happen, only noting that it could.
Another consideration is justice at
the personal level. Branko Milanovic, a
researcher with the World Bank who appears to have spent much of his career
studying economic inequality, posed one example which I modify for the purposes
of this letter.[60] Suppose we have a small classroom of
students; a good fairy arrives and gives one student $20,000 and everyone else
25¢ for no apparent reason. Everyone’s
welfare is increased, but the effects would not be positive. Some “might refuse to accept [the] quarter,
some might leave it in the room, others throw it away in disgust.” Milanovic surmises that “many (most) of those
who would have received 25 cents would not merely not feel better . . . but
would rather feel worse. They would feel
worse off because their feeling of justice and propriety would have been hurt.” The point is that income is not only a way to
buy things, it is a recognition of how society values people, an expression of
worth. “Hence a large (and particularly
if unjustified or unclear) difference in income will be viewed as a slight on
their own worth.”[61]
To make the example a little more
realistic, we only have to look at the distribution of faculty salaries at a
place like the University
of Minnesota, Twin Cities
(and its salaries are not dissimilar from other Big Ten/large research
universities). The faculty in law,
business, and medicine make more (in some cases a lot more) than
faculty, for example, in English, history, and philosophy. The faculty members in all fields have a
similar job description—they are to teach students, do research, and provide
service to the state/community. Some
teach more, some less; some do more research, some less. But at root they are all about the same
general tasks. The fact that there are
large discrepancies in salaries among fields does, in fact, leave some faculty
in the lower-paid departments resentful about the differences. I suspect most of them do not lead an
organizational rebellion because they ruefully concede the pressure of the
external market on the salaries of those in law, business, and medicine. As a colleague of mine wrote to me, “like it
or not, market forces here, and in all of society, are inevitable—not
necessarily popular or unpopular, good or bad, just inevitable. But market forces are but one factor” in
determining just compensation.
“At some
point, CEO pay is going to rub up against the gross national product, and I’ve
been told by some economists that will put a cap on their salaries.” – Graef Crystal, compensation analyst, April 20, 2000
What I have a hard time understanding is the astronomical salaries
and benefits arrangements that have been handed out to corporate executives in
recent years. (As I compose this, the
New York Stock Exchange is seeking help from the Securities and Exchange
Commission in recovering some of the $140 million the NYSE paid to Richard
Grasso when he was chair of the Exchange.)
The gap between the “normal” employees and the CEOs and other executives
is so great that it defies comprehension.
And most people’s sense of what is fair.
“There’s no
more central theme in the Bible than the immorality of inequality. Jesus speaks more about the gap between rich
and poor than he does about heaven and hell.” – Jim Wallis, editor, Sojourners magazine (1999)[62]
There is also justice at the social level. When there are large disparities in income
and wealth, life opportunities are not distributed fairly. If one were to lock up a child in a room for
10 years, providing only food and sanitary facilities, one would end up with a
very badly damaged human being. It seems
to me that we do that same thing when we have children being reared in places
like the slums of East St. Louis or in the rural poverty that afflicts various
parts of the country (to say nothing of the awful conditions in places like
Bangladesh or other desperately poor third-world nations). To imprison a child in a room for 10 years is
punishable in the criminal justice system; allowing children to be raised in
circumstances that almost guarantee they will fail in life (“fail” at least by
middle/upper class standards of success) is not. I do not say it should be criminal (who would
one put in jail?) but it violates my sense of justice that we as a society are
not more active in trying to remedy the situation.
“The
man of great wealth owes a peculiar obligation to the state because he derives
special advantages from the mere existence of government.” – Theodore Roosevelt
“I
personally think that society is responsible for a very significant percentage
of what I’ve earned. If you stick me
down in the middle of Bangladesh
or Peru
or someplace, you’ll find out how much this talent is going to produce in the
wrong kids of soil.” – Warren Buffet
This is also where I take issue with those who tend to the
libertarian end of the political spectrum:
There is an implicit (or not-well-explained-away) assumption that
everyone can make it in life, no matter where they start from—and that those
who have made it are largely responsible for their own success. Given what we know about the contributions of
genes to human abilities, and about environment to behavior, learning, and
attitudes, and about the role of pure luck in life, I would venture to surmise
that most of us are only slightly responsible for our own success (or
failure). We happened to pick the right
parents who provided us the intellectual and perhaps economic wherewithal to
make our way in the world with a modicum of success, as the intergenerational
income mobility data suggest. And we are
allowed to do that in a society that we have structured to permit such success.
“The good men
may do separately is small compared with what they may do collectively.” –-
Benjamin Franklin
Thomas Hobbes famously wrote that the life of man in nature (that
is, in a state without organized society) is “solitary, poor, nasty, brutish,
and short.” So, the story goes, we came
together in a society to ameliorate the ills of living in the wild. In doing so, we set for ourselves the rules
by which we will live together.
Sometimes we set rules that favored a few over the many (e.g., ancient Egypt, societies that recognized the divine
right of kings); sometimes, more recently, we set rules that favor a larger
number of the group (e.g., the U.S.
and many industrialized countries, especially Western Europe, Australia,
etc.). Humans have set up societies in a
multitude of ways. Sometimes there are
revolutions because some group, large or small, doesn’t like the rules any more
and topples the whole system and starts over (e.g., France
in 1789, Russia in 1917).
What is key for me, however, is that nobody starts out with any
particular right to anything—I have come to conclude that there are no “fundamental”
rights, only rights that society gives—that we the people agree to give each
other when we live together as a group.
I don’t argue this as a matter of principle, merely as a recognition of
fact. In some countries, one person
decides the rights (e.g., Nazi Germany); in others, they are decided upon
through various legislative and judicial bodies. In the U.S., of course, fundamental rights
are set forth in the constitution. But
they are, as any student of American constitutional history knows, subject to
continuing interpretation and application by the courts and Congress (and
amendment from time to time).[63]
My point is that if we set rules that allow some people to become
fabulously wealthy and others to exist in poverty, we can change the
rules. I would certainly not argue
against allowing anyone with the energy, intelligence, and luck to become
wealthy. Wouldn’t mind getting there
myself, although given my career choice that isn’t going to happen. But one must recognize that the achievement
of wealth comes in the context of rules we all set—we allow it to happen, as
Roosevelt and Buffet suggest. No one in
any organized society has an inherent right granted by the Almighty or anyone
else to keep all the money he or she earns, without limit. I submit that under the rules we have all
agreed to, we can require that those who achieve such wealth pay back some of
the money in order to help us alleviate the poverty of the others. If one assumes that not all start out at the
starting line of the race of life with an equal chance, then those who started
out with advantages (e.g., money, ability, personality, environment) can be
required to give some back to help those who started with disadvantages. One does not have to advocate confiscatory
taxation to take this position.
Bill Gates Sr., father of the Microsoft founder, argues that Bill Gates
Jr.—”along with every other wealthy American—owes an enormous debt to that
much-maligned entity, the federal government.
That government, Gates says, provides the security and stability that a
thriving economy needs: the legal
system, the financial markets, the basic orderliness and predictability that
let people engage in the kinds of entrepreneurial activities that produce great
wealth. Other preconditions of much
American wealth-building, he notes, are publicly funded education and research.
. . . He calls the United States government ‘the
greatest venture capitalist in the history of the world’ and points out that
‘there’d be no Internet today but for the federal government. Zero.
The software industry: to a large
extent dependent on things that happened on college campuses where research by
smart people was being supported by the federal government.’” “Sharing the Wealth?” Washington Post Magazine, April 13, 2003.
The OECD publishes a table of data on public social expenditures
(as a percentage of Gross Domestic Product) for its member countries. (Public social expenditures include: old age cash benefits, disability benefits,
accident/injury benefits, sickness benefits, services for the elderly and
disabled, survivors, family cash benefits, family services, labor programs,
unemployment, health, housing, and other.)
The most recent data are for 1998.
The U.S.
spent 14.59% of its GDP on public social expenditures. The only OECD members that spent less were Turkey, the Slovak
Republic, Mexico,
and South Korea. The two countries that spent about the same
were Ireland and Japan. All of the others spent a significantly
greater percentage. Here are the
numbers:
Public Social Expenditures as a Percent of Gross Domestic Product
Sweden 30.98 New Zealand 20.97
Denmark 29.81 Spain 19.71
Switzerland 28.28 Czech Repub 19.42
France 28.82 Iceland 18.44
Germany 27.29 Canada 18.03
Norway 26.97 Portugal 18.21
Austria 26.80 Australia 17.81
Finland 26.54 Ireland 15.77
Italy 25.07 Japan 14.66
Great Britain 24.70 U.S. 14.59
Belgium 24.54 Slovak
Repub 13.57
Netherlands 23.90 Turkey 11.59
Poland 22.83 Mexico 8.22
Greece 22.73 Korea 5.94
Luxembourg 22.09
One criticism I suppose you could offer, when drawing comparisons
with places like Japan and much of Europe, is that their economies have not
done nearly as well as the U.S. economy in the last 10-12 years, and the
question one could ask is “why would we want to be like them?” in either social
welfare expenditures or in greater wealth/income equality. One response that comes to mind is that part
of the reason the U.S. economy is doing well is because it is being built on
the backs of the poor, because places like Wal-Mart (by corporate policy) do
not offer full-time jobs that carry health and retirement benefits to the vast
majority of their retail workforce. One
only needs to read Nickel and Dimed by Barbara Ehrenreich to come to an
understanding of how people in these kinds of jobs only barely make it in life,
and with little or no health care, no retirement, and wages that most of us
would consider pretty dismal—and how it is that among the 20 wealthiest people
in the U.S. are several of the heirs of the Wal-Mart founder.[64] The safety net in the United States has gaping holes in
it. I will always remember the
conversation that Pat and I had with a Dutch book publisher while having drinks
at a table next to one of the canals in Amsterdam. He commented something to the effect that “we
take care of everyone because the Dutch are a wealthy and industrious people.” One does not have to suggest the U.S.
be like the nearly-socialist Scandinavian countries to take the position that
we could be doing a lot better than we are without bankrupting the state and
federal governments. Of course,
Americans would have to get away from their tax-cutting mania to begin to
afford even a modest increase in the public safety net that other
industrialized countries seem to manage.[65] We can change the rules on the distribution
of income, I suspect, without doing great damage to our economy.
One of my faculty commentators, however, made this
observation: “While the western Europeans currently have a
finely-woven safety net, they all experience persistent 10% unemployment, a
degree of ennui and hopelessness among younger people, strong prejudices
against immigrants, especially if they are non-Christian or a different color,
and a birth rate that cannot possibly support the survival of their present
system. Direct foreign (and domestic)
investment is declining. German
companies are looking to America
as a more productive environment. A lack
of incentives and heavy taxation of earnings reduces individual investment in
higher education. In France, a caste system among
universities dooms or predestines entrants to their future jobs based on where
they were admitted.” My only comment,
which I’ll follow up on later, is that there has to be some middle ground.
A British
researcher, Professor Frank Castles, has looked into public welfare spending in
the OECD countries.[66] He writes that there has been the assumption
that “the welfare state as we know is doomed to extinction by globalisation of
the world economy. In its most dramatic
variant, this story is about an inevitable ‘race to the bottom’ in social
expenditure as governments cut back programmes in order to reduce levels of
taxation. Why must governments make such
cuts? The answer we are given is that
the costs of social provision, of better wages and conditions and of
environmental safeguards necessarily fall on the production costs of
businesses, making it more difficult for them to compete in international
markets. Unless these costs are cut
back, jobs will be lost and migrate overseas, while firms will close down, taking
their capital with them.”
Castles
examined public welfare expenditures in the OECD countries for the period
1980-1998 and concluded that “the idea of ‘race to the bottom’ is very far from
the truth. In this nearly two decade
period, when the international economy was rapidly becoming more closely
integrated, social expenditure was actually growing not declining, going up
from 18.7 to 22.7 of gross national product in the average Western nation. . .
. Over the period as a whole, in the
vast majority of these countries, spending on the welfare state became a more
prominent part of total government spending than in the past.” Castles may have been using the same data
that I did; I only included the data for 1998, but the OECD provides it back to
1980. My cursory glance at the numbers
over time suggests that Castle is correct in his analysis.
Here
is a representative sample of the numbers from the OECD data 1980-1998 for U.S.
spending on public welfare as a percent of GDP:
1980 13.13 1989 12.93
1983 14.04 1992 15.11
1986 13.04 1995 15.41
1998 14.59
Castles
contends that the reason some argue “the welfare state is falling apart when,
self-evidently, nothing of the kind is happening” is because “the intention is
to persuade. To persuade us of
what? And the answer there too is pretty
obvious. To cut back the spending on the welfare state—on pensions for the old,
on a decent heath system, on services to families—and to make it appear that
such cuts are forced on us by outside forces.”
On the other hand, as one colleague pointed out to me, “all western
countries are faced with a dilemma that their politicians are unwilling to
address; i.e., how will they pay for their promises without bankrupting their
economies. We are in a pell-mell fight
between Democrats and Republicans to see who can most successfully pander to
the most successful lobby in the country, the AARP. It’s ludicrous to pass prescription drug
legislation without means testing.”
The
trick is to avoid the problems identified by the OECD at the ministerial
level: “high and persistent unemployment
and . . . [that] too many people of working age are reliant upon welfare
support for long periods, and there are significant risks that this will be
transmitted to future generations.” It
is interesting that the third of the seven reasons the ministers identified as
causes for a need for welfare reform was “growing income inequality and persistent poverty and increased polarisation of households into
work-rich and work-poor. There
are now more households where there is no adult in paid employment, at the same
time as there are also more families with two-earners.”[67] One reviewer wrote “and why is there ‘high
and persistent unemployment’ in western economies? Some is due to the high costs of employment
relative to productivity and some is due to inadequate individual investment in
human capital. And some is related to
draconian regulations, especially in western Europe, that prohibit employers
from dismissing marginally performing employees, leading to an unwillingness to
take on a risky applicant.”
Another instrumental reason to be concerned about income/wealth
inequality has been advanced by Professor Wolff. Apart from the social justice concerns, “inequality
is actually harmful to the well-being of a society. There is now a lot of evidence, based on
cross-national comparisons of inequality and economic growth, that more unequal
societies actually have lower rates of economic growth. The divisiveness that comes out of large
disparities in income and wealth, is actually reflected in poorer economic
performance of a country. Typically when
countries are more equal, educational achievement and benefits are more equally
distributed in the country. In a country
like the United States,
there are still huge disparities in resources going to education, so quality of
schooling and schooling performance are unequal. If you have a society with large
concentrations of poor families, average school achievement is usually a lot
lower than where you have a much more homogenous middle class population, as
you find in most Western European countries.
So schooling suffers in this country, and, as a result, you get a labor
force that is less well educated on average than in a country like the Netherlands, Germany
or even France. So the high level of inequality results in
less human capital being developed in this country, which ultimately affects
economic performance.”
Wolff goes on to comment, apropos the issue of taxes, is that “one
reason we have such high levels of inequality, compared to other advanced
industrial countries, is because of our tax and, I would add, our social
expenditure system. We have much lower taxes than almost every Western European
country. And we have a less progressive
tax system than almost every Western European country. As a result, the rich in this country manage
to retain a much higher share of their income than they do in other countries,
and this enables them to accumulate a much higher amount of wealth than the
rich in other countries. Certainly our
tax system has helped to stimulate the rise of inequality in this country. We have a much lower level of income support
for poor families than do Western European countries or Canada. Social policy in Europe, Canada and Japan does a lot more to reduce
economic disparities created by the marketplace than we do in this
country. We have much higher poverty
rates than do other advanced industrialized countries.”
Richard
Freeman, in Boston Review in the mid-1990s, asked “what are the causes
of the new inequality? The short answer
is that analysts disagree. Pat Buchanan
and Ross Perot put immigration and trade at the top. The Clinton Administration
blames technology. Republicans blame
taxes and government regulations. The
AFL-CIO stresses declining unionism and the fall in the real value of the
minimum wage. It would be nice to know
how much each of these factors (and others) contributes to the problem. But such knowledge is largely irrelevant to
the ‘what is to be done?’ question that should be our principal concern.” Freeman concludes his article by urging that “we
must find a cure to the inequality problem that is gnawing, like a cancer, at
the soul of our country.”
Another
reason to be concerned is one voiced by Bill Gates, Sr., father of the
Microsoft founder. Gates has been
leading the “fight to preserve the federal estate tax. Currently levied on the assets of fewer than
2 percent of Americans at the time of their death, this tax works, Gates
believes—albeit in a modest way—to keep the wealth gap from growing even wider.
‘It really is not a good world where the disparity that was illustrated here
exists,’ he says, ‘and an element in that is the plain and simple fact that
wealth is power.’ Democracy is at risk
when the rich ‘can basically buy public policy.’ And this is precisely what he thinks the
other side in the estate tax battle has done.”[68]
Two
questions about the increasing inequality of income/wealth are whether
Americans understand it, and if they do, whether they care about it. Professor Bartels has provided some useful
data. He wrote that “In light of these
developments, business writer Robert Samuelson (2001) argued that ‘If Americans
couldn’t abide rising inequality, we’d now be demonstrating in the streets.’ Instead, quite to the contrary, the past
three years have seen a massive additional government-engineered transfer of
wealth from the lower and middle classes to the rich in the form of substantial
reductions in federal income taxes. . . .
As a result, according to projections by the Institute on Taxation and
Economic Policy, the total federal tax burden in 2010 will decline by 25% for
the richest one percent of taxpayers and by 21% for the next richest four
percent, but by only 10% for taxpayers in the bottom 95 percent of the income
distribution.”
Drawing
on survey data from the 2002 National Election Study survey, Professor Bartels
reports that the responses demonstrate “widespread public recognition of the
sheer fact of growing economic inequality in contemporary America. . . . Moreover, a majority of those who recognized
that income inequality has increased said they thought that was ‘a bad thing’;
most of the rest said they “haven’t thought about” whether it is good or bad,
while only about 5% said it was a good thing.”[69]
Half of the
American public thinks that rich people are asked to pay less than they should
in federal income taxes—but almost half do not think so. More than 60% agree that government policies
have exacerbated economic inequality by helping high-income workers more—but
more than a third deny that assertion, and more than 85% say that “some people
just don’t work as hard.” More than 40%
say the difference in incomes between rich and poor has increased over the past
20 years, and that that is a bad thing—but an even larger proportion either don’t
recognize the fact or haven’t thought about whether it is a good thing or a bad
thing.
On the other hand, what is pretty clearly absent in
these data is any positive popular enthusiasm for economic inequality. Americans may cling to their unrealistic
beliefs that they, too, can become wealthy; but in the meantime they do not
seem to cherish those who already are.
Fewer than 7% say that a larger income gap between the rich and the poor
is a good thing (or that a smaller gap is a bad thing). Fewer than 15% say the rich are asked to pay
too much in taxes, while three times that many say the poor are asked to pay
too much in taxes. And the public as a
whole likes “big business” even less than it likes people on welfare, liberals,
feminists, the news media, and the Catholic church. Thus, the mystery of apparent public
enthusiasm for tax policies skewed in favor of the rich remains a mystery.
Explaining all
this politically is a “bank shot,” to use a billiards term. It requires trusting the voters with
complexity. Will they see that their new
$400 child credits are chump change compared with all the new fee hikes and
service cuts? Will they understand that
they’re paying more in state and local taxes so that a guy with a Jaguar
putting up a McMansion down the block can pay less in federal taxes? Will they connect those 30 kids cramming
their child’s classroom to decisions in far-away Washington?
Bartels
concludes: “The answer to these
questions suggested by my analysis is:
Not likely.”
A friend of mine on the faculty
wrote to me, after previewing what I had written, that “it is ‘not likely’ that
Americans will wake up—for a variety of reasons, all of them portending the
further erosion of democracy and increase in economic disparities; perhaps the
main reason is that they are too damn busy to think, and absolutely not likely
to get any less busy in the foreseeable future.
I can think of all kinds of reasons why many Americans are clueless
about most things economic and political that matter, but very few ways of
raising their cognition level. It is
worse now than it has been in my lifetime, and it wasn’t great even when I was
a kid.”
I have no solutions to offer, other
than dancing a little around the issue of the estate tax; solutions require far
more expertise than I can bring to bear on the subject. I do know, however, that I shall be listening
carefully to those in political office to hear what they have to say on this
subject. As my friend Professor Chuck
Speaks wrote to me, “that inequality exists cannot be argued. What we can debate is 1) is inequality good
or bad?, and 2) if bad, what is an acceptable level of inequality that can be
sustained without placing our economy and society in jeopardy?” I think it is a problem worthy of our
attention.
I
did find it interesting that in the article on “The Futile Pursuit of
Happiness,” the research has public
policy implications, such as, for example, with respect to changes in tax
laws. “The data make it all too clear
that boosting the living standards of those already comfortable, such as
through lower taxes, does little to improve their levels of well-being, whereas
raising the living standards of the impoverished makes an enormous difference.”
That old radical Benjamin Franklin argued that property “is
a ‘Creature of Society and is subject to the Calls of that Society whenever its
Necessities shall require it, even to its last Farthing.” Franklin also
endorsed for the constitution of Pennsylvania,
at the 1776 convention drafting it, a provision providing “That an enormous
Proportion of Property vested in a few Individuals is dangerous to the Rights,
and destructive of the Common Happiness, of Mankind; and therefore every free State hath a Right
by its Laws to discourage the Possession of such Property.’” Toward the end of his life he wrote that “when
societies were formed they passed laws, and when ‘by virtue of the first Laws
Part of the Society accumulated Wealth and grew Powerful, they enacted others
more severe, and would protect their Property at the Expense of Humanity. This was abusing their Powers, and commencing
a Tyranny.’”[70] Franklin
also wrote that everyone has “a ‘natural right’ to all he earned that was
necessary to support himself and his family, . . . but all property superfluous
to such purposes is the property of the public, who by their laws have created
it.”[71]
Christopher Jencks, at Harvard’s John F. Kennedy School
of Government, writing about these very issues, commented that
One
might hope that rising levels of education would help voters understand their
economic interests, but the evidence on this score is discouraging. The correlation between higher income and
voting Republican has risen as the Republican Party has become more
homogeneous, but it is still relatively weak because voters care more about
issues like race and abortion than about economics. More than half the low-income voters who might
benefit from electing a Democratic senator or president do not bother to vote
at all. . . . All of this suggests that
the American political system may not be capable of reversing the growth of
economic inequality. On the other hand,
that is exactly what sensible critics would have concluded in 1928, and they
would have been wrong.[72]
As I
said at the beginning of this little disquisition, I thought few were paying
much attention to the issue of income and wealth inequality when I wrote these
paragraphs in late December ‘03 and January ‘04. Then I read in the February 2, 2004, issue of Time an
article in which authors argued that the federal government has “turned your
life into a spin of the roulette wheel. . . .
Overall, Washington
has structured the game just as any gambling house would, so there a few
winners but a lot more losers.
It’s why many of use are falling further behind the
harder we work, why our debt dwarfs that of our parents, why some of us receive
world-class health care and others almost none, why some can afford college but
for others it had been priced out of reach, and why the wage gap between rich
and poor has started growing again. In
1992 the 400 individuals and families with the highest income in the U.S.,
according to tax returns filed with the IRS, received on average $12.3 million
in “salaries and wages.” By 2000, the
latest year available, that figure had more than doubled, to $29 million.
More significant, in 1992 it took the combined wages
of 287,400 retail clerks at, say, Walmart, to equal the pay of the top
400. By 2000 it required the combined
pay of 504,600 retail clerks to match the pay of the top 400.
My only dispute with the Time authors is “started.” It’s been going on for some time.
Jim Holt wrote an article for Slate (msn.com) wondering
about the “difference principle” that the great philosopher John Rawls
advanced. In essence, the principle
holds that “differences in wealth, status, etc., can be defended only if the
create a system of market forces and capital accumulation whose productivity
makes the lowliest members of society better off than they would be under a
more egalitarian system.” If
“trickle-down” economics works better than an egalitarian system, it can be
justified. To assess how ‘economically
just a society is, . . . all you have to do is look at how well its worst-off
members are doing.” (Holt noted the
increasing Gini coefficient in the U.S.)
In order to figure out how to get to an original
position, Rawls posited that everyone collaborating in creating a society would
do so from a “veil of ignorance”: no one
would know what position he or she would occupy in the new society. Rawls argued that everyone in that situation
will push for a society in which the worst-off citizens are in a position he or
she finds acceptable—because they don’t know if that’s where they will end up
once the veil is lifted. Holt is
critical of Rawls’s construct because it is assumed people have no idea about
the distribution of wealth, what the odds are of being in one economic class or
another, will not take any risk, and that the society will be a zero-sum game
(where you start is where you must end up).
Are these risk-averse agents, Holt asks, “more reasonable than their
diametrical opposites: agents who would
choose a society that maximizes the well-being of the best-off class? (That, by
the way, seems to be the sort of society we are drifting toward.)” Not only has Rawls’s goal of a just society
not been achieved, we are going in the opposite direction.[73]
* * *
I passed around to a few friends an email message I had
received from another friend. The author
compared the cost of the $87 billion for the Iraq war to what it could have been
spent for in other ways. “To get some
perspective, here are some real-life comparisons about what $87 billion means:
$87
billion is more than the combined total of all state budget deficits in the United States.
The Bush administration proposed absolutely zero funds to help states deal with
these deficits, despite the fact that their tax cuts drove down state
revenues. [Source: Center on
Budget and Policy Priorities].
$87
billion is enough to pay the 3.3 million people who have lost jobs under George
W. Bush $26,363.00 each! The unemployment benefits extension passed by
Congress at the beginning of this year provides zero benefits to workers who
exhausted their regular, state unemployment benefits and cannot find work
[Source: Center on Budget and Policy Priorities].
$87
billion is more than double the total amount the government spends on Homeland
Security. The U.S.
spends about $36 billion on homeland security. Yet, Sen. Warren
Rudman (R-NH) wrote America
will fall approximately $98 .4 billion short of meeting critical emergency
responder needs for homeland security without a funding increase. [Source:
Council on Foreign Relations].
$87
billion is 87 times the amount the Federal Government spends on after-school
programs. George W. Bush proposed a budget that reduces the $1 billion
for after-school programs to $600 million cutting off about 475,000 children
from the program. [Source: The
Republican-dominated House Appropriation Committee].
$87
billion is more that 10 times what the government spends on all environmental
protection. The Bush administration requested just $7.6 billion for the
entire Environmental Protection Agency. This included a 32 percent cut to
water quality grants, a 6 percent reduction in enforcement staff and a 50
percent cut to land acquisition and conservation. [Source: Natural Resources Defense Council].
A friend of mine who is not given to responding to my humorous
or my various political/social commentary emails and who is also not prone to
outbursts fired back a response within an hour after receiving it. I was startled.
“I
absolutely agree that this is the best argument about the Iraq war: money. Of
course we were lied to about the reason. Of course there was no realistic
planning for nation building. Of course we are losing lives. Of
course we have lost tremendous credibility internationally because of our new
foreign policy. Of course we have made the war on terrorism worse by
creating new martyrs and new recruits. But the clearest, best reason to
me is that we can’t afford it. This chart is exactly the right
tool. We needed the money for other things. Whether your issue is
state deficits, environment, education, deficit reduction, infrastructure,
whatever, it needed the money. This $87 billion was a conscious choice
that the fun of military adventurism was more important than any of the above.”
* * *
A conservative friend was talking with me about the
courses that Krystin was taking at college this year. I related the ones she was planning on in the
spring: anthropology, sociology,
statistics, and Spanish. He allowed that
the statistics and the Spanish were OK but implied that the other two were a
waste of time. It dawned on me some time
later that there may be a lesson there.
Those on the political right (and I include those of a libertarian bent
in this category, although they would not) are sometimes at least ignorant, if
not downright disdainful, of how people and societies actually function and
live. They need to take courses in
sociology and political science and anthropology. Instead of prescribing public policy on the
basis of what works, it often seems to me they prescribe on an “ought” basis—people
ought to do this, they ought to do that, they ought to be this way—and the “ought”
may bear no relationship to the reality of the larger society or of the
psychology and sociology of individuals.
Or they generalize from their own personalities, their own backgrounds,
and their own abilities, to the larger society.
It
is engineers and physicists who make claims about evolution that few biologists
find credible.
And
speaking of public policy based on “ought,” I am always both amused and
dismayed when people in and out of political office make declarative statements
about public policy based on the Bible. I
would be the last to argue that the Bible is not a wonderful document as
history, as literature, and as a spiritual text—but I would be among the first
to argue that the social and religious views of a group of defiant Jews (the
New Testament) and earlier Jews (the Old Testament) in the Middle East of 2000-3000
years ago is not an appropriate basis for public policy in the 21st
Century.
* * *
There was an interesting piece in The Economist
last spring about “The lunatic you work for.”
It reported on a movie that asks the question what sort of person the
corporation is, given that the corporation is treated as a “person” in the
law.
The answer, elicited over two-and-a-half hours of
interviews with left-wing intellectuals, right-wing captains of industry,
economists, psychologists and philosophers, is that the corporation is a
psychopath. Like all psychopaths, the
firm is singularly self-interested: its
purpose is to create wealth for its shareholders. And, like all psychopaths, the firm is
irresponsible, because it puts others at risk to satisfy its profit-maximising
goal, harming employees and customers, and damaging the environment. The corporation manipulates everything. It is grandiose, always insisting that it is
the best, or number one. It has no
empathy, refuses to accept responsibility for its actions and feels no
remorse. It relates to others only
superficially, via make-believe versions of itself manufactured by
public-relations consultants and marketing men.
In short, if the metaphor of the firm as person is a valid one, then the
corporation is clinically insane. . . .
The main message of the film is that, through their psychopathic pursuit
of profit, firms make good people do bad things.
The Economist writer(s) argues, however, that the
greater threat is the bureaucracies of state socialism. They maintain that the state, even democratic
states, has the capacity to behave “as a more dangerous psychopath than any
corporation can ever hope to become.” As
perhaps one would expect from The Economist, they say the “film also
invites its audience to weigh up the benefits of privatisation versus public
ownership. It dwells on the familiar problem of the corporate corruption of
politics and regulatory agencies that weakens public oversight of privately
owned firms charged with delivering public goods. But that is only half the story. The film has nothing to say about the immense
damage that can also flow from state ownership.
Instead, there is a misty-eyed alignment of the state with the public
interest.” It is true that the state has
the potential to be very dangerous (some of think that Attorney General
Ashcroft symbolizes that potential in action), but it is nonetheless
interesting to read this analysis of the corporation. I have often felt that the people I know who
work in such organizations are perfectly decent people I am glad to count as
friends, but I wonder how they can work for organizations that sometimes do
things I find reprehensible—or at the very least ethically questionable.
* * *
And
now the last Bryson excerpt, on living with our germs.
It’s probably not a good idea to take
too personal an interest in your microbes.
Louis Pasteur, the great French chemist and bacteriologist, became so
preoccupied with them that he took to peering critically at every dish placed
before him with a magnifying glass, a habit that presumably did not win him
many repeat invitations to dinner.
In fact, there is no point in trying to
hide from your bacteria, for they are on and around you always, in numbers you
can’t conceive. If you are in good
health and averagely diligent about hygiene, you will have a herd of about one
trillion bacteria grazing on your flesh plains—about a hundred thousand of them
on every square centimeter of skin. They
are there to dine off the ten billion or so flakes of skin you shed every day,
plus all the tasty oils and fortifying minerals that seep our from every pore
and fissure. You are for them the
ultimate food court, with the convenience of warmth and constant mobility
thrown in. By way of thanks, they give
you B.O.
And those are just the bacteria that
inhabit your skin. There are trillions
more tucked away in your gut and nasal passages, clinging to your hair and
eyelashes, swimming over the surface of your eyes, drilling through the enamel
of your teeth. Your digestive system
alone is host to more than a hundred trillion microbes, of at least four
hundred types. . . . Every human body
consists of about 10 quadrillion cells, but about 100 quadrillion bacterial
cells. They are, in short, a big part of
us. From the bacteria’s point of view,
of course, we are a rather small part of them.
Because we humans are big and clever
enough to produce and utilize antibiotics and disinfectants, it is easy to
convince ourselves that we have banished bacteria to the fringes of
existence. Don’t you believe it. Bacteria may not build cities or have
interesting social lives, but they will be here when the Sun explodes. This is their planet, and we are on it only
because they allow us to be.
Bacteria, never forget, got along for
billions of years without us. We
couldn’t survive a day without them.
They process our wastes and make them usable again; without their
diligent munching nothing would rot.
They purify our water and keep our soils productive. Bacteria synthesize vitamins in our gut,
convert the things we eat into useful sugars and polysaccharides, and go to war
on alien microbes that slip down our gullet.[74]
I
include these excerpts from Bryson, and rail about stem-cell research, for
example, for a reason. In the words of
George Dvorsky, “we all need to know about science. Without this knowledge we are powerless,
forced to live in a fog about how things work.
Without it, we are utterly dependent on others to form our opinion. Without it, we cannot properly participate in
society as informed, critical and responsibly opinioned [sic] citizens. Moreover, in today's hi-tech information age
world, democracy cannot work without a scientifically literate society.” Dvorsky cited a number of polls from both the
United States and Europe to suggest a disturbing lack of scientific
knowledge (e.g., “only 48% of Americans knew that the earliest humans did not
live at the same time as the dinosaurs,[75]
and that only 22% could properly define a molecule. The survey also showed that
only 45% knew what DNA was and that lasers don't work by focusing sound waves,
and that 48% knew that electrons were smaller than atoms.” More amusing, if it were not so sad, is that
“60% of Europeans believe that ordinary tomatoes do not contain genes while
genetically engineered tomatoes do, while 50% believe that eating genetically
modified fruit can cause a person's genes to become modified.”[76]
Perhaps
more important, many people don’t understand how science is done. “Only 21% of those surveyed were able to
explain what it means to study something scientifically. Slightly over half
understood probability, and only a third knew how an experiment is conducted. .
. . That those with a deficiency in
scientific comprehension have underdeveloped critical thought faculties. In other words, they might as well be
suffering from some kind of cognitive disorder.”
And
why is this a problem? “The trouble with
ignorance is not so much what people don't know but what this causes them to
believe. There is a direct correlation
between scientific illiteracy and a propensity for belief in superstitions, . .
. the paranormal and pseudoscience. Those
unacquainted with science also tend to be more prone to scam artists, unwise
investments, fiscal schemes and bogus health and medical practices.”
As
neuroscientist Steven
Pinker has noted, “As our economy comes to depend
increasingly on technology, and as modern media present us with unprecedented
choices—in our lifestyles, our workplaces, and our political commitments—a
child who cannot master an ever-increasing body of skills and knowledge will be
left farther and farther behind. . . .
The late Carl Sagan similarly worried about the effects
of a scientifically illiterate society. “We live in a society exquisitely dependent on
science and technology, in which hardly anyone knows anything about science and
technology,” he lamented. “We have also arranged things so that almost no one
understands science and technology. This is a prescription for disaster. We
might get away with it for a while, but sooner or later this combustible
mixture of ignorance and power is going to blow up in our faces.” Indeed, scientific illiteracy cripples
culture, justice, democracy and society in general. When you have misinformed
individuals you get unhealthy societies.
And
you get public policies that some of us find both reprehensible and backwards. Ultimately, there is the specter of George
Orwell and 1984, because if a government and its scientists are among the few
who understand science and technology, the rest of us are in jeopardy.
This
is not, of course, about knowing how everything actually works, or all the details
of chemistry or physics. It does mean
you have to understand how science works.
I have often marveled that although I use a computer a great deal, I
have not a clue how it works—why when I press the “a” key an “a” appears on
this screen in front of me. Nor do I
begin to understand how I can enter something in Google and it instantly
provides me access to reams of information about my entry. I can understand the processes that led to
the development of these nifty devices, but I do not and never will understand
how they work.
It is for this reason, among others, that Pat and I chose
for Krystin’s and Elliott’s middle school the one in Minneapolis that required science classes all
three years.
* * *
So,
the election was held and George Bush was re-elected. The point, for those who do not agree with
his policies, is to figure out what attracted the majority of American voters
to him, why they voted for him. I read a
few articles in the months before and the weeks after the election and reached
a few conclusions of my own. First of
all, I realized it is pointless to take the position of Britain’s Daily Mirror,
which on the Thursday after the election had a picture of President Bush on its
front page and a large headline underneath it:
“How can 59,054,087 people be so DUMB?”
It might make the losers feel better for a few minutes, but it won’t do
anything to advance the issues that more progressive people endorse.
There
was also floating around the web after the election a graph that purported to
list states by their average IQ and how they voted, with all the “smart” states
voting for Kerry and the not-so-smart states voting for Bush. I poked around a little on the web myself,
and while I did not find a definitive answer, I am reasonably confident the
table was a hoax.
One
reaction to the election is simply to say that the outcome reflects the mood of
the country. It’s a conservative time,
with a revival of religiosity in some parts of the country in some groups, and
the conservatives won. There is a swing
in American politics, over the decades, and this was not a time when the more
liberal candidate was going to win. I am
not sure that someone other than Kerry could have defeated Bush. One of my attorney friends related to me a
prediction he heard at a bar association lunch last summer. The speaker was David Gergen, the moderate
Republican editor of U.S. News and World Report who has also served in both
Democratic and Republican presidential administrations. According to my friend, Gergen predicted that
Bush would win in 2004—but that the Democrats would win in a landslide in 2008,
in reaction to the policies enacted through eight years of a Bush
administration. I guess we’ll see about
that.
It
may also be time to reassess the presidency of Abraham Lincoln, according to
one contributor to the Op-Ed page of the Star-Tribune. She opined that President Lincoln should have
let the Confederacy go—from Missouri south to Texas and east to the Atlantic,
there could be another nation. Yes, that
would have prolonged slavery, but eventually it would have been abandoned as
the Confederate States were pressured by the world. Today there could then be the Christian
theocracy in the South, where prayer is mandatory in schools and at public
events, abortion is illegal, church attendance is perhaps mandatory, militarism
is a valued part of the social order, and creationism is taught in the
schools. Among other things. The United States, in contrast, would
be secular, certainly more advanced scientifically, it would be more tolerant,
and religion would be a private matter.
(Another option, which appeared on a map that circulated shortly after
the election, is that the blue states—the west coast, Minnesota, Michigan, and
Wisconsin, and Pennsylvania, New York, and New England—all join the United
States of Canada; the rest of the country, the red states, would be renamed
Jesusland.)
The brilliant historian Simon Schama[77]
wrote for The Guardian a week after the election; his analysis parallels
Huntington’s,
with more emphasis on geography, and one wonders if Schama saw the “US of
Canada/Jesusland” map.
In the wee small hours of November 3 2004, a new country appeared on the
map of the modern world: the DSA, the Divided
States of America. Oh yes, I know, the obligatory pieties about
"healing" have begun; not least from the lips of the noble Loser.
This is music to the ears of the Victor of course, who wants nothing better
than for us all to Come Together, a position otherwise known as unconditional
surrender. . . . I don't want to heal
the wound, I want to scratch the damned thing until it hurts and bleeds - and
then maybe we'll have what it takes to get up from the mat.
"We are one nation," the newborn star of
Democrats, Senator-elect Barack Obama, exclaimed, even as every salient fact of
political life belied him. Well might he
invoke Lincoln,
for not since the Civil War has the fault line between its two halves been so
glaringly clear, nor the chasm between its two cultures so starkly
unbridgeable. . . . It is time we called
those two Americas
something other than Republican and Democrat, for their mutual alienation and
unforgiving contempt is closer to Sunni and Shia, or (in Indian terms) Muslim
and Hindu. How about, then, Godly America and Worldly America?
Worldly America, which of course John Kerry won by a
massive landslide, faces, well, the world on its Pacific and Atlantic coasts
and freely engages, commercially and culturally, with Asia and Europe in the
easy understanding that those continents are a dynamic synthesis of ancient
cultures and modern social and economic practices. . . . Godly America, on the other hand, . . . turns
its back on that dangerous, promiscuous, impure world and proclaims to high
heaven the indestructible endurance of the American Difference. If Worldly America is, beyond anything else,
a city, a street, and a port, Godly America is, at its heart . . . a church, a
farm and a barracks; places that are walled, fenced and consecrated. Worldly America is about finding civil ways
to share crowded space, from a metro-bus to the planet; Godly America is about
making over space in its image.
Worldly America is pragmatic, practical,
rational and sceptical. In California it passed
Proposition 71, funding embryonic stem cell research beyond the restrictions
imposed by Bush's federal policy. Godly America is mythic, messianic,
conversionary. No wonder so many of us
got the election so fabulously wrong even into the early hours of Tuesday
evening, when the exit polls were apparently giving John Kerry a two- or
three-point lead in both Florida and Ohio. For most of us purblind writers
spend our days in Worldly America and think that Godly America is some sort of
quaint anachronism, doomed to atrophy and disappear as the hypermodernity of
the cyber age overtakes it, in whatever fastness of Kentucky or Montana it
might still circle its wagons. The shock
for the Worldlies is to discover that Godly America is its modernity.
Well, the autumn leaves have, just this week, fallen
from the trees up here in the Hudson
Valley and the scales
from the eyes of us deluded worldlies.
If there is to be any sort of serious political future for the
Democrats, they have to do far more than merely trade on the shortcomings of
the incumbents. . . . The real challenge
is to voice an alternative social gospel to the political liturgy of the
Godlies; one that redefines patriotism as an American community, not just a
collection of wealth-seeking individuals; one that refuses to play a zero-sum
game between freedom and justice; one in which, as the last populist president
put it just a week ago, thought and hope are not mutually exclusive.
But
again, those are visceral reactions on the part of the losers, certainly not
concrete or realistic proposals.
The
National Interest carried in its
Spring, 2004, issue an extremely interesting article by Samuel Huntington
entitled “Dead Souls: The
Denationalization of the American Elite.”[78] The article suggests that certain segments of
American “elites” have far less commitment to their country and more to the
forces of globalization, the “huge expansion in the international interactions
among individuals, corporations, NGOs and other entities; growth in number and
size of multinational corporations investing, producing and marketing globally;
and the multiplication of international organizations, regimes, and
regulations.”
Transnational ideas and
people fall into three categories: universalist, economic and moralist. The universalist
approach is, in effect, American nationalism and exceptionalism taken to the
extreme. In this view, America is exceptional not because
it is a unique nation but because it has become the “universal nation.” It has merged with the world through the
coming to America
of people from other societies and through the widespread acceptance of
American popular culture and values by other societies. The distinction between America and the world is
disappearing because of the triumph of American power and the appeal of
American society and culture. The economic
approach focuses on economic globalization as a transcendent force breaking
down national boundaries, merging national economies into a single global
whole, and rapidly eroding the authority and functions of national governments. This view is prevalent among executives of
multinational corporations, large NGOs, and comparable organizations operating
on a global basis and among individuals with skills, usually of a highly
technical nature, for which there is a global demand and who are thus able to
pursue careers moving from country to country.
The moralistic approach decries patriotism and nationalism as
evil forces and argues that international law, institutions, regimes and norms
are morally superior to those of individual nations. Commitment to humanity must supersede
commitment to nation. This view is found
among intellectuals, academics and journalists.
Economic transnationalism is rooted in the bourgeoisie, moralistic
transnationalism in the intelligentsia.
(Underlining added, just for clarity.)
Globalization,
according to the Huntington, “is proving right Adam Smith’s observation that
while ‘the proprietor of land is necessarily a citizen of the particular
country in which his estate lies . . . the proprietor of stock is properly a
citizen of the world, and is not necessarily attached to any particular
country.’” That was 1776. Following interviews with a number of
executives of multinational corporations and non-profits, two authors of a
study wrote that “‘again and again we heard them say that they thought of
themselves more as “citizens of the world” who happen to carry an American
passport than as U.S.
citizens who happen to work in a global organization.’
Huntington cites sources to suggest there are about 20 million “denationalized
elite” individuals in the United
States.
These “transnationalists have little need for national loyalty, view
national boundaries as obstacles that thankfully are vanishing, and see
national governments as residues from the past whose only useful function is to
facilitate the elite’s global operations.”
Further, these folks point out that career advancement in business, the
media, the professions, and the academic world is more limited if one has a
purely national identity and involvement.
“Outside politics, those who stay home stay behind.”
One
result of global involvement is an erosion of the “sense of belonging to a
national community. . . . ‘The higher
people’s income and education . . . the more conditional the allegiance.’” Those who fall in the “moralist
transnationalist” category “abandon the commitment to their nation and their
fellow citizens and argue the moral superiority of identifying with humanity at
large.” They see patriotism as dangerous
(as the refuge of scoundrels in politics) and come to dislike national
sovereignty. These people are
internationalists.
While
these internationalist “elites” are developing, “identifying more with the
world as a whole and defining themselves as ‘global citizens,’ Americans as a
whole are becoming more committed to their nation. Huge majorities of Americans claim to be
patriotic and express great pride in their country” (well over 90% both before
and after the attacks of 9/11). This
might not mean much if the same were true elsewhere in the world, but it is
not; Americans, according to the article, have ranked first among nationals in
supporting their country since at least the first surveys were done in 1981.
What
matters about all this is that “growing differences between the leaders of
major institutions and the public on domestic and foreign policy issues . . .
form a major cultural fault line” in American society. “In a variety of ways, the American
establishment, governmental and private, has become increasingly divorced from
the American people. Politically, America
remains a democracy because key public officials are selected through free and
fair elections.[79] In many respects, however, it has become an
unrepresentative democracy because on crucial issues—especially those involving
national identity—its leaders pass laws and implement policies contrary to the
views of the American people.
Concomitantly, the American people have become increasingly alienated
from politics and government.” Not
surprisingly, perhaps, the elites in “media, labor, religion, law and
bureaucracy were almost twice to more than three times as liberal as the public
as a whole.” On significant issues, the
elites have quite different views. “The
public is overwhelmingly concerned with the protection of military security,
societal security, the domestic economy and sovereignty. . . . Elites are more concerned with U.S.
promotion of international security, peace, globalization and economic
development of foreign nations than is the public.” The plays out, according to one study, in America’s role in the world: “the dwindling of consensus about America’s
international role follows from the waning of agreement on what it means to be
an American, on the very character of American nationalism. The domestic underpinnings for the long
post-World War II hegemony of cosmopolitan liberalism and internationalism have
frayed.”
One
result of the difference in views between the general public and the elites,
with the political leaders enacting policies at variance with public views, the
public loses trust and confidence in government. Statistics from the 1960s to the 1990s show
that to be the case. Participation in
government declines accordingly, which it has.
A third result, the article suggests, is an upsurge in the use of
initiatives, which have increased considerably in the late 1970s. In many cases, even though all of a state’s
political leaders—political, academic, media, religious, etc.—opposed an
initiative, the proposals were nonetheless approved by substantial
majorities. There is a deep divide
between the elites and the public on national identity and America’s role in the world.
Part
of this is related to religion, in Huntington’s
view. The countries that are more
religious are more nationalistic, and vice-versa. “Religiosity distinguishes America from most other Western
societies. . . . Their
religiosity leads Americans to see the world in terms of good and evil to a
much greater extent than most other peoples.
The leaders of other societies often find this religiosity not only
extraordinary but also exasperating for the deep moralism it engenders in the
consideration of political, economic, and social issues.”[80] In this, Huntington echoes Garry Wills’s editorial.
Huntington concludes: “Significant
elements of American elites are favorably disposed to America becoming a cosmopolitan
society. Other elites wish it to assume
an imperial role. The overwhelming bulk of the American people
are committed to a national alternative and to preserving and strengthening the
American identity of centuries. America
becomes the world. The world becomes America.
America
remains America.
Cosmopolitan? Imperial? National? The choices Americans make will shape their
future as a nation and the future of the world.”
Even
without necessarily agreeing that Huntington
is right in the particulars, I think his general sociology may be right. This article says a great deal to me about
the 2000 and 2004 presidential elections.
If ever there was someone who was NOT a member of the elite as
Huntington describes it, it is George W. Bush (even though he was born into and
educated in it), and, equally clearly, his father George H. W. Bush is a member
of that elite. Given Huntington’s
use of Scott’s poem, one presumes Huntington
does not think much of these “dead souls.”
But even putting aside a particular political or partisan view, I think Huntington has an
interesting thesis here, one that explains a number of recent events in
American politics.
While
I am not a member of any elite, the article struck me also because I find
myself (1) disappointed and dismayed at what seems to be motivating a
significant chunk of the American electorate, a group of people with whom I
apparently have little in common, given election outcomes, and (2) tending in
the direction of being “denationalized” myself (although I never would have
used that term or even thought about it that way before reading this
article—and I guess that suggests it is possible to be “denationalized” without
being a member of an elite). I have some
sympathy with the moralist approach to internationalism because I do not
believe that our sometimes-excessive patriotism and nationalism are serving us well (either as a country or humanity
in general), particularly when those emotions are directed to war and fractious
foreign relations. A more
internationalist approach, it seems to me, is the only way in which we shall
address enormous problems such as climate change and the underdeveloped world,
to name but two that no nation alone can address effectively. And the only way we can effectively remain at
peace. But it appears that the majority
of American voters in 2004 may not have agreed.[81]
Another
facet of this “denationalization” and the differing views about American
national identity struck me when I read a little essay in The Chronicle of
Higher Education about sovereignty.
Not a concept that gets people (including me) fired up, to be sure, but
the essay author wrote about a little book left in papers of the late Senator
Alan Cranston (California). Cranston
argues
that when people understand sovereignty as the absolute power of a government
over its own territory and citizens, a shield against the intervention of other
governments, nongovernmental organizations, and outside powers, it is an
illegitimate and dangerous medieval idea.
At best, sovereignty should be understood as the right of a people to
determine their own destinies. Such
sovereignty, he maintains, delegable to governments through democratic process,
is the only legitimate form, and political history in the West happily
continues to head in that direction.
Finally,
sovereignty as a defense against outside intervention to stop extraordinarily unacceptable behavior by
a government against its peoples is always, in Cranston’s view, heinous and
unjustified. International covenants on
genocide and human rights similarly demonstrate the world community’s declining
appetite for claims of such absolute state sovereignty (emphasis in original).
I imagine that many in the country would defend
absolutely America’s
absolute sovereignty. While there is
nothing to suggest that the U.S.
government is treating us unacceptably (in the conventional definitions that Cranston would likely have accepted, such as torture,
murder, genocide, and so on), I suspect only a minority of U.S. citizens would accept the
proposition that our national sovereignty could be abrogated were the federal
government to start doing such things.[82] But the denationalized folks would probably
argue that the U.S.
government should be subject to the same kinds of constraints as any other
government in this regard. I would agree
(e.g., the Geneva Conventions on the treatment of POWs). Of course, when the U.S. has the largest military in the world, it
isn’t likely that others around the globe are going to be able to infringe on U.S. sovereignty, no matter whether the federal
government is or starts abusing U.S.
citizens. (And much though I think we
are way off base in holding people in Guantanamo
and elsewhere without access to lawyers or relief, even that hardly amounts to
murderous attacks approaching genocide.)
There
is potentially an ominous undercurrent to the 2004 elections. Dale Maharidge wrote an article for The
Nation last September recalling his upbringing in blue-collar,
steel-working Ohio. He recently conducted a series of interviews
in the area and tells of the despair that is affecting the area as the steel,
textiles, and other industries in the rustbelt go into decline. People are bitter and angry, he wrote, and
that anger has combined with fear after the 9/11 attacks. People are “unemployed, underemployed,
hurting economically in some way. This group
of Americans, who number in the millions, harbors deep-seated anger over
corporate shenanigans, their lack of healthcare and good jobs, yet in interview
after interview I found they are often the most fervent in their support of
George W. Bush and his tough rhetoric.”
The
question in Maharidge’s mind was “why.”
He recalled talking with a labor studies faculty member at Youngstown State
University in Ohio, John Russo. “Russo worried about the workers’ growing
wrath—he was seeing it mature into xenophobia and right-wing radicalism. ‘It’s not unlike the anger in prewar Germany and prewar Italy,’ Russo said. And, he added, it was akin to the United States
in the Great Depression. ‘In the 1930s, America
could have gone in either direction.’”
Maharidge
didn’t buy Russo’s worry, at first, but the more he interviewed people the more
he began to accept it. Now it appears
that no other outcome except the election of FDR was possible in the 1930s, but
he recalled Walter Lippman’s comment that the nation would have “‘followed
almost any leader anywhere he wanted to go.’
A cynical leader could have exploited fear, a course taken by so many
other inferior leaders in times of chaos.”
FDR, of course, did not follow that path. It was shortly after FDR was first elected
that Sinclair Lewis wrote his novel It Can’t Happen Here, about a fascist
American government. The anger in the
nation now is similar; the hate ministry of Charles Coughlin in the 1930s has
been replaced by right-wing talk radio in the 2000s, he argues, and those who
listen to Limbaugh and O’Reilly are angry people who have seen good jobs
disappear. One result is that when “many
of them go into the voting booth they will punch the card or pull the lever for
a candidate who appears strong.” What
will happen if the economy turns down, or if there is another terrorist attack? There is a risk that we will slide toward the
fascism that Lewis feared.
Another
interesting take on voting and election returns was provided by Louis Menand,
in an interesting article he wrote for The New Yorker in August entitled
“The Unpolitical Animal: How Political
Science Understands Voters.” Menand
looked at the work of political scientist Philip Converse, from the 1960s
(which, my political scientist friends tell me, is still considered valid), in
an attempt to understand how people cast their votes.
The
myth is that undecided voters want to hear more about a candidate’s position on
education or health care or some other issue.
But “to voters who identify strongly with a political party, the
undecided voter is almost an alien life form.
For them, a vote for Bush is a vote for a whole philosophy of governance
and a vote for Kerry is a vote for a distinctly different philosophy. The difference is obvious to them, and they
don’t understand how others can’t see it, or can decide whom to vote for on the
basis of a candidate’s personal traits or whether his or her position on a
particular issue ‘makes sense.’ To an undecided voter, on the other hand, the
person who always votes for the Democrat or the Republican, no matter what,
must seem like a dangerous fanatic. Which voter is behaving more rationally and
responsibly?”
Menand notes the advice that political campaign experts
give to candidates, such as not assuming issues will make a difference but that
packaging will, that a logo is important, that colors are important. (“‘Blue is a positive color for me, signaling
authority and control. . . . but it’s a negative color for women, who perceive
it as distant, cold, and aloof. Red is a
warm, sentimental color for women—and a sign of danger or anger to men. If you use the wrong colors to the wrong
audience, you’re sending a mixed message.’”
I wonder if this is why, in the third presidential debate, both Bush and
Kerry wore navy blue suits and red ties.)
But is that how voters actually make up their mind?
Converse’s research isn’t encouraging. He found that about “ten per cent of the
public has what can be called, even generously, a political belief system. He named these people ‘ideologues,’ by which
he meant not that they are fanatics but that they have a reasonable grasp
of ‘what goes with what’—of how a set of
opinions adds up to a coherent political philosophy.” Those who are not “ideologues” in the Converse
sense call others “liberal” or “conservative” but “Converse thought that they
basically don’t know what they’re talking about” and that the non-ideologues
are inconsistent: “they can’t see how
one opinion (that taxes should be lower, for example) logically ought to rule
out other opinions (such as the belief that there should be more government
programs).”[83]
According to Converse, a little over 40% of the
electorate voted not according to ideology but “perceived self-interest.” The rest, the remaining half of the
electorate, vote “either from their sense of whether times are good or bad
(about twenty-five per cent) or from factors that have no discernible ‘issue
content’ whatever. Converse put
twenty-two per cent of the electorate in this last category. In other words, about twice as many people
have no political views as have a coherent political belief system.”
“Converse
concluded that “very substantial portions of the public” hold opinions that are
essentially meaningless—off-the-top-of-the-head responses to questions they
have never thought about, derived from no underlying set of principles. These
people might as well base their political choices on the weather. And, in fact,
many of them do.” (In a paper written in
2004, the Princeton political scientists
Christopher Achen and Larry Bartels estimate that ‘2.8 million people voted
against Al Gore in 2000 because their states were too dry or too wet’ as a
consequence of that year’s weather patterns.[84])
Only about thirty per cent name an issue when they
explain why they voted the way they did, and only a fifth hold consistent
opinions on issues over time. Rephrasing poll questions reveals that many
people don’t understand the issues that they have just offered an opinion on.
According to polls conducted in 1987 and 1989, for example, between twenty and
twenty-five per cent of the public thinks that too little is being spent on
welfare, and between sixty-three and sixty-five per cent feels that too little
is being spent on assistance to the poor.
Converse’s research, and similar research in the
intervening years, suggests that elections do not express the preference of the
voters. Given this situation, Menand reports that there are three different
theories about the legitimacy (or lack thereof) of democracy.
The
first is that electoral outcomes, as far as “the will of the people” is
concerned, are essentially arbitrary. The fraction of the electorate that
responds to substantive political arguments is hugely outweighed by the
fraction that responds to [other things]. . . .
Even when people think that they are thinking in political terms, even
when they believe that they are analyzing candidates on the basis of their
positions on issues, they are usually operating behind a veil of political
ignorance. They simply don’t understand,
as a practical matter, what it means to be “fiscally conservative,” or to have
“faith in the private sector,” or to pursue an “interventionist foreign
policy.” They can’t hook up positions
with policies. From the point of view of democratic theory, American political
history is just a random walk through a series of electoral options. Some
years, things turn up red; some years, they turn up blue.
A second theory is that although people may not be
working with a full deck of information and beliefs, their preferences are
dictated by something, and that something is elite opinion. Political campaigns, on this theory, are
essentially struggles among the elite, the fraction of a fraction of voters who
have the knowledge and the ideological chops to understand the substantive
differences between the candidates and to argue their policy implications.
These voters communicate their preferences to the rest of the electorate.
The third theory of democratic politics is the theory
that the cues to which most voters respond are, in fact, adequate bases on
which to form political preferences. People use
shortcuts
. . . to reach judgments about political candidates, and, on the whole, these
shortcuts are as good as the long and winding road of reading party platforms,
listening to candidate debates, and all the other elements of civic duty. [In other words, people use “gut reasoning”
to decide.] The will of the people may
not be terribly articulate, but it comes out in the wash.
Menand says the third theory is the one that could
legitimate the democratic process, but he expresses doubt about its
validity. There is no evidence that
people vote in their own interest, or that they are able to do so. “When political scientists interpret these
seat-of-the-pants responses as signs that voters are choosing rationally, and
that representative government therefore really does reflect the will of the
people, they . . . are not doing the math.
Doing the math would mean demonstrating that the voters’ intuitive
judgments are roughly what they would get if they analyzed the likely effects
of candidates’ policies, and this is a difficult calculation to perform. One
shortcut that voters take, and that generally receives approval from the elite,
is pocketbook voting. If they are
feeling flush, they vote for the incumbent; if they are feeling strapped, they
vote for a change. But, as Larry Bartels . . . has pointed out, pocketbook
voting would be rational only if it could be shown that replacing the incumbent
did lead, on average, to better economic times. Without such a demonstration, a
vote based on the condition of one’s pocketbook is no more rational than a vote
based on the condition of one’s lawn.
It’s a hunch. . . . Bartels has
also found that when people do focus on specific policies they are often unable
to distinguish their own interests. . . .
[In terms of the estate tax, which I touched on earlier,] most Americans
simply do not make a connection between tax policy and the over-all economic
condition of the country. . . . This
helps make sense of the fact that the world’s greatest democracy has an
electorate that continually ‘chooses’ to transfer more and more wealth to a
smaller and smaller fraction of itself.”
I also saw a piece, which I cannot now remember where but
it was authored by someone aligned with Republican views, that suggested the
“elite” expect most people to vote in their own narrow self-interest but they
themselves—the “liberal elite”—vote on more communitarian interests (e.g., all
the rich Hollywood types voted for Kerry even though he would have raised their
taxes). The point the person made was
that the folks of middle America may also vote
in favor of what they see as the interests of the country rather than their own
pocketbooks and that it is condescending for those on the left to assume that
they do not. I think the point is
well-taken.
So what is one to make of this? Menand offers some modest amount of
comfort. “Morris Fiorina, a political
scientist at Stanford, thinks that [the country has not fallen into two starkly
opposed camps], and that the polarized electorate is a product of elite
opinion. ‘The simple truth is that there is no culture war in the United
States—no battle for the soul of America rages, at least none that most
Americans are aware of,’ he says.” Fiorina
argues, according to Menand, that “public-opinion polls . . . show that on most
hot-button issues voters in so-called red states do not differ significantly
from voters in so-called blue states.
Most people identify themselves as moderates, and their responses to
survey questions seem to substantiate this self-description. What has become
polarized, Fiorina argues, is the elite. The chatter—among political activists,
commentators, lobbyists, movie stars, and so on—has become highly
ideological.” People, by and large, are
not politically interested; “this absence of ‘real opinions’ is not from a lack
of brains, it’s from lack of interest. . . .
It’s not that people know nothing.
It’s just that politics is not what they know.” So elections may, except in the most vague and
general sense, reflect a random walk through red and blue states. Anyone who saw the election map cast in
shades of purple, rather than red and blue, might be inclined to concur.
In a New York Review of Books review of several
books about the “American empire,” Tony Judt made the observation that “even if
it could be demonstrated beyond a doubt that American hegemony really was a net
good for everyone, its putative beneficiaries in the rest of the world would
still reject it. Whether they would be
acting against their own best interests is beside the point—a consideration not
always well understood even by proponents of ‘soft power’ [persuasion as
opposed to use of military force]. As
Raymond Aron once observed, it is a
denial of the experience of our century (the twentieth) to suppose that men
will sacrifice their passions for their interests” (emphasis added). I find that last quote absolutely
fascinating, and one that explains much about the 2004 presidential election
that I had hitherto been unable to grasp.
Forget pocketbook voting; people vote their passions rather than their
interest. That observation, to the
extent it is correct, explains much that Bartels and Menand do not (but that
Maharidge hints at). I suppose this
observation has been documented repeatedly in political science and social
psychology studies, but if so I have never read them, so I was struck by the
quote.
Judt also summarized how many felt about American’s role
in the world now. “With our growing
income inequities and child poverty; our underperforming schools and
disgracefully inadequate health services; our mendacious politicians and crude,
partisan media; our suspect voting machines and our gerrymandered congressional
districts; our bellicose religiosity and our cult of guns and executions; our
cavalier unconcern for institutions, treaties, and laws—our own and other
people’s: we should not be surprised
that America has ceased to be an example to the world. The real tragedy is that we are no longer an
example to ourselves.”
After the election I also
read a number of articles by those who vehemently opposed Bush but who were
also angry at those who worked to defeat him.
The general tenor was similar; one went like this: “I have to be honest. I was angry at the left before the election. Now, I’m furious. . . . For years, I’ve watched progressive activists
shrug their shoulders with hopelessness as religious conservatism grew—writing
off people of faith or downright insulting and alienating them. . . . We continue to watch with wonderment,
decrying how the country has moved to the right—as though this is a natural
phenomenon, an irreversible trend, a done deal.”
Needless to say,
there are opinions all over the place about what those who disagree with the
Bush administration should do in the future.
One that struck me came from a senior fellow at the Brookings
Institution, Michael O’Hanlon. They “need
bolder, newer ideas, particularly in this post-9/11 world in the realm of
foreign policy. . . Neocons have provided
much of the spark and intellectual energy behind modern-day Republicanism. . .
. Big ideas are needed in a changing,
challenging international environment.
They are also good politics.
Candidates with big ideas convey purpose and gravity. They also convey resoluteness and firm
beliefs—traits that helped President George W. Bush appeal to voters on the
grounds that he had character and shared their values.”
Neocons
have shown how to come up with big ideas in recent years. They provided some of
the intellectual heft and vision behind President Ronald Reagan’s outlandish
belief that the Berlin Wall should come down. More recent notable examples are
Assistant Secretary of Defense Paul Wolfowitz’s conviction that the overthrow
of Iraqi President Saddam Hussein could help remake the Middle East, former
foreign-policy adviser Richard Perle’s willingness to confront Saudi Arabia
over its internal policies and the beliefs of John Bolton, Bush's
undersecretary of state for arms control and international security, that arms
control can be used in a more confrontational way to put pressure on extremist
regimes.
O’Hanlon comments
that "one need not agree with much of the neocon movement to admire its
intellectual vigor and its ambitious approach. . . . But big ideas are better than no ideas. The key is to ensure that they are debated
and vetted, not to squelch them in advance."
Some
might disagree with this assessment, at least in political terms, claiming that
what Democrats need is simple credibility on foreign policy so that they can
neutralize the issue and out-compete Republicans on domestic turf. This perspective, which seems to have guided
much of the Kerry campaign this year, begs the question of how one obtains
credibility in the first place. Purple
hearts from Vietnam, however
commendable, do not suffice—which should be no surprise since Bill Clinton
defeated two war heroes and Ronald Reagan defeated a Naval Academy
graduate in their respective runs for the White House.
Herewith
a comment/prediction for 2008. If the
Democrats want to win in 2008, they cannot nominate Hillary Clinton. She will do for the right what George Bush
has done for the left: energize it. I have no guess who they should or might
nominate. One of the most attractive
Republican tickets, in terms of electability, might be Rudy Giuliani and Arnold
Schwarzenegger, in some combination.
With the latter, of course, there is the minor problem of the need for a
constitutional amendment to make him eligible to run, since he is not a native-born
American. Absent Schwarzenegger, it
could be Jeb Bush.
* * *
Have a good holiday season and a good 2005.
[1] I
asked friends in biochemistry and genetics about creationism. I wondered if creationism is not a solution
looking for a problem. They agreed that
it is. Biologists don’t have a problem
with complex organs (e.g., the eye) or complex chemical structures; they
evolved. Just as proteins did not spring
into place fully formed, neither did humans or their organs.
[2] Or at
least that’s what Nero Wolfe called a female ass. And since the cat was left in the women’s
restroom, I make the assumption it was a woman who left it there.
[3] Our tour
guide at the U.N. very politely pointed out to us that on entering the grounds
of the U.N., we had departed from U.S. territory. I don’t know what happens to someone who
commits a crime on the grounds of the U.N.
[4] The
article can’t be obtained on the Times website for free any longer, but
it is also posted at
http://www.matr.net/print-7954.html
[5] http://www.butterfliesandwheels.com/articleprint.php?num=51
[6] Benjamin Franklin:
An American Life, by Walter Isaacson, p. 427. According to the Oxford English Dictionary,
“Mesmerism is “a therapeutic doctrine or system, first popularized by Mesmer,
according to which a trained practitioner can induce a hypnotic state in a
patient by the exercise of a force (called by Mesmer animal magnetism). Mesmer’s claims were not substantiated by a
scientific commission established by Louis XVI in 1784 including Benjamin
Franklin and Antoine-Laurent Lavoisier.
His techniques, however, had great popular appeal and were variously
developed by other practitioners in the late 18th and early 19th centuries,
ultimately forming the basis of the modern practice of hypnosis.”
[7]http://skepdic.com/lysenko.html
[8] http://chronicle.com/free/v50/i30/30b01601.htm
[9] Or, in
the tart words of one of my faculty colleagues in a message to me, talking
about liberals, “not being cocksure, they can easily be taken advantage of by
blockheads who are.”
[10] From,
of all things “gulfnews.com,” which describes itself as “your one-point online
connection to Dubai, the United Arab Emirates and the Gulf
region for news and information.” The
editorial writer in this case was an American.
[11] http://books.guardian.co.uk/top10s/top10/0,6109,1140156,00.html
[12] Carroll also reports:
“Finally, this just in from the Raelians:
His Holiness Rael draws the exceptional accuracy of
his scientific and humanitarian vision from the Message He received in 1973
from the Elohim, a very advanced race of human beings from a distant planet
within our galaxy. The Elohim created
all life on Earth scientifically using DNA (including humans in their image)
and were mistaken for God, which explains why the name Elohim is present in all
original Bibles. The Bible is, in fact,
an atheist book describing the scientific creation of life on Earth. The new concept of “Intelligent Design” fits
perfectly with this explanation of our origins. Thirty years ago the Elohim explained to Rael
that human cloning coupled with memory transfer would one day allow humans to
live forever on Earth. Today this
prediction is close to becoming a reality, as it has been for millennia on the
Elohim’s planet. It is, in fact, how the
Elohim resurrected Jesus, their messenger, as well as many others whom they
sent to guide humanity and who now live on their planet.
I don’t see why a “critical
analysis” of evolution shouldn't include Rael’s vision, especially since the
master himself thinks ID fits with his godless religion and an atheistic Bible. Now, that’s certainly an ‘alternative’
viewpoint that you won’t find in most science texts.”
[13]
“Neither the common nor the Latin name give any indication that the hacking
cough and haunting whoop are often followed by vomiting. Nor does either name indicate that this
distressing paroxysmal phase can last up to four weeks, and that this phase, in
which the victim most clearly needs constant assistance, cruelly is also the
phase in which this deadly disease is the most highly contagious. Since highly and deadly are
relative terms, I should tell you that pertussis infections occur in 70 to 100
percent of all unimmunized household contacts that have been exposed to an
infected person.
The force of pertussis
coughing is so severe that many patients develop facial suffusions
(discolorations), and small hemorrhages in the skin or conjunctivae. The coughing alone can also lead to hernias,
rectal prolapse (protrusion of the rectal mucous membrane or sphincter muscle
through the anus), or even hypoxic encephalopathy (degenerative disease of the
brain). An adult can literally cough his
way into a proctologist’s or neurologist’s office. Vomit, food particles, or
mucous aspired while whooping can result in secondary pneumonia infection. Some
children even become malnourished because they literally can’t stop coughing
long enough to eat. And some, usually
infants, die.” The article can be found
at http://www.csicop.org/si/2004-01/anti-vaccination.html.
[14] I see
that in November of this year, researchers at the European Molecular Biology
Laboratory have traced the evolutionary origin of the human eye. “When Darwin's
skeptics attack his theory of evolution, they often focus on the eye. Darwin
himself confessed that it was ‘absurd’ to propose that the human eye, an ‘organ
of extreme perfection and complication’ evolved through spontaneous mutation
and natural selection. But he also
reasoned that ‘if numerous gradations from a simple and imperfect eye to one
complex and perfect can be shown to exist’ then this difficulty should be
overcome.” The scientists at EMBL did
precisely that.
[15] For
those of you not up on your astronomy, a light-year is the distance light
travels in one year, or about 6,000,000,000,000 miles (six trillion, because light
travels at approximately 186,000 miles per second).
[16] Another
faculty friend, Professor Chuck Speaks, who understands the physics of light
and sound, wrote to me that radio waves “obey the inverse square law.
Think of the radiation pattern as an ever increasing larger sphere. As
the radius of the sphere increases by some factor, the surface area of the
sphere increases by the square of that factor. Thus, the intensity [of
the energy], at distance 2X is only 1/4th of the value at X.” So a radio wave from Earth out 100 million
light years has an intensity only one trillionth the density of a radio wave
only 100 light years from Earth (one trillionth = one million squared). According to a physicist friend, this “explanation
also applies to radio waves and other forms of electromagnetic radiation.
Since light is also electromagnetic radiation, you might imagine flashing a
light (i.e., sending a light signal) and ask whether, or where and when it can
be detected. If it is a point source, the electromagnetic rays will
spread out spherically, and thus fall off in intensity following the
inverse square law. If the signal is perfectly collimated (the best would
be to have a fiber-optic line), then it would only decrease due to losses
[from] atoms and molecules in its path, each of which serves to scatter the
beam essentially spherically.” So all we
need is a 100-million-light-year-long fiber-optic line to detect civilizations
that far away.
[17] From
the SETI Institute, seti.org/litu/Welcome.html.
[18] “Drake
Equation,” Wikipedia, the free encyclopedia on the web.
[19] http://www.seti-inst.edu/publications/newsletters/setinews/vol9no1.html
[20] www.physics.hku.hk/~tboyce/sfseti/contents.html.
[21] seti.org/publications/newsletters/setinews/vol9no1.html.
[22] www.physics.hku.hk/~tboyce/sfseti/37drake.html.
[24] Robin
Hanson, “The Great Filter—Are We Almost Past it?”
www.hanson.gmu,.edu/greatfilter.html.
[25] Journal
of Evolution and Technology, Vol. 9, March, 2002.
[26]
Nanotechnology is “the building of working devices, systems, and materials
molecule by molecule by manipulating matter measured in billionths of a
meter. The research seeks to exploit the
unique and powerful electrical, physical, and chemical properties found at an
infinitesimally small scale.” www.eurekalert.org/bysubject/index.php?kw=280; “EurekAlert”
is sponsored by the American Association for the Advancement of Science.
[27] Benjamin Franklin
An American Life, p. 169.
[28] “Homer Gets a Tax Cut: Inequality and Public Policy in the American
Mind,” August, 2003.
[29] From its website, “the Center conducts
research and analysis to inform public debates over proposed budget and tax
policies and to help ensure that the needs of low-income families and
individuals are considered in these debates. We also develop policy
options to alleviate poverty, particularly among working families. In addition, the Center examines the short-
and long-term impacts that proposed policies would have on the health of the
economy and on the soundness of federal and state budgets.”
[30] From its website, a “private, nonprofit,
nonpartisan research organization dedicated to promoting a greater
understanding of how the economy works.
The NBER is committed to undertaking and disseminating unbiased economic
research among public policymakers, business professionals, and the academic community.”
[31] Congressional Budget Office.
[32] www.inequality.org/newsfr.html.
[33] All these data are from www.inequality.org/factsfr.html.
[34] It should be noted that Census Bureau data probably
understate the degree of income inequality while the CBO data are less likely
to do so. Census data do not include
capital gains and exclude earnings above $999,999 (so 80% of the income of
someone making $5 million per year is not included). The CBO data combine Census data with IRS
data from federal tax returns so include total income. Census data also do not include government
transfer payments—food stamps, housing subsidies, and so on—while the CBO data
do.
[35] www.census.gov/hhes/income/histinc/ie3.html: Census Bureau, Historical Income Data, Income
Equality. (What a misnomer of a title—any college
student who labeled these data with that title would get a raised eyebrow and a
lower grade from his or her professor.)
[36] These
numbers are slightly different from the table immediately preceding because the
data come from different points in time.
These are from OMB, http://www.ombwatch.org/article/articleview/1615/1/178/
[37] www.census.gov/hhes/income/histinc/ie3.html: Census Bureau, Historical Income Data, Income
Equality, and www.census.gov/prod/3/98pubs/p60-203.pdf: Census Bureau, Current Population Reports,
P60-203, Measuring 50 Years of Economic Change.
I spoke with a faculty colleague in Statistics about the numbers to find
out if they are statistically significant.
He said that since these are based on census data, and thus millions of
pieces of data, not samples, any change in them is statistically
significant. The question of whether
they are meaningful is not something statistical significance can
indicate—but he said he thought these data were meaningful because the changes
over time are in one direction and are not trivial. Year-to-year fluctuations in the coefficient
probably do not signal anything important, but the trend line is unmistakable.
[38] One
faculty colleague reviewer wrote to me that “with all due respect to Akerlof,
elementary macroeconomics suggests that while consumption has a positive effect
on the economy, investment is more powerful.
People who save don’t stuff it in a mattress, they invest it. (Just like your Bronfman quote on the
preceding page). Investment is what
enables creation of a greater supply, higher productivity, and less expensive
goods for consumption. I would submit
that the only examples of real hoarding are by a variety of the world’s present
and past dictators who extract taxes and tributes and send them to numbered
accounts in Switzerland (and a small collection of malefactors like Ebbers and
Kozlowski who are currently being prosecuted, and Marc Rich who escaped
prosecution through the generosity of our last president’s departure pardons).”
[39] “Partisan Politics and the U.S. Income Distribution,” May,
2003.
[40] In both of the personal cases, the wives/mothers did not
have an effect on income because both my mother and my mother-in-law were
stay-at-home moms.
[41] United Nations, “Human Development Indicators 2003,” www.undp.org/hdr2003/indicator/indic_126_1_1.html
[42] “The Wealth Divide:
The Growing Gap in the United States Between the Rich and the Rest,” May,
2003, www.multinationalmonitor.org/mm2003/03may/may03interviewswolff.html.
[43] Note
that he is talking about wealth, not income.
[44] “One
who makes an income from property or investment,” according to the Oxford
English Dictionary.
[45]
Emmanuel Saez and Wojciech Kopczuk, “Top Wealth Shares in the United States,
1916-2000: Evidence from Estate Tax
Returns.”
[46] A faculty friend observed that
this quotation “exaggerates the problem by comparing wealth with annual incomes
(GDP). It gives the impression that the
400 richest people get one-eighth of the GDP.
They’re disgustingly rich, but not that disgustingly rich.” I agree that the quote does mix up two
concepts but the point remains: the
assets/wealth of the 400 equal one-eighth of the GDP. Even if it’s comparing apples to oranges, it
highlights the point in an interesting way.
[47] Organization for Economic Co-operation and
Development, “30 member countries sharing a commitment to
democratic government and the market economy.”
The 30 countries are all “developed” or very close, and range from the
US, Canada, and Mexico to Japan, Korea, Great Britain, Germany, Greece,
Hungary, the Scandinavian countries, Spain, Australia, New Zealand, etc.
[48] And on
this point the data are ominous:
According to Jennifer Washburn in “The Tuition Crunch” in The Atlantic, January, 2004: “In the past two decades our commitment to
equal opportunity in post-secondary education has deteriorated markedly. In 1979 students from the richest 25% of American
homes were four times as likely to attend college as those from the poorest
25%; by 1994 they were ten times as likely.”
[49] “Income Mobility and the Fallacy of
Class-Warfare Arguments Against Tax Relief,” by D.
Mark Wilson, http://www.heritage.org/Research/Taxes/BG1418.cfm
[50] “Revised Estimates of Intergenerational Income Mobility in
the United States,”
November, 2003 (an earlier version of the paper was released in 2002).
[51] Professor of Economics and Public Policy at Princeton University.
[52] “Ignorance
Fills the Income Gulf,” June, 2003, www.inequality.org/grahamyoungfr.html.
[53] “The
myth of mobility,” July, 2003, Suresh Naidu, The Ultimate Field Guide to the U.S. Economy, www.fguide.org.
[54] “The Death of Horatio Alger,” December,
2003, www.thenation.com/docprint.mhtml?i=20040105&s=krugman
[55] Galbraith is Lloyd M. Bentsen Jr. Chair in
government-business relations at the Lyndon B. Johnson School of Public Affairs
at the University of Texas at Austin.
[56] Data
that suggest reinforcement of the trend to intergenerational immobility are
college graduation data. According to
Mortimer Zuckerman, Editor-in-Chief at U.S. New and World Report, “a mere
4.5 percent of young people from the lowest income quartile get a B.A. degree
by the age of 24; 12 percent from the next quartile get one; 25 percent from
the third quartile, and 51 percent of students in the top quartile. Could it be that America is now resegregating higher
education, only this time not by color but by class?” At the same time, college costs are
skyrocketing, federal grants cover far less of the cost of college attendance
than they did in the past, and the number of students turned away because of
lack of space is increasing. Those who
do attend (or their parents) are facing increased loan burdens. Because there is much greater reluctance
among lower-income families (and minority families) to incur debt, the prospect
of increased loans is a deterrent to students from those families to attend
college at all. A regular paycheck, even
in a low-paying, dead-end job, looks more attractive than thousands of dollars
of loans for an education, the outcome of which is uncertain.
[57] April 4,
2002. For those in political
life who want the estate tax eliminated, it was a nifty public relations
decision to begin referring to it as the death tax.
[58] According to Bartels, “the survey included 1,511
respondents interviewed by telephone in the six weeks before the 2002 midterm
election; 1,346 of these respondents (89%) were reinterviewed in the month
after the election. The respondents
answered a series of questions about their perceptions of economic inequality
and its causes and consequences, the 2001 Bush tax cut, the proposed repeal of
the federal estate tax, and related issues.
Thus, the 2002 NES data provide an unusual opportunity to probe how
ordinary Americans reason about economic inequality and public policy.”
[59] “Solving the New Inequality,” December
1996/January 1997 issue.
[60] Branko Milanovic, “Why we all do care about inequality
(but are loath to admit it),”
www.worldbank.org/research/inequality/pdf/freldstein.pdf.
[61]
Actually, I don’t think this is a very strong argument. If, however, one imagines the good fairy as
the government, and the gifts from the fairy as your tax reductions, and that
all the money distributed by the good fairy means less money for roads,
schools, snow plowing, and so on, then the gifts from the good fairy are even more
insulting.
[62] From their web site: “Sojourners, www.sojo.net,
is a Christian ministry whose mission is to proclaim and practice the biblical
call to integrate spiritual renewal and social justice.”
[63] At one
time our “fundamental rights” did not include possession of alcoholic
beverages. Now they do.
[64] One of
my reviewers (who knows labor markets) made this comment about Wal-Mart. “You need to look more closely at
Wal-Mart. It never has suggested that it
would be a high payer, but relative to other retailing jobs in the areas in
which it operates, it is not a low-payer.
Further, it does offer health care and retirement benefits to employees
who intend to make a career with Wal-Mart.
Job security is quite high given their rate of growth in the areas in
which they operate. I doubt seriously
whether the majority of Wal-Mart employees are dissatisfied with either their
employment or their employer. Should an
employer be expected or required to pay what social commentators and assorted
policy makers think they should pay? . . .
I don’t find it unusual that several of the Walton heirs are among the
wealthiest Americans. Their father
didn’t will them the stock, but probably gave it to them when it had relatively
little value. Their progeny may or may
not be among the most wealthy Americans of the future. It all depends on how well they develop their
own talents.”
[65] The New
York Times editorialized after the President’s State of the Union address
in January: “The president’s domestic
policy comes down to one disastrous fact:
his insistence on huge tax cuts for the wealthy has robbed the country
of the money it needs to address its problems and has threatened its long-term
economic security. Everything else is beside the point.”
[66] “Welfare spending: A race to the bottom?” National Europe Centre, Australian National
University
5-11-2003, www.apo.org.au/webboard/items/00499.shtml.
5-11-2003, www.apo.org.au/webboard/items/00499.shtml.
[67] Commonwealth
Department of Family and Community Services (Australia), “International
Comparisons and the International Experience with Welfare Reform,” 2002, www.facs.gov.au/internet/facsinternet.nsf/whatsnew/interwelfare.htm
[68]
“Sharing the Wealth?”
[69] In a
briefing sponsored by the Brookings Institution, Professor Bartels said that
“people who are better informed in general about politics are also much more
likely to say that growing economic inequality is a bad thing. They’re also more likely to realize that
economic inequality has broader political and social consequences.” www.brookings.edu/dybdocroot/comm/events/20031216.pdf
[70] The
quotes from Benjamin Franklin by Edmund S. Morgan.
[71] Benjamin
Franklin: An American Life, by
Walter Isaacson, p. 424.
[72] http://www.prospect.org/web/page.ww?section=root&name=ViewPrint&articleId=7748
[73] http://slate.msn.com/?id=2074780
[74] From The New York Times 11/9/04: “I saw a television advertisement recently
for a new product called an air sanitizer.
A woman stood in her kitchen, spraying the empty space in front of her
as though using Mace against an imaginary assailant. She appeared very determined. Where others are satisfied with
antibacterial-laced sponges, dish soaps, hand sanitizers and telephone wipes,
here was a woman who sought to sterilize the air itself.
As a casual student of
microbiology, I find it hard to escape the absurdity here. This woman is, like any human being, home to
hundreds of trillions of bacteria.
Bacteria make up a solid third, by weight, of the contents of her
intestines. . . . The fantasy of a
germ-free home is not only absurd, but it is also largely pointless.” Mary Roach, the author, went on to comment
that if one is worried about food poisoning, don’t eat moist food left for 4-5
hours at room temperature. If one is
worried about disease, the few bacteria on doorknobs and countertops won’t hurt
you. New antibacterial cleaning liquids
are a scam. According to research she
cited, “an active adult touches an average of 300 surfaces every 30
minutes. You cannot win at this. You will become obsessive-compulsive. Just wash your hands with soap and water a
few times a day, and leave it at that.”
So ends Gary’s
helpful household hints.
[75] This
may not be surprising inasmuch as “45 percent of
responding U.S. adults agreed that ‘God created human beings pretty much in
their present form at one time within the last 10,000 years or so,’” according
the National Geographic article I cited earlier, so why would they have
any question about whether humans co-existed with dinosaurs?
[76] http://www.betterhumans.com/Features/Columns/Transitory_Human/column.aspx?articleID=2003-12-22-2
[77] I have a faculty colleague who was a college
classmate of Schama’s. She told me that
when Schama decided to major in history, she picked another field because she
could not begin to compete with anyone as smart as he is.
[78] The “Dead Souls” comes from Walter Scott’s “‘The Lay
of the Last Minstrel.’ Therein, he asked
whether
‘Breathes
there the man with soul so dead
Who
never to himself hath said:
‘This
is my own, my native Land?’
Whose
heart hath ne’er within him burned
As
home his footsteps he hath turned, . . .
From
wandering on a foreign strand?’
A
contemporary answer to Scott’s question is: Yes, the number of dead souls is
small but growing among America’s
business, professional, intellectual and academic elites.” The article is not available free at The
National Interest website, but can be found at
http://www.freerepublic.com/focus/f-news/1111567/posts. I would like to point out that both
Huntington and The National Interest are conservative, although where
they fall on the scale in the multitude of places one can be a conservative, I
don’t know.
[79] One is
tempted to question the validity of this assertion, given the number of voter
challenges and disputes about electoral machinery in the 2000 presidential
election and the concerns that continue to be expressed about electronic voting
machines.
[80] “In a 2003 Harris poll 79 percent of Americans said
they believed in God, and more than a third said they attended a religious
service once a month or more. Numerous
polls have shown that these figures are much lower in Western
Europe. In the United States a majority of respondents in
recent years told pollsters that they believed in angels, while in Europe the issue was apparently considered so
preposterous that no one even asked the question. When American commentators warn about a new
fundamentalism, they generally mention only the Islamic one. European intellectuals include two other
kinds: the Jewish and Christian variants.”
Peter Schneider, The New York
Times, March 13, 2004.
[81] One
likely member of this “denationalized elite,” Thomas Friedman, the Minnesota
native who writes for The New York Times, wrote after the election about
“two nations under God” that “we don't just disagree on what America should be
doing; we disagree on what America is.”
[82] I put
aside for this purpose the treatment of Native Americans over the last 400
years, the existence of slavery until 1865, and other arguably bad treatment of
certain groups of Americans or immigrants.
The U.S.
government is not carrying out carefully-planned murderous attacks on any
subgroups in the population, to my knowledge.
[83] That
certainly described Krystin this summer.
She came home after getting her first paycheck and exclaimed “where did
all the money go?” We pointed out to her
that it was taxes, and reminded her that we have generally supported higher
taxes and so has she, and that all the programs she thinks would be a good idea
must be paid for somehow. Elliott,
hearing about this later, declared that he would never complain about higher
taxes. Even I wouldn’t say that,
however, because I would certainly complain if I thought they were too high,
but so far in the U.S.
taxes haven’t gotten anywhere near too high by my standards. See the earlier commentary on income and
wealth distribution.
[84] And see
Bartels’s work on the estate tax, which I referred to earlier. It follows in this same vein.