December 2015
Thursday,
not Wednesday
My
letter comes with warm greetings!
Well, we made it another year. That's always encouraging. In the words of George Carlin, "I'm
always relieved when someone is delivering a eulogy and I realize I'm listening
to it." So I've been here all year
to write again.
This letter could have been
longer. Beginning in August and through
the end of the year, my efforts were directed to preparing a graduate seminar,
and I learned that I do not have more than 5-6 hours per day of intellectual
energy. I would be sure I'd continue to
work on developing lecture notes in the evenings, but after spending much of
the day on them I had brain fatigue, so a scotch and soda with Kathy was much
more appealing. The labor on the seminar
meant I hadn't the ambition to even write about family and friends, much less spend
time reading and commenting on the many articles and bits of news that caught
my attention. I'll do better in 2016.
If he smiled much more, the ends of his mouth
might meet behind, and then I don't know what would happen to his head! I'm
afraid it would come off!
The U of M Gopher football
team got into a New Year's Day bowl, the Citrus Bowl, for the first time since
1962 (when they defeated UCLA in the Rose Bowl). I worried that our friends Joe & Genie
Dixon, who are Gopher football fans, would cancel their annual NYE dinner and
go to Orlando for the game. Joe assured
me that a 43-year tradition of NYE dinners would not be upended for a football
bowl game. I was glad to hear that,
because I would have been at loss for what to do NYE if we didn't have that
dinner to attend—to say nothing of the loss of one of the best meals of the
year (apart from the meals Kathy prepares regularly, of course).
For instance, take the two words "fuming"
and "furious." Make up your mind that you will say both words, but
leave it unsettled which you will say first. Now open your mouth and speak. If
your thoughts incline ever so little towards "fuming," you will say "fuming-furious;"
if they turn, by even a hair's breadth, towards "furious," you will
say "furious-fuming;" but if you have the rarest of gifts, a
perfectly balanced mind, you will say "frumious."
An
excerpt from one of my daily news updates about books:
All
serious artists, no matter how they work — whether at dawn or midnight,
whether indoors or out, clothed or naked, intoxicated or sober — share one
trait. They work.
Here's
Chuck Close: "Inspiration is for amateurs. The rest of us just show
up and get to work."
Here's
Henri Matisse: "For over fifty years I have not stopped working for
an instant."
Here's
Gustav Mahler: "You know that all I desire and demand of life is to
feel an urge to work!"
Here's
Stephen Jay Gould: "I work every day. I work weekends. I work
nights…"
Here's
Sigmund Freud: "I cannot imagine life without work as really
comfortable."
Here's
H. L. Mencken: "Looking back over a life of hard work . . . my only
regret is that I didn't work even harder."
Given
the small sample size, one cannot reasonably argue that these statements
represent the views of all or most artists, despite the claim made. But the point is an interesting one when
juxtaposed with the oft-made point in the 21st century that because
of the Internet, professionals often find themselves working all the time, even
at home on weekends and in the evenings.
This excerpt uses a rather broad definition of the term "artist"
when it includes Gould, Freud, and Mencken, but that makes the point even more
emphatically: it is not just traditional
"artists," the ones who paint and compose, but a broad spectrum of
successful people who work all the time.
What
differentiates these people, of course, is that they were all doing things they
loved to do. One can imagine people in
hundreds of jobs who cannot wait for the work day to end to get away from work
they find uninteresting, boring, repetitious, physically demanding, or downright
aversive. For those who love what they
do, of course, "work" isn't work.
It is also possible, in those endeavors, to achieve greatness to some
degree or another. A lawyer can be "great"
in his or her field, even if only on a local level. The same with a doctor or architect or
professor. Or even administrator. But it's challenging to think about "greatness"
in jobs that entail no creativity or independent thought. (I write that at the same time that I have a
profound appreciation for people who can do an excellent job, no matter the
task—whether fixing our dishwasher or the plumbing or installing concrete. I am reminded of former Health, Education, and
Welfare Secretary John W. Gardner's comment that "the society which scorns excellence in plumbing as a humble
activity and tolerates shoddiness in philosophy because it is an exalted
activity will have neither good plumbing nor good philosophy: neither its pipes
nor its theories will hold water.")
Apart
from those around the world who go hungry daily, face illness without medical
care, or see violence many days of their lives, one of the groups of people I
feel most sorry for are those who have jobs that make them want to do nothing
but get home and consume a six-pack of beer.
It is not real work unless you would rather be
doing something else.
I have contended in an earlier edition of
this letter (tongue in cheek, in part) that learning history is not required
for everyday life. The great British
historian A. J. P. Taylor suggests there is another reason for the futility of
learning history (at least on instrumental grounds): "We learn from history not to repeat old
mistakes. So we make new ones instead."
I wonder if I've been changed in the night. Let
me think. Was I the same when I got up this morning? I almost think I can
remember feeling a little different. But if I'm not the same, the next question
is 'Who in the world am I?' Ah, that's the great puzzle!
One of the more interesting
electronic publications around is Aeon, in my opinion. It included an interesting article by a
Canadian philosopher, John Schellenberg, about how little we think about the
nearly eternal future. All who
understand the findings of physics, astronomy, and Darwin know about what can
be called "deep time." The Big
Bang was about 14 billion years ago; the earth is about 4.5 billion years old;
evolution has been taking place for millions of years. To the extent we can absorb and understand
these numbers, in Schellenberg's view, we begin making the transition from
human time to scientific time.
But humans tend to think of
themselves as the end product of time, despite the fact we know perfectly well
that the sun will continue to shine on the planet, nurturing life, for another billion
or more years (before it expands and then contracts into a white dwarf). "Deep time" is, in most thinking,
the deep past. We're also at the
beginning of a process, not just the end of it.
Schellenberg asks us to visualize a 20-foot line that represents a
billion years into the future. How far
along the line have humans traveled? If
you start with the very beginnings of human society, perhaps 50,000 years ago,
we have not traveled far enough along the line to measure the distance with
normal measuring devices. We have to have
organized society for another 500,000 years before we have traveled 1/8th of an
inch. So, he points out, "the beginning
is near!" Not the end.
There are several reasons why
humans have been unable to comprehend or think much about the deep future. One is that we are preoccupied with
ourselves; we can't imagine what might succeed us (whether biologically or
technologically). Another is that we can
increasingly understand the deep past because of the findings of astronomy as well
as archaeology. Schellenberg also
maintains that (at least in the West), the Bible bears some
responsibility: the New Testament, in
particular, claims that "the end is near," and there are members of
plenty of sects around the world who believe it. There is, he points out, very little about
the future in the Bible.
The science, however, suggests the
contrary; a glance at Wikipedia tells me that the "earliest documented
members of the genus Homo . . . evolved around 2.3 million years ago;
the earliest species for which there is positive evidence of use of stone
tools." If one is willing to assume
humans continue to exist and evolve without destroying the planet or retreating
to a more barbarian level of civilization because of people like Senator Inhofe,
2.3 million years out of at least a billion in front of us isn't much. It may be the end-timers are right, in
proclaiming the imminent end of human civilization, but they are surely wrong
if the claim includes the end of the planet itself.
What Schellenberg's article
suggests to me is that we could think a little more seriously about the future
and where we stand in the long run of scientific time. It is not even remotely possible that we
represent the end of evolution and the development of intellect; humans today
may represent merely one small incidental step on the way to wherever it is we're
going. We cannot fathom, surely, what
our descendants in thousands of years might be like or what they may have
discovered or invented. (Again, assuming
we manage to overcome the antediluvian views of a number of political leaders
in the U.S. and around the world; I'm not sure that the odds of that assumption
being correct are better than 50/50.)
The one depressing fact that we
cannot, so far, get around is the speed of light. If some evolved version of human beings are
to survive the eventual scorching of the earth by the expanding sun, we will
have to get off the planet. But unless
physicists find us a way to travel across interstellar space at speeds far
faster than light, we're doomed to live on space ships traveling for hundreds
of thousands or millions of years to other younger stars with habitable
planets. Our descendants would have no
experience of Earth except to the extent the ships could keep records over all
those millions of years.
In any case, we are in the infancy
of humanity. Whether we'll make it even
to the toddler stage remains to be seen.
Imagination is the only weapon in the war
against reality.
On a related tack: ScienceDaily
reported last year on the development of a new atomic clock that will "neither
gain nor lose one second in about 300 million years." This replaces the existing clock, which can
gain or lose a second every 100 million years.
One friend of mine snorted that "while
I am a huge proponent of increased funding on scientific endeavors, it's hard
to make this sound as if it's not a waste of money. I can't imagine any taxpayer anywhere cares
at all about this. And it's hugely
arrogant and borderline insane that we think we'll still be here in 300 million
years at the rate we're going."
I'm not the only one somewhat
pessimistic about what the near future might bring, to say nothing of the deep
future.
Flamingoes and mustard both bite. And the moral
of that is--Birds of a feather flock together.
I wrote last year about the research
by two faculty members at the University of Warwick in England suggesting that
there may be a genetic link to happiness and that the closer one's genetics
match those of the Danish, the happier one will be. The research follows from earlier research
from University College London estimating that genes may account for a
substantial portion of differences among people in happiness or life
satisfaction.
Reporter Scott Timborg at Salon noted the happiness data this
year; the U.S. comes in 15th in the world in level of
happiness—always lagging behind Scandinavia, Switzerland (and also behind
Australia, Costa Rica, Israel, and Mexico, among others). He points out the obvious:
What most of these really happy countries have in common
is a higher level of taxes and services than a go-it-alone country like the
U.S., which conflates superficial ideas of liberty with real freedom, would
ever allow. So we've gotten used to
trailing Scandinavia, much of the rest of Europe, fellow English speakers like
Canada and New Zealand, and so on.
'That's the effect of living backwards,' the
Queen said kindly: 'it always makes one a little giddy at first—'
'Living backwards!' Alice repeated in great
astonishment. 'I never heard of such a thing!'
'--but there's one great advantage in it, that
one's memory works both ways.'
'I'm sure MINE only works one way,' Alice
remarked. 'I can't remember things before they happen.'
'It's a poor sort of memory that only works
backwards,' the Queen remarked.
One of the questions that has
intrigued me for most of life is how humans are different from all other
species on the planet (if, indeed, we are different in any fundamental
sense). I wrote to a friend in 1973
after finishing The Difference Of Man and The Difference It Makes by Mortimer Adler asking the question. (Adler maintained that it was the Turing test
that would make the difference:
development of a speaking computing machine that could not be
distinguished from a human speaker, thus demonstrating the ability of
conceptual thought. That's not a test of
human difference that most theorists would accept today, but at the time it
seemed extremely far-fetched.)
One
recent answer is the use of symbolic language (e.g., hieroglyphs, alphabets),
which appears to have emerged roughly 200,000 – 300,000 years ago, and that may
be the point at which "modern" human emerged. The importance of symbolic language is that
it allows more precise language, communication of a wider range of ideas, and
perhaps most important, the ability to save and communicate what had been
learned—we are not limited to evolutionary changes in our genetic makeup but
can instead build on what has gone before to change behavior. It is true that animals can communicate with
each other, and very likely have emotions, but they do not have the ability to
accumulate and store knowledge nor is it likely they can conceptualize
non-physical abstractions such as truth, justice, the future or past, evil, and
so on.
Stephen
Cave, a Cambridge Ph.D. in metaphysics and writer on science, ethics, and
metaphysics for a number of notable international publications, reviewed a
number of books that touched on how humans are different ("What sets
humanity apart?") Two of the
authors maintained that there isn't any difference between humans and other
animals except in degree; the other two maintained that there is a significant
qualitative difference.
The
argument that we are not different (once one puts aside the biblical answer
that humans are made in the image of god) stems from the understanding that we
are but one piece of the evolutionary tree.
We share characteristics with all other animals (bipedalism, opposed
thumbs, relatively large brains). There
is also the claim that we are accidental—50,000 years ago or more there were
more than one species of humans and by luck and extermination, we survived or
triumphed. Another line of thinking is
that of "evolutionary continuity":
humans have argued that they are the only ones with "reason,
emotion, consciousness or morality. But
we share a long evolutionary history and basic biology with other animals,
particularly other mammals. We should therefore expect that 'if some mammals
experience something, most or all mammals probably do, too.'" There's evidence to support the proposition
that other species demonstrate altruism, tool-making skills, and emotional
lives.
The other point
of view—that there is a difference that makes a difference, to echo Adler—is
that while human share traits with many species, especially mammals, humans
have more of the important ones, such as an understanding of past and future,
theory of mind, culture, and morality.
Some animals show glimpses of possessing the traits, but to nowhere near
the same extent as humans. One argument,
by someone who's been studying the differences between apes and humans for
decades, is that it is humans' ability to cooperate that moves humans to
behavior and advances that other animals cannot match. While there's evidence that chimps and others
can cooperate, it is very limited and very brief, and they abandon it within
minutes. Humans, in contrast, can
cooperate for decades or a lifetime. (A Natural History of Human Thinking,
Michael Tomasello)
I just don't buy the argument that we aren't qualitatively
different. All mammals are off the same
evolutionary tree, so of course there are similarities with other species, but
when sharks write sonnets and chimps build computers, I'll buy the argument
that the differences aren't that great.
It seems to me one of those things that is as plain as the nose on your
face but that is extraordinarily difficult to define or measure adequately.
I'd give all the wealth that years have piled,
the slow result of life's decay,
To be once more a little child
for one bright summer day.
the slow result of life's decay,
To be once more a little child
for one bright summer day.
This is the cartoon version of the issue
that's been debated heatedly, particularly in Europe: one's "right" to remove materials
from the web that are damaging. Here's
what the Wall Street Journal
reported, and it summarizes the issues quite well:
Europe's top court ruled that Google Inc.
can be forced to erase links to content about individuals on the Web, a
surprise decision that could disrupt search-engine operators and shift the
balance between online privacy and free speech across Europe.
Under [the] ruling . . . individuals can
request that search engines remove links to news articles, court judgments and
other documents in search results for their name. National authorities can
force the search engines to comply if they judge there isn't a sufficient
public interest in the information, the court ruled.
The European Court of Justice's decision represents the strongest legal backing of what is often called the "right to be forgotten," a concept born out of 19th-century French and German legal protections. . . .
The European Court of Justice's decision represents the strongest legal backing of what is often called the "right to be forgotten," a concept born out of 19th-century French and German legal protections. . . .
Proponents of the "right to be forgotten" argue that individuals should be able to force the removal from the Internet of information that is old or irrelevant, and could be deemed to infringe on their right to privacy. Detractors say that the ruling could lead to a massive wave of takedown requests that would swamp companies and privacy regulators with legal costs, while whitewashing the public record.
The Washington Post chimed in with support for the idea, even if not
the specific European approach:
While people forgive and forget over time,
the Internet punishes relentlessly. . . .
Behind the trappings of European regulation, the "right to be
forgotten" is really a right to be forgiven; a right to be redeemed; or a
right to change, to reinvent and to define the self anew. A person convicted of a crime deserves a
chance at rehabilitation: to get a job or a loan. A person wrongly charged or convicted
deserves even more freedom from search-engine shackles.
We can diverge from Europe over
bureaucratic process. And we can debate
and decide for our society when the right to be forgotten is forfeit. But we should adopt—we should own—the concept
of erasure online. For there could be
nothing more American than a second chance in a new world.
This seems like a great idea for the
obscure among us. But for those who are
in public life, and in political life in particular, the idea that they can
erase past records of (mis)behavior strikes me as inappropriate—and a little
scary. If someone has defrauded people or
entities in the private or public sector or committed any of a thousand
different offenses, and that person is running for office, I think the voters
should be able to find out. The point
about whitewashing seems to me legitimate.
But it presumably is not possible to have an exception for candidates
for public office; someone could erase items and only decide later to run for
office. Or, more likely, decide to run
for office and then erase them before announcing one's candidacy. The possibilities for flummoxing the law, if
there were one, are endless.
At the same time, the idea of
forgiveness isn't without its attraction.
I'll let someone else write the regulations governing the circumstances
under which one can remove items from the web.
"Then you should say what you mean,"
the March Hare went on.
"I do," Alice hastily replied; "at least—at least I mean what I say—that's the same thing, you know."
"Not the same thing a bit!" said the Hatter. "You might just as well say that 'I see what I eat' is the same thing as 'I eat what I see'!"
"I do," Alice hastily replied; "at least—at least I mean what I say—that's the same thing, you know."
"Not the same thing a bit!" said the Hatter. "You might just as well say that 'I see what I eat' is the same thing as 'I eat what I see'!"
I wrote a note to myself last year that I
would never write pieces of an autobiography, and then changed my mind, as I
recorded in my 2014 letter. "I may
get part way through the writing and decide it's not worth it. As I told Elliott in a text message, in
looking through saved papers, "I've now remembered events that I would
have been quite pleased to have left forgotten."
It has been an interesting exercise thus
far—put in abeyance, along with the work on this letter, while working on
seminar talks—but given that virtually all the materials I am relying on have
never been on the web, one can wonder why I would dredge up events that I would
just as soon forget and no one else could remember with any clarity. In this case, I don't have to erase; I could
just shred, and the events, like almost all events in almost all of our lives,
would vanish. That's what human memory (and its failings) accomplish: it lets certain things fade into the fuzzy
mists of the past, to be left there and then to vanish forever when one dies.
At the same
time, it has been rewarding to see that I have learned certain things over the
course of the nearly 50 years for which I have notes and letters. One thing for sure: my writing has improved considerably. It's also been gratifying to see that
convictions I formed in my 20s, based on the knowledge and data available to me
at the time, largely haven't been overturned by new findings. I suppose one could argue that my views froze
at that point, and have been invulnerable to change ever since (which is a
common enough psychological phenomenon), but I try to keep an eye out for
credible evidence that contradicts what I believe. Sometimes I discover it, and have to adjust
what I believe, but there haven't been defensible reasons to make fundamental
shifts in my outlook.
What's also
been interesting is being forced to try to categorize parts of one's life, or
phases of life. Some parts I would not
change at all—I'd still get my undergraduate degree in Political Science—and
others I would change quite a bit. In
the case of the latter, I'd have many more relationships with members of the
opposite sex while in my 20s—I'd like a do-over on that part. I've also discovered that lives (or mine, at
least) are characterized by periods of tranquility—living out daily life
without great storm or stress—that are punctuated by incidents of great
upheaval. I can't believe that any of us
is particularly interested in the routine lives of our peers and friends: we go to work, we raise our families, we
interact with friends and relatives, we may travel a bit, we may have hobbies,
we cook meals, we tend to our homes, and so on.
It's the variations from the norm that are worth relating: the trauma,
the difficult decisions, the heights of elation and the depths of sadness. Who cares about the trips to the grocery
store? I've discovered that I faced
several periods of great upheaval during the last five decades (and I'm certain
I'm no different from most people in that regard), and it is those that are
most worth attention. (Where I may be
different from many is that when I face prolonged periods of stress, I write a
lot. So it's easy for me to reconstruct
my thoughts in those periods, even from 30 and 40 years ago.)
In
retrospect, I would have tried to be more thoughtful about what I was going to
do in life. What I did I largely
stumbled into; neither of the two largest chunks of my career were intentional
(administrative oversight of college athletics for a dozen years—an involvement
that continues to this day—and serving as Secretary to the Faculty for 26
years). In both cases, the work was to
last a year or so until I went on to whatever else it was I was going to
do. The "whatever else" never
came to pass. I can't say I was unhappy
doing what I did, and believe that I accomplished a few things that I can look
back on with some modest sense of accomplishment.
Let me
return to my source of quotations from a couple of letters ago, H. L. Mencken. In
July of 1931, author and philosopher Will Durant wrote to a number of notable
figures and asked, essentially, "What is the meaning of life?" His letter concluded: "Spare me a moment to tell me what
meaning life has for you, what keeps you going, . . . what are the sources of
your inspiration and your energy, what is the goal or motive-force of your
toil, where you find your consolations and your happiness, where, in the last
resort, your treasure lies. While
Mencken is today embraced as by some on the conservative right for some of his
views, in other areas (e.g., religion, which I won't pursue here), he differed
dramatically from conservatives then and now.
But his response to Durant on his work almost perfectly captures my
perceptions about myself—and I suspect they are true for many in professional
fields.
You ask me, in brief, what satisfaction I get out of life,
and why I go on working. I go on working for the same reason that a hen goes on
laying eggs. There is in every living creature an obscure but powerful impulse
to active functioning. Life demands to be lived. Inaction, save as a measure of
recuperation between bursts of activity, is painful and dangerous to the
healthy organism—in fact, it is almost impossible. Only the dying can be really
idle.
The precise form of an individual's activity is determined,
of course, by the equipment with which he came into the world. In other words,
it is determined by his heredity. I do not lay eggs, as a hen does, because I
was born without any equipment for it. For the same reason I do not get myself
elected to Congress, or play the violoncello, or teach metaphysics in a
college, or work in a steel mill. What I do is simply what lies easiest to my
hand. It happens that I was born with an intense and insatiable interest in
ideas, and thus like to play with them. It happens also that I was born with
rather more than the average facility for putting them into words. In
consequence, I am a writer and editor, which is to say, a dealer in them and
concoctor of them.
There is very little conscious volition in all this. What I
do was ordained by the inscrutable fates, not chosen by me. In my boyhood,
yielding to a powerful but still subordinate interest in exact facts, I wanted
to be a chemist, and at the same time my poor father tried to make me a
business man. At other times, like any other relatively poor man, I have longed
to make a lot of money by some easy swindle. But I became a writer all the
same, and shall remain one until the end of the chapter, just as a cow goes on
giving milk all her life, even though what appears to be her self-interest
urges her to give gin.
I am far luckier than most men, for I have been able since
boyhood to make a good living doing precisely what I have wanted to do—what I
would have done for nothing, and very gladly, if there had been no reward for
it. Not many men, I believe, are so fortunate. Millions of them have to make
their livings at tasks which really do not interest them. As for me, I have had
an extraordinarily pleasant life, despite the fact that I have had the usual
share of woes. For in the midst of these woes I still enjoyed the immense
satisfaction which goes with free activity. I have done, in the main, exactly
what I wanted to do. Its possible effects on other people have interested me
very little. I have not written and published to please other people, but to
satisfy myself, just as a cow gives milk, not to profit the dairyman, but to
satisfy herself. I like to think that most of my ideas have been sound ones,
but I really don't care. The world may take them or leave them. I have had my
fun hatching them.
Next to agreeable work as a means of attaining happiness I
put what Huxley called the domestic affections—the day to day intercourse with
family and friends. My home has seen bitter sorrow, but it has never seen any
serious disputes, and it has never seen poverty. I was completely happy with my
mother and sister, and I am completely happy with my wife. Most of the men I
commonly associate with are friends of very old standing. I have known some of
them for more than thirty years. I seldom see anyone, intimately, whom I have
known for less than ten years. These friends delight me. I turn to them when
work is done with unfailing eagerness. We have the same general tastes, and see
the world much alike. Most of them are interested in music, as I am. It has
given me more pleasure in this life than any external thing. I love it more
every year.
I
would take the view that it's more than heredity that determines what it is we
do in life; the social sciences have convincingly demonstrated that environment
plays a significant role (the debates continue, of course, about the relative
role of each, and the extent of the interaction effects between the two).
Apropos of the owl of Minerva quote ["Only when the dusk starts to fall does the owl
of Minerva spread its wings and fly"] and this poking around on fragments of an
autobiography, "life, the philosopher Kierkegaard believed, is lived
forward but can only be understood backwards" (History Today book review, The
Long Shadow [WWI and 20th Century]). I think that's absolutely right. There is some, perhaps much, that I am coming
to understand about earlier events in my life than I did before. An interesting question, I think, and
contrary to the assertion attributed to Kierkegaard, is whether a better
understanding of one's past—if that's what it indeed is—can help in any way in
figuring out where one is now and where one might be going. I'll tell you what I discover in a couple of
decades.
Philip Guedalla, a British barrister
and author, has written that "autobiography is an unrivalled vehicle for
telling the truth about other people."
That's the one thing I would—will—not do. My own foibles, yes; those of others, no.
'Cheshire
Puss,' [Alice] began, rather timidly, as she did not at all know whether it
would like the name: however, it only grinned a little wider. 'Come, it's
pleased so far,' thought Alice, and she went on. 'Would you tell me, please,
which way I ought to go from here?'
'That depends a good deal on where you want to get to,' said the Cat.
'I don't much care where—' said Alice.
'Then it doesn't matter which way you go,' said the Cat.
'-–so long as I get SOMEWHERE,' Alice added as an explanation.
'Oh, you're sure to do that,' said the Cat, 'if you only walk long enough.'"
Kathy and I went to India for 19 days in
January-February. The worst part of the
trip was simply the going and coming.
Two flights each way, through Amsterdam to Delhi. Going over wasn't a trial—the flights were
uneventful and we were anticipating the adventure—but coming home, it seemed
like the flight from Amsterdam to Minneapolis took forever. We were awake 47 hours, from the time we
awoke in our hotel in Delhi on Tuesday, February 3, to the time we went to bed
at home on Wednesday, February 4 (with the exception of largely unsuccessful
attempts at sleep in the VIP lounge in Delhi and on the two flights). Every time I looked at the flight data, when
I thought an hour had elapsed, it was only 10 or 15 minutes later. But we got home and felt like zombies.
India was a treat and a trial. This was a study and travel venture sponsored
by St. Olaf College in Northfield, Minnesota and led by a retired professor of
philosophy and his wife. The theme of
the tour was the wonders and religions of India. We went with our friends Rolf & Roberta
Sonnesyn; Roberta is a St. Olaf graduate, hence the connection. There were about 30 of us in the group,
including 3 physicians and 2 lawyers and a number of other professionals and
spouses.
We had an absolutely marvelous
guide, Harsh, who's been in the tourist business for many years and who's a
native of Delhi. (His name means "happiness"
in Hindi, which is the official language of India.) He spent the entire time with us, making sure
we got from place to place (9 cities in 17 days), got on the bus, got us fed,
through the airports, and so on. He was
a fount of wisdom about India, although he offered some views that seemed a
little rosier than what the data might suggest were warranted. But a great guy. He mostly gives tours to Americans, and he's
no doubt helped by the fact that he lived in Los Angeles for five years (as
well as London and Singapore at other times).
In general, I should say that we
probably had a pretty antiseptic experience:
we stayed in decent hotels, our tour bus was comfortable and clean, and
our itinerary was focused on religious and cultural monuments. We weren't exposed (much) to the darker side
of the widespread poverty in India (among the worst of which is in Mumbai and
Calcutta, neither of which we visited).
We traveled by train, planes,
automobiles, auto rickshaws (imagine a 3-wheeled golf cart with a much more
powerful engine), bike rickshaws, most of the time a big tour bus, once an
elephant, once a boat (on the Ganges), and once a horse-drawn rickshaw (at the
Taj Mahal). The traffic, especially in
the center cities, was something to behold and left us dumbfounded. A street in Minneapolis that would
accommodate two lanes of traffic, one each way, would have roughly six "lanes"
of traffic on it in India, although the term "lanes" is wildly
misleading. The traffic, composed of all
of the vehicles I mentioned above plus motorcycles (by the millions), trucks of
various sizes, and pedestrians, is utterly chaotic and frequently mostly
congested. All of vehicles weave in and
out among each other, always honking, along with pedestrians darting through
the often-slowed traffic. Vehicles are
routinely within a couple of inches of each other. Drivers of all kinds do not believe in gradually
slowing down (perhaps because some other vehicle will dart in ahead of them),
so they go fast up to whatever is ahead of them and then stop quickly. Lane markers appeared to be largely
decorative. In the center of cities
there were no stoplights or stop signs, so it was every vehicle going all ways
all the time. Honking, we came quickly
to realize, is not a mark of annoyance in India, it's an alert to drivers that
someone is coming upon them from behind.
Many vehicles have "use horn" or "please honk" on
their backs. If I ever had occasion to
live in an Indian city, I would promptly give up my driver's license because I
would either become a psychiatric case or I would get killed trying to
drive. It appeared that everyone there
is used to the complete mayhem on the roads and simply handles it without much
stress.
Riding in either bike or auto
rickshaws felt, to us, like we were taking our lives in our hands. The seating was somewhat precarious and as a
passenger one is not protected from a fall or accident. The drivers wove among the other vehicles
with abandon. Inasmuch as the streets
are not all as well paved as they are in the U.S.—even given pothole problems
in Minnesota—one is jolted and bumped around and you sometimes hang on for dear
life. And because the rickshaws are
open, the passengers are in the cloud of all of the exhaust fumes from the
multitude of vehicles surrounding them.
Then there is the honking plus the general noise level to add to the
enjoyment.
As we traveled about, especially in
the cities, we were dismayed by the amount of trash or litter or garbage strewn
about everywhere. Millions of plastic
bags and cups, piece of paper, general trash, and sometimes animal droppings. At one point I was waiting to use an ATM, and
a guy who had come out of it while I was waiting stood and went through his
wallet cleaning out old receipts. He
ripped each one in half and dropped the pieces on the ground, leaving perhaps a
dozen or more pieces of paper. I wanted
to scold him. All this trash all over
the place was depressing to contemplate and it sure didn't do much for the
scenery. I will give credit to the new
prime minister: he's launched a "clean
up India" campaign and involved all kinds of celebrities in the
effort. So far, however, it doesn't
appear to have had much effect.
We were also struck by what seems to
be a lack of facilities maintenance. The
exteriors of all the buildings appeared worn, down-at-heel, ill-kept. It seemed that once a building was
constructed, nothing more was done to maintain its appearance. As a result, they all looked decrepit, no
matter how new they might have been.
One widespread practice—not confined
to India—is that of having attendants in toilet facilities. They must be tipped. I confess to intensely disliking having to
pay someone to hand me a piece of paper to wipe my hands on or to turn on the
faucet for me. I didn't like it in
Europe and I didn't like it India. But
we always had to keep 10-rupee notes in our pockets to tip the attendants; 10
rupees is about 16 cents. On the other
hand, the "job," such as it is, gives a very modest income to some
guy for a task that most of us would never want to perform. So I suppose I shouldn't grouse.
We observed many, many makeshift "residences"
in the larger cities we were in, along the streets, on rooftops, under bridges,
and so on. They were no more than 8 X
10, it appeared in most cases, and had plastic tarps or old rugs or other
flimsy materials as roofs. On the other
hand our hotels, in some cases little oases of America, were lavish by
comparison to the surrounding areas. It
struck me as a little strange, and uncomfortable, to slide into bed between
clean sheets in our hotels—which were, with one exception, similar to most
American mid-range hotels—knowing that within only hundreds of feet there were
many people sleeping in dirty, tiny spaces that were open to the weather. (The temperatures were in the 50s to low 70s,
which the locals thought was really cold but which we, of course, found
perfectly comfortable.)
In a related vein, I sometimes
wondered what people who looked at us in the tour bus thought. In looking out the windows, we were about 8-9
feet above the ground (the buses were big, like Greyhound buses, although not
quite that comfortable); here we were, whites of European ancestry, behind
glass in clean seats, looking down at locals walking in dust and dirt and noise
and traffic. They sometimes stared at
us, although these tour buses were not uncommon. What were they thinking about? I felt that we were almost alien observers
visiting from another planet.
(Sometimes, if we passed a school bus full of youngsters, we'd wave and
smile and they'd all laugh and wave in return.
But that was the kids, not the adults.)
Were they simply curious?
Envious? Somewhat angry about the
clear difference in circumstances? I
never did know.
Male public urination was common, and the
presence of other people was no deterrent.
One of the guys on the trip had been counting the number of times he'd
seen it; I believe he had reached 230 by the time of one of our last bus
rides. On the one hand, that's a fairly
disgusting practice. On the other, with
so many people living in quarters that have no indoor plumbing, and with few
public toilets, what is one to do? And
what do the women do? They find places
that are more discreet, apparently.
Our tour guide gave us an example of
the living caste system. His wife is
from the Brahmin class (the priestly class, the highest-status); he is from the
Kshatriya class (the ruling and military elite), the second-highest caste. He has been married for something like 15
years and has two daughters; his father-in-law has never spoken to him because
his daughter married below her caste.
His mother-in-law does, and his father-in-law maintains communication
with his daughter and with his two granddaughters, but will not speak to his
son-in-law. I wise-cracked to Harsh that
not having your father-in-law speak to you wasn't necessarily all bad; he
understood the humor.
Because one of the foci of the trip
was the religions of India, we visited a number of temples. There are some huge ones that give the large
European cathedrals a run for their money in terms of being breath-taking,
although they are very different from cathedrals. For example, in Amritsar, both the big Hindu
temple and the Sikh Golden Temple cover a large amount of ground with
highly-sculpted walls enclosing a considerable amount of land and buildings: no single building as large as a cathedral
but the establishment as a whole of equivalent effect. Again it felt a little odd to be traipsing
through these places while there were religious ceremonies taking place (which
they do pretty much around the clock).
We always had to be barefoot (or wear socks), and in the case of the
Golden Temple, we had to do a lot of walking to get around the place—on an
overcast, windy, damp, cool day, so our feet were freezing most of the
walk. (We had a lot of shoe removal
sites during this trip, which for the most part was not a problem, but we did
sometimes wonder what kind of bugs we were picking up on our feet. Nothing happened, however, so our feet must
have protected themselves.)
At one point later in the trip, in a
post-prandial chat, a couple of the people on the tour said they were going to
throw away the shoes they'd been wearing in India (because of the places they'd
been and the surfaces and substances they'd walked on and the germs the shoes
probably picked up). I was surprised; I'd
never thought about that. I later talked
with one of the three physicians on the tour; he said that he'd never thought
about doing so, either, and that if one were really concerned, they just need
to wipe down the bottoms of the shoes with alcohol. I suggested that leaving them outside in
Minnesota on some -10 degree night might also work; he agreed that would be a
solution as well. But that sentiment
about discarding shoes gives you an idea of what some of our walking was like.
I learned something about food that
probably most of you know but I didn't:
curry is not a single spice. I'd
always assumed—without giving it any thought—that it came from some plant. Nope. "Curry"
is a mixture of spices, with the content and proportions varied with the food
being prepared. We figured out early on
that our menus—almost always a buffet—were considerably toned down from what
Indians normally eat. In one hotel the
dining room was split in two; we wandered into one at meal time and were
directed to the other: the first one had
only Indians eating; the other was for our tour group. The normal Indian food was too spicy for most
of us. On occasion Harsh would have a
separate meal: real Indian food, instead
of the Scandinavian version. (It was all
Indian food, just less spicy for us.)
The second night after we got home we went to a local restaurant and had
a cheeseburger—even though we don't eat that much red meat at home, we had had
none while in India, as well as no milk or cheese. That cheeseburger tasted really good.
We drank a fair amount of beer with
meals; it was provided at many places and in most hotels. There are Indian beers; the one we were
always given was Kingfisher. It is the
Bud of India: bland to the point of
being nearly tasteless. I bet somewhere
there are local breweries that produce much better-tasting beer.
In learning about the various
religious beliefs of Indians, as a secular humanist I was reminded of a phrase,
altered slightly for my own purposes, of Justice Sutherland in Adkins v. Children's Hospital (1923), in
response to social science data presented to the Court: interesting but not persuasive. (Sutherland's exact words were "interesting,
but only mildly persuasive.")
Something I had thought about earlier with respect to Christianity
occurred to me again on this trip. Just
as I suspect the imagined itinerant Jewish preacher of 2000 years ago would be
astounded at the architecture (think St. Peters in Rome, Westminster Abbey, or
even Mount Olivet in Minneapolis) and ceremonies attached to his name, so also
I suspect Siddhartha Gautama and Mohammed would be startled at the buildings
and decoration erected in their names.
So I found the travels and discussions instructive, even fascinating,
but nothing led me to think I should abandon my current beliefs.
I was surprised to learn—I guess I
knew this but I'd completely forgotten it—that astrology is a big business in
India, at least among the circles who can afford to consult an astrologer. And the charts or reports aren't the ones you
see from "Omar the Astrologer" or whoever in the local newspaper;
these are serious reports requiring, ab initio, date, location, and time of
birth. I find it puzzling that
intelligent and well-educated people in India would give any credence to the
proposition that the position and movement of stars millions of light years
from the Earth affect their lives. "Among
other issues, there is no proposed mechanism of action by which the positions
and motions of stars and planets could affect people and events on Earth that
does not contradict well understood basic aspects of biology and physics." Nor have scientific tests born out any of the
premises or predictions. There are
certain elements of magic to most religions, but those typically are claims not
subject to scientific analysis. In the
case of astrology, however, the claims have been tested and found wanting (to
say the least).
One of the most famous and fabulous
pieces of architecture combined with art (in the world) isn't a religious structure,
it's a mausoleum: the Taj Mahal, built
to house the remains of the third wife of one of the Moghul shahs. Like many places in the world, both natural
and human-built, pictures don't do it justice.
It is astonishing. We heard while
on the grounds that it has been estimated it would cost $70 billion to
replicate the Taj Mahal. I believe it,
given the enormous amount of inlaid marble (with onyx, jasper, jade, lapis,
sapphire, and so on) and the sheer size of the buildings.
Despite my grumbling about aspects
of our venture, I can say that we had a wonderful trip. Some of the architecture (and art associated
with it, primarily sculpture but also painting and inlaid gems) was absolutely
stunning. We were educated about the
differences between the local Hindu architecture and the later Moghul (Islamic)
architecture and the interplay between them.
We were extremely impressed with the accomplishments of a civilization
that is centuries older than virtually all in the west (except perhaps ancient
Egypt and its predecessors). I won't
recite a list of all the places we visited and the things we saw—pictures can
do a better job of that—but I can say we were impressed and informed. But a few highlights.
One of the many religious sites we
visited was the Bahá'à (Lotus) Temple in Delhi.
It bears a striking resemblance to the Sydney Opera House, but there's
no architectural connection between the two, as far as I could tell. The Bahá'à Temple was austere on the inside,
and even Kathy, who's not fond of overly-decorated interiors, thought it cold
and sterile. The woman who met us
outside the temple gave us a little spiel about the Bahá'à faith; in essence,
it seeks a blending of all the world's religions (according to her—I have not
gone and read independently about Bahá'Ã).
Both I and another guy in the tour group took exception (by question,
not accusation) to the possibility, pointing out that there are fundamentally
irreconcilable differences between many of the world's major religions. One simply cannot be a Hindu and a Christian,
for example. But apparently there are
people who believe that blending or merger can be accomplished. I suspect that devout Muslims, among others,
would find the creed objectionable.
Those who are scholars of religion, I am told, dismiss the notion that
there is an underlying unifying theme or core to all the world's
religions. That conclusion accords with
my limited personal exposure. So I'm
inclined to think the Bahá'à faith, as it was described to us, is implausible
in its premises.
I was struck again, as I have been
in other places in the world, by the discrepancy between (1) the opulence and
expense of palaces, forts, and religious buildings and (2) the squalid and
unhealthy living conditions of huge numbers of the population in which these
edifices are built. I guess it has
always been that way, starting with the Pyramids in Egypt and perhaps even
earlier and running through to the time of the construction of the enormous
cathedrals across much of Europe. We
seem to be moving in that same direction in the U.S. today, with a vast gap in
income and assets between the majority and the small minority on top; in our
case today, however, instead of religious monuments, we have banks and
financial institutions.
One of the more interesting
religious sites we visited was the Golden Temple of the Sikhs in Amritsar—a
city founded by the Sikhs. Sikhism, a
monotheistic religion, was founded in the early 1500s; while I'm not going to
try to explain it, I can say that it was originally led by 10 successive gurus
who had been writing the precepts of the faith.
The last of the ten designated that text, the Guru Granth Sahib, the Sikh "bible" so to speak, as the
final guru. For Sikhs, the realization
of truth is more important than anything else and their teachings emphasize the
principle of equality of all humans and rejects discrimination on the basis of
caste, creed, and gender. The Guru Granth Sahib has 1430 pages and is
a collection of hymns describing the qualities of God and the necessity
for meditation on God's name; the text is housed in the Golden
Temple, which is the holiest of all sites for Sikhs. The Golden Temple, completed in 1604, is
housed inside an impressive (white marble) walled complex. We visited on a cold, blustery, overcast day,
and since one must remove footwear to be in the complex, we all took off shoes
and socks (because it had rained recently, there was some water on the marble
floors, so most of us took off socks as well as shoes). As a result, we all had freezing feet for the
duration of the tour. At one point we
entered one of the other buildings in the complex and were given small
bracelets of colored string (for good luck) and the opportunity to pour holy
water on something. I declined to
participate because I think it hypocritical to participate in ceremonies to
which I give no credence.
The last two cities we visited before
returning to Delhi to depart for home were Varanasi and Bodhgaya. The former is the holiest city in Hinduism;
it is located on the banks of the Ganges (Ganga in India) and is one of the
oldest continually-inhabited cities in the world, since about the 11th
or 12th Century BC. Many
Hindus come to Varanasi to die in order that their ashes can be deposited in
the river, thus freeing them from the cycle of reincarnation. We took a boat ride on the Ganges one evening
and saw, from out on the water, the ceremonies closing the day; there was much
music, many hundreds of people on shore around the celebrations, and various
lights and torches. There were also many
boats; we were in a boat parking lot and people were walking from boat to
boat. From everything I'd read about how
polluted the Ganges is, I expected to see trash floating on the surface and
that it would smell. There wasn't any
trash nor did it smell. I don't know if
the lack of smell would be true in the summer as well as the winter.
Bodhgaya is the holiest city of
Buddhism (even though India has comparatively few Buddhists now—the population
is about 80% Hindu and 15% Muslim, with the remaining 5% consisting of
Buddhists, Christians, Sikhs, Jains, and others). It is the place where Siddhartha Gautama
meditated for six weeks, achieved enlightenment, and became Buddha. (He is now referred to as "Lord Buddha,"
but from what we learned while there, he would have repudiated the title.) There is a very large temple, Mahabodhi
Temple, that marks the location of his walks and meditation, surrounded by many
little plazas, I guess, where the Buddhist faithful and monks can pray. There is also the Bodhi tree, an enormous
tree (about the 5th or 6th one) that grew from a sapling
of the tree that was there when Buddha was alive (his birth and death dates are
unclear, but a number of historians suggest about 563 BC to about 483 BC,
although those dates remain contested).
The grounds are attractive and bedecked with flowers; it was a pleasant
place to spend time.
Someone in our tour group observed
that there are any number of Christians who are also engaged with Buddhism. Two of the guys in our group were Catholic
priests, so I asked one of them what he thought about that: can one simultaneously be a Christian and a
Buddhist? His view was that while there
are many practices in Buddhism that do not conflict with Christianity, when one
gets down to core values, one cannot be both, because there are irreconcilable
differences between them. As my friend
Rolf also observed, there are any number of scholars who question whether
Buddhism even qualifies as a "religion." The point is well taken, but on the other
hand, with Siddhartha Gautama now called "Lord Buddha," with statues
of Buddha everywhere, and with the praying that Buddhists do, while it may not
have been a religion as Buddha first outlined it in his sermons, it seems to me
to have evolved to become one over the past couple of millennia.
Bodhgaya was one of the more fun
places we visited in that we could get out of our hotel and walk around by
ourselves rather than as part of the whole tour group. So we walked along the street several times,
looking at what the dozens of vendors in their small stalls had to offer. Although in general Kathy didn't find any
particular bargains on the beads and gems she uses in making jewelry, nor was
there all that much that she couldn't get at the annual gem show in the Twin
Cities, she did find a number of very good deals and unique items in
Bodhgaya. I had to go to the ATM three
times within about two hours to get more money (we were trying, at that point,
to minimize the number of rupees we held because they are not convertible in
the U.S. to dollars—I have no idea why—inasmuch as Bodhgaya was the last stop
before returning to Delhi and then home).
I was worried that at some point the bank was going to refuse my
withdrawals because they were so frequent, suspecting fraud. But it didn't and I got the money for Kathy's
purchases. The one drawback to walking
around as we did is that we were also right in the middle of the dust and air
pollution, and both our eyes and throats reacted (not so severely that we
couldn't enjoy ourselves anyway, but enough that we certainly noticed).
Many on the trip, by the 13th
or 14th day, began to suffer from various ailments of the sinus or
intestinal tract. A number of people
thought their allergies were finally getting to them, after the prolonged
exposure to badly polluted air. When we
first arrived (at 1:00 a.m. local time in Delhi), Kathy commented that as soon
as we walked out of the airport terminal she could taste the air in her
mouth. (After we had returned, there
were news reports that Delhi is now the most polluted city in the world, worse
than Beijing.) Both Kathy and I got
colds, at different times during the trip, but Harsh had pills for us that
mitigated the worst effects, so they were more annoying than debilitating. For me, the greatest effect of the air
quality was in Varanasi and Bodhgaya, as I noted. I think that's where many of us may have felt
the impact of the air pollution the most.
Except for the fact that Kathy wants
very much to visit Thailand, I would say that our trip to India is my last trip
to Asia. After Japan, South Korea, and India, I don't think I need to
visit that part of the world any more. Part of it may just be that I'm
getting old and cranky about facing such enormous cultural, culinary, and
cleanliness differences. The Scandinavian-German tidiness of my
upbringing, along with their basic meat-and-potatoes diets, makes me resistant
to adapting to such dramatically different facets of life in Asian cultures,
especially when the dietary change can have such dramatic and unpleasant
effects. This is a non-judgmental chauvinism: I'm not saying Asian food or culture or
anything else is bad, although I'm willing to condemn the garbage everywhere,
only that it's so different that further exposure for me would be redundant.
The most remarkable and irritating
coincidence of the entire trip occurred when we had returned home. When
in India, visiting Americans drink only bottled water, and they use it for
brushing their teeth and anything other than washing. (In my case, I also
used it for rinsing my contact lenses, which was a trick.) The hotels all
provide 2 large bottles of water every day; all the restaurants provide it at
the tables. One eats no food that might have been rinsed in tap water
(e.g., lettuce). We were so looking forward to getting home and being
able to use, and drink, the tap water. We got home on a Wednesday
afternoon about 2:00. The first thing we both did is take a shower and
empty all the laundry out to be washed. Once showers were done, I was
going to start the laundry. When I did so, the water coming into the
washing machine was tinted brown and the water pressure was low. I
thought maybe it was just a rust buildup of some kind since we'd been gone so
long (although Krystin had stayed in the house during our absence). But
it was also brown coming out of the kitchen faucet and the bathroom.
Kathy called the city; it turns out that within an hour or so after we got
home, a water main a few houses down the street from us had burst, turning the
water brown and reducing the pressure. So we didn't have clean tap water
after all (until the next morning).
I learned something mundane as a byproduct
of our trip. While we were in Varanasi and Bodhgaya, we noticed that the
moon was nearly full—and getting fuller each day. I have never known the
phases of the moon—but Elliott did, I knew, from his astronomy class. So
I texted back and forth with him and finally learned them. When the moon
is more than half full, it's gibbous; when it's less than half full, it's
crescent. A gibbous or crescent moon is waxing or waning. So we saw
a waxing gibbous moon in India. What I really didn't know before is that
the light on the moon always starts on the right and increases to the point
when there is a full moon—after which the darkness starts from the right and
eventually become a new moon (when the moon is completely dark). I
had suggested to Elliott that one couldn't know if one was looking at a waxing
or waning crescent or gibbous moon; he said that was not right because you know
the light and dark both always start on the right. So there's the
astronomy lesson for the year (which most of you probably already had learned
long ago.)
I suggested to Elliott that that there
probably wasn't much reason for him to visit India. He wouldn't like the
food and all the religious sites would not be of interest. He said he
never really wanted to visit India, "and particularly now that I know it's
similar to Africa in the sense that you need a million preemptive medical
procedures." Neither Kathy
nor I had India high on our bucket list, either. But the trip was
reasonably priced, so when Rolf & Roberta asked us to join them, we said "sure."
What I continue to mull over is the opportunity cost: what trip won't we
take because we took this one (given finite resources, limits on vacation time,
and limited lifetime)? Maybe none--we'll still get to most places we want
to go, life and health permitting.
While emailing with Elliott from India, he
wrote at one point: "Now pause for a moment, step back, and realize
how strange it is that we can communicate instantaneously despite being almost
literally on opposite sides of the planet with an 11.5 hour time difference. Even having spent the majority of my life in
the 21st Century, that still sort of boggles the mind." I agreed with him and had been thinking the
exact same thing while we'd been exchanging messages. Even very early in his lifetime (the early
1990s), transoceanic communication was either by mail or bad and very expensive
telephone calls. Today it is virtually free (setting aside the billions
of dollars spent on infrastructure and equipment).
I emailed also with a faculty friend at
another Big Ten university, a guy who had retired from being a professor of
Chinese literature. After reading my
notes on the trip, he wrote back to say that "one of the great joys of
retirement has been retirement from visiting China. I basically have the same responses that you
do to uncleanliness, disrepair, and food that is always threatening to segue
into a desperate scramble." I did
have that problem in India; I was flattened for 36 hours by an intestinal
challenge that required I never be far from a toilet. Fortunately, Kathy had a z-pac from her
doctor that she gave to me, which solved the problem fairly quickly.
I opined to a faculty friend,
reflecting on the increasing income and asset inequality that shows up in these
magnificent religious monuments in India (as it does in such monuments and
buildings elsewhere around the world), that we will soon be in the same state
as pre-revolutionary France, with the peasants desperate and the nobility and
king decadent. She wrote back to say
that I think like she does. "I am
amazed that the populace have yet to grab their pitchforks and head for
Washington and Wall Street. But I suppose they are too engrossed in their
iPhones, iPads, and other tech devices to notice what is happening."
If you want to inspire confidence, give plenty of statistics – it does not matter that they should be
accurate, or even intelligible, so long as there is enough of them.
Various "laws," both amusing and
serious, that I've stumbled across in recent times. The first three have been around for quite
awhile. These are a mixture of quotes and my paraphrases.
Clarke's three laws:
First law: When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong.
Second law: The only way of discovering the limits of the possible is to venture a little way past them into the impossible.
Third law: Any sufficiently advanced technology is indistinguishable from magic.
First law: When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong.
Second law: The only way of discovering the limits of the possible is to venture a little way past them into the impossible.
Third law: Any sufficiently advanced technology is indistinguishable from magic.
The first law has been proven accurate
repeatedly since it was first enunciated in 1962. The third one will
certainly come into play if ever we encounter aliens in the near future.
Shermer's last law (2002) – A corollary of
Clarke's three laws, "any sufficiently advanced alien intelligence is
indistinguishable from God."
Cunningham's law: The best way to get the right answer on the
Internet is not to ask a question, it's to post the wrong answer.
Hanlon's razor, a play on Occam's razor,
normally taking the form "never attribute to malice that which can be
adequately explained by stupidity." Alternatively, "do not
invoke conspiracy as explanation when ignorance and incompetence will suffice,
as conspiracy implies intelligence."
Finagle's Law of Dynamic Negatives (also
known as Finagle's corollary to Murphy's law) is usually rendered "Anything
that can go wrong, will—at the worst possible moment."
Poe's law (religious fundamentalism): Without a winking smiley or other blatant display
of humor, it is impossible to create a parody of fundamentalism that someone
won't mistake for the real thing.
Littlewood's law states that an individual
can expect to experience "miracles" at the rate of about one per
month. The law was framed by Cambridge University Professor John Edensor
Littlewood, who defines a miracle as an exceptional event of special
significance occurring at a frequency of one in a million. He assumes that during
the hours in which a human is awake and alert, a human will see or hear one "event"
per second, which may be either exceptional or unexceptional.
Additionally, Littlewood supposes that a human is alert for about eight hours
per day. As a result a human will in 35 days have experienced under these
suppositions about one million events. Accepting this definition of a
miracle, one can expect to observe one miraculous event for every 35 days'
time, on average—and therefore, according to this reasoning, seemingly
miraculous events are actually commonplace.
The last one is an interesting case where,
if people understood statistics, they would not be so surprised at events that
seem "miraculous."
Lewis's law: the comments on any article about feminism
justify feminism. (This certainly conforms to my experience with reader
comments on articles.)
Shirky principle – Institutions will try
to preserve the problem to which they are the solution. (Fortunately, in
the case of universities, people always need to learn more, so they don't have
to preserve a problem!)
Sturgeon's law is commonly cited as "ninety
percent of everything is crap." It is derived from quotations by
Theodore Sturgeon, an American science fiction author and critic. . . . The phrase was derived from Sturgeon's
observation that while science fiction was often derided for its low quality by
critics, it could be noted that the majority of examples of works in other
fields could equally be seen to be of low quality and that science fiction was
thus no different in that regard from other art forms.
On a more serious note, and one that bears
on my musings about what I've done in life, here are the three laws of behavior
genetics:
First Law:
All human behavioral traits are heritable.
Second Law: The effect of being raised in the same family
is smaller than the effect of the genes.
Third Law:
A substantial portion of the variation in complex human behavioral
traits is not accounted for by the effects of genes or families.
There
seems not to be significant dissent among professionals in psychology and
genetics about the validity of the three "laws." Steven Pinker,
the psychologist and cognitive scientist, summarizes the results of the
research this way: "genes 40-50%, shared environment 0-10%, unique
environment 50%." ("Shared environment" refers primarily
to the decades of research on siblings and especially identical twins.)
It's too late to correct it: when you've once said a thing, that
fixes it, and you must take the consequences.
Another "law," which receives
its own paragraphs because I was a political science student, is Duverger's
law, named after Maurice Duverger, a French sociologist. The law is the very strong tendency of
winner-take-all single-member district electoral systems to encourage two-party
systems (although it is not inevitable, but variations are usually due to
oddities in laws or geography of voting).
Also referred to as "first past the post," it's the U.S.
electoral system, where voters have a single vote for one candidate in a
district where there is one seat available (e.g., U.S. Senate seat, seat in the
legislature, etc.) and the winner is elected by plurality vote. What happens is that even when a third party
receives a significant number of votes, if its candidates are always in
third-place (with the other two parties alternately winning and losing), that
third party will never have any representatives in government. Except for true believers (e.g., communist
voters, socialists, Green parties, libertarians), voters tend to abandon such
parties after a time. The weaker
parties, in other words, are eliminated from serious contention, as voters
decide to choose among candidates deemed more likely to be able to win.
With only one exception, that is what has
happened to the Independence Party in Minnesota. Jesse Ventura won a gubernatorial election,
but in every election since then it has never had a winning candidate for any
position in government. It has,
predictably, faded to minor party status.
A
third party can only succeed if one of the two major parties begins to fail and
is succeeded. The Republicans (at that
time the progressive party—how things change) replaced the Whigs, who didn't
take a strong stand on slavery, so southerners went for Democrats, who were
pro-slavery, and northern liberals went for Republicans, who were strongly
anti-slavery.
Conversely,
when the electoral rules provide for proportional representation, there is a
tendency to a multi-party system: a
party that receives 20% of the vote will receive approximately 20% of the seats
in the legislative body. In a "first
past the post" system, it gets none.
I
asked my political science colleague and friend Phil Shively about Duverger's
law. He tells me that "yes,
Duverger's 'law' is an old, well-established principle. It's probably as
close to a law as we have in political science. It's my main explanation
for why such a very diverse society as we have in America shoe-horns all of our
varying interests into just two parties. (Not only does a plurality
electoral system tend to two parties, but an elected presidency—in effect a
plurality system with just a single huge district—is Duverger on steroids.)"
The
moral of the story is that if you want your vote to have an effect on the
outcome of the race between the two major-party candidates, you don't throw it
away on a minor party.
(Actually,
there's a big caveat to be observed:
voting for a third-party candidate who siphons off enough votes from
major-party candidate A will lead to the election of major-party candidate B,
even though candidate B did not receive a majority of the votes cast. That is precisely what happened in Minnesota,
when the Independence Party in 2002 and 2006, which received much of its
support from good-government lefties who usually vote Democratic, drew enough
votes from the Democratic candidate that the GOP candidate Tim Pawlenty won
twice, even though he never received more than about 40% of the votes
cast. In those two elections, it is
likely that a large number of Independence Party voters ended up with a
governor who was their LEAST preferred candidate. Dopes.)
As
I watch posts on Facebook and read news pieces here and there, I see—a full
year before the election!—Bernie Sanders supporters saying they won't vote for
Hillary Clinton if she wins the nomination.
There are two significant things wrong with that. First, if she wins primaries and caucuses,
the voters of the Democratic Party have selected her. Second, the Sanders supporters who abstain (I
assume virtually none of them would vote for any Republican nominee) increase
the likelihood that the Republican will win by depriving the Democratic
candidate of their votes. Dopes.
She felt a little nervous about this; 'for it
might end, you know,' said Alice to herself, 'in my going out altogether, like
a candle. I wonder what I should be like then?' And she tried to fancy what the
flame of a candle looks like after the candle is blown out, for she could not
remember ever having seen such a thing.
2015 is a big year for
anniversaries. Four of them caught my
attention. Just as 2014 was a significant
centennial year, for World War I, 2015 is equally or more important as (1) an
octocentenary, (2) two bicentenaries, and (3) a sesquicentenary. A couple of the events were important for
Anglo-American history and development, one was important for the world as a
whole, and one is simply a noteworthy literary anniversary.
First, it was the 800th anniversary of the
Magna Carta in June. Any number of observers during the year commented on
how strange it is that a treaty between feudal opponents in 13th Century
England is both celebrated and cited eight centuries later. It is
commemorated on one of the doors of the U.S. Supreme Court with a bronze panel.
It was one settlement in a long fight between the king, John (generally
considered to be one of the worst monarchs who ever sat on the English throne)
and his barons. I won't recite the full history of events; you can look
them up on Wikipedia or a thousand other places if you're interested.
But one must know a little bit about the
events. The king had squandered most of his resources—money and
political—trying to retain control of parts of France (which English kings
ruled parts of for a very long time, ultimately losing all of it). John
had also angered the pope by refusing to accept the papal nominee for
Archbishop of Canterbury and kicking the monks out of Canterbury. The
pope responded by excommunicating him—which, as anyone who knows about medieval
history knows, was a big deal, because it cut the person, and in this case the
nation, off from intercourse with all other Christian nations. He made nice
with the pope by humiliating himself with the Vatican, but he'd also gotten on
the bad side of the barons because he'd been fleecing them. As a consequence, they were rebelling and
putting his throne at risk. So John had
to reach a settlement with them; the result was the Magna Carta.
It wasn't novel—earlier kings had issued
charters—nor was it a success. In fact,
a month later, John appealed to his new friend the pope to annul the charter
because he didn't like it. Moreover,
most of it applied to its time and place:
to stop John from swindling them, the baron insisted on provisions
governing practices irrelevant to modern society. But there were notions in the document,
perhaps due to the work of the new Archbishop of Canterbury, Stephen Langton, that
set out broader considerations of how to organize (in this case a medieval,
monarchical) society.
Ultimately, however, it was an agreement
between elites. It didn't say much about
liberty.
Perhaps the reason the charter is
celebrated on the Supreme Court door is for the language in one of the sections,
chapter 39: "no free man shall be
seized or imprisoned, or stripped of his rights or possessions, or outlawed or
exiled, or deprived of his standing in any other way, nor will we proceed with
force against him, or send others to do so, except by the lawful judgment of
his equals or by the law of the land." But it was unclear what that meant—there was
no law of the land at the time. It wasn't
novel language; it had appeared in continental documents in the previous two
centuries. It disappeared from
continental law and documents but it stuck in Anglo-Saxon law.
What it did do is proclaim the rule of
law. The idea is that that the charter
shapes how we perceive ourselves as freedom-loving, English-speaking people who've
had liberty under the Magna Carta for 800 years. As one British faculty member observed, "it's
a load of tripe, of course, but it's a very useful myth."
One reason it survived, historians joke,
may be that King John got sick and died shortly after backing out of the
deal. The rebellious barons had invited
the French king to England to take the throne; John went after the barons'
lands but died en route. His son was
young, so a regent governed—William Marshal sent the French packing and revived
the charter. At which point the barons
had no cause to fight, so stopped. When
the young king (Henry III) came to the throne, he re-issued the charter once
again, as did his successors.
The other reason may have been that it was
revived by Sir Edward Coke in the 1600s because it was politically helpful for
those who wished to topple the Stuart monarchy.
(Which they did, beheading Charles I in 1649.) The parliamentary opponents of the Stuarts
didn't want to seem to be revolutionaries, so they made up the idea of the "Norman
Yoke," the result of William the Conqueror's (and his Normans) victory in
1066. They created the myth that there
been a "Saxon Eden" before the Normans that the barons had been
trying to revive with the Magna Carta—and that the Stuart parliamentary
opponents were also trying to revive.
The Magna Carta was the "charter of our liberties, . . . a
renovation and restitution of the ancient laws of this kingdom." (The story was a fabrication.)
Coke was the chief lawyer for the
parliamentarians. He argued that the
language of chapter 39 established limits on the monarch's power. After the revolution, and then the
restoration of the monarchy in 1660, the English weren't that interested in the
Magna Carta. Coke was also one of the
great jurists of the era; he wrote Institutes
of the Lawes of England and a number of Reports
that in many ways serve as the foundation of modern English and American
jurisprudence.
The English were no longer interested in
the Magna Cart, but the colonists were.
They were established just about the time that Coke was making the
arguments for the meaning of chapter 39.
Coke wrote the Virginia charter; William Penn published the charter in
America. Many of the founding fathers
were lawyers—and they read Coke. The
colonists used the language of chapter 39 in their protests against the alleged
depredations of the crown against the colonies.
Its spirit, if not its exact language, is apparent in the Bill of
Rights.
American courts still refer to the Magna
Carta. It was cited as a basis for not delaying the sexual-harassment
case against President Clinton; Chief Justice Roberts has claimed that American
constitutional law has given meaning to the language of the charter. He also cited the language of the charter in
a case this year.
So even though it failed when signed, it
still does matter because of how it was used in the further development of the
law.
The American Bar Association was
responsible for a memorial building in England on (or near) the site of the
signing at Runnymede, erected in 1957. The
Brits didn't care enough about it to build one themselves, although they did
have festive occasions on the octocentenary, including a visit to the site by
the Queen.
"You may look in front of you, and on both
sides, if you like,' said the sheep: 'but you can't look ALL round you - unless
you've got eyes at the back of your head."
The second anniversary of note, which you
might not expect, is the eruption of Mount Tambora in Indonesia, on April
10-11, 2015. It was the most powerful
eruption in the last 500 years and spread ash over several hundred thousand
square miles. The blast was heard over
1200 miles away. (Imagine me in
Minneapolis hearing something taking place in New York City.) Somewhere between 60,000 and 120,000 people
were killed.
By comparison, the eruption of Mt.
Pinatubo in 1991 was about one-sixth the size in terms of volume of material
and about one-third the size in terms of sulphur.
But it had effects far beyond the date and
site. 1816 was the "year without a
summer" because of the effects that Tambora's explosion had on the
atmosphere (some of the sulphur dioxide ash turned to sulphate ions, which hung
in the atmosphere for months, and which reflect back more sunlight than the
atmosphere normally does, so cooled the earth).
There were affects both tragic and
amusing. The weather was so lousy in
Europe that besides small crops and soaring food prices, four people were
confined to the house they were staying in for the summer near Geneva. Forced to amuse themselves, they decided to
write stories. One of them was by Mary
Shelley, Frankenstein, and the other
by John William Polidori, The Vampyre,
which Bram Stoker later elaborated on for Dracula. So one byproduct of Tambora was two modern
horror genres. More seriously, there was
widespread starvation in parts of the world because of crop failures due to
cool weather.
There is speculation about other effects
of Tambora. For example, the increased
grain prices caused by hunger in Europe may have driven farmers across the
Appalachians to better weather and bigger crops that were then shipped to
Europe. It's also possible that the
summer frosts that resulted from the eruption drove farmer Joseph Smith from
Vermont to New York, where his son later found the golden tablets that serve as
the foundation of the Mormon Church.
Historians also wonder about the effects of the weather on European
politics following the Napoleonic wars.
These are tiny examples about the
potential effects of global climate change.
And that was just one volcano.
This anniversary coincides with some
speculation in the last year or so that in addition to the meteor impact that
may have, most likely did, lead or contribute to the end of the dinosaurs (that
is, the non-avian dinosaurs; modern birds are descended from flying dinosaurs),
the impact may have been preceded by massive volcanic eruptions in what is now
India for 200,000 years or more, eruptions that were changing the climate. Those climate changes were creating an
atmosphere that caused some extinctions, some argue, and the meteor hit may
have been the final straw. Or, as most
paleontologists maintain, it was primarily the meteor impact. In either case, about 75% of all living
creatures were wiped out. What must have
been the sudden impact of the meteor makes me wonder if the dinosaurs weren't
gone very quickly, almost in the blink of an eye, after the meteor impact.
Mid-November 2015, a report from Leeds
University in England concluded that "the role volcanic activity played in
mass extinction events in Earth's early history is likely to have been much
less severe than previously thought."
So more likely meteor.
If
everybody minded their own business, the world would go around a great deal
faster than it does.
The third
anniversary, also from 1815, is the 200th anniversary of Battle of Waterloo,
June 18, 1815, when British and Prussian forces defeated Napoleon for the
second time.
For
those of you who know your European history, you will recall that after a dozen
years of invading countries across Europe, Napoleon was defeated in 1814 by an
alliance of his enemies (Britain, Prussia, Russia, and Austria), imprisoned on
the Isle of Elba, escaped from Elba in 2015, and returned to France and became
its leader again, to the adulation of much of the French population. The restored Bourbon monarchy had reverted to
its autocratic ways, interrupted by the French Revolution in 1789, and fled
Napoleon's return without a whimper. The
allies, meeting in Vienna to attempt to restore Europe to its pre-Napoleonic order
and political boundaries, were determined to defeat Napoleon again, once and
for all time. The coup de
grâce for Napoleon came at the Battle of Waterloo, where on June 18, 1815,
forces led by the Duke of Wellington decisively defeated Napoleon's forces,
after which he fled and was forced into exile on the island of St. Helena, a
couple of thousand miles east of Rio de Janeiro in the middle of the South
Atlantic, far away from Europe's shores.
British
historian and journalist Andrew Roberts, writing in the Smithsonian, made an argument that I find intriguing. He pointed out that Napoleon, back in power
in France, had written to the allies that he was no longer interested in
conquest, and that "'from now on it will be more
pleasant to know no other rivalry than that of the benefits of peace, of no
other struggle than the holy conflict of the happiness of peoples.'" The allies, however, would have none of
it. Roberts reports that "the
foremost motive that the British, Austrians, Prussians, Russians and lesser
powers publicly gave for declaring war was that Napoleon couldn't be trusted to
keep the peace." Roberts dismisses
that concern because Napoleon had renounced his interest in empire; he recited
a number of Napoleon's actions on returning to France.
He [Napoleon] refrained from taking measures against anyone who
had betrayed him the previous year. . . . He immediately set about instituting
a new liberal constitution incorporating trial by jury, freedom of speech and a
bicameral legislature that curtailed some of his own powers; it was written by
the former opposition politician Benjamin Constant, whom he had once sent into
internal exile. . . . Napoleon well knew
that after 23 years of almost constant war, the French people wanted no more of
it. His greatest hope was for a peaceful period like his days as first consul,
in which he could re-establish the legitimacy of his dynasty, return the nation's
battered economy to strength and restore the civil order the Bourbons had
disturbed.
I suspect that had I
been a member of the government in Britain or somewhere on the continent, I
would have been extremely nervous about Napoleon's return to power.
Roberts points out, however, that not only did they not
trust Napoleon, they had less virtuous motives for crushing Napoleon.
The autocratic rulers of Russia, Prussia and Austria wanted to
crush the revolutionary ideas for which Napoleon stood, including meritocracy,
equality before the law, anti-feudalism and religious toleration. Essentially, they wanted to turn the clock
back to a time when Europe was safe for aristocracy. At this they succeeded—until the outbreak of
the Great War a century later. [Britain
was already on the road to a broader democracy but had other reasons to want to
see Napoleon gone, including distrust of his ambition, Roberts says.]
This is a fascinating point on which one could
speculate. If Napoleon had been left
alone in France—and if Roberts is correct that Napoleon would have left Europe
alone—the actions Napoleon was taking would have moved the implementation of
what we think of as essential elements of liberal government ahead far faster
than what occurred in 19th Century Europe (largely, as Roberts
suggests, because of the settlements reached at the Congress of Vienna in
1815). The autocratic regimes in eastern
Europe would have remained in place but there would have been a model that
might have advanced western Europe farther along the path toward constitutional
government—a model for the people across Europe that the autocracies of the
east did not want to exist. As it was,
the Bourbons, Romanovs, Hohenzollerns, and Hapsburgs remained on their thrones
for decades longer. (Interestingly,
however, and probably not accidentally, not for as long in France; the Bourbons
were gone by mid-century. Napoleon's
nephew was elected president of the republic and then crowned himself emperor,
but got tossed out after being captured by the Prussians in the Franco-Prussian
war in 1870. The French government has
been, until recent decades, a basket case ever since the first Napoleon.)
A web commenter on a Wall
Street Journal article (about the Belgian commemorative euro coin—see
below) echoed Roberts' comments.
In
those countries he [Napoleon] conquered he brought a code of law which largely
overturned the royal order and brought much of the individual rights that the
French Revolution had spawned. In view
of all this, those who celebrate the defeat of Napoleon are in fact celebrating
the continuation of monarchic rule for the remainder of that century which then
spawned the often violent overthrows of these same monarchies and the bloody
revolutions that followed and still, in many ways, continue.
Another reader followed up.
While war is always bloody and
destructive, the Napoleonic code brought about equality for women. A surviving
wife couldn't inherit property before that and it had to go to a son.
Jews got their yellow bands removed. The metric system spread to
every part of the world except USA and Burma.
I'd always been of the view that it was good that
Napoleon was defeated and the British triumphed. Now I'm not so sure. The British, for better or worse, would
probably still have ruled the seas for the next century irrespective of whether
Napoleon returned to power in France, so whatever influence it had on global
development, for better or worse, would likely have been similar to what it did
in fact have. France wasn't in a
position to compete with the British Navy.
One can wonder if French enlightenment—assuming Napoleon did all the
things he was planning to do—beginning in 1815 would have had an effect on
American political development. Given
that we were still at odds with the British at that point (remember the War of
1812?) and friends with the French, it's hardly impossible that there could
have been a liberal French influence on the United States. It's such fun to play "what if"
with history.
The faint echoes and emotions of Waterloo continue to
reverberate. Perhaps you noticed that
the French were upset with the Belgians for issuing a (Belgian) euro to
commemorate the Battle of Waterloo and Wellington's victory. (One headline I read: "France's whining won't stop Belgium from
minting coins commemorating the Battle of Waterloo.") The
battle was actually fought on what is now Belgian land, then part of the
Kingdom of the Netherlands, so they were glad to see the French lose—and be
gone from Belgium.
Earlier in 2015, the French pressured the Belgians into
not minting a two-euro coin commemorating Waterloo (and, indeed, melting the
180,000 it had already minted).
According to one source I read, French officials said that the coins
would carry "a
negative symbol for a section of the European population," which would be "detrimental
at a time when euro zone governments are trying to build unity and cooperation
under the single currency." But
later in the year the Belgians minted a similar coin anyway—however, they made
them worth 2.5 euros, so they aren't legal tender outside of Belgium.
The French remained peeved. It never occurred to me, when I wrote a
couple of years ago about how Napoleon arguably had a direct effect on where I
am in life, that 2015 was the bicentennial of the Battle of Waterloo, but once
again one can observe how sensitive people—nations, groups—remain even after
the passage of two centuries.
(The Netherlands and Britain also
issued commemorative coins. The Dutch
coin was also a euro, but is, like the Belgian, only legal tender in the
Netherlands. Britain isn't one of the
countries that uses the euro, so it issued a £5 coin. Neither of these productions seem to have
aroused the ire of the French.)
Well, when one's lost, I suppose it's good
advice to stay where you are until someone finds you.
The fourth anniversary I'm noting is
the 150th of Alice's
Adventures in Wonderland, marked by publication of a note-laden
commemorative edition. The notes are
much longer than the text; it's almost too much information. But I've read the book 3-4 times in my life
and every time I find it delightful.
If you haven't figured it out yet,
the quotes I've scattered through this letter are from Lewis Carroll, Alice's Adventures in Wonderland and Through the Looking Glass, and other
Carroll works.
Alice laughed. 'There's no use trying,' she
said: 'one can't believe impossible things.'
'I daresay you haven't had much practice,' said the Queen. 'When I was your age, I always did it for half-an-hour a day. Why, sometimes I've believed as many as six impossible things before breakfast."
'I daresay you haven't had much practice,' said the Queen. 'When I was your age, I always did it for half-an-hour a day. Why, sometimes I've believed as many as six impossible things before breakfast."
It was interesting to watch the
gyrations of the Bavarian government last year as it contemplated how to deal
with the end of its copyright on Mein
Kampf by Adolf Hitler. (For those of
you who don't know, the 700-page book was written while Hitler was in prison in
1924 for participating in a putsch attempt against the Weimar Republic, the
German government; it is his viciously anti-Semitic, race-based manifesto of
Aryan triumphalism.) Under German law,
the Bavarian government inherited the copyright ownership (because it was
Hitler's last registered address), which expires at the end of 2015. Although it keeps the book locked up, there
are copies on the web and in used bookstores.
Initially, the Bavarian government had
supported (financially and publicly) an effort to publish a scholarly,
annotated version of the book, and provided nearly half a million euros to the
Institute for Contemporary History. The
project would be undertaken by distinguished historians and provide commentary
on all the crap in the book.
Representatives of the German Jewish community supported the effort. One reason for sponsorship of the
annotated—and intendedly definitive—version was to weaken the impact of any
other versions that might be published without analytical commentary (or by
neo-Nazi groups supporting the ideology of the book).
Then the Bavarian government did an
about-face and said it would not support the project—and would, moreover, sue
anyone or any organization that published the book after the expiration of the
copyright, on the grounds that it is prohibited hate speech. Apparently some Holocaust survivors demanded
the book remain prohibited, and one political leader argued that it was
inconsistent to ban far-right parties in Germany but then allow publication of Mein Kampf. The government decision raised academic
freedom questions around the world, and the Institute announced it would
proceed with the annotated edition without government funding. The head of the Institute also expressed
doubt that the government would actually file charges against them for
publishing once the copyright had expired.
The New York Times later in the year reported that the Bavarian
government had backed down, when even some German Jewish leaders protested,
arguing that there should be an annotated version that would demonstrate the
absurdity of the vision Hitler set out in the book. In the words of the NY Times commentator, "The inoculation of a younger generation
against the Nazi bacillus is better served by open confrontation with Hitler's
words than by keeping his reviled tract in the shadows of illegality." The idea, he said, is to "strip it of
its allegedly hypnotizing power."
Jewish leaders in Israel, however, largely and adamantly opposed
publication.
The Bavarian government finally
decided that it would leave the project funding intact but would no longer be a
sponsor for the publication. One might
call that an artful dance that tries to satisfy everyone. I wager it doesn't satisfy those critical of
the publishing effort. Although one can
understand the objections—and the pain—of the Holocaust survivors and their
families, my guess is that publication is a better way to prevent worship than
is suppression. If one believes "never
again," better to have idiocy and malignancy exposed and destroyed than
let it fester underground.
There was an article later in the
year reporting on the royalties from Mein Kampf, profits that publishers
generally are not keeping—but no one wants to take the money, even though there
are efforts to donate it to charity, because it's seen as tainted. As I commented to a few friends at the time,
seems to me putting the money to use in a way Hitler would have hated is a
marvelous idea and the groups to which it might be directed could take the
money gladly, knowing they're defeating the ends he sought.
I give myself very good advice, but I very
seldom follow it.
Faced with the choice between
changing one's mind and proving there is no need to do so, almost everyone gets
busy on the proof.
-- John Kenneth Galbraith
Facts are meaningless. You could use facts to prove anything that's even remotely true.
Facts are meaningless. You could use facts to prove anything that's even remotely true.
-- Homer Simpson
If belief in human rationality were a scientific theory it would long since have been falsified and abandoned.
If belief in human rationality were a scientific theory it would long since have been falsified and abandoned.
-- John Gray
This
is a disheartening take on human nature, but the research evidence for it is
reasonably firm. FiveThirtyEight had a science article on how we are primed to reach
false conclusions and cited several researchers who've approached the
subject. One point is evident: "simply exposing people to more information
doesn't help." There have been
studies that demonstrate people can increase the accuracy of their beliefs when
the subject is one they don't care too deeply about. "But the lesson of
controversial political, health and science issues is that people don't apply
their critical-thinking skills in the same way when they have a preference for
who's right." So there is much
evidence that "erroneous beliefs aren't easily overturned, and when they're
tinged with emotion, forget about it."
None of Mr. Trump's claims will do
him harm.
If you set to work to believe everything, you
will tire out the believing-muscles of your mind, and then you'll be so weak
you won't be able to believe the simplest true things.
A retired professor of Geography at
the Open University in England (who has been awarded multiple honorary
doctorates and other national—British—and international awards), Doreen Massey (who's
inclined toward the Marxist end of the spectrum) has studied globalization,
uneven economic development, and how space affects poverty and the divisions
between rich and poor and between social classes. She wrote a short piece about the economic
vocabulary used in western societies and how it concedes the ground on central
philosophical issues of economic and social organization before the debate has
even begun. Even though I'm no Marxist
by any stretch of the imagination, she made a number of points that resonate
with me.
Massey contests the "hegemonic 'common
sense'" of the market economy and its sidekicks, including competitive
individualism, the virtue of private gain, and downplaying the value of the
public. One can ask, instead, for
example, "what is an economy for?" and what "do we want an
economy to provide?" There are many
answers, and there would no doubt be widespread disagreement, she acknowledges,
but her suggestion is "the enabling of decent lives and the flourishing of
human potential for all." I think
that's not a bad start for a discussion.
Massey also argues that some of the
assumptions inherent in capitalist economics are not accurate, or at least not
accurate for all people all of the time.
We are not only or always competitive individuals; we also cooperate and
care for others. She also suggests a
need for a much stronger and more spirited defense of the public sector; it
provides jobs and services, of course, but more importantly it embodies "the
necessity for a public realm, a sense of 'the public" as being part of
what helps to build a good society."
Related to the public sector is the need for a defense of taxation;
while the claim is that everyone hates taxes, they build the society. I have long thought taxes in the U.S. are
significantly lower than they should be.
Private purchases of something one can also get through the public
sector are fine, but spending money on social investment and services is widely
resented. She also makes an observation
that parallels something I've wondered about as well over the years, the
ambiguous meaning of the word "investment": "When money is advanced for productive activity we call it
investment, and that is fine. But money
advanced to buy something that already exists (an 'asset'), so that it can be
sold later at a higher price, also gets called investment. But it is not; it is speculation. It doesn't produce anything new; what it does
do is redistribute existing wealth towards those who already own assets or can
afford to buy them." Much of the
spending in the economy is on speculation, not on creating value.
And if you take one from three hundred and
sixty-five what remains?"
"Three hundred and sixty-four, of course."
Humpty Dumpty looked doubtful, "I'd rather see that done on paper," he said.
"Three hundred and sixty-four, of course."
Humpty Dumpty looked doubtful, "I'd rather see that done on paper," he said.
Angus
Deacon, Princeton, who won the Nobel Prize this year:
"The dominant
fact for me about most poor countries is that the social contract, the
contract between the government and the governed, is not there.
It's very weak and it's very unsatisfactory and you know in many cases the
government is preying on the people, not helping them. It's what you call an
extractive regime rather than a sort of helping, promoting regime."
"Oh, don't go on like that!" cried
the poor Queen, wringing her hands in despair. "Consider what a great girl
you are. Consider what a long way you've come today. Consider what o'clock it
is. Consider anything, only don't cry!"
Alice could not help laughing at this, even in the midst of her tears. "Can you keep from crying by considering things?" she asked.
"That's that way it's done," the Queen said with great decision: "nobody can do two things at once, you know." [Carroll knew 150 years ago that multi-tasking was a myth!]
Alice could not help laughing at this, even in the midst of her tears. "Can you keep from crying by considering things?" she asked.
"That's that way it's done," the Queen said with great decision: "nobody can do two things at once, you know." [Carroll knew 150 years ago that multi-tasking was a myth!]
As we approached Christmas 2014,
Krystin continued to have problems with esophageal constriction. Fortunately, her health had finally (three
months after the pancreas transplant) improved to the point that her only
trouble came from the esophagus. One of
the most difficult decisions for her was only indirectly related to her health
(if at all): she decided she had to give
up her beloved cats because she just wasn't able to take care of them properly
and had been in the hospital too much, leaving them alone.
We were profoundly grateful to Peggy Hinz,
the mother of Krystin's lifelong friend Christine Lenzen, who volunteered to
foster the two cats until such time that Krystin can properly care for
them. I told Peggy we owed her big time.
As the year progressed into spring and
then summer, Krystin returned to the hospital for a week because she could not
stay hydrated and threw up virtually all the food and liquid she consumed. One consequence was that she began losing
weight—and for someone of her small build, losing weight wasn't a good
idea. The upshot of the hospital stay
was that she's on a feeding tube in her stomach for the indefinite future
because of nerve damage to her stomach from diabetes that prevents her from
digesting more than small amounts of food.
Much of what she eats just sits in her stomach and is eventually
regurgitated, which in turn causes her esophagus to constrict. So she is on a feeding tube for 75% of her
nutrition (administered overnight, at least), and on one that also allows
drainage of fluids that cause esophagus problems. We're keeping our fingers crossed that this
will end the major medical events for quite awhile, and Krystin was able to
join us for a pleasant Father's Day dinner.
She ate a small meal without incident (then or later).
Unfortunately, her status didn't get
better as Thanksgiving of 2015 approached.
She was in and out of the hospital much of late October and November
with severe abdominal pains that have mystified the medical establishment. The physicians cannot find anything wrong but
the pain continues. The esophageal
constriction continues as well, a phenomenon the physicians have been unable to
treat except in short-term and unsatisfying ways. So we continue to grapple with those as the
end of the year approaches.
When _I_ use a word,' Humpty Dumpty said in
rather a scornful tone, 'it
means just what I choose it to mean--neither more nor less.'
The question is,' said Alice, 'whether you CAN make words mean so many
different things.'
The question is,' said Humpty Dumpty, 'which is to be master--that's
all."
means just what I choose it to mean--neither more nor less.'
The question is,' said Alice, 'whether you CAN make words mean so many
different things.'
The question is,' said Humpty Dumpty, 'which is to be master--that's
all."
I've read off and on over the years
comments about how language shapes the way we think, and that people who speak
different languages—especially languages very different from English or western
European languages—also, thereby, think differently. I have sometimes wondered whether that claim
is true.
I learned this year that the
proposition is known as the Sapir-Whorf hypothesis,
named after Edward Sapir and Benjamin Whorf.
The claim is "that our thoughts are limited and shaped by the
specific words and grammar we use. Mayans don't just speak Mayan; they think Mayan,
and therefore they think differently from English speakers. According to
Sapir-Whorf, a person's view of the world is refracted through her language,
like a pair of spectacles." Among
linguists, Sapir-Whorf is deader than a doornail: the claim has not been supported by research.
A guy (writer,
pundit) named John McWhorter has been on a campaign to demolish
Sapir-Whorf. It's an hypothesis that has
been tested and failed, even though it is reflected in our culture in various
ways. Graeme Wood, books editor at
Pacific Standard (quoted above and here), wrote that
perhaps the most famous invocation of
Sapir-Whorf is the claim that because Eskimos have dozens of words for snow,
they have a mental apparatus that equips them differently—and, one assumes,
better—than, say, Arabs, to perceive snow. . . . To get a hint of why nearly
all modern linguists might reject this claim, consider the panoply of
snow-words in English (sleet, slush,
flurry, whiteout, drift, etc.), and the commonsense question of why we would
ever think to attribute Eskimos' sophisticated and nuanced understanding of
snow to their language, rather than the other way around. ("Can you think
of any
other reason why Eskimos might
pay attention to snow?" Harvard's Steven Pinker once asked.)
In a multitude of other cases, where one
language has more or fewer words for something, there's nonetheless no
significant difference in the ability of language speakers with fewer words for
it to identify something distinct. (That
is, if one language has more words for different colors of blue than another
language, the latter speakers can discriminate between blues just as quickly as
the speakers of the language with more words for blue.) There have been more sophisticated tests for
Sapir-Whorf since it was originally developed 70 years ago, and while there
have been indications that there are differences among speakers of different
languages, the differences are so tiny that they amount to virtually nothing.
This is hardly a big deal, in the
larger scheme of life, but I do find it reassuring that in essence people seem
to think about the world in the same way despite using a multitude of words and
languages. Whatever the reason for
differences among us, it seems that different language—except at the
superficial level of communication—does not signal different perceptions.
'How is it you can all talk
so nicely?' Alice said, hoping to get it into a better temper by a compliment. 'I've
been in many gardens before, but none of the flowers could talk.'
'Put your hand down, and
feel the ground,' said the Tiger-lily. 'Then you'll know why.'
Alice did so. 'It's very
hard,' she said, 'but I don't see what that has to do with it.'
'In most gardens,' the
Tiger-lily said, 'they make the beds too soft—so that the flowers are always
asleep.'
This sounded a very good
reason, and Alice was quite pleased to know it.
I had confirmed this year, in a short abstract from Karen Armstrong's
latest book, Fields of Blood,
something I'd long suspected but never explored: in some sets of religious beliefs, the idea
of separation of church and state makes little sense. In fact, Armstrong argues that the
Jeffersonian construct of separation was novel at the time and that even
Christians from an earlier era would have been startled to learn of it.
Part of the issue arises with the definition of religion. There isn't any universal one.
In the
West we see 'religion' as a coherent system of obligatory beliefs,
institutions, and rituals, centering on a supernatural God, whose practice is
essentially private and hermetically sealed off from all 'secular' activities.
But words in other languages that we translate as 'religion' almost invariably
refer to something larger, vaguer, and more encompassing. The Arabic din signifies
an entire way of life. The Sanskrit dharma is also 'a "total"
concept, untranslatable, which covers law, justice, morals, and social life. . .
. The idea of religion as an essentially
personal and systematic pursuit was entirely absent from classical Greece, Japan,
Egypt, Mesopotamia, Iran, China, and India. Nor does the Hebrew Bible have any
abstract concept of religion.
The meaning of the word as understood in modern
America evolved in the late Renaissance.
Armstrong maintains that
"the only faith tradition that does fit the modern Western notion of
religion as something codified and private is Protestant Christianity, which,
like religion in this sense of the word, is also a product of the early modern
period." The understanding now did
not fit with the presumptions of the Catholic Church, and roughly at the time
of the American Revolution or slightly before, "Europeans and Americans
had begun to separate religion and politics, because they assumed, not altogether
accurately, that the theological squabbles of the Reformation had been entirely
responsible for the Thirty Years' War."
The conviction that
religion must be rigorously excluded from political life has been called the
charter myth of the sovereign nation-state. The philosophers and statesmen who
pioneered this dogma believed that they were returning to a more satisfactory
state of affairs that had existed before ambitious Catholic clerics had
confused two utterly distinct realms. But in fact their secular ideology was as
radical an innovation as the modern market economy that the West was
concurrently devising. To non-Westerners, who had not been through this
particular modernizing process, both these innovations would seem unnatural and
even incomprehensible. The habit of separating religion and politics is now so
routine in the West that it is difficult for us to appreciate how thoroughly
the two co-inhered in the past. It was never simply a question of the state 'using'
religion; the two were indivisible. Dissociating them would have seemed like
trying to extract the gin from a cocktail.
As
a secular humanist, of course, I want the state as far away from religion as
possible. Allowing religious precepts to
drive public policy permits going places that it seems to me are
unacceptable. How much of Leviticus does
one wish to see enacted into law? Or the
Koran? The American/Western European
divorce of the two is central to the right to choose any or no religious
belief; the state and the power it possesses are not thrown behind one
particular religion.
If
Armstrong is correct—and I have no reason to doubt her history, although I have
doubts about some of the other elements of her book—the separation not only
makes no sense to other beliefs, in some cases it is anathema. The state and the religious must be bound
together in a theocracy in the eyes of followers of more aggressively
evangelizing faiths (e.g., elements of Islam, elements of fundamentalist
Christianity). It is for that reason,
more than any other, that I fear the election of "fundamentalist"
believers of any kind to political office.
What
I think is an interesting question is how believers in those faiths that can't
make sense of the wall of separation between religion and government manage to
do so anyway. My U.S. House
representative is the only Muslim in Congress (I think), but I've never had any
indication that Mr. Ellison supports the intrusion of religion—his own or
anyone else's—into public policy. At
least theoretically ("theoretically" because there are none now), a
Hindu elected to Congress could face the same dilemma. As could a Jew—although my understanding from
reading history is that Jews are among the LAST people who want religion
enshrined in law (at least outside Israel, which is a different case) because
they have so often over the centuries been on the receiving end of
discriminatory religious statutes. In
the case of Buddhism, a few years ago the Dalai Lama endorsed the separation of
church and state, so to the extent he speaks for Buddhists, that's one view;
there are a couple of small places where Buddhism has been established as the
state religion.
But I don't want to go among mad people,"
Alice remarked.
"Oh, you can't help that," said the Cat: "we're all mad here. I'm mad. You're mad."
"How do you know I'm mad?" said Alice.
"You must be," said the Cat, "or you wouldn't have come here."
"Oh, you can't help that," said the Cat: "we're all mad here. I'm mad. You're mad."
"How do you know I'm mad?" said Alice.
"You must be," said the Cat, "or you wouldn't have come here."
Elliott wrote to me that "I still find it amusing that people
readily deride the playing of video games for a few hours a day, yet they
wouldn't feel anywhere near the same sentiment towards someone who professed to
reading for the same amount of time. Both are objectively useless in terms of
their benefit to anyone other than the user, and the jury is still out on which
one produces more mental stimulation."
I didn't leave that assertion alone.
"You will never get me (or likely any educator) to agree that books
may equal video games. Given a reasonably wide range of books, a reader
is going to have a vastly superior breadth of experience about things (humans,
nature, science, romance, whatever) than is any video game player."
Elliott maintained that "that's because you haven't played a video
game in the last 20 years. Only read books. I've done both."
I thought and think he's just completely wrong, and told him so. "There is no way you can persuade me
that you get more out of video games than I have reading a landmark history of
WWI, a book about the development of economics, the evolution of computers from
Turing on, and high-quality novels set in various times and places. Video
games cannot possible teach as much as good books."
"Tis so,' said the Duchess: 'and the moral
of that is- "Oh, 'tis love, 'tis love, that makes the world go round!"'
'Somebody said,' Alice whispered, 'that it's done by everybody minding their own business!'
'Ah, well! It means much the same thing,' said the Duchess."
'Somebody said,' Alice whispered, 'that it's done by everybody minding their own business!'
'Ah, well! It means much the same thing,' said the Duchess."
It caught my attention that
ibuprofen may extend the lifespan as well as make one healthier. At least it has for yeast, worms, and
flies. I'll be waiting to see if there
are any findings about people. But in
the meantime, one wonders about taking a regular dose of ibuprofen!
She had quite forgotten the Duchess by this
time, and was a little startled when she heard her voice close to her ear.
'You're thinking about something, my dear, and
that makes you forget to talk. I can't tell you just now what the moral of that
is, but I shall remember it in a bit.' 'Perhaps it hasn't one,' Alice ventured
to remark.
'Tut, tut, child!' said the Duchess. 'Everything's
got a moral, if only you can find it.'
Elliott: "In my random internet wanderings I
stumbled across a list someone had compiled of 'the most heavy metal sounding
classical music pieces.' So I've been
slowly going through it. Lots of Russian
composers on there. Been listening to a
good amount of Tchaikovsky." I
asked if he meant the original classical music or a heavy metal version. He said original, "'most heavy metal' in
terms of its structure and instrumentation.
And lots of Bach." I
surmised that Messrs. Tchaikovsky and Bach would be surprised at the
categorization. Elliott said the "Toccata and Fugue in D Minor is
basically one long guitar solo. I
imagine had Bach been born 200 years later he would have been a pioneering
blues/heavy metal guitarist. Listen to
the Toccata and Fugue in D Minor. Then listen to Eruption by Van Halen. Not
hard to see the influence."
Unfamiliar
as I am with heavy metal music (that is, completely unfamiliar), this came as
quite a surprise.
If you knew Time as well as I do," said
the Hatter, "you wouldn't talk about wasting it. It's him."
I don't know what you mean," said Alice.
Of course you don't!" the Hatter said, tossing his head contemptuously. "I dare say you never even spoke to Time!"
Perhaps not," Alice cautiously replied, "but I know I have to beat time when I learn music."
Ah! That accounts for it," said the Hatter. "He won't stand beating. Now, if you only kept on good terms with him, he'd do almost anything you liked with the clock."
I don't know what you mean," said Alice.
Of course you don't!" the Hatter said, tossing his head contemptuously. "I dare say you never even spoke to Time!"
Perhaps not," Alice cautiously replied, "but I know I have to beat time when I learn music."
Ah! That accounts for it," said the Hatter. "He won't stand beating. Now, if you only kept on good terms with him, he'd do almost anything you liked with the clock."
Elliott in January: "Did people 200+ years ago say "um",
"like", and "you know," or some other equivalent? Or were people back then better trained as
orators at a young age such that they simply learned to speak more slowly and
thoughtfully? I don't picture anyone at
the Continental Congress saying 'So, like, independence, you know?'"
I said I had no idea—but bet that there
were "pause words" in use then as well as now, although perhaps not
as much among the Founding Fathers, who were well educated. I asked my friend Professor Tom Clayton. He commented that the question gave him
pause—and he pointed out it isn't something studied very much in English
literature. He wrote that "the kind of expletives you give might be expected
wherever there is limited vocabulary and corresponding pauses in expression due
to straining thought." They are
also part of the way people of a certain age and education speak, "so that
not to use them would distinguish one as not quite belonging and perhaps
putting on airs ("he goes" for "he says/said" belongs to
the same province)." Tom also reminded
me that that the idea that "people back then were better trained" "would
apply only to a class of person, where the educational lines between classes
were sharper than they are now." The major class distinction now is money.
That didn't use to be the case, in Britain, anyhow, where impoverishment and
gentility were not mutually exclusive."
'I quite agree with you,' said the Duchess; 'and
the moral of that is—Be what you would seem to be—or, if you'd like it put more
simply—Never imagine yourself not to be otherwise than what it might appear to
others that what you were or might have been was not otherwise than what you
had been would have appeared to them to be otherwise.'
Here's
an interesting thought experiment:
how/what do you think if you are told by an omniscient authority--so there
is absolutely no doubt--that you will live out your normal lifespan but the day
after you die all humanity will end? [Of course there's no such authority, but
this is a thought experiment!]
One
variable: (1) only you know; (2) the world knows (and also has no doubt,
otherwise you're just another nut case predicting the end of the world).
In
either case, if the "you" is me, and I live out my
genetically-suggested lifespan, the world would have about 20 years left. If the "you" instead were Elliott, the
world would probably have about 60 years left.
In both cases, no medical advances that extend our lives are assumed.
In
1, could you keep it a secret from your spouse/loved ones?
In
2, one suspects civilization would collapse—who'd want to go to work? This collapse could go on for decades, if the
end of human life weren't coming for 5 or 6 decades. It is possible you'd suddenly be the focus of
multiple assassination attempts by the crazies who'd want to bring about the
end or who wanted to find out if the prediction were really true. (Elliott's observation.) So you'd rather no one knows, and the world
continue on its normal pattern, rather than live out your last days in a world
where society has disintegrated.
In
2, perhaps all of the world's research would suddenly be directed to making you
immortal (also Elliott's observation).
That's an outcome that wouldn't be all that attractive to me. Would the world go on more or less as normal
while the research was taking place, holding its collective breath? (Obviously, the chances of making someone
immortal improve with the youth of the subject; there's a lot more time for
research to make Elliott immortal than there is for me.) This probably isn't as farfetched as it might
sound; there's certainly enough work going on in the biological sciences on
this front that if the efforts were multiplied a thousand-fold, discoveries
might be made.
Another
variable: presumably those who believe
in heaven/everlasting life would be less disturbed than those who are certain
there's no such thing. Those who look
forward to The Rapture or redemption might even welcome the end of human life. Discussion of this variable can descend into
a theological discussion, a place I do not wish to go.
Elliott's
first response was to guess that he would want to party for 24/7. One can surmise that would be the response of
many or most people: there's little time
left, so enjoy it while you can. One
potential problem, of course, is that if everyone decided they wanted to party,
and no one wanted to work for 20 or 60 years, there would be a food and goods
shortage because perhaps farmers would produce what they needed (but no more)
and most of the world's industry would shut down. Instead of partying, people would be desperately
short of food (and water, if all the municipal water plants went out of
business). One can imagine society would
descend into the chaos of hungry mobs.
The
major point (in the article I read about this hypothetical but can't now find)
is not what would happen in the world but how you think about your own life.
For me, I think I would be incredibly depressed. Kathy and I talked about this one night last
winter, and the more I thought about it, the more I realized that I parse my
life into two pieces. One piece is the
hedonistic enjoyment of life because as a secular humanist I'm certain there's
no afterlife, no eternal bliss, so I should enjoy my time alive because it's
the only time I'll have. The other
piece, however, is future-oriented: in
whatever small ways I can, through my job, political and other contributions,
life in general, I hope I can make life slightly better for my kids and
humanity. (In 2, the work of the
University—any university or college—would cease, because its activities are
almost entirely future-oriented:
research on improving human life in all its variety and teaching people
to be thinking human beings, to find something to do in life, and to contribute
to humanity's future.)
To
remove entirely the second piece of what I do in life would leave me
morose. I would not in the least be
inclined to party. Just knowing my kids'
lives would come to an end would be bad enough; knowing that everyone would die
would be overwhelming. I confess I was
surprised at my reaction; I didn't realize that's what it would be nor, on
reflection, how much of what I do and think is oriented to the future, even
beyond my own expected lifespan.
Kathy
posed the question that one sees all through life: setting aside this hypothetical, why don't we
live our lives to the fullest now (especially those who don't believe there's
any heaven or eternal bliss)? It's not
possible to do so in the purely hedonist sense (parties, dinners, games,
whatever one enjoys) because the vast majority of us have to earn a living in
order to pay for food, housing, transportation, plus whatever play activities in
which we choose to indulge. In addition,
one has to think that the endless parties and games would, in short order,
become stale; many of those in this world who are fabulously wealthy do find
useful ways to spend their time, one guesses for the very reason that unending
hedonism becomes unending boredom.
Oliver
Sack's reflections bear on this, it seems to me. The writer had a piece in the New York Times in February reflecting
on his diagnosis of terminal cancer and how he intends to live life to the
fullest in the few months he has remaining. (He died on August 30.) "I feel a
sudden clear focus and perspective.
There is no time for anything inessential. I must focus on myself, my work and my
friends. I shall no longer look at 'NewsHour'
every night. I shall no longer pay any
attention to politics or arguments about global warming. This is not indifference but detachment—I
still care deeply about the Middle East, about global warming, about growing
inequality, but these are no longer my business; they belong to the future." Even if we aren't under the gun of a terminal
cancer diagnosis, those of us who can see the end of the tunnel—well, I do,
anyway—begin to count the years and months left and start to think about how we
want to allocate our time and with whom we wish to spend it. (But I still have to do the damn laundry.)
At
the same time I was mulling over what my personal reaction would be to the end
of humanity after my death, Elliott sent me an anonymous quote: "The fact that there is a Highway to
Hell but only a Stairway to Heaven says something about the anticipated traffic
flow." There are variations on it
all over the web. As far as I can tell,
it's a witticism someone derived by combining the titles of two songs from the
1970s, "Highway to Hell" by AC/DC and "Stairway to Heaven"
by Led Zeppelin (that knowledge comes courtesy of Kathy). The witticism, however, did lead me to ponder
how those who believe in an afterlife (in heaven, in particular) should
realistically assess their chances of actually going to heaven.
This was a wild goose chase, but I
browsed the web for awhile trying to learn how various religious denominations
view the likelihood of people getting into heaven. None of the adherents knows, of course, and for
Christians the Bible's guidance is mixed.
My summary is this: some
evangelicals believe very few get to heaven and that even fewer
non-evangelicals do; Catholics believe a fair number go to hell; liberal
theologians reject the idea of salvation and hell; Baptists don't believe many
get there; Islam seems have a range as well, with both minimal and maximal
views on the number who achieve paradise; and so on. For Hindus, who believe in reincarnation and
not always as a human, presumably life could continue in another form if the
entire planet isn't obliterated with the end of humanity. For Buddhism, a non-theistic religion, there's
no anticipated afterlife so the question doesn't arise. Given that it appears that the more
conservative sects, particularly Protestant, believe that few get to heaven,
those who might be awaiting the rapture, or who decide to party 24/7 while
awaiting the end of time once you've announced the end of humanity, might want
to reconsider their odds. They don't
seem too high.
We are but older children, dear,
Who fret to find our bedtime near.
Who fret to find our bedtime near.
Indirectly
(very!) related to my hypothetical about life ending when you die is
interesting research on the relationship between money and happiness. It seems that "both material and
experiential wealth tends to reduce people's ability to savor simple joys and
experiences. Wealth and abundance may
undermine appreciation and reduce the positive emotions associated with
everyday experiences." As one would
expect, the converse is that experiencing adversity (or having experienced it in
the past) increases one's appreciation of everyday moments and
experiences. So "consistently
indulging in pleasure and abundance may not be the most productive route to
happiness." Partying 24/7 in
anticipation of end times might not be all that much fun. Especially if it might go on for 20 years.
A
related piece of research reported last winter made me think about my own
experiences—and I concluded that I thought the findings were probably
accurate. "Experiential purchases—money
spent on doing—may provide more enduring happiness than material purchases
(money spent on having). Participants reported that waiting for an experience
elicits significantly more happiness, pleasantness and excitement than waiting
for a material good."
With
only rare exceptions, moreover, I suspect that the memories of experiences are
longer-lasting and provide more enjoyment than do memories of material
purchases. Maybe if I purchased a Renoir
or Matisse, I'd really enjoy it for a long time, but most of our purchases in
life are more pedestrian—new clothes, new furniture, a new car, and so on. I cannot say that any of those purchases made
earlier in my life now give me any great pleasure, but I can certainly say that
experiences with family and friends, even from long ago, still provide
pleasure, even when one recalls both the good and the bad. (Of course, we all tend to suppress or forget
the bad, as time passes, and look back more fondly on many events than we
looked upon them at the time.) So, "the
researchers suggest that it may make sense to delay consumption of some
purchases, and shift spending away from material goods to more experiences. In
short—start planning for vacations, dinner parties and concerts ahead of time
to reap more benefits from anticipation."
And, in my opinion, to reap more benefits from happy memories.
"Can
you do Addition?" the White Queen said. "What's one and one and one
and one and one and one and one and one and one and one?"
"I don't know," said Alice. "I lost count."
"She can't do Addition," the Red Queen interrupted.
"I don't know," said Alice. "I lost count."
"She can't do Addition," the Red Queen interrupted.
While
it is true that the Scots speak English (more or less, depending on where you
are in Scotland, and some do speak Gaelic), there are also a number of words
that are unique to Scots-English. One of
them I learned late last winter, and it is so appropriate for us as well as
Scotland: dreich: "dull, damp, miserable, grey—but all four
together" (pronounced dreech, with a guttural "ch"). (With thanks to my Scots friend Bruce Nelson
for helping me get this right!)
Another one, also readily seasonal
in use: thole (verb): to suffer, to endure, to persevere—with the
promise that in doing these things, you will win through and grow. My Scot friends tell me this one isn't used
all that commonly. But we tholed the
winter (and especially the winter of 2013-14).
Anybody, who is spoken about at all, is sure to
be spoken against by somebody: and any action, however innocent in itself, is
liable, and not at all unlikely, to be blamed by somebody. If you limit your
actions in life to things that nobody can possibly find fault with, you will
not do much!
While
browsing the Huffington Post last spring, I came across a piece that claimed to
identify the factors that determined whether one is "affair ready"—that
is, of a state of mind to have an affair outside one's marriage. Just for the heck of it, I looked at it to
see if I was, by their 10 criteria, "affair ready."
1.
You often think that you "love but you're not 'in love'" with your
spouse.
2. You've been unhappy with your spouse and/or the relationship for quite a while (more than one year).
3. You're bored.
4. You want out, but you don't want to hurt your mate.
5. You don't have the guts to ask for a divorce.
6. You've tried (or think you've tried) to tell your spouse that you're unhappy, but these complaints fall on deaf ears or are met with verbal or physical harassment.
7. You begin to spend more time with other people doing extra-curricular activities (perhaps you golf every weekend now, or you take up a new pastime such as biking, photography or the school auction).
8. You don't feel appreciation, respect or admiration by or for your spouse.
9. Your sex life isn't what you'd like it to be.
10. Other people you know have had, or are having, affairs.
2. You've been unhappy with your spouse and/or the relationship for quite a while (more than one year).
3. You're bored.
4. You want out, but you don't want to hurt your mate.
5. You don't have the guts to ask for a divorce.
6. You've tried (or think you've tried) to tell your spouse that you're unhappy, but these complaints fall on deaf ears or are met with verbal or physical harassment.
7. You begin to spend more time with other people doing extra-curricular activities (perhaps you golf every weekend now, or you take up a new pastime such as biking, photography or the school auction).
8. You don't feel appreciation, respect or admiration by or for your spouse.
9. Your sex life isn't what you'd like it to be.
10. Other people you know have had, or are having, affairs.
I was quite pleased—although not the
least bit surprised—that none of the 10 factors were true of me. I sent them to Kathy and told her. She told me that none of them were true for
her, either. So I guess the situation
here is just hunky-dory!
'I'm very brave generally,' he went on in a low
voice, 'only today I happen to have a headache.'
One
of my faculty friends has been reading and reflecting on human consciousness,
deemed the "hard problem" by Australian philosopher and cognitive
scientist David Chalmers in 1995. The
descriptor has stuck, although not all agree with its accuracy. The issue is the extent to which entirely
physical processes determine consciousness or sensation; is there something
beyond the interactions of neurons that creates our consciousness? Some argue that consciousness plays tricks on
us so that it appears non-physical. Will
neuroscience eventually answer the question of what consciousness is?
I
certainly don't know. But I did conclude
that I agreed with Sam Harris:
Even
if one thinks the human mind is entirely the product of physics, the reality of
consciousness becomes no less wondrous, and the difference between happiness
and suffering no less important. Nor does
such a view suggest that we'll ever find the emergence of mind from matter
fully intelligible; consciousness may always seem like a miracle. In philosophical circles, this is known as "the
hard problem of consciousness"—some of us agree that this problem exists,
some of us don't. Should consciousness
prove conceptually irreducible, remaining the mysterious ground for all we can
conceivably experience or value, the rest of the scientific worldview would
remain perfectly intact.
And
I agree with Susan Blackmore:
Consciousness
is not some weird and wonderful product of some brain processes but not others.
Rather, it's an illusion constructed by a clever brain and body in a complex
social world.
While people are awake they must always be conscious of something or other. And that leads along the slippery path to the idea that if we knew what to look for, we could peer inside someone's brain and find out which processes were the conscious ones and which the unconscious ones. But this is all nonsense. All we'll ever find are the neural correlates of thoughts, perceptions, memories, and the verbal and attentional processes that lead us to think we're conscious.
While people are awake they must always be conscious of something or other. And that leads along the slippery path to the idea that if we knew what to look for, we could peer inside someone's brain and find out which processes were the conscious ones and which the unconscious ones. But this is all nonsense. All we'll ever find are the neural correlates of thoughts, perceptions, memories, and the verbal and attentional processes that lead us to think we're conscious.
When
we finally have a better theory of consciousness to replace these popular
delusions, we'll see that there's no hard problem, no magic difference, and no
NCCs.
I don't think Harris and Blackmore
contradict each other.
'The
horror of that moment,' the King went on, 'I shall never, never forget!'
'You will, though,' the Queen said, 'if you don't make a memorandum of it.' [The story of my life, now.]
'You will, though,' the Queen said, 'if you don't make a memorandum of it.' [The story of my life, now.]
In
my opinion, two of the more aversive commercial transactions that the U.S. has
created for its citizens who have any amount of money and who are purchasing an
automobile and selling or buying real estate.
Unfortunately, Kathy and I had to undertake both of them last spring.
My
nephew-in-law in Wisconsin, who's an expert on cars, advised me that it was
time to get rid of my 2000 minivan.
Inasmuch as I take advice from experts, Kathy and I went car-shopping in
late March. To our surprise, it was not
as obnoxious a process as it can be, perhaps because of the dealership we dealt
with. In the past, whenever I've
purchased a car from a dealer, I've come home feeling that I needed to wash my
hands because the salesmen (always men) were so slimy and the process so
distasteful. Not this time; there was no
pressure and no negotiation: the guy
showed us the cars we were interested in and left us alone to talk about what
we wanted to do—and told us they didn't bargain because they priced their cars
reasonably.
Kathy
and I brought a different approach to the process. She was very good about gathering information
from the web about the reliability and quality of various makes and models and
compiled a list of vehicles we should consider.
She likes to see different cars, go to several dealers, and think about
what she's learned. I, on the other
hand, want to get the whole thing done as quickly as possible. (When I bought the minivan in 2001, I drove into
the dealer lot, looked at the minivans, picked one out, and bought it. In that case it was relatively easy because I
knew exactly what I wanted.) In this
case, we went to a dealer that several friends had recommended—people whose
judgment we trusted—and immediately found a car that we liked. It was one of the ones on Kathy's list.
Kathy
observed later that it was easier to reach a conclusion than it had been the
last time she or I bought a car, 14 years earlier; as she wrote on her Facebook
page, "I double-checked the reliability of our
new car, checked Blue Book prices to make sure we were getting a good deal,
then scoped out the same car being offered by other dealers in the Twin Cities
to make sure we were getting the best deal in town. All on my phone." She was satisfied with the results of her
investigations. So after hesitating a bit about making a big purchase so
quickly, she agreed that we should just go ahead and buy it. The whole venture took less than three hours,
including driving time to and from the dealership. I was pleased to have it done with.
All
that having been said, I still hate spending money on a car. It is the same as ripping up $1000 bills and
throwing them out the window. We bought
a 2014 Volvo (that only had 500 miles on it, so a "new car" in
effect), and given the quality of Volvos, I decided—hope—that that will be the
last car I have to buy. Well taken care
of, it ought to last me the rest of my driving life. Until, of course, self-driving cars become the
rule, and driver-operated vehicles are no longer legal, but I'm assuming that
may be more years off than the expected life of the Volvo.
(I
wrote that last sentence in March, and given what I have read in the news in
the succeeding months about the advances in the technology of self-driving
cars, the Volvo may not outlive their advent.
But I suspect that there won't be any overnight switch; there are over
250 million cars registered in the U.S., and it won't be possible to simply one
day declare them all illegal.)
One
of the inescapable aspects of modern life, with its complicated technology, is
the owner's manual. The manual for the
Volvo is 380 pages long. It took us
several months to figure out all the bells and whistles and gizmos and gadgets
in the car, and I came to realize that it is in essence a computer that has
four tires and a chassis and a steering wheel.
It also resembles Microsoft Word:
it has FAR more features than I ever want or need (such as four
different ways of displaying varied information on the dashboard, with
different color schemes for each). Some
advances in technology are wonderful and useful; some are merely piling on.
I have decidedly mixed emotions about the
car. Nice to have a new car; not nice to spend the money on it. But
even I can't completely escape that middle/upper-middle class fascination with
having a new car from time to time. Kathy asked me later if I had buyer's remorse. I said my only "remorse" was
whether I could have made my minivan last another 2-3 years if I'd put a couple
of thousand dollars into it. It's always
the case, however, that with an older vehicle one never knows when one is
throwing good money after bad, so I decided I just had to forget it and be
content with the purchase. So after
driving a minivan for over 30 years, I'm back to a sedan. The one thing I'll
miss is being higher up in my seat when driving—but I didn't any longer want to
own a van or one of its relatives, an SUV or a crossover.
This
buying a car reminds me of a family incident many years ago (funny the things
one remembers, as we all know). I was
perhaps in my late teens or early 20s and joined my parents, grandmother, and
great aunt on a visit to the home of one of my mother's cousins and her
husband. The cousin was among the first
of the total of 24 born to my grandmother and her 13 siblings; my mother was
the last of the 24, so she and her cousin were almost a full generation apart. As was true for all my relatives of that
generation—and just about everyone else—they were not college grads. Anyway, so we're visiting, and my mother,
grandmother, great aunt, and cousin start talking about whatever it was that
women of that generation talked about.
The cousin's husband quickly became bored with whatever the topic of
conversation was and said "come on, Gary, let's go look at the car." I was slightly interested in what my mother
and the other women were talking about and I had absolutely no interest
whatever in going to "look at the car." I was dismayed by his suggestion. But I went with him down to the garage, along
with my father, and found that it was an utterly ordinary car of some kind,
Chevy or Ford. Inasmuch as I knew
nothing about cars other than how to operate them, and didn't give two hoots
about them, I had nothing to say. But he
didn't have anything else he could talk about.
I
am amused by what I wrote to my late former father-in-law in January, 1985,
after buying a car:
The one thing about buying a car, apart from my general
dislike of spending money that way, is having to deal with car salesmen. They seem to me to be among the slimiest
people around, and I always come away from a dealership feeling slightly soiled. This guy we dealt with fit the stereotype in
almost every way. The other part of the
process I dislike is that I always come away feeling like I've been
cheated—that if I'd hung on long enough, they would have reduced the
price. My whole reaction is to want to
come home and wash my hands.
At
the same time as we were getting a new car, we decided to put our (Kathy's)
townhouse on the market. We'd been
unwilling landlords for over three years—unwilling because we'd have sold it
when we got married but the real estate market was so lousy that we'd have had
to take a significant loss on it. (We'd
used a property management firm for renting it, and while the rent didn't quite
cover the costs, the operating loss was minimal and we probably came out even
in the wash after the tax benefits were factored in. But it's still just a pain to be a landlord
because suddenly one is confronted with the need to buy a new clothes dryer or
water heater or whatever.) Because the
market has recovered, we put it up for sale and sold it.
In a Wonderland they lie, Dreaming as the days
go by, Dreaming as the summers die
Ever drifting down the stream-- Lingering in
the golden gleam-- Life, what is it but a dream?
There
have been a number of studies that found that people who are politically
conservative are happier than those who are politically liberal. That conclusion was based on self
reports: people were asked how happy
they were. A more recent study,
published this year, overturned that finding. Appearing in Science, the University of California researchers decided that rather
than asking people how happy they felt, which was the basis for the conclusions
in the earlier studies, they would analyze text and photographs. They looked at 432 million words in the
Congressional record, over 47,000 Twitter updates, and photos from LinkedIn. Given the known political affiliation of the
subjects, they concluded that "liberals more frequently employed positive
language in their speech and writing and smiled more intensely and genuinely in
photographs."
Interestingly,
the authors also found that "people tend to report all kinds of traits and
abilities in an overly favorable way. . . .
If you ask people to rate themselves across almost any set of positive
traits—intelligence, social skills, even driving ability—most will rate
themselves above average. We observed
that effect to be stronger among conservatives than liberals." All the
world, it seems, is Lake Wobegon.
I
had known of those earlier studies and wondered about them. One question that comes to mind is whether
one is happier if one's public policy views prevail (or more than not): to what extent is happiness linked to public
policy outcomes? For part of the sample
they drew on, members of Congress, it is conceivable that happiness is indeed
directly linked to public policy outcomes (local, state, national, or
international). For tweets on Twitter,
or LinkedIn photographs, however, the nexus is not clear. For a large chunk of Americans, those who pay
little or no attention to politics at all, public policy preferences surely
have no effect on happiness levels. I
would think that for all groups of Americans, no matter how they are sliced and
diced, happiness will depend far more on one's relationship with a partner, the
health and well-being of one's children and other family members, the state and
breadth of one's friendships, one's physical and mental health, one's economic
security, one's job, and so on. Even for
me, a lifelong political junkie, public policy and electoral outcomes play a
much smaller role in my life satisfaction/happiness than my professional life,
my wife's and my children's status, and so on.
If
the foregoing is generally true, then the difference in happiness levels
between liberals and conservatives is largely unrelated to electoral and policy
outcomes. It must be a difference in
personality or character traits.
(For those who are
affected by politics, it is true that liberals can be extremely critical even
of politicians and public figures who one might say are on the left. Many of them even do dumb things like vote
for third-party candidates (because the conventional—Democratic—standard-bearer
isn't liberal enough) and thus elect the conservative they most disagreed
with. Until recently, those on the
political right haven't been so discordant about their own, but now with multiple
candidates for the GOP presidential nomination, there is more conservative
backbiting as well. It may be that
liberals were less happy than conservatives because their expectations for
change were always higher and thus their disappointment greater when things did
not change faster (or perhaps did not even change in the direction they
desired). To the extent the happiness of
someone of a conservative bent is affected by the political sphere, the success
of the Affordable Care Act, the rapid change in opinion about gay marriage, and
the marked decline in church attendance and religious belief among younger
Americans (as three examples), one could make the argument that the conservative
may be unhappier than the liberal.)
The authors
conclude, it seems to me correctly:
The questions raised by
this research are important because of a growing interest in using self-report
measures of happiness to inform public policy.
Our research supports those recommending caution about promoting any
particular ideology or policy as a road to happiness. Research investigating self-report-based
happiness differences between cultures, nations, and religious groups may
inadvertently capture differences in self-reporting styles rather than actual
differences in emotional experience.
One finding I've been told about is
that evangelicals tend to be happier people.
Perhaps. A guy named B. A.
Ritzenthaler put together an interesting collection of national data from
around the world that compared happiness levels and religiosity. (In this case, he used the UN Happiness
Index: "based on life expectancy,
GDP per capita, social support, corruption, generosity, and the freedom to make
life choices. On a scale of 1 to 10
Denmark scored the highest at 7.693 and Togo was the lowest at 2.936. The
United States ranks 17th with a happiness index of 7.082.") Religiosity was based on international Gallup
Poll data. He found a general trend
toward increased happiness with decreased religiosity, but it isn't a strong
correlation (R2 = 0.2896, and although it wasn't provided with the
graph, the standard deviation looks to be fairly large). So at least at the gross statistical level,
being religious doesn't seem to mean being happier. The fact that the U.S. ranks as high as it
does may reflect the fact that it has a lot of happy evangelicals.
I
would have bet all along that liberals would be happier than conservatives, for
two reasons. One, liberals by definition
are more accepting of change in the world, and it does change all the time;
conservatives are not (or they wouldn't be conservative). Two, the international literature on
happiness levels around the world demonstrate repeatedly that it is the most
liberal societies that are the happiest; the Scandinavians consistently show up
as the happiest people on earth. If the
most liberal societies are also the happiest, it only makes sense that liberals
will be happier than conservatives.
Contrariwise,' continued Tweedledee, if it was
so, it might be; and if it were so, it would be; but as it isn't, it ain't.
That's logic.
Sometime
in the spring of 2014 I was looking at a photograph from the University's
archives, one I'd seen before, of the first four presidents: William Watts Folwell, Cyrus Northrop, George
Vincent, and Marion Burton. All four
were in academic robes, so it must have been some celebratory occasion; my
guess is that it was Burton's inauguration in 1917.
The
photo got me to thinking: how many times
have there been a sitting president plus three of his predecessors, and even
further, what was the highest number of predecessors living at any point in the
University's history? A quick scan of
the presidential terms and death dates made me realize that the current
president, Eric Kaler, had five living predecessors—something that had never
happened before in the University's history.
So
I suggested to President Kaler that it might be worthwhile to bring all six of
them together for a public conversation or some kind of event. He thought it was an interesting idea but
said he didn't want to foot the bill for an event like that.
Not
one to give up, I wrote to former president Nils Hasselmo, a friend and
colleague with whom I'd stayed in touch since he left the presidency in
1997. He, too, thought it was
interesting but said it would likely just bring up past troubles and would not
be a good idea. Former president Bob
Bruininks had the same reaction.
So
I dropped the idea. Then I heard from
another friend and colleague, Shirley Clark, to whom I'd mentioned the
idea. (Shirley was a professor of higher
education and my first doctoral adviser, someone I knew would at least have a
passing interest in the idea.) She wrote
back and told me of a similar event that had been held at the University of
Illinois many years ago and described the focus of the discussions. I forwarded her comments to Nils, at which
point he reconsidered and said that if a conversation were focused on the
future of universities and the risks and opportunities for public higher
education, then it might be worthwhile.
I passed Nils's observation to former presidents Bruininks and
Keller. There still appeared to be
little enthusiasm for pursuing the event, so I didn't.
A
month or so later I happened to encounter President Kaler at a meeting, at
which time he said he'd thought more about the 6 presidents idea and decided
maybe it was worth pursuing. I sent him
my email exchanges with Nils, and at that point President Kaler decided to go
forward and asked me to spearhead the effort.
So I did, and on May 4—the ONLY day in several months that worked for
all 6 of them—the six presidents convened on the stage of Northrop Auditorium
and had a conversation in front of a nearly-full house. Everyone thought it went marvelously,
including me.
What
started out as a one-off public conversation morphed into an entire series of
events celebrating the six presidents:
they and their spouses all had dinner at Eastcliff (the University
president's home) the night before, they had a session with students in the
morning, a session with senior faculty over lunch, the afternoon public
conversation, and then a reception and dinner that night. It was a very long day that tired me out—and
three of the presidents are over 80 (but in better physical and probably mental
shape than I am!).
President
Kaler afterwards congratulated me on my perseverance in the face of initial
resistance and on getting it organized.
All 5 of his predecessors wrote to me after the fact and said they'd
very much enjoyed the events and thought they all went extremely well. So I got a feather in my cap. No salary reward, of course, but a feather. Not that I'm looking for many feathers at my
stage of the game! But I was glad to
have had the idea of an event that celebrated a part of the University's
history in a unique way.
"Come back!" the Caterpillar called
after her. "I've something important to say."
This sounded promising, certainly. Alice turned and came back again.
"Keep your temper," said the Caterpillar.
This sounded promising, certainly. Alice turned and came back again.
"Keep your temper," said the Caterpillar.
At
the advanced age of 63 I tip-toed into the professoriate last spring: as a favor to a faculty friend and colleague
who was overwhelmed with more responsibilities than any two people could handle
well, I agreed to take on teaching a number of the classes for her graduate
course. The first session I was nervous
as can be, but after that I felt more confident. One problem was that I hadn't given much
thought to the course material since I'd taken it myself 25 years earlier (no,
it wasn't all the same materials as I'd had, although some were—the literature
on organizational problems and development doesn't change that fast). Another problem was that it wasn't a course I
even liked all that much. I really didn't
feel competent to teach this course. But
I re-read the literature and brought my own experiences to the class sessions,
and I guess I did acceptably because the written student comments after the
course was done were very positive.
Some
in the college now have visions of me teaching more. The only drawback to this idea is that
teaching is really hard work! I haven't
had to scramble and work as hard as I did for those classes for many
years—because, I suppose, it was something entirely new for me. One does want to be more informed about the
subject matter than the students in the course, and that's more of a challenge
with graduate students than with undergraduates. Graduate students, especially in this
program, are often adult professionals who are not wet behind the ears, so one
has to be more on top of one's game than with undergraduates.
Thus
my intensive labors on the course I'm going to teach this coming spring
semester.
I don't see how he can ever finish, if he doesn't
begin.
As
seems to happen as we age, I had intersections with the medical establishment
with greater frequency than earlier (and than I care for). I learned that at one point I likely had a "low-pressure
headache" due a probably pinpoint leak of spinal fluid. But it fixed itself and the neurologist, whom
I saw two months after the fact, closed his discussion with me by saying he
hoped never to see me again. He meant it
kindly and I agreed with him.
I
knew I'd have to contend with it at some point, and finally did at the end of
May. I have had for a dozen years or
more something known as Dupuytren's contracture, a bulking up of the tendons in
my right pinkie finger extending into the palm, a condition that also pulls the
pinkie and fourth finger curving into the palm.
The growth had grown only very slowly, but my physician said it was time
to deal with it. So I had surgery.
The surgery went fine but the week after
it was an annoyance because I had a cast on my right hand. It was amazing to me how hobbled I was with
the cast (I am one of those people who is "severely" right-handed—I
can't do any fine motor activity with my left hand). I could not write, I could only barely use a
computer, I had trouble eating, and a shower was a challenge. I was SO relieved when the cast came off
after a week. I still had to do PT for a month, and had to wear a
removable brace when sleeping or not otherwise using my fingers, but the docs
said at the time that the surgery was a complete success from their standpoint.
(My hand was really ugly, with a swath of black stitches from the first
joint on my right pinkie finger to about an inch and a half into my palm.) Fortunately, I never at any point had any
pain. They gave me enough oxycodone for a month, I think--I took it twice
the first day of the surgery and never again. Elliott jokingly wondered
what the street value of the oxycodone was. I had an unexpected and
unwelcome "vacation" from my office because I was so non-functional.
I was startled at the preparation for the
procedure--had to remove all my clothes (and my wedding ring--on the other
hand!), put on a surgical gown, and be wheeled into their surgery room.
Just for a few cuts on one hand! What was really strange about the
procedure was that I was to be given a sedative and local anaesthetic—except
that when they wheeled me in, I was talking with the anaesthetist and the
surgical assistant, thinking I'd be sedated but could watch if I wanted to, and
suddenly I was awake later with a cast wrapped around my hand. I never
knew when I went under—it just happened. I never even saw them put in the
IV for the anaesthesia—I was just out and then awoke, all done. I was
more surprised than anything when I came to. (Also fortunately, there was
no anesthesia "hangover" because it wasn't full anesthesia, just a
powerful sedative. I woke up feeling completely normal and without any
pain.)
I was amused, and initially taken aback,
when I first met with the surgeon for an exam and discussion of options. The first question she asked me was where I
was from. I just looked at her, and
finally said I had grown up in Minneapolis.
"No, no, that's not what I meant.
Where are your ancestors from?"
I looked at her again; that was the oddest question I'd ever been asked
in a medical consult. So I told her my
forbearers came from Sweden, Denmark, Germany, and the British Isles. She said "that's what I expected." It turns out, of course, that her question
was not illogical: Dupuytren's
contracture predominantly affects males over 50 whose background is northern
European. She said she would have been
surprised if I'd said I had Italian or Spanish ancestry, and she said she
almost never sees a Native American, an African-American, or an Asian with
Dupuytren's. The medical establishment
does not know the etiology of Dupuytren's nor why it mostly affects north
European males. Weird.
"Just look down the road and tell me if
you can see either of them."
"I see nobody on the road." said Alice.
"I only wish I had such eyes," the King remarked in a fretful tone. "To be able to see Nobody! And at such a distance too!"
"I see nobody on the road." said Alice.
"I only wish I had such eyes," the King remarked in a fretful tone. "To be able to see Nobody! And at such a distance too!"
As
friends of ours move out of the houses they've lived in for a very long time,
they of course get rid of many possessions—they get rid of the stuff that
accumulates in a house when one has lived in it for two or three decades. Watching that process, and talking about it
with them, led us to look around nervously at our own house. We have no plans to move in the immediate
future and figure that we might be here another ten years or so, depending on
our health and activity levels. It isn't
such a big house that we wander around unused rooms now that my kids have moved
out. (Well, Elliott presumably will move
out after he graduates in December 2016.)
There
is, however, a large amount of "stuff" scattered around. One big chunk of it is about a dozen Xerox
boxes filled with Legos; Elliott absolutely does not want me to sell them
because he will take them when he moves.
Unfortunately, that likely must await his moving into a house, not an
apartment, because he'd have nowhere to store all those boxes in an
apartment. Another big chunk of "stuff"
is the eight sets of china we possess.
That wasn't planned, by any means, but Kathy's inherited three sets from
her mother in the last couple of years—as her mother prepares to sell the house
in Nebraska and move to a seniors' place in the Twin Cities. I already had five sets of china, also not
really planned. So our kids are going to
inherit enough china to last them and their grandchildren (if ever there are
any) a lifetime. Compounding the china
accumulation is a lot of other glassware—bowls, platters, glasses for various
purposes. Both Kathy and I have a
weakness for buying glassware and pottery (going to all the glass factories in
Sweden didn't help). And then there are
rows and rows of that antique technology, books.
We
took a baby step toward reducing the volume.
I had to move a bookshelf for a plumber to get access to the bathtub
plumbing, so I took the opportunity to at least get rid of about three bags of
books. But it's hard to give up those
old friends, even if I won't ever re-read the vast majority of them. (Even Elliott admires the bookshelves as
interior decoration but he won't want all my history and biography and
fiction.) With the introduction of the
eighth set of china from Kathy's mother after a visit Kathy made to Nebraska in
May, we rearranged all the hutch and buffet and other shelves and managed to
get rid of two bags of glassware. We
also (stupidly) put a bunch of stuff in boxes and put the boxes in the
basement; I said later we should just have donated all of it because once
something goes in a box in the basement, it surely won't ever be used
again. My defense (to myself) for
keeping most of the glassware and china is that we use most of it from time to
time during the year. Much of it has
sentimental or family value as well, and I am constitutionally unable to toss
anything that came from my parents, grandparents, or great-grandparents.
I
suppose the real pressure will come when we actually decide to move. I tell the kids they'll be getting a lot of
stuff at that point. Elliott rolls his
eyes and says "EBay." But then
he (and Krystin) start to look around and say "well, I want that" and
that and that.
The
reason we got into the (very modest) downsizing was because we spent
approximately 2% of our gross annual income on plumbing and bathroom costs in
May. I had to move the bookshelf to
accommodate the plumber because the bathtub had had a tiny faucet leak for
decades and had ruined the tub. We never
use the tub so I hadn't really paid much attention. So we had an acrylic tub liner installed and
of course fixed the damn plumbing at the same time; the access panel for the
bathtub plumbing was behind the bookshelf.
At the same time, some of the remaining steel plumbing pipes in the
basement suddenly began to spring leaks, so we had to replace all of
those. I've used the same plumber for
over 20 years; he does superb work and he's quite expensive. He has essentially re-plumbed this entire
house. Nothing he's ever done, however,
has had to be fixed again later.
Later
in the summer, as we were selling Kathy's townhouse (we've been landlords for
the last four years), we had more plumbing expenses, to put the unit in tip-top
shape for a buyer. So it went somewhat
over 2% of our gross income.
We
were writing out rather large checks for several weeks just to pay for
effective distribution of water in the house.
Not my idea of a fun way to spend money.
We can but stand aside, and let them Rush upon
their Fate! There is scarcely anything of yours, upon which it is so dangerous
to Rush, as your Fate. You may Rush upon your Potato-beds, or your
Strawberry-beds, without doing much harm: you may even Rush upon your Balcony
and may survive the foolhardy enterprise: but if you once Rush upon your
FATE--why, you must take the consequences!
Coterminous
with minor medical and plumbing fun, Kathy and I put in a native wildflower
garden. My geriatric raspberry bushes
were not producing many new canes and the berries were getting smaller and
smaller as the years passed. By the end
of the autumn of 2014 the patch had turned into a large weed bed—it was a mess.
We
hired Elliott and a friend to rip out all the raspberry bushes and everything
else that had been growing in the plot (about 10' by 20'), added some new dirt,
and then went out and spent about $300 on weeds. Or what looked like weeds. We wanted a garden with plants that are bee-
and butterfly-friendly and we wanted one that we would plant once and that
would take care of itself thereafter.
Most northern native wildflowers either resemble weeds or are indeed
plants that many of us have for years considered weeds (e.g., goldenrod,
milkweed, wild columbine). So we planted
about 50 plants, neatly spaced and grouped.
I
assumed that next summer it would be a wild profusion of greenery and flowers
because all of these plants spread. It
didn't take until next summer; by the end of last summer the plants had grown
large and spread considerably. When they
all come back next spring it will be a mess!
We have been rewarded, however, by suddenly seeing many
butterflies—including monarchs—and bumblebees in the yard. The bees we'd had because of the hosta
flowers, I think, but we had more (and different kinds of) bees, as we were
told to expect.
Although
nothing was said when I returned to the doctor to have the cast removed, it was
rather dirty. I gardened even with the
cast, although I was a little hobbled by it.
Kathy did much more of the work than I did, but I helped out as I
could. The surgeon wasn't involved in
the cast removal but I imagine she might have scolded me had she known. I have sometimes not been all that good about
following medical instructions.
'Speak
when you're spoken to!' The Queen sharply interrupted her.
'But if everybody obeyed that rule,' said Alice, who was always ready for a little argument, 'and if you only spoke when you were spoken to, and the other person always waited for you to begin, you see nobody would ever say anything, so that -- '
'Ridiculous!' cried the Queen. 'Why, don't you see, child -- ' here she broke off with a frown, and, after thinking for a minute, suddenly changed the subject of the conversation.
'But if everybody obeyed that rule,' said Alice, who was always ready for a little argument, 'and if you only spoke when you were spoken to, and the other person always waited for you to begin, you see nobody would ever say anything, so that -- '
'Ridiculous!' cried the Queen. 'Why, don't you see, child -- ' here she broke off with a frown, and, after thinking for a minute, suddenly changed the subject of the conversation.
Elliott
painted this (20" by 28" oil) portrait of me last April, from a
photograph. It's a big picture. One friend of mine who saw it exclaimed "Wow! I'm
greatly impressed. Hans Holbein reincarnated with a contemporary twist."
A couple of other people (including
Elliott's instructor for the class in which he painted it) describe him as
painting in the Van Gogh style. My
friend maintained that Elliott won't have any trouble making a living. Perhaps.
My college roommate and good friend
Gary Voegele has a neighbor who is a professional artist who does commissioned
portraits. Elliott and I looked at his
website; for a portrait (head and bust) he charges $8,000 and for a full-length
portrait the fee is $27,500. They
require 2-3 sittings and multiple photos and take anywhere from six months to a
year to complete. Elliott's for me was
free and took him about three weeks, but as a result of my posting a copy of it
on Facebook, he got two (paid) commissions.
Inasmuch as Gary's neighbor lives in a rather nice home in Lakeville,
one presumes he's reasonably successful financially as an artist.
Elliott tells me that he plans to have a
day job, at least to start with. His
ultimate objective is to use his skills and talents in video game design; if he
achieves success in that adventure, he'll do just fine in life. He can buy condo in Italy overlooking the
Mediterranean for Kathy and me.
Elliott
later sent me a text of another picture he'd painted, a model in one of his
classes. I asked him if he planned on
selling it or if it was one that I could have.
He wrote "what makes you think default is [the picture] goes to you
for free?" and added an emoji with its tongue sticking out. I pointed out that I'd written "is it
one I get for awhile," because
even if I have them, he'll inherit them all (back) in any case. Unless, I added, "I can sell one for
$100K."
To
which Elliott responded "if you can get someone to buy one of my paintings
for $100K, I'll let you keep half of it."
A deal.
"She
can't do Subtraction." said the White Queen. "Can you do Division?
Divide a loaf by a knife—what's the answer to that?"
"I suppose-" Alice was beginning, but the Red Queen answered for her.
"Bread-and-butter, of course."
"I suppose-" Alice was beginning, but the Red Queen answered for her.
"Bread-and-butter, of course."
Here's
an odd piece related to World War I.
The
capital of Serbia, Belgrade, acquired a new monument last June—to Gavrilo Princip,
the Bosnian-Serb nationalist who assassinated Archduke Franz Ferdinand on June
28, 1914, the event that tripped the sequence of events that led to WWI. Princip is being described as someone who "became
part of Serbian history and sacrificed himself for freedom." Princip died in an Austrian prison four years
later. Some consider him a hero, some a
terrorist. Guess it depends on where you
stood. Or stand.
"It
is a very inconvenient habit of kittens (Alice had once made the remark) that
whatever you say to them, they always purr.
"If
they would only purr for 'yes,' and mew for 'no'; or any rule of that sort,"
she had said, "so that one could keep up a conversation! But how can you
talk with a person if they always say the same thing?"
One
of the more interesting research reports I read this year, confirming what many
of us understand from sometimes unhappy experience, was titled "Why we blow up when we argue
about politics." The authors report
on research from the University of California, Santa Barbara, which found that
people can discuss many topics, have lively conversations, but not get
angry. Turn to politics, however, and "lively disagreements can get downright ugly"
and, if pursued, even threaten relationships.
Why does this happen, they ask?
After a series of experiments, they
concluded that political party affiliation is similar to membership in a gang
or clique, rival groups. That
similarity, in turn, arose because of evolution: when we were hunter-gatherers, it was important
for humans to know who was on their side and who wasn't when there were
disputes. What's surprising is that the
political characterization of others transcends race; the major concern is
coalition. People will remember someone
else's political affiliation or beliefs but they will ignore race in
determining coalition membership; race does not necessarily predict
alliances. (That is why, they argue,
ethnically different does not preclude success in coalition leadership; they
point to Benjamin Disraeli—a Jew in Anglican England—Arnold Schwarzenegger in
California, and Obama in the U.S. Not
sure that Disraeli and Schwarzenegger were different "races," but I
guess they were using the term broadly.)
"When people express opinions that reflect the views of different
political parties, our minds automatically and spontaneously assign them to
rival coalitions."
I have two reactions to the findings
and their conclusions that don't cohere.
One, I'd like to see more evidence before I'd be willing to ascribe the
potential hostility in political disagreements to evolution and what is
essentially clan affiliation. Two, their
general observation that discussions centered on politics, as opposed to other
topics, can turn nasty accords with my personal experience. But I'd put religion in the same category;
religious disagreements can turn as vehement (and bloody) as political
disagreements, if one considers the outcomes of disagreements in human history
(e.g., the Inquisition and the battles between Christians and Ottoman Muslims
in the Balkans, to say nothing of ISIS and what's going on in Middle East). We have all learned that unless one is in a
politically homogenous group, wisecracks or disputatious comments about
politics are out of bounds; so also with religion.
Shortly after the research was
published, there appeared in the Public
Library of Science a report from Aarhus University in Denmark on
physiological testing conducted to determine people's reactions to proposals
that came from their own political party and from opposing parties. They attached electrodes to skin and measured
physiological reactions to political party symbols and public policy
propositions. Pretty interesting what
they learned: people who had a
physiological reaction to a party symbol were biased toward the proposals from
their own party and against those from other parties—even when the proposals
were identical! Simply identifying with
a party was not sufficient to induce the bias, however; only those who
exhibited the physiological reaction demonstrated the bias. They concluded that "our partiality
seemingly stems from instinctive emotional reactions."
The authors went on to comment that
their finding does not render political debates irrelevant. Such events signal party directions; it's
just that people are differentially affected by the presentations and
arguments. Those with strong party
affiliation, reflected in the physiological reactions, will support their party's
proposals almost irrespective of the quality of arguments. We all know of people who will support their
preferred party's proposals regardless of the merits (or lack thereof).
"I
could tell you my adventures—beginning from this morning," said Alice a
little timidly; "but it's no use going back to yesterday, because I was a
different person then."
I had a disappointing sense of déjÃ
vu in June. When I took my first "real"
job at the University, in June, 1975, I worked for one of the University's vice
presidents, a guy named Walter Bruning (who virtually no one at the University
now remembers because he was only there two years and he had come to Minnesota
from Nebraska when Peter Magrath was appointed president; Magrath had worked
with him in Nebraska and brought him to be a vice president/"chief of
staff"). Although my position was
technically an "administrative internship," it was nearly full time
and evolved fairly quickly into a regular staff position. Anyway, effective January 1976, President
Magrath rearranged some of the vice presidential reporting lines and
responsibilities, and among other things he moved the reporting line for
athletics from another vice president to Walt Bruning.
In 1976 the University's support for
women in college athletics was minimal, to put it charitably (as was the case
all over the country). The men's budget
was probably in the several millions of dollars; the women were provided less
than $100,000.
To back up a bit, in 1972 Congress
passed Title IX (and Richard Nixon signed!), an amendment to the Higher
Education Act of 1965; the amendment provided that "No person in the United States shall, on
the basis of sex, be excluded from participation in, be denied the benefits of,
or be subjected to discrimination under any education program or activity
receiving federal financial assistance."
This was a huge step forward for women in education at all levels, but
there was little attention paid to the effect it would have on college
athletics. That changed very
quickly. The discussion at the time
focused on employment, admissions, and so on.
By 1976,
when President Magrath changed the administrative responsibilities at the
University, institutions of higher education (and K-12, for that matter) were
coming to realize that Title IX had far-reaching implications. Minnesota, like many of its peers, hired a
person to serve as Title IX coordinator, to review admissions, curriculum,
hiring and employment rules, and a host of other institutional policies and
practices that could or did have an adverse impact on women. (That person, Ann Pflaum, had an office right
next to mine and we've been colleagues ever since; Ann much later became the
University historian.) Walt Bruning
charged Ann with responsibility for everything at the University related to
Title IX except intercollegiate athletics; athletics fell to me, because Walt
assigned general staff oversight responsibility for athletics to me in addition
to analyzing the Title IX implications.
It was an
ironic choice. I had never played
sports, had never been interested in either college or professional sports, and
rarely attended athletic events. I knew
how football and hockey and basketball were played, but that was about it. A number of people (mostly in women's
athletics) said that in one way I was an ideal person to take on the charge of
the Title IX analysis: I knew squat
about athletics and didn't give a rat's behind about it, so I brought an
entirely dispassionate view to the problems.
Although it
didn't have a significant effect on what the University was doing, the student
government on campus had filed a complaint with the federal Office for Civil
Rights (in the Department of Health, Education, and Welfare) alleging that the
University was not in compliance with Title IX in athletics. (I had been in student government at the
time, and while I had little interest in athletics, I supported the idea of the
complaint.) One reason the complaint
didn't have much impact is because the Office for Civil Rights was slow about
investigating complaints and finding violations, to put it mildly. The threat of a finding of a violation was
the withholding of all federal funds to an institution; there were few in
higher education who believed that the feds were going to stop all the funding
going into bio-medical and other scientific research at a university because it
wasn't providing enough gymnasium time and hockey sticks to a few women (to put
it cynically).
The Office
for Civil Rights was so slow in enforcement, and seemed so unable to figure out
what to do, that Vice President Bruning at one point paid for me to fly to
Washington to meet with the head of OCR to explain what Title IX compliance
should look like. As I look back on it,
that was an interesting event: me, in my
late 20s, going to tell the head of a federal agency how he should do his
job. (He did meet with me and he said he
appreciated the analyses and explanation of how to consider compliance.)
In any case,
for the next several years, I worked with the people in athletics and developed
a series of analyses that looked at participation rates and opportunities
(e.g., number of sports offered for men and women), coaching positions and
salaries, facilities availability for practice and competition, funding for
athletic "scholarships" (grants-in-aid is the technical term, and
more accurate), funding for equipment, team travel, recruiting, publicity, and
administrative support, and so on. The
financial implications were significant; if it were to be in compliance with Title
IX in athletics, the University would have to substantially increase the
financial support for women's sports. Or
cut the budget for men's sports, but no one gave that alternative much thought. One of the great problems for colleges and
universities is that only a handful [literally; no more than perhaps 8-10
programs consistently make money] of athletic programs actually generate a
surplus; a couple of years ago, the average NCAA Division I school was putting
$10 million per year from student fees or institutional resources into
athletics. So the existing men's
programs—then and now—were not a cash cow that could be used to support women's
programs.
Over a
number of years, Minnesota (and most of its peers) gradually achieved something
approaching compliance, patching together funding as they were able, and by the
turn of the millennium there was probably reasonable parity in opportunity and
resources for men and women in college sport.
In recent years, however, the balance seems to have tipped again in
favor of the men, primarily because of the increasing revenue-generating
capacity of men's basketball and football (and, in the latter case, because of
the growth of the "power five" conferences, schools in which are
spending more and more money).
Now, my déjÃ
vu: the University of Minnesota is again
being investigated by the feds for Title IX compliance in intercollegiate
athletics. My guess is that quite a few
of our peer institutions could be subject to the same problems: the charge is that the department is devoting
a disproportionate share of the resources to the men's programs. Sigh.
I thought, by the time I left the vice president's office in 1986, that
the problem had largely been dealt with.
I am reassured, however, that the University will deal with this
promptly.
'My NAME is Alice, but—'
'It's a stupid enough name!' Humpty Dumpty
interrupted impatiently. 'What does it mean?'
'MUST a name mean something?' Alice asked
doubtfully.
'Of course it must,' Humpty Dumpty said with a
short laugh: 'MY name means the shape I am—and a good handsome shape it is,
too. With a name like yours, you might be any shape, almost.'
I was
extremely pleased in the middle of the year that Kathy and I were invited to
attend a wedding celebration of a faculty colleague. I am happy to pat myself on the back in this
case because we (my faculty friend and I) began talking about being divorced
several years ago (when we both were). I
was somewhat farther along in the relationship-building process than she was; I'd
met Kathy but she hadn't met anyone. I
urged her to try match.com. She did so
and met a guy to whom she got married last June. She's happy to acknowledge that I gave her
the nudge that got her going in the right direction! I'm always glad to take credit for a happy
ending.
I don't think...
Then you shouldn't talk, said the Hatter.
Kathy and I had great fun sitting on our deck in the back
yard for the cocktail hour. We put up
two bird feeders, each with multiple feeding spots, and watched with amusement
the social patterns and pecking orders (the pun is intended) of sparrows (and a
few other kinds of birds). We would have
a dozen or more birds fluttering around the feeders and others hanging out in
the local shrubbery waiting their turn.
Or, perhaps more accurately, waiting to zoom in and push another bird
off the feeder.
In
the meantime, our "pet" chipmunk would come on to the deck and wait
for the food she knew we would toss to her—grains, cashews, other nuts. We did this so often with her, and last
summer as well, that she would come up and sit right next to our feet while
eating. The other chipmunk who seems to
have resided nearby, no dummy, sat beneath the bird feeders and snarfed up all
the seeds and corn that the birds dropped.
On occasion one of the chipmunks would climb up the bird feeder to get
at the seeds—and scare the birds away. I
fixed her, however, by rubbing Vaseline all over the bottom half of the pole
holding the feeder; as soon as she touched it, she'd back away. Ha!
Then a
rabbit or two would hop through the fence and find a couple of the garden
plants it enjoyed. I put a stop to that,
first by chasing it away and then ringing the plants with powdered fox urine. Ugh, that stuff smells, but it did keep the
rabbits away from the plants.
Later in
the fall, the squirrels re-appeared (we didn't see them much in the
summer). They can't get food from the
bird feeders—the people who design these things have finally come up with a couple
that really do defeat the squirrels. But
they, like the chipmunks, are happy to sit on the ground and eat what
falls. They get plenty of food because
birds are among the messiest eaters on the planet. Even in November we could look out the window
and see 2-3 squirrels nosing around on the ground while there are 3-4 birds
eating and tossing seeds all over.
Chipmunks must hibernate early because they disappeared early in
October.
So that's
the wildlife news for the year. The 3
cats are doing fine, doing what cats do, which is not much.
Alice never could quite
make out, in thinking it over afterwards, how it was that they began: all she
remembers is, that they were running hand in hand, and the Queen went so fast
that it was all she could do to keep up with her: and still the Queen kept
crying 'Faster! Faster!' but Alice felt she COULD NOT go faster, though she had
not breath left to say so. . . .
'Now! Now!' cried the
Queen. 'Faster! Faster!' And they went so fast that at last they seemed to skim
through the air, hardly touching the ground with their feet, till suddenly,
just as Alice was getting quite exhausted, they stopped, and she found herself
sitting on the ground, breathless and giddy.
The Queen propped her up
against a tree, and said kindly, 'You may rest a little now.'
Alice looked round her
in great surprise. 'Why, I do believe we've been under this tree the whole
time! Everything's just as it was!'
'Of course it is,' said
the Queen, 'what would you have it?'
'Well, in OUR country,'
said Alice, still panting a little, 'you'd generally get to somewhere else--if
you ran very fast for a long time, as we've been doing.'
'A slow sort of country!'
said the Queen. 'Now, HERE, you see, it takes all the running YOU can do, to
keep in the same place. If you want to get somewhere else, you must run at
least twice as fast as that!'
A question
that had occurred to me from time to time, as I see all the data confirming
anthropogenic climate change, is whether the changes in the atmosphere will
affect the ability of human beings to breathe it. We know human activity is increasing the
amount of CO2 (among other things) in the atmosphere; would that
increase make it harder to breathe?
My consulting
meteorologist/climatologist tells me no.
He notes that while the atmosphere is very sensitive to the amount of CO2
it contains, CO2 is only a tiny part of the total atmosphere. The risks to health, because of breathing
difficulties, will come from tropospheric ozone—which comes from use of fossil
fuels—which becomes smog. The byproducts
of the use of other chemicals pose other health threats. But not CO2 itself. I guess I'm glad to know.
Alice:
How long is forever?
White Rabbit:
Sometimes, just one second.
Aeon had a
piece by journalist and author Donald Lazare about how the American
constitution is seen, in some quarters, as a nearly sacred document. Lazare wonders why the U.S., unlike any other
developed nation, considers its founding document so authoritative. As he points out, it was crafted the days of "silk
knee britches and powdered wigs" and it's not clear why we'd consider
ourselves bound by the views of society and government prevalent at that
time. Yes, the constitution can be amended,
but the hurdles to doing so are high (a two-thirds vote in both houses of
Congress and approval by three-quarters of the states)—so high, he points out,
that "13 states representing as little as 4.4% of the population can veto
any change sought by the remaining 95.6% of the population." In political practice, I doubt that's the way
the vetoes actually occur, with only the smallest-population states banding
together to block an amendment—although I could see them doing so if an
amendment proposed to apportion Senate seats on the basis of population rather
than two per state.
Lazare makes
a point that has occurred to me before and notes statistics I hadn't seen. "In the 114th US Congress,
67.8 million people voted for senators who caucus with the Democratic
Party, while 47.1 million voted for senators who caucus with the
Republican Party. Yet those 67.8 million votes elected 46 senators
while the 47.1 million votes elected 54 senators." This is not a representative system.
Federal
Judge Richard Posner (7th Circuit), in review article in the Yale Law Journal, commented on the
question of the constitution as governing document:
Federal
constitutional law is the most amorphous body of American law because most of
the Constitution is very old, cryptic, or vague. The notion that the
twenty-first century can be ruled by documents authored in the eighteenth and
mid-nineteenth centuries is nonsense.
Kathy
and I attended a talk/interview with Justice Scalia, who is one of those who
argue for a strict construction of the constitution. Posner, in effect, clearly believes that to
be silly. So do I.
Do you hear the snow against the window-panes,
Kitty? How nice and soft it sounds! Just as if someone was kissing the window
all over outside. I wonder if the snow LOVES the trees and fields, that it
kisses them so gently? And then it covers them up snug, you know, with a white
quilt; and perhaps it says, "Go to sleep, darlings, till the summer comes
again."
I
don't know if other parts of the country—or other countries—have indices for
the livability of their weather. For
some time we've had the Winter Misery Index, which measures exactly what the
name suggests. It "assigns points
for daily counts of maximum temperatures 10 degrees or colder and daily minimums
of 0 or colder. . . . For snowfall, one
inch is assigned a point per calendar day. . . . The duration of a winter is noted by the
number of days the snow depth is 12 inches or greater." [Minnesota
Department of Natural Resources]
The
DNR has now developed the Summer Glory Index (SGI), which "tells us
approximately how wonderful our June through August meteorological summer has
been. It combines the effects of temperature, humidity, and precipitation." [Also DNR]
The summer of 2015, we learn, was the third most glorious for the period
since records exist, from the early 1900s.
Only 1922 and 2008 had nice summers.
My
memory is so bad. I can't remember the
summers of 1983 and 1988 well enough to know that they were "wretched,"
although I do recall 1988 as being very hot and burning out many lawns. There are another half dozen in the last 30
years that were NOT glorious that I also can't recall. I'd only be able to remember if I'd kept a
weather diary all my life—and I just don't care enough to have done so or to
start now. I'll take the DNR's word for
it that they were crappy.
'So
here's a question for you. How old did you say you were?'
Alice made a short calculation, and said 'Seven years and six months.'
'Wrong!' Humpty Dumpty exclaimed triumphantly. 'You never said a word like it!'
'I thought you meant "How old ARE you?"' Alice explained.
'If I'd meant that, I'd have said it,' said Humpty Dumpty.
Alice made a short calculation, and said 'Seven years and six months.'
'Wrong!' Humpty Dumpty exclaimed triumphantly. 'You never said a word like it!'
'I thought you meant "How old ARE you?"' Alice explained.
'If I'd meant that, I'd have said it,' said Humpty Dumpty.
British
travel journalist and 3-book author Richard Grant (born in Malaysia who lived
in Arizona as well as Kuwait) asked in an article headline "Q: Is travel a
necessary component of the good life?"
He answered his question thus: "No,
but go anyway and bring an open mind."
His books are about nomadic life in America, lawlessness in the Sierra
Madre mountains in Mexico, and dangerous situations in Africa that he
encountered.
Grant
reports that "people who live in deeply rooted, pre-modern cultures are usually
calmer, happier and more fulfilled than we are, despite their relative lack of
comfort, convenience and material possessions. Very rarely do they feel the
desire to travel, and it certainly plays no part in their idea of a life well
lived." We in 21st
Century America, especially those who are reasonably affluent, aren't
pre-modern; "modern consumer living comes with a built-in restlessness and
dissatisfaction." Or if not
dissatisfaction, at least an urge to see other parts of the world.
I've long
maintained that there are two categories of people: those who travel and those who vacation. We usually travel, and then have to come home
and go back to work to rest. Grant makes
a different distinction, which I like: "we
must make a distinction between travel, which takes the traveler out of his or
her comfort zone, and tourism, which strives to maximize comfort and
familiarity in a foreign setting. A good travel experience is not relaxing, but
stimulating and taxing. The senses are on full alert. The mind struggles to
keep up with the bombardment of unfamiliar data, the linguistic difficulties,
the puzzles and queries and possible threats." I confess, sometimes I like
to be a traveler, sometimes to be a tourist.
Going to Asia was being a traveler.
I suppose going to Europe is partly travel, partly tourist, for
Americans; Europe isn't quite as different as Asia.
"Travel
enables us to see our own culture more clearly, by contrasting it against
others." I agree, and it seems to
me that the most rabid American "patriots" on the right are often
those who've never traveled, never seen another culture, never understood that
we aren't the only ones who feel "exceptional."
Here the
Red Queen began again. 'Can you answer useful questions?' she said. 'How is
bread made?'
'I know THAT!' Alice cried eagerly. 'You take some flour—'
'Where do you pick the flower?' the White Queen asked. 'In a garden, or in the hedges?'
'Well, it isn't PICKED at all,' Alice explained: 'it's GROUND—'
'How many acres of ground?' said the White Queen. 'You mustn't leave out so many things.'
'I know THAT!' Alice cried eagerly. 'You take some flour—'
'Where do you pick the flower?' the White Queen asked. 'In a garden, or in the hedges?'
'Well, it isn't PICKED at all,' Alice explained: 'it's GROUND—'
'How many acres of ground?' said the White Queen. 'You mustn't leave out so many things.'
Elliott
and I had a brief exchange last summer, one that reflects news articles I'd
seen over several months. Elliott doesn't
have a driver's license—a permit, but not a license. He told me he hates to drive. I told him that I thought he should just get
the license so that if an emergency arises, now or in the future, he'd be able
to drive. His response was not
illogical: why should he get a license,
and thus start incurring the cost of car insurance (I have never had to pay car
insurance for him because he's not counted with only a permit)? If there's truly an emergency, he said, he'd
drive without a license and pay the fine, which would be a lot cheaper than
paying hundreds of dollars per year in car insurance.
He
occasionally points out to me that he's saved me thousands of dollars by not
having him on my car insurance for the last 9 years.
I
point out to him that if and when he's married and has kid or kids, his wife
may not be too keen on being the only driver.
A sick kid needs drugs, mom is nursing.
But dad can't drive to the drugstore to get it? I think mom would get really tired of that in
a hurry. But we shall see—and if the day
comes, I certainly won't be in the middle of it.
The sun was shining on the sea,
Shining with all his might:
He did his very best to make
The billows smooth and bright--
And this was odd, because it was
The middle of the night.
Shining with all his might:
He did his very best to make
The billows smooth and bright--
And this was odd, because it was
The middle of the night.
I
sort of liked this little excerpt from a Financial Times piece:
J.M. Keynes was right to predict global
prosperity, wrong to predict that this would mean lives of leisure for all.
Where did his analysis break down? "The
noble answer is that we rather like some kinds of work. We enjoy spending time
with our colleagues, intellectual stimulation or the feeling of a job well
done. The ignoble answer is that we work
hard because there is no end to our desire to outspend each other."
I can truthfully say that the first
answer is more applicable to me. I don't
want to outspend others, I just want to have enough to spend on what I
want. I suppose it's true that "what
I want" is to a certain extent driven by what I see that others have, but
I think only to a modest extent. And in
those cases when it is true, it's not because I want to outspend anyone, it's because I happen to like whatever the others
are spending money on!
And I do like spending time with the
vast majority of the people with whom I work—and the stimulation I receive from
them.
'He said he would come in,' the White Queen
went on, 'because he was looking for a hippopotamus. Now, as it happened, there
wasn't such a thing in the house, that morning.'
'Is there generally?' Alice asked in an astonished tone.
'Well, only on Thursdays,' said the Queen.
'Is there generally?' Alice asked in an astonished tone.
'Well, only on Thursdays,' said the Queen.
These are among my favorite people,
especially at dinner parties. I am not
one of them, unfortunately. Elliott
comes close.
Well, it's no use your talking about waking
him, said Tweedledum, when you're only one of the things in his dream. You know
very well you're not real.
I am real! said Alice, and began to cry.
You won't make yourself a bit realer by crying, Tweedledee remarked: there's nothing to cry about.
If I wasn't real, Alice said–half laughing through her tears, it all seemed so ridiculous–I shouldn't be able to cry.
I hope you don't think those are real tears? Tweedledee interrupted in a tone of great contempt.
I am real! said Alice, and began to cry.
You won't make yourself a bit realer by crying, Tweedledee remarked: there's nothing to cry about.
If I wasn't real, Alice said–half laughing through her tears, it all seemed so ridiculous–I shouldn't be able to cry.
I hope you don't think those are real tears? Tweedledee interrupted in a tone of great contempt.
Six
times to the dentist's chair in 4 weeks is not my idea of a good time. First I have a molar on one side—which one I
cannot tell—that aches slightly; I see the dentist and he (and I) conclude it's
likely from clenching my teeth (which I do from time to time for no reason that
I can ascertain). That seemed likely,
given that I couldn't even pinpoint which tooth ached.
A
few days later, a chunk breaks off from a molar on the other side of my mouth
and leaves exposed a filling that's probably 50+ years old. Fortunately, there's no pain, but in to the
dentist I go. He says I need a crown on
that molar. I'm not thrilled but not
surprised. I schedule an appointment for
a week or so later, and another two weeks after that.
At
the same time, my other molar has continued to bother me, so the dentist drags
in the endodontist who's on her way out the door for the day. She taps teeth with a metal gizmo (you always
love the feel of metal on teeth), puts cold and hot against them; I jump when
she puts the heat on one of them. She
examines further and concludes I have an infected abscess underneath one of me
existing crowns that requires a root canal; she puts me on antibiotics and
ibuprofen—and squeezes me in to her schedule the next day to get the first half
of the root canal done.
Which
she does, a process that involves drilling through a crown after giving me
enough novocaine to take out a battalion (three injections). I just love the noise and smell coming from
my mouth during drilling. But she can't
finish because of the infection; blood and pus came out of the crown once she'd
drilled through it, she told me, and the infection needed to be gone before she
could finish. Even with all the
novocaine, she kept hitting sensitive spots that made me jump in the chair.
Four
days later, back for the first appointment for the new crown. More novocaine,
more drilling. Get a temporary crown and I'm to come back two weeks later to
have the permanent crown put in. The
next night the temporary crown breaks and it feels like part of it disappeared
(I think I swallowed it while eating dinner.
Yuck.)
The
next morning is the second root canal appointment. First she confirms that I
indeed lost half the temporary crown.
Massive novocaine again, more drilling.
This time heat and burning smells as well, as she injects the inert material
into the canal. But she finishes, says
it went perfectly.
From
the endodontist (in a different location) I drive back to the dentist (at the U
Student Health Service) and get a new temporary crown. Fortunately, no novocaine, just a new
temporary. Two weeks later, back to get
the new permanent crown. That went
without incident, fortunately.
The
bad part is that my dentist tells me I need two additional crowns (because the
fillings from my childhood are slowly loosening/deteriorating and the molars
will split); he suggests I get them before he and I retire because dental
insurance while working is better than that when one is retired.
'Which
reminds me—' the White Queen said, looking down and nervously clasping and
unclasping her hands, 'we had SUCH a thunderstorm last Tuesday—I mean one of
the last set of Tuesdays, you know.'
Alice was puzzled. 'In OUR country,' she remarked, 'there's only one day at a time.'
The Red Queen said, 'That's a poor thin way of doing things. Now HERE, we mostly have days and nights two or three at a time, and sometimes in the winter we take as many as five nights together—for warmth, you know.'
'Are five nights warmer than one night, then?' Alice ventured to ask.
'Five times as warm, of course.'
'But they should be five times as COLD, by the same rule—'
'Just so!' cried the Red Queen. 'Five times as warm, AND five times as cold—just as I'm five times as rich as you are, AND five times as clever!'
Alice was puzzled. 'In OUR country,' she remarked, 'there's only one day at a time.'
The Red Queen said, 'That's a poor thin way of doing things. Now HERE, we mostly have days and nights two or three at a time, and sometimes in the winter we take as many as five nights together—for warmth, you know.'
'Are five nights warmer than one night, then?' Alice ventured to ask.
'Five times as warm, of course.'
'But they should be five times as COLD, by the same rule—'
'Just so!' cried the Red Queen. 'Five times as warm, AND five times as cold—just as I'm five times as rich as you are, AND five times as clever!'
MIT
psychologist Professor Sherry Turkle, about whom I had a few comments last year
because of her conclusions that electronic instruments of communication are reducing—degrading—the
amount and quality of live conversation that people have, with troubling consequences
for human relationships, has come out with a book on the subject, Reclaiming Conversation: The Power of
Talk in a Digital Age. Reader
reviews on Amazon are mostly extremely positive, as were those in the New
York Times, the Boston Globe, and the Atlantic, among many
places. I am limited in how much I can
say about the book because I haven't read it, although I noted with interest
one critic on Amazon who claimed the book was nothing more than a large set of
anecdotes rather than a work that synthesizes and seeks to create theory about
human interactions. If true, that's a
telling critique.
I'm obviously a crank on this
subject because I still don't agree with much of her argument (as I understand
it from numerous book reviews). It also
be, however, that my relationships with friends and family are outside the norm
that Turkle sees in the world. I do talk
at length with my spouse and my kids (in addition to texting and
emailing with them). I talk with
friends. I see my kids talk to their
friends (in addition to texting and emailing with them).
I'm inclined to concur with a reader
comment in the Atlantic, in response to a series of photos of people together
but glued to their cell phones.
There were no glory days when family members,
lovers, or friends paid attention to each other during every hour spent
together. People read books and newspapers, knitted, listened to the radio,
watched TV, darned socks, put together jigsaw puzzles.
So let's just let go of this notion that
cell phones are isolating people. More often than not, they are conversing with
others on the other end of that cell phone conversation, people who in the past
would have been out of their lives because it was too expensive to call long
distance and letters took ages to get there.
I
agree. Sometimes Kathy and I will be on
our cell phones while out to dinner—and more often than not it's looking up the
answer to a question that's come up over dinner conversation!
'Are we
nearly there?' Alice managed to pant out at last.
'Nearly there!' the Queen repeated. 'Why, we passed it ten minutes ago! Faster!'
'Nearly there!' the Queen repeated. 'Why, we passed it ten minutes ago! Faster!'
Elliott texted me a question that made me
laugh out loud—I really did LOL (which I don't do very often).
So I'm having this discussion with some
classmates. Thought I'd pass on the
question to someone who might have an answer:
was there a day when you woke up and thought "gee, I'm officially
an adult now and I have a purpose" or do you just sort of continue to
wander through life never quite sure exactly what's going on or what you're
supposed to be doing? I'm guessing from
a psychology standpoint that that moment probably doesn't exist.
I told him that my flip response is to say
that I still don't know what I'm going to do when I grow up. But I said that his last comment was probably
correct: there's no such epiphany.
I haven't seen any research on the
question. But, of course, I poked around
a little on the web.
Psychology Today had a narrative
piece, nothing approaching "research," but with observations that are
aligned with what I've seen and read about.
After WWII and even into the 1960s, it was a clear path for the middle
class: go to school, get married, buy
house, have kids. You were on your own,
and probably no more than 25 years old (with the caveat that the veterans,
immediately after the war, were a bit older).
It may be that many who are just coming out of college now would find
that path implausible, especially if they are not paired (the age of marriage
is increasing), don't have a job, or have debt from school loans. Or are characterized by two or all
three.
One indicator of change is the significant
increase in the number of 18-24-year-olds still living at home with their
parents—now over half of them (56% in 2012 according to the Pew Research
Center). It is the case that the number
of people in that age range who are in college has increased (living in a dorm
is counted as living at home). Many
students also take longer to complete their degrees. So "adulating" gets pushed back.
The Psychology
Today author suggests that "adulthood is more subjective" with
the Millennials. I'm not so sure that's
a new attitude. I haven't asked them,
but it will be interesting dinner conversation to ask my peers when they
thought—knew—they were "adults."
Psychology
Today also had a blog by a psychologist at Rutgers, more
research-based. She related that the
question of "who is an adult" is not merely idle curiosity because
having some kind of an answer can help in talking to adolescents and those in
their early 20s about having a solid base in life to become adults. Her answer is
not quite what I'd expected.
Research in the field, survey
research that has been replicated a number of times, has found that "very
few young and very few old consider the things that adults "do"—having
a job, buying a home, getting married, or having a child—indicators of
adulthood. Rather, it became apparent that becoming adult was about, well,
becoming. . . . Accordingly, an adult is
someone who—accepts responsibility, makes independent
decisions, and becomes financially independent." The majority of 18-29-year-olds feel in
between being an adult and not. This is
the school of "emerging adulthood."
There is another segment of the
academic community, however, that argues we simply have an extended adolescence
that postpones adulthood. These scholars
maintain that adolescence is the stage in life when we prepare to become
adults, that schooling, careers, and marriage are still appropriate markers.
So it's an argument within the
community of scholars about whether it's "getting there" or "being
there." There are elements of life
where this makes a difference: policies
and programs based on age (juvenile court, car insurance, etc.). The debate also comes into play when parents
have "adult" or "near-adult" children at home—visiting or
living. To what extent is there
independence, to what extent is support provided? When there's no longer any legal obligation
on the part of parents to the children?
There's no set of guidelines or
rules to help the youngsters or their parents, certainly nothing like what's
available for small children. So I guess
we all just muddle along.
For me, if I thought about it at all—which
I don't remember doing, although I may have—it was probably when I got engaged
at age 29. Yes, I was certainly an adult long before that—I had not lived
at home for several years, I had a real job, bought food, did laundry,
etc. But I was also single, partied with friends, and certainly didn't
know that I was going to spend the rest of my life at the University. I
didn't know what I was going to do.
I told Elliott that "some of my
friends knew what they were going to do, mostly the ones who went to law school
or another professional school. Your cousin Ben knew what he was going to
do when he got his degree in business. Krystin didn't know what she was
going to do—getting into grant administration probably wasn't on her list of
likely careers. Kathy, with an art history degree, certainly didn't know
she was going to fall into web and technology work."
Maybe some people realize they're an adult
early--especially those, I suppose, who get married early and have kids (late
teens, early 20s).
I asked Elliott if he thought he was an
adult. I wrote to him: "I would say so. What's unclear is
what the measures are. Independence from parents? Married, with or
without kids? Professional or job responsibilities? Instead of a bright line, I suspect it's a
gradual transition. What do you think?"
I didn't get the answer I'd expected from
him. [The terseness of the language is because it was a text, not an email.] "I'd say overall mostly no. Yes, I spend more time living away from home
than not right now and I have been legally adult for several years. But I not financially independent, my current 'job'
is just that one I have until something better comes along, and I have no
responsibility to anyone other than myself. Imagine that last bit changes
considerably when you married/have a kid."
I said he was absolutely right on the last
comment. But he implicitly responded to the
question I'd posed: what are the
criteria? Intellectually he's an adult,
even if he might revise some of his views as he gets older. There are many baby boomer offspring who
still live at home, but they're all adults. Well, most are. I told him that "your job now is 'student.'
That does not mean you are not an adult."
Elliott concluded that he might
never feel like an adult, depending on the criteria used to make a
judgment. I told him he resembled me in
that respect. "Not anymore, but
there were times well into adult life when I just wanted, for a few seconds, to
go back home and sleep in my bed at my parents' house. Couldn't, of course, and didn't really want
to, but the thought crossed my mind."
I added that he didn't actually have
to decide "I'm an adult now."
People just go about doing things in life—get a job and work, maybe
marry/get involved with someone, maybe have kids, develop friendships, and so
on—and then you look around and realize you must have become an adult when you
weren't looking.
How puzzling all these changes are! I'm never
sure what I'm going to be, from one minute to another.
One of my cousins, on a side of the
family that I don't see very much, died this year at age 79. She died on my brother's birthday, October
11. That recalled for me the fact that
either my grandfather Theodore or that cousin's father (my uncle William) also
died on my brother's birthday (in 1960).
I couldn't remember which one so I Googled both my grandfather and
uncle, just to see what would turn up. I
did find the death dates (it was my grandfather Theodore, who died 9 days after
my uncle William, his son, but my father (brother to William) thinks his father
never knew that his son William had died because he—my grandfather Theodore—was
quite ill when it happened, and died shortly thereafter.
Too long a prologue to what I discovered
about my grandfather on the web. I came
upon a source from 1905, Commemorative Biographical Record of the Upper Lake
Region (1905). It's one of the more vivid descriptions I've
seen of someone from more than a century ago who is also a relative (excerpts):
Theodore Engstrand, the popular and genial
manager of the Brule Store Co., at Brule, Wis. . . . [I only
knew him when he was elderly, in his late 70s and early 80s, playing solitaire
in his rocking chair and smoking his pipe, mostly quiet, certainly not as the
genial and popular man he might have been six decades earlier.]
In the public schools of Price County, Theodore Engstrand received his elementary education. The schools then in vogue in that locality, were primitive in the extreme, the curriculum embracing only reading, writing and arithmetic. However, meager as were his opportunities, he succeeded by dint of effort in becoming well grounded in the preliminaries of an English education. Supplementary to this he took a year's commercial course at Bryant & Stratton's Business College, Chicago, Ill., where by diligence he became thoroughly familiar with the principles of a technical business education. [This upends a story I've been relating about my family, that my father was the first one to go to college. I've been wrong all these years! I also learned that Bryant & Stratton is still in existence—and now that I looked it up on the web, I get advertisements for it on my Facebook page all the time!]
In 1898 he abandoned his school duties
within two months of graduation, to assume the responsibility of managing the
general mercantile establishment of the Brule Store Co., at Brule. . . . Since accepting this responsible position,
Mr. Engstrand has continuously devoted himself to conducting the affairs of the
establishment, which, under his wise guidance, has been eminently prosperous. [My dad
told me that this affluence came to an end a few years later, when the store wasn't
doing well, so my grandparents moved to Rib Lake, WI, where he had another
store. At some point they came to
Minneapolis—my father speculated that it was because his mother left his father
for some reason, and they ended the separation by him coming to join her in
Minneapolis. My grandfather worked
selling shoes for different companies, then started another store in 1931, in
Minneapolis, on 44th and Beard, but it burned down a couple of years
later. After being ill for a period, my
grandfather finally got a job: he worked at the University; it was on a
government program, and he was even in a little movie they made, where he was a
guy at the desk, supposedly back in the early 1900s. Eventually he went
to work for Donaldson's selling suits.
So I had another "fact" wrong in my history: I was NOT the first person in the family to
work for the University!]
Socially and fraternally, Mr. Engstrand
occupies a conspicuous place in the community, being an honored member of West
Superior Lodge No. 236, F. & A.M., and of the Knights of the Maccabees,
Brule Tent No. 34. Mr. Engstrand's political affiliation is with the Republican
Party. He takes an active interest in local affairs, and has been honored by
his party as delegate to Douglas county and Congressional conventions, serving
three years in the former. [So I learned to my surprise that he was
active in civic affairs and politics.
Interesting that he was a Republican—in the day when Theodore Roosevelt
was the leader, an environmentalist and trust-buster. My grandparents later became Democrats.]
"Try
another Subtraction sum. Take a bone from a dog: what remains?" asked the
Red Queen
Alice considered. "The bone wouldn't remain, of course, if I took it—and the dog wouldn't remain; it would come to bite me—and I'm sure I shouldn't remain!"
"Then you think nothing would remain?" said the Red Queen.
"I think that's the answer."
"Wrong, as usual," said the Red Queen, "the dog's temper would remain."
Alice considered. "The bone wouldn't remain, of course, if I took it—and the dog wouldn't remain; it would come to bite me—and I'm sure I shouldn't remain!"
"Then you think nothing would remain?" said the Red Queen.
"I think that's the answer."
"Wrong, as usual," said the Red Queen, "the dog's temper would remain."
I suppose, for the locals, there's one
other anniversary of (mostly) amusing note:
the Armistice Day blizzard of 1940 "is the defining blizzard of the
20th century in Minnesota and remains the storm against which all other
blizzards in this state are compared."
About the same amount of snow fell in the 1940 blizzard as in the
Halloween blizzard of 1991. The
temperatures mid-day were in the low 60s but they dropped by over 50 degrees as
the day wore on. There were 145 deaths
attributed to the storm (across the Upper Midwest)—so to those families the
blizzard was anything but amusing—and 1.5 million turkeys also died.
The only story I know of from the 1940
blizzard is one my long-time neighbor Haldis Jezusko told (the Jezuskos lived
next door to my great aunt and uncle for 30 years; I used to play with the
Jezusko kids when I'd come to visit my great aunt and uncle; Haldis still lived
there when we moved into my great aunt and uncle's house in 1989). Haldis worked downtown at Donaldson's
department store and related that since it was such a nice morning, she wore
sandals and a light coat to work. But it
took her two hours to get home on the streetcars, her feet were soaked, and she
was freezing.
I imagine that for the generations that recall
it (not so many of those generations left alive now), it was one of those
moments in life when almost everyone could remember where they were and what
they were doing. We had dinner guests in
1991; the drapes were closed and we were paying no attention to the weather
outside. I could barely get the front
storm door open when it came time for them to leave; they ended up staying
overnight, even though they only lived 8 blocks away.
The other dates that stick in our
memories, of course, are John Kennedy's assassination, the attack on the World
Trade Center on 9/11, and (perhaps) the explosion of the Challenger space
shuttle. We had a dinner conversation
about these dates and where we were; Kathy and I had forgotten the last one,
but our friend Carolyn Collins mentioned it and we agreed with her that it was
one of those dates.
Alice: I simply must get through!
Doorknob: Sorry, you're much too big. Simply impassible.
Alice: You mean impossible?
Doorknob: No, impassible. Nothing's impossible.
Doorknob: Sorry, you're much too big. Simply impassible.
Alice: You mean impossible?
Doorknob: No, impassible. Nothing's impossible.
A friend of mine sent me a clipping
from the Wall Street Journal titled "Science
Increasingly Makes the Case for God."
The author considered the null SETI findings (Search for Extraterrestrial
Intelligence) and the physics of the universe—the values of the four
fundamental forces must be what they are for the universe to exist—and
concluded that in combination they suggest that life on Earth may be the only
life in the universe, that life in the universe—and the universe itself—is
miraculous, and thus a strong argument for a Creator. (The author, Eric Metaxas, has written two
noted biographies, William Wilberforce and Dietrich Bonhoeffer, numerous
children's books, hosts "Socrates
in the City: Conversations on the Examined Life" events in New York City, and does other
things. So he's no intellectual slouch.)
Metaxas's article prompted a
response in the New Yorker from
Lawrence Krauss, professor of theoretical physics, Foundation Professor of the
School of Earth and Space Exploration at Arizona State, and chair of the board
of sponsors of the Bulletin of the Atomic
Scientists. Krauss took exception to
Metaxas's article. Krauss points out,
naturally, that Metaxas doesn't understand the science and that his claim
simply revives the ages-old argument for intelligent design.
Krauss maintains that it is too
early to dismiss the possibility of life elsewhere in the universe, and that
the opportunities for life elsewhere, in scientific assessment, have increased
dramatically. He also says that Metaxas
doesn't understand statistics or how evolution works (physical processes are
not random, natural selection drives systems in a certain way, and evolution is
inevitable; if something had not wiped out the dinosaurs 65 million years ago,
allowing mammals to develop, the planet might be populated by intelligent
lizards). As for the fundamental forces,
Krauss says Metaxas confuses cause and effect:
life evolved in the universe as it exists, and we manage to stay on the
planet because gravity stops us from floating away.
In my judgment, any time a lay and
popular writer chooses to tangle with a scientist in the scientist's own
field—and especially when the scientist in question advocates for public understanding
of science, supports policy based on empirical data, works on behalf of science
education, and seeks to reduce the impact of superstition—the lay writer isn't
going to come out looking good. If you
read the two articles one right after the other, Metaxas doesn't end up looking
like he knows what he's talking about.
(There are a number of reasons why
we might not detect alien life, including the fact that radio waves spread and
dissipate over long distances. I don't
believe the SETI searches can reach more than a few dozen or hundred light
years away from Earth—but the universe is 12 billion light years across.)
"What
impertinence!" said the Pudding. "I wonder how you'd like it, if I
were to cut a slice out of you, you creature!"
It spoke in a thick, suety sort of voice, and Alice hadn't a word to say in reply: she could only sit and look at it and gasp.
"Make a remark," said the Red Queen: "It's ridiculous to leave all conversation to the pudding!"
It spoke in a thick, suety sort of voice, and Alice hadn't a word to say in reply: she could only sit and look at it and gasp.
"Make a remark," said the Red Queen: "It's ridiculous to leave all conversation to the pudding!"
Don't bother eating banana peels, if
you've given thought to doing so. There
have been web postings about the positive benefits of the nutrition in the
peels, including curing insomnia, treating depression, lowering cholesterol,
protecting your heart, and helping your eyes.
Alas, as with many claims about the miraculous values of different food
items, there's no research evidence to support the claims. There may be a few nutrients in the peel, it
appears, but even then, our bodies might not be able to absorb them.
Moreover, the peels just don't taste
good. One would need a high-powered
blender and some additional ingredients for a banana-peel smoothie.
Some guy, who's written a book on bananas,
says "Go ahead and google 'monkey
eating a banana,' and you'll see that even most monkeys are peeling the banana
before eating it. If monkeys are smart enough to figure this out, we should be,
too."
But four
young Oysters hurried up,
All eager for the treat;
Their coats were brushed, their faces washed;
Their shoes were clean and neat --
And this was odd, because, you know,
They hadn't any feet.
All eager for the treat;
Their coats were brushed, their faces washed;
Their shoes were clean and neat --
And this was odd, because, you know,
They hadn't any feet.
If
you aren't, you should be paying attention to CRISPR: "clustered regularly interspaced short
palindromic repeats." It's a
technique that allows easy and precise editing of plant and animal DNA. CRISPR will not only allow manipulation of
plants (e.g., to resist disease) but it will permit editing human DNA—and thus
human heredity. (If one imagines DNA as
the machinery that produces us, CRISPR represents the ability to change what
the machine is producing—or not.)
That will raise
all kinds of challenging questions. It
will likely be possible, for example, to edit the DNA of the human egg or sperm
to eliminate inherited diseases, such as Huntington's or sickle-cell anemia, or
eliminate cystic fibrosis in the fertilized egg. So parents could be assured their child wouldn't
have those or any other identifiable afflictions. Or it may be possible to insert genes that
protect against aging or Alzheimer's.
At
some point it might also mean parents can select for intelligence, and the
worry is that such ability to select would lead to "super people" and
designer babies, for those who could afford it—and that everyone else would be
left behind. A dystopian vision, some
believe, or "positive eugenics."
Scientists report, however, that it is becoming easy to manipulate the
genes of an embryo, although it isn't done, and the process isn't perfect. But it almost certainly will get better. (One leading scientist estimates 10-20 years
for a gene-edited human.)
Some
countries (not including the U.S.) have banned what is called "germ line"
(embryonic) engineering, but that was before CRISPR technology had been
developed. The technology has evolved
faster than scientists imagined.
What
do Americans think about this? In 2014
about half thought gene editing would be appropriate to reduce or eliminate the
risk of a serious disease, about half did not.
An overwhelming majority thought it would not be appropriate to edit for
intelligence.
A
well-known UCLA law professor, Eugene Volokh (who teaches on free speech,
religious freedom, church-state relations, etc.), argues that if genetic
modification to make a baby smarter becomes available, all objections will
become irrelevant—because people will clamor for it and get it. In general, he points out, society is better
off with more smart people. If parents
could choose the IQ of their children, 85 or 100 or 130, many, most, will
choose 130. Even if Congress bans the
procedure, it won't matter. (Set aside
all religious, ethical, moral objections, just look at the practical
implications of the development of the procedure.)
It
won't matter because the Chinese or Russians or Germans may not, and we
(Americans) will come to realize that the Chinese and Russians and Germans will
suddenly have cohorts of much smarter children.
The Economist reported earlier
in 2014 that Chinese scientists reported that they have already tried to use
CRISPR to edit human embryos. The
embryos couldn't develop full term—yet.
Besides
international competition, the wealthy will want the procedure available for
their children, in any case, so will go overseas to get it done if the U.S. won't
permit it. Then the upper middle class
will protest, and then the middle class, and eventually public policy will have
to change.
Volokh
also observes that if the procedure were to become successful and widely used,
suddenly there might be a lot of jobs that need to be done that smart people
won't want to do. But maybe all those
smart people will find other ways to get those jobs done.
The
Economist writer does, however, note
one hurdle to designer babies: even with
huge amounts of data, "biology still has a tenuous grip on the origins of
almost all the interesting and complex traits in humanity. Very few are likely
to be easily enhanced with a quick cut-and-paste. There will often be
trade-offs between some capabilities and others. An à la carte menu of
attributes seems a long way off. Yet science makes progress—indeed, as gene
sequencing shows, it sometimes does so remarkably quickly." My good friend the late Burt Shapiro, who
taught genetics to incoming medical and dental students at the University of
Minnesota for decades, cautioned me many times over lunch to be extremely
suspicious of any claims that such-and-such a disease or affliction has been
linked to a single gene. Most human
traits, to the extent they are genetic, are extremely complex. Designing for smart may happen—but not very
soon.
Michael
Specter, writing in the New Yorker,
contends that to use CRISPR
properly will require far more than a dialogue among
scientists, however, no matter how public or well-intended. Many of the people
I interviewed told me that, if the public is left out of the discussion about
CRISPR, the technology will almost certainly end up with what they described as
a "Monsanto problem." When Monsanto introduced G.M.O.s, in the
nineteen-nineties, it failed to engage people. Nobody knew whether these new
products—seeds created in a lab—were safe, or for whom they were most valuable.
The result was a suspicion (and, in many cases, an overt hostility) that has
lingered long after the technology has been proven both safe and useful.
The Economist reporter suggested that editing human embryos "is a
Rubicon some will not want to cross. Many scientists, including one of CRISPR's
inventors, want a moratorium on editing 'germ line' cells—those that give rise
to subsequent generations."
I
think it likely that both Specter and the Economist
are wrong. If it can be demonstrated
that CRISPR can be used to edit embryonic genes to produce smarter kids with no
side effects (which there should not be, from what I can understand, if the
technique is perfected), Volokh will be right:
at the least, those with money will demand it, even if they have to go
abroad.
So
it's not for us or our children, but depending on how quickly the scientists
achieve perfection in the process, our grandchildren may face a decision about "improving"
their children.
In any case, my
advice is that if you see a headline that refers to CRISPR, you might want to
take a glance at it.
The day after I
composed these few paragraphs, this appeared in the Scientist:
"Engineering Virus-Resistant Plants
"Researchers use CRISPR to create plants that resist infection by DNA viruses."
"Researchers use CRISPR to create plants that resist infection by DNA viruses."
Nearly simultaneously, Nature reported research (using CRISPR)
at UC Irvine that could re-engineer the genes of the mosquito that carries
malaria so that the disease could essentially be wiped out. CRISPR allows changes so the mosquito's
disease-resistant trait shows up in all its offspring (which hasn't been
possible before), which would spread rapidly.
The
Nature report also included comments
from an MIT political scientist who studies technology, who made a point
similar to the one Specter did: advances
in science are moving much faster than policy discussions and regulations,
especially when it comes to re-engineering populations in the wild. To change a critter in the wild, even the
mosquito, could affect entire ecosystems.
(The scientists in California chose to do their work on a non-native
mosquito that cannot survive in California.)
The UC researcher predicted that they could have mosquitos for tests in
nature within a year—but they don't plan to release them until there have been
appropriate public and scientific discussions.
We'll
go slow on bugs—but I'm skeptical we'll go quite so slow when it comes to
eliminating human disease and improving babies.
Just
before I sent this letter off to be copied:
(1) Scientific American: (11/30/15)
This Week, World Summit On Altering Human Genes Explores Ethical Limits
Academies in the US, China and the UK
jointly organized the gathering
The popularity of
the genome-editing tool CRISPR–Cas9, which uses bacterial enzymes to cut
genomes in precise spots to disrupt or repair troublesome genes, has sparked an
ethical debate—and many believe that the time is ripe for an international
discussion.
In January,
Baltimore and a small group of scientists gathered in Napa, California, to
discuss issues surrounding genome editing, including rumours that researchers
had already edited human embryos. Some consider the editing of any reproductive
cell as contentious because the changes could be passed to future generations
(2) MIT Technology Review:
(11/5/15)
The biotechnology
startup Editas Medicine intends to begin tests of a powerful new form of gene
repair in humans within two years.
Speaking this week at the EmTech conference in Cambridge,
Massachusetts, Editas CEO Katrine Bosley said the company hopes to start a
clinical trial in 2017 to treat a rare form of blindness using CRISPR, a
groundbreaking gene-editing technology.
If Editas’s plans move forward, the study would likely be the
first to use CRISPR to edit the DNA of a person.
CRISPR technology was invented just three years ago but is so
precise and cheap to use it has quickly spread through biology laboratories.
Already, scientists have used it to generate genetically engineered monkeys,
and the technique has stirred debate over whether modified humans are
next
(3) The New Yorker: (11/16/15)
"When I’m ninety, will I look back and be glad about what we have accomplished with this technology? Or will I wish I’d never discovered how it works?" —University of California, Berkeley, molecular biologist Jennifer Doudna, who helped develop CRISPR technology
(4) SciTechDaily: 12/3/15
Scientists Overcome Key CRISPR-Cas9 Genome Editing Hurdle
Researchers at the
Broad Institute of MIT and Harvard and the McGovern Institute for Brain
Research at MIT have engineered changes to the revolutionary CRISPR-Cas9 genome
editing system that significantly cut down on “off-target” editing errors. The
refined technique addresses one of the major technical issues in the use of
genome editing.
From what I'm seeing now on a daily
basis, I could keep on adding these citations until this letter is 500 pages
long.
"You couldn't deny that, even if
you tried with both hands."
"I don't deny things with my hands," Alice objected.
"Nobody said you did," said the Red Queen. "I said you couldn't if you tried."
"I don't deny things with my hands," Alice objected.
"Nobody said you did," said the Red Queen. "I said you couldn't if you tried."
I reached the conclusion that any set of political or
religious beliefs, any ideology, when implemented in public policy, that leaves
significant numbers of the most economically and socially vulnerable people
worse off is simply evil. I can no
longer stomach the proposition that ideology, "principles," trump the
well-being of the members of a society. I
do not use "evil" in any theological sense, I use it as a standard by
which to judge how well we treat those who, for whatever reason, are only
barely hanging on to a life one might call civilized.
Shortly
after my epiphany—I use the word lightly—I stumbled across an article in which
the author created a neologism:
sociocide, the killing of a society. In various ways, societies are killed. Such places include Iraq, Syria, Afghanistan,
certain places in Africa. The "societies
barely exist as a positive agent of social action in the lives of its members." Violence and brute force are the order of the
day; people cannot be murdered because there is no state that establishes a
legal standard for murder; people are simply killed.
Margaret
Thatcher famously quipped that "there is no
such thing as society. There are individual men and women, and there are
families." I think she is utterly
and completely wrong, and carried to its logical public policy end, it is an
approach that kills a society. There is
an argument in anthropology that society preceded individuals in the sense that
individuals cannot exist without society; "interdependence and interconnectedness
are elements of who we are." Rather
than live in constant combat with other human beings, we (try to) cease the use
of violence and substitute civil society.
I don't know if anyone else has said or written this—they probably
have—but all the evidence we have, from history and around the world, is that
we are all better off when we are all better off.
In my view, the standard for
evaluating a society is how well off everyone is and the extent to which
everyone has a reasonable means to achieve the fundamental necessities of life
(food, shelter, health, safety) as well as the aspects of life that make it
worth living (education, occupation, friends, fun). I don't believe there is much reason to worry
about the smart and the rich; they have done well in every human society since
the dawn of recorded history and they will surely continue to do well. It is everyone else—the bottom 80-90%—who I
am concerned about.
There's no possibility that income
and wealth will be exactly evenly distributed among human beings. The differential distribution of abilities
and motivation alone make that impossible.
Nor, one can argue, would such a flat distribution be particularly
desirable; it would be boring and no one would have any incentive to do better
or be innovative. I've always recoiled
at Marx's "from each according to his ability, to each according to his
need." What needs attention is
simply the floor: no one, in a wealthy
country, should be without access to basic needs. Some parts of the world have done much to
achieve that state. We aren't even
close.
It disappoints me that nearly every
single policy proposal I have seen from Republican candidates for office goes
in exactly the opposite direction from the one I find most conducive to a
society I'd care to live in. Some
proposals damage people slightly; some put those at risk at even greater risk
(the poor, children, the elderly, people of color, those with medical
issues). I keep finding myself appalled.
"You have no mind to be unkind,"
Said echo in her ear:
"No mind to bring a living thing
To suffering or fear.
For all that's bad, or mean or sad, you have no mind, my dear."
Said echo in her ear:
"No mind to bring a living thing
To suffering or fear.
For all that's bad, or mean or sad, you have no mind, my dear."
A
topic that reflects my age, alas. There
was amusing speculation in the Huffington Post about when "old age"
begins. Supposedly if you could get to
21 in the Middle Ages, you had a good chance to reach your 60s as long as you
avoided disease and medical treatments.
No teeth and lousy eyesight, but you could be alive. Then in the last
century the expected life span went from about 46 years to about 79 years now
(U.S.). This advance in lifespan "allowed
more people to aspire to saying things like, 'This getting old stuff sucks.'"
I am certainly guilty as charged.
It has been the
Baby Boomers that led the attack on "old age," by inventing "Spanx,
cosmetic surgery, Viagra, and the belief that they looked 10 years younger than
they really were." Age keeps
advancing anyway, so the question remains:
when does "old age" begin?
It depends on who you ask.
In
a 2015 survey of 2,000 Brits, the vote was that old age begins at 68 (that is,
68 is the end of middle age). People in
their 60s, however, classify old age as beginning at 77. An earlier survey reported that 75-year-olds
say old age never begins (and please go away) and that people believe old age
starts at 80. Then there is the
distinction between the 'young old,' the 'middle old,' and the 'very old,' with
little agreement on what ages fit in which group.
I
prefer the response of the 75-year-olds:
it never comes and please go away.
My personal preference, of course, is that I'm physically and mentally
healthy well into my 80s or 90s and then just fade away in the night. A hope I presume many share.
Christopher Hitchens made an
observation about getting older that is true no matter how one defines "old
age": "A melancholy lesson of
advancing years is the realization that you can't make old friends." So I guess whatever old friends we who have
crossed the 50 mark (for example) have are going to be it. However, that certainly hasn't prevented
us—or most of my friends, as far as I can tell—from making new friends. And we intend to keep on doing so as long as
we have an ounce of sociability left in us.
(Here are some of
the countries with the same or longer life expectancies than in the U.S. [World
Health Organization, 2013, overall, so both sexes combined]; [obviously, many
tied for same rank]; countries in bold have universal health care in some form:
Rank
1 Japan
84 years
2 Spain, Australia, Switzerland, Italy 83
9 Canada, France, Iceland, Israel, New Zealand, Norway, Sweden 82
2 Spain, Australia, Switzerland, Italy 83
9 Canada, France, Iceland, Israel, New Zealand, Norway, Sweden 82
19
Republic of Korea, Finland,
Portugal, Ireland, UK, Austria,
Germany, Greece
81
29 Belgium,
Chile, Denmark 80
34 Colombia, Costa Rica, Cuba, United States 79 years)
Is all our Life, then, but a dream
Seen faintly in the golden gleam
Athwart Time's dark resistless stream?
Seen faintly in the golden gleam
Athwart Time's dark resistless stream?
Bowed
to the earth with bitter woe
Or laughing at some raree-show
We flutter idly to and fro.
Or laughing at some raree-show
We flutter idly to and fro.
Man's
little Day in haste we spend,
And, from its merry noontide, send
No glance to meet the silent end.
And, from its merry noontide, send
No glance to meet the silent end.
The foregoing reminds me of a
thought that has crossed my mind when reading about the medieval origins of
modern Western universities. Contrary to
what many in the academy would like to assert—that the liberal arts are the
core of a university—the initial universities, the collections of masters and
students in various European cities—had three faculties: law (canon and civil), theology, and
medicine. The artes liberales faculty developed with them, but the primary
function of those early gatherings of masters were to train lawyers and
officials for the Church and civil service.
As the institutions evolved, a degree in the liberal arts became a
prerequisite for admission to one of the three advanced faculties, but that was
later.
What I cannot figure out is why it
took several years to study medicine.
They didn't know anything, and as has been noted, seeing a doctor before
the 20th Century was as likely to kill you as to heal you. I believe they studied Galen and other ancient
Greek and Roman texts, but in light of what modern biology has uncovered, they
knew little and even less how to treat the afflictions they were confronted
with. (You may recall that the
physicians bled George Washington in 1799; the massive blood loss almost surely
hurried him to death.)
But, said Alice, if the world has absolutely no
sense, who's stopping us from inventing one?
Shortly after my rant about political ideology,
I came across an article reporting on recent research in the social sciences
(economists, political scientists) on "the political economy of wellbeing,"
otherwise known as "happiness research." But this isn't psychological, it's work
predicated on the assumption that there are objective conditions that can be
measured that contribute to individual happiness. Happiness, that is, is as much social as it
is psychological.
Certainly not to my surprise, the public policies most
conducive to wellbeing are those adopted by social democracies, and a 2014
research review by a Rutgers political scientist found that leftist or liberal
governments have the greatest levels of satisfaction; "the more generous
and universalistic the welfare state, the greater the level of human happiness." That term is one that gets many Americans all
riled up, of course, because welfare state implies a bunch of freeloaders. What it means at its simplest, however, is
insuring people against risk as well as providing benefits (the usual: health
care, pensions, unemployment, but also sick days, parental leave, vacation,
minimum earnings). It also means
addressing basic human needs such as food and shelter.
The element of any society most important to fulfilling
human needs is the economy. The market
economy has succeeded far more than any of the alternatives. The difficulty is that a market
economy—capitalism—contains a logic that undermines human wellbeing in that it
treats labor as a commodity. People must
sell their labor to survive in a complex society. Commodities are subject to market forces,
which can be uncertain, to say nothing of cruel. When work is a commodity, employees are
treated as a factor of production to be kept as low-cost as possible. The value of human beings is devalued when
they are considered as labor costs; dignity and wellbeing are irrelevant.
If capitalism—the market economy—contributes so much to
human wellbeing at large, but has such demeaning effects on people, the
solution seems to be to de-commodify labor by allowing people to maintain a
decent standard of living even if they cannot sell their labor (because of
illness, old age, disability, inability, caring for family, getting an
education, or unemployment in a bad economy).
In other words, a society that creates a social safety net will provide
wellbeing to its members. Doing so does
not mean one repudiates capitalism.
Is such a safety net too expensive, both in terms of cost
to the society's members as well as to the society as a whole in terms of
reducing the ability of markets to function optimally? There are several studies that find a
positive relationship between level of taxes paid and social satisfaction—in
effect, the greater the de-commodification of labor, the higher the levels of
social satisfaction. The greater the
amount of "social spending," the greater the government's share of
GDP, the higher the levels of wellbeing.
Not only do the cross-national studies reach that conclusion, so do the
cross-state comparisons in the U.S.:
life satisfaction is higher in states that have higher levels of welfare
spending, a more regulated economy, and with a longer history of policy
liberalism (which in effect translates in most cases to Democratic Party
rule). The Nordic countries, among
others, have economies that function well even though they have high levels of
public spending, so it is not the case that social welfare undermines healthy
capitalism.
The puzzle is why the vision of a better life for
everyone seems to be in retreat. The
empirical evidence is clear, but "welfare state" has a terrible
reputation in the U.S., and we lurch toward it and then back away from it. (We moved toward more social spending with
the Affordable Care Act, but then hear of proposals to reduce Social Security
benefits, one of the most spectacularly successful programs for preventing
poverty in American history). Why
especially do those who would most benefit from higher social spending
nonetheless consistently vote for candidates who are committed to unrestricted
market competition (or nearly so) and decreased social spending? There isn't a clear answer to that question,
it seems. Part of it may be that people
can't easily judge the effect of political programs on their own levels of
happiness and wellbeing. Another part
may be that what "the people" want isn't always translated into
public policy. The decline of labor
unions and spread of "right to work" laws have, the evidence makes
clear, weakened one of the strongest forces for public policies that increase
social wellbeing. It may also be that
those who possess the most political power are not favorably inclined to
welfare state spending. In any case, it
has been and no doubt will continue to be an uphill struggle in the U.S. to
achieve anything like the levels of social spending that create the higher
levels of social satisfaction we see elsewhere in the developed world. Which is too bad, particularly for those
whose purchase on wellbeing is by their fingernails.
I stumbled on this interesting little chart from Linda
Yueh, from Oxford and the London School of Economics.
Average government expenditure (%GDP) 2000-2010
China
|
19.4
|
Developing Asia
|
21.4
|
ASEAN-5
|
22.5
|
Sub-Saharan Africa
|
27.0
|
Emerging economies
|
27.7
|
Latin America
|
29.9
|
Middle East/North Africa
|
31.5
|
United States
|
37.8
|
Central and Eastern Europe
|
39.8
|
European Union
|
46.5
|
What is misleading about
the figure for the United States, I suspect, is that such a huge percentage of
the federal budget goes for military purposes, which does not generally add
much to social wellbeing. Those who push
to lower social welfare spending in this country apparently prefer that we
mimic Asia and sub-Saharan Africa.
Yes, I'd prefer to see significantly higher taxes, which
I'd be glad to pay, if the money went toward increasing social satisfaction and
not multi-billion-dollar weapons systems.
Say, whence is the voice that, when anger is
burning,
Bids the whirl of the tempest to cease?
That stirs the vexed soul with an aching — a yearning
For the brotherly hand-grip of peace?
Bids the whirl of the tempest to cease?
That stirs the vexed soul with an aching — a yearning
For the brotherly hand-grip of peace?
The year ending in 15 seems to have a lot of
significant-year anniversaries. I
started out intending to note two: the
Magna Carta and the Battle of Waterloo.
Then I added Alice's Adventures in
Wonderland and the 1940 Armistice Day blizzard. As we drew near the end of the year, I came
across two more that are significant, one primarily for those of us of Western
European descent (as with Waterloo) and one of universal importance (but the
implications of which I'm not able to fully understand).
The Battle of Agincourt took place October 25, 2014. A smaller English army under Henry V defeated
a larger French army on a field in northern France. Henry was claiming the throne of France (to
which he had some reasonable claim), but ended up marrying the French king's
daughter instead and keeping title to his French domains. (There remains a dispute about how much
larger the French army was, but there's no dispute that it was significantly
larger.) There are seven contemporary
accounts of the battle that survive, three of which were written by
eyewitnesses. The primary factor in the
battle was the introduction of new weaponry on the part of the English, the
longbow, which resulted in the slaughter of the French troops and a couple of
dozen French political and military leaders.
The accounts suggest it was a brutal battle, both before
and in the aftermath. And even though it's
a conflict that is hardly on anyone's mind now, nonetheless in 2010 there was a
mock trial of Henry V for war crimes that U. S. Supreme Court justices Alito
and Ginsburg presided over. The issues
included whether there was "just cause" for the war in addition to
the charge of killing the French prisoners of war (which did happen); the
audience vote apparently was very close but the two "judges" found
the English king guilty.
So the question you might pose is, "who cares?" Good point, and it's not clear that there
were any long-term implications of great importance. The king and the battle were immortalized by
Shakespeare in Henry V, and the
English victory, against the odds, were recalled during World War I (the 500th
anniversary in 1915) and again during World War II (Laurence Olivier
entertained British troops with rousing parts of Henry V's speeches). It became a part of English national
mythology.
Agincourt Day is, however, no longer celebrated in England—wasn't
by the turn of the 17th Century—and, for rather obvious reasons, it
was never celebrated in France. Nor is
the play performed much in France. The
French recovered their role in Europe within a few decades and the English were
mostly kicked out of France in 1453 and lost their final foothold in 1558.
There is, I read, an excellent museum at the site of the
battle. We may have to go there if we to
France in the next few years.
Sentence first; verdict afterwards.
Finally, it is the 100th anniversary
of the theory of special relativity, published by Einstein in 1915. The equation that rules the universe, says
the New York Times (and probably a lot of other texts as well). General relativity has a great deal more
import for our lives than does the Battle of Agincourt. I've found a few ways that it matters. I'm not going to delve into the scientific
aspects of general relativity because I can't.
I don't understand them. But we
can understand what it means as we float through our days.
Gravity warps space and time (don't ask me, I don't
know), which means that the satellites in space that send and receive signals
for our GPS would add about 7 microseconds per day to their clocks compared to
the clocks on earth and in your car—which means that if that difference were
not adjusted for, your GPS would be off by several miles within one day. Relativity explains why electric generators
and transformers work. Relativity
explains why gold doesn't corrode. When
we used cathode ray tubes for TVs and monitors, the manufacturers had to adjust
for the speed at which electrons were traveling when they hit the screen. Finally, that the theory of relativity is
valid is demonstrated by the fact that nuclear power plants work. (And why atomic weapons are possible.)
Relatedly, it's the 150th anniversary of James Clerk
Maxwell's theory of electromagnetism—which, in interaction with the knowledge
of general relativity, allows us to have cell phones, television, and
radio. It's also what led to the development
of x-rays and MRIs as well as the radio telescopes with which astronomers scan the universe. (Maxwell also took the world's first color
photograph.)
Einstein
recognized the value of his predecessor's work:
"One scientific era ended and another began with James
Clerk Maxwell."
"And at last we've got to the end of this ideal racecourse! Now that you accept A and B and C and D, of course you accept Z."
"Do I?" said the Tortoise innocently. "Let's make that quite clear. I accept A and B and C and D. Suppose I still refused to accept Z?"
"Then Logic would take you by the throat, and force you to do it!" Achilles triumphantly replied. "Logic would tell you, 'You can't help yourself. Now that you've accepted A and B and C and D, you must accept Z!' So you've no choice, you see."
"Whatever Logic is good enough to tell me is worth writing down," said the Tortoise. "So enter it in your notebook, please. We will call it (E). A and B and C and D are true, Z must be true. Until I've granted that, of course I needn't grant Z. So it's quite a necessary step, you see?"
"I see," said Achilles; and there was a touch of sadness in his tone.
As I write in late
November, we (the U.S.) seems to be entering another
period in its episodic cycle of xenophobia/hysteria over some group of "others."
It happened in the 1790s with the Alien and Sedition Acts to restrict
immigration, in the 1840s with the Know Nothings' promise to end the influence
of Irish-Catholics and other immigrants, with the Chinese Exclusion acts after
the Civil War, with the Immigration Restriction League founded in the 1890s in
the belief that those coming from southern and
eastern Europe were racially inferior to Anglo-Saxon, with the 1905
establishment of the Japanese and Korean Exclusion League,
and in the years right after World War I that saw anti-radical hysteria about a
Bolshevik revolution here and including anti-immigrant sentiments because
radicals were often European immigrants.
It wasn't particularly anti-immigrant, but the hysteria of the McCarthy
era falls in the same category, a national fit that its thoughtful citizens
resisted and many later deeply regretted.
And as is often the case, frenzy is
being whipped up by a demagogue, with mini- and would-be and reluctant
demagogues trying to keep up with the chief demagogue. (Many people forget, I think, that we've had
these characters in the U.S. before but none of them got close to the White
House. The one who might have presented
the most significant challenge was felled by an assassin before he could engage
in a national election: Huey Long.)
I
have seen on Facebook pages the cry that America is in bad shape and sinking,
and that it is going in the wrong direction, and that it needs to be made great
again. Several questions occur to me, of
course, and I've posed them. As for "in
bad shape and going the wrong direction," let's see, the economy is in the
best shape it's been in since the Great Recession, steaming right along;
millions more people have health care; the price of gas is the lowest it's been
in decades, people are finally beginning to buy homes again, unemployment is
down. There are few members of the
military being killed in wars that never should have been started. Some of us
think the nationalizing of gay marriage was a huge step in the right
direction. No one I've encountered who
says we're going the wrong direction is willing to engage the question "what
exactly is going wrong?" Mostly I
hear ad hominem attacks on the president.
Nor
is it clear what halcyon days we are to return to. People tend to forget that at the end of
World War II we were sort of the "last man standing." So of course we were on top of the world;
ours was the only undamaged and functioning (large) economy, one that had been
hugely stimulated by the war. So it
appeared that it was going to be an American century or Pax Americana. It can hardly be surprising that industrious
folks around the world—Japan, Germany, the UK, Russia, China—might eventually
recover and begin to contest American economic supremacy. That we would become one of many competitors
is not startling. The vision of America
atop the world should have faded long ago, the accidental byproduct of a global
conflagration.
It
may also be that it was a white America; there were certainly American people
of color, but they were far less visible and most of them "knew their
place." That was certainly the
world I grew up in.
So
what's ailing the country? The fact that
we have a Black president? Is it made
worse by the fact that the leading contender for the Democratic nomination in
2016 is a woman? The upset with police
forces around the country and many who are convinced we are going in the wrong
direction think the police are always the good guys and all these demonstrators
(often people of color) should be jailed?
Or worse? The fact that the GLBT
community is being acknowledged and accepted in mainstream culture and gay
marriage is legal? The rise of a Black
celebrity culture, one as pervasive as the White celebrities? The apparent decline in religiosity in the
American public, particularly among Millennials, and the parallel perception
that Christianity is under attack (by the forces of modernity, the courts, and perhaps
even the schools)? The increased numbers
of people of color, so different from the 1950s that many of the disenchanted
grew up in? Probably some combination of
all of these, varying with the individual and the locale.
The
ones who seem to be most upset, as the news media have it and as far as I can
ascertain personally, are the White Christian conservatives, which I suppose is
not surprising because their imagined country is vanishing. The upset seems to me to be combined with a
longing for a leader who will restore the lost nation. A faculty member in a seminar reminded us
many years ago that you should be careful what you wish for when you want a
strong leader; remember the German word for leader is Führer. I have recalled that admonition several times
recently as I have listened to the Trump campaign. (I've also recalled the truism that if a
demagogue repeats something often enough, and loudly enough, some part of the
population comes to believe it, no matter how baldly ridiculous and erroneous
the claim is. Cf. Joseph Goebbels.)
All
we can do is hope there's minimal damage from the current xenophobia and that
we can repair that damage later, as happened in the earlier episodes.
As
Elliott commented as the brouhaha was engaged, "[I'm] not looking forward
to hearing about all this for the next year." A sentiment I suspect that is widely shared,
even among us political junkies.
Less Bread! More Taxes!--and then all the
people cheered again, and one man, who was more excited than the rest, flung
his hat high into the air, and shouted (as well as I could make out) "Who
roar for the Sub-Warden?" Everybody roared, but whether it was for the
Sub-Warden, or not, did not clearly appear: some were shouting "Bread!"
and some "Taxes!", but no one seemed to know what it was they really
wanted.
It turned
out to be a year of recovering contact with friends from long ago. First I reconnected with a guy who was a
graduate student the same time I was in 1974-75; we met in the Council of
Graduate Students at the University.
Denny was a Ph.D. student in Animal Science, and as he joked at the
time, he was writing a chicken shit dissertation—because it was about
recycling chicken poop for fertilizer
and other uses. He went on to become a
long-time faculty member at the University of Wisconsin-River Falls. We now meet every couple of months for a beer
and chat. As someone from my
professional community, we can always exchange stories about the academy; we've
also had a common subject in divorce, unfortunately.
Then I got
together with a high school acquaintance, primarily because we exchanged a few
Facebook messages. We'd known each other
in high school, although were not close, and we were in school PTAs at the same
time but didn't connect. I'd seen him at
a friend's pre-high-school-reunion cocktail hour a few years ago. I just decide to invite Jeremy and his wife
over for dinner and had a delightful time—at which time I told him that I
thought I'd seen him at PTA meetings, but I wasn't sure it was him, and I'm not
the kind of person who walks up to someone else and asks "are you
so-and-so?"
Finally,
when Kathy and I were eating dinner in downtown St. Paul before an opera
performance, I happened to see a guy with whom I'd worked 30 years ago but hadn't
seen since. Amazingly, we both
remembered each other's name and exclaimed them aloud when we saw each
other—even more remarkably, Kevin remembered I had a daughter with diabetes (he
didn't know about Elliott because I hadn't seen him since Elliott was
born). I've gotten together with him
since and will do so again.
It has
sometimes puzzled me why friendships sometimes wither for no apparent
reason. In some cases, friendships can
fade because of diverging interests or careers or spouses who may not be
interested. But in other cases there's
no good explanation other than lack of effort to maintain the
relationship. Maybe that says they weren't
all that strong in the first place.
People
change, as we all know, and it appears that at least in these three cases in my
life, they (and I suppose I) changed over the years, and now we can resume a
friendship that had no good (or at least visible) reason for languishing. Inasmuch as I am one who believes one can
never have too many friends, I am pleased to have these resumptions occur.
As you have invited me, I cannot come, for I have made a rule to
decline all invitations; but I will come the next
day.
As we always do, on the day after
Thanksgiving the kids and I drove 45 minutes north to a tree farm to cut the
Christmas tree. Part of the ritual is
that we listen to the Fiddler on the Roof
soundtrack on the way. I invariably
get a little bit teary-eyed when listening to one of the songs: "Sunrise, Sunset." Many who are parents of grown or nearly-grown
children probably do; when DID those babies grow up? I don't remember getting older.
I NEVER loved a dear Gazelle –
Nor anything that cost me much:
High prices profit those who sell,
but why should I be fond of such?
Nor anything that cost me much:
High prices profit those who sell,
but why should I be fond of such?
One reason I write, captured in a
brief paragraph in brainpickings.org:
Recently, while packing to move, I came
upon a stack of letters from my Bulgarian grandmother. During my time in
college, we wrote each other long, beautiful letters about once a month. Then
she discovered the internet. The letters became emails and, invariably, their
nature changed. The slow mutual beholding of sentiment and feeling that co-respondence implies became the quick mutual reaction to
information under the pressure of immediacy, which often bled into the banal —
daily errands, travel plans, the weather.
Yep.
Emails are largely mindless.
Letters require time and effort to write. I admire Krystin in this regard because she
and a couple of her friends still write letters to one another. One personal exception to the bland
triviality of emails is the exchanges Kathy and I had when we were first
courting: of course we talked when we
were together, but during the evenings during the week, long before we ever
lived together, we would often spend an hour or more sending each other
messages. Perhaps it is because both of
us can be somewhat reticent, the medium of emails allowed us to explore topics
more openly than we might have had we been sitting together talking. (Both of us saved the emails, and they will
constitute an entire chapter of the autobiography that may never be written—but
that chapter is! A conversation over
several months saved because of the medium.)
If I had a world of my own, everything would be
nonsense. Nothing would be what it is, because everything would be what it
isn't. And contrary wise, what is, it wouldn't be. And what it wouldn't be, it
would. You see?
An odd and sad experience the day
before I sent this letter off to be duplicated.
My dad's older brother (by 11 years) had 7 children, 5 who were older
than I am, and by well over a decade in the case of his oldest children. All 7, of course, are my first cousins—and
they are my only first cousins because my mother was an only child. We were closer to my mother's side of the
family than my father's, so I didn't really see these cousins all that much
while I was growing up or during my adult years. I perhaps see/saw them once every few years
at a family gathering or when traveling and happened to be in the city where
some of them lived.
As I noted earlier in this letter, Kathy
and I went to funeral of one of those cousins in October. She was the 4th of the 7 children
to die; she was nearly 80, so not a youngster.
I learned in early December that another of those cousins, Donna, the
next oldest, was murdered by her grandson (whom she had raised); the grandson
also murdered one of Donna's children.
He stabbed them both to death.
This happened in San Diego, and the local CBS affiliate had three
different articles about the crime. (I
should probably say, to be technically correct, that the grandson is the alleged killer, but I infer from the
news accounts and from the relatives in California that there's no doubt.) It seems that the grandson had severe mental
problems, so this may not have been a crime committed by a fully rational
person. I extended my condolences to my
cousins, including ones that I have never met (other than, now, by email).
Even though this event is fairly remote
from my life—I last saw Donna about two or three years ago at a family
gathering in Wisconsin, and it had probably been a decade since I'd seen her
before that—it still gave me the creeps to think that I now knew someone who
had been murdered, and that it was a member of my own family. For most of us, something like that happens
to other people, not me. When I learned
about the events, I also recalled that Krystin played with the killer when they
were both very young, perhaps 3-4 years old, at a family gathering. This was all on my mind for a couple of days
afterwards. The events did, however,
connect me with Donna's daughter, my cousin, about family history, in which she
has an interest, so I was at least glad that happened.
And never, never, dear madam, put 'Wednesday'
simply as the date! That way madness
lies!"
A friend of mine, upon reading the 2014
letter, wrote back to me with an Oscar Wilde quote of his own: "Oscar Wilde said 'Every portrait that
is painted with feeling is a portrait of the artist not of the sitter.' Your writings are windows through which you
are clearly visible." I suppose
that's inevitable, but it isn't my intent.
With best wishes
for 2016!