December 1, 2013
Warm greetings.
I hope your lead-up to the holidays
is proceeding famously and you're looking forward to good times with friends
and family through the beginning of 2014 (and, of course, thereafter).
Although this letter got a bit long,
even by my standards, I cannot in truth beg Pascal's famous 1657 apology ("I have made this longer than usual
because I have not had time to make it shorter") because it would be a
lie; I had the whole year to make it shorter.
I chose not to because I had a great deal of fun with it
this year. In addition to the usual news
and family notes, I found myself taking up a series of (what for me were) diverting
topics from the byways of human behavior and thought. I hope you enjoy them as well.
I'm
continuing my practice of the last two years of using quotations from some of
my favorite writers as subject dividers.
This year's sources are Dorothy Parker (1893-1967), the poet, short
story writer, critic, and wit, and Ambrose Bierce (1842-sometime after 1913),
the journalist and critic who also wrote The
Devil's Dictionary, a few selections from which I include (the ones that
begin with the word in CAPS). Bierce, like Mencken, can tend to be on the
dark side (he was known as "Bitter Bierce"); Parker's quips are
mostly on the lighter side. She was
perhaps most well known for her writing for the New Yorker and for being a founding member of the Algonquin Round
Table, a group of New York City writers, actors, critics, and others who met
for lunch at the Algonquin Hotel for roughly a decade, from 1919 to 1929; she
was also nominated for two Academy Awards for scriptwriting and later
blacklisted during the McCarthy era. (One
of Parker's friends in the group was the actor and essayist Robert Benchley, a
few of whose wisecracks I'll also toss in.)
Bierce and Parker never met but it is clear, from a textual analysis of
Parker's work that I read, that Parker was influenced by Bierce.
INFANCY, n. The period of our
lives when, according to Wordsworth, 'Heaven lies about us.' The world begins
lying about us pretty soon afterward.
Years are only garments, and
you either wear them with style all your life, or else you go dowdy to the
grave. (Parker)
2013 began as the coldest day of the
winter thus far. It was -5 degrees when
I awoke on January 1. Nowadays we call
that extremely cold. When I was growing
up, -30 was extremely cold and -5 was merely a bit cool.
You just have to love living in the
northern plains. Kathy and I were
driving to work in mid-March and the radio announcer on Minnesota Public Radio
said "tomorrow is the first day of spring.
The wind chill will be 10 below zero" (it was actually 15 below the
next morning). It did seem to many of us
that the cold and snow lasted longer than usual this past spring. However, I received commentary on the subject
from a University adjunct faculty member, climate scientist Kenny Blumenfeld, via
a periodic email he sends out to those of us on his mailing list; this came
March 21:
As for our actual weather, some readers
apparently have found it to be noxious indeed. The winter has hung on a
bit longer than what we have gotten used to in recent years, and March has been
legitimately cold relative to what's normal. Still, this is really
nothing. The coldest temperature of the winter in Minneapolis was -13,
and we went to zero or below 12 times. Before 1983, we averaged over 30
such instances per year. Since then, we've done it 22 times, on
average. So this winter clearly continued the unmistakable trend of not being
able to produce substantial and lasting cold weather. It went easy on us
yet again. And do I need to remind us all that it has rained close to ten
times this winter? Not exactly the signature of a healthy, old-fashioned
winter.
I always find these
reminders from people who have the data to be useful. And Kenny confirms my snarky statement about
what winter was like when I was growing up J
If I didn't care for fun and
such,
I'd probably amount to much.
But I shall stay the way I am,
Because I do not give a damn. (Parker)
I'd probably amount to much.
But I shall stay the way I am,
Because I do not give a damn. (Parker)
MAYONNAISE, n. One of the sauces which serve the French in place of a state religion.
Milder or not, there was still snow
and cold, so Kathy and I decided to get away from them for a short time, a
prescription I have always found that makes it easier psychologically to deal
with winter: Being in warm and green for
even a week or so cuts up the bleakness.
So we spent ten days in Florida in mid-January. The trip also provided an opportunity to
catch up with a high-school classmate and his wife, Bill and Debbie Merriman, with
whom I've remained friends since high school/college, and the University's
former director of women's intercollegiate athletics, Merrily Baker, who's been
a friend since I met her at the national meetings of the Association for
Intercollegiate Athletics for Women in the late 1970s. We had a delightful time visiting both, in
St. Petersburg and Boynton Beach, respectively, and it was good to renew the
friendships after more than a decade.
Inasmuch as I am married to an art
history major who loves the works of Salvador Dali, the one item on the trip
itinerary that was non-negotiable was a visit to the Dali museum in St.
Petersburg. I had visited it the last
time I had been in Florida, with Pat and the kids over the New Year (2002 into
2003), and it was in a somewhat nondescript building. It has since moved into a spectacular
facility, part of which, the docent informed us, is built to withstand a
category 5 hurricane. So the staff no
longer must move the collection to safe ground every time there is a major
storm—they just move it into the "bunker" (that is also the primary
display area of the museum). We and the
Merrimans agreed afterwards that our docent's tour of the museum and the major
Dali works was one of the most outstanding guided tours we had ever had. I'm a so-so Dali fan but I do find his work
interesting.
It's an accident that the U.S. has
such a large collection of Dali pieces; the Dali museum has the largest
collection outside Europe and one of the largest, period. A guy who went into the plastics business
right after World War II, and made a fortune, also (with his wife) became
friends of Dali—and Dali was at the dedication of the first Dali museum in
Cleveland. They also collected his works,
and eventually put the collection in Florida after a local attorney in St.
Petersburg saw an article about the collection and, with financial support of
the city and the state, persuaded the couple to put it in St. Petersburg. It is well worth the visit, even if one only
has lukewarm opinions of Dali's work.
We also went to (as it turns out, a
very small) exhibit of the work of Dale Chihuli, who does rather interesting
things with glass. I figured that with
the admission charge, we paid about $1 to see each piece on display. Even though the items on display were fun and
interesting, I'm not sure that was the right ratio between dollars paid and
items viewed.
We are suckers for local arts and
crafts fairs, and found one in Dunedin.
We of course came away with a couple of items for the house. Just what we needed.
On the way from the Merrimans' to
Merrily's, which was a diagonal trip across the state from the northwest corner
of the peninsula to down the southeast coast a bit (although north of Miami), we
stopped at a place we'd never heard of.
We thought we'd driven past the exit, so kept on going, then got a
second chance, so decided "what the heck" and drove in. The Bok Tower Gardens is one of those fascinating
little sites that doesn't make it on to a "must do" list but that
rewards the time spent.
Mr. Bok, who made a pile of money as
editor of the Ladies' Home Journal,
decided to create, in the middle of Florida near their winter home at Lake
Wales, a botanical garden and bird sanctuary that included a bell tower with a
very large carillon. (Bok's wife was the daughter of the founder of
Curtis Publishing, which published the Ladies'
Home Journal and the Saturday Evening
Post.) There are now several hundred
acres of garden designed by Frederick Law Olmsted as well as a carillon tower
over 200 feet tall, about 50 feet on a side, light pink and tan stone, that is
an art deco masterpiece (dedicated by President Coolidge). The same guy who did the art deco sculptures
on Rockefeller Center did the ones on the Bok Tower. We spent a very peaceful couple of hours
walking in the gardens being amazed by the enormous diversity of exotic plants
and flowers. (One of his grandsons was
the long-time president of Harvard—1971 to 1991—Derek Bok.)
The
tower has bronze sculpted doors by Samuel Yellin, who, we are informed, was
America's premier metalworker at the time. The sculptures (which one cannot get
close to because they are fenced off) depict events from the Book of Genesis. I was reminded immediately of the bronze doors
with relief sculptures by Lorenzo Ghiberti on the Florence Baptistery (one pair
of which was declared "the Gates of Paradise" by Michelangelo).
The drive on "Alligator Alley"
from Miami to Naples is not that thrilling.
As Kathy observed, much of the trip is like driving "across a wet
Nebraska."
We once again owed our friends the
Dixons a big "thank you." They
were not using it so gave us the keys to their condo in Naples. We mostly poked around the area and enjoyed
the warmth and water. We picked the
wrong day to visit Sanibel Island, a place Kathy visited with her parents in
the early 1970s, and that Kathy had also visited while they were there;
unfortunately, the weather turned cool and very windy and overcast, so we couldn't
wander the beaches. We managed only a
tram ride through part of the Everglades.
Back in Naples the next day, however, even though windy, we had a long
narrated boat ride in the mangrove jungles of the Everglades (narrated by a
recent college grad who grew up in La Crosse, Wisconsin). Those jungles are such alien places—but they
are also home to some of the most interesting and beautiful birds in North
America. We saw quite a number of them.
Our last night in Florida we took a
dinner cruise out into the Gulf of Mexico from Naples. What catches one's attention on the way out
are the enormous houses built on the waterfront. Some of them surely exceed 20,000 square feet
and illustrate what one might call exhibitionist architecture. (Kathy commented aptly that the homes on
Summit Avenue in St. Paul and around Lake of the Isles in Minneapolis are
rather modest shacks compared to these waterfront homes. I can only add that her comparison was not
hyperbole.) I describe them as obscene
displays of wealth. I would not use that
term if the majority of those who build and own such homes did not also throw
substantial political support to those who do the most economic harm to those
at the lower end of the socio-economic scale—by fighting increased minimum
wages, opposing good universal health care, opposing taxes to support good
schools, pay decent salaries to teachers, support programs in desperately poor
communities, seeking to increase and solidify the vast disparities in wealth
and income in this country, and so on.
Many (not all, I am aware) of those homes were built on the backs of
people laboring under conditions none of us would like.
But being with Kathy, I enjoyed the
cruise anyway, and we saw a spectacular sunset.
Which was the point.
During
our Florida stay I was reminded of my friend Scott Eller's advice when I asked
him, in advance of going to Paris some years ago, what he'd recommend we see
and do. He gave me several suggestions
but said that the best thing to do was to just wander along the Seine and enjoy
being in Paris. It was good advice and
it applied equally to south Florida, although for very different reasons. As Merrily observed when we were visiting,
there are not a lot of major cultural, architectural, or other kinds of sight-seeing
places in south Florida—the point, in the winter, is simply to be in south
Florida and enjoy the weather and the beaches.
So several times we simply walked the beaches and reveled in not being
in Minnesota in January.
This
trip was a departure from our usual approach.
As I have commented before, I divide the world into two groups of
people, those who travel and those who vacation. Of course there is a middle ground. Kathy and I usually travel. This time we pretty much just
vacationed. I am more inclined to
vacation when I have not spent a fortune on transportation and lodging; when I
do, I want to get my money's worth and see everything there is to see!
In America there are two
classes of travel - first class, and with children. (Benchley)
PRESENT, n. That part of eternity dividing the domain of disappointment from the realm of hope.
We had hoped that the worst of the
cold weather would have been over in Minnesota by the time we returned. But no, we got punished for going to Florida. The Sunday night we returned the temperature
went down to -12, the coldest of the year, and the week after we got back was the
coldest week of the year, with almost every nighttime temperate below zero, and
sometimes well below zero. What a
shock. Of course, we could read the
Minneapolis weather forecast while we were in Naples, so it made the return to
Minnesota all the more depressing. I
really have never minded the winter, but spending time in a tropical climate—as
I was informed south Florida is—in part of the winter months does make one wish
that winter were shorter. (I know,
however, that I want to be nowhere near Florida in the summer.)
To
add insult to injury, I woke up the Thursday morning after we got back (on the
previous Sunday) and thought I was a little chilly. I was; the temperature in the house was at 57
and heading down. So at 6:00 in the
morning I was on the phone to plumbing and heating places that have emergency
service.
I had meetings at work so had to
leave; fortunately, Kathy could rearrange her schedule. So she met with the service guy, who after
much testing of parts, and seeing that the furnace, which is supposed to be
serviced every year, hadn't been for over a decade, concluded that it had to be
replaced. (I did not know one is
supposed to service a furnace every year.
I do now.) Thursday night the
house was heated with space heaters (which the furnace company kindly provided),
which at least kept the temperature at about 60 degrees (and it didn't get
below zero on Thursday night, thank heavens).
So a guy was here all day on Friday and, $4200 later, we had a new
furnace. This is one of the less
exciting purchases one can make in life, right up there with the washing
machine and the water heater.
Kathy and I have, mostly, prided
ourselves on being hardy Minnesotans of good Scandinavian stock. I have to say, however, that after spending
some time in south Florida in January, I'm rethinking how hardy I want to claim
to be. Perhaps it is simply advancing
age, but the idea of getting away from winter for a period is becoming more and
more attractive. Maybe I'm just now
figuring out what the Midwest snowbirds figured out a long time ago.
I've never been a millionaire
but I just know I'd be darling at it.
(Parker)
MONEY,
n. A blessing that is of no advantage to us excepting when we part with
it. An evidence of culture and a passport to polite society.
Supportable property.
Clothing is something I rarely
consciously plan to purchase. I've
almost never set out to go to Dayton's or Jos. Bank or wherever to buy
anything; most of the time it's "gee, we're at Rosedale, let's see if there's
anything on sale at Bank." So I was
wandering around Macy's at Southdale last spring, waiting for Krystin to get done
with the doc giving her shots in her eyes, and I find a clearance tie table,
$10 - $15 for each. That's in the price
range I like for a tie. There were a
couple that jumped out at me that I really liked—and then I saw that they were "Donald
Trump Executive" or something like that.
Even though they were actually only $9 each, I decided I just couldn't
do it. I didn't want the inventory of
ties with that name on it to shrink by even one bit because of me.
My father always thought one of the
more puzzling questions in life is why there are more horses' asses in the
world than there are horses.
IMPUNITY, n. Wealth.
PRESENTABLE,
adj. Hideously appareled after the manner of the time and place.
There
was another vexing incident upon our return from Florida. I have an old Glenwood water bottle into
which I have been depositing my odd change for years and years. It had, in recent months, finally gotten
almost full. I noticed when dumping some
change in it after we got back, however, that the level of coins was down about
4-5 inches.
My first and only thought could be
that one of my kids had taken money. I
asked them both about it and they vigorously denied it. I was heartsick, because it seemed apparent
that it could only have been one of the two of them—and one of them clearly had
to be lying. It didn't make sense to
me—Krystin had a reputation when she was younger of taking things that were not
hers but that was a long time ago—and she has a full-time job with a decent
salary and enough money to do what she wants.
Why she would steal coins from my jar was a puzzle. And Elliott, I had always thought, simply
would never do such a thing—besides, he'd been frugal about his modest income
from his job and various birthday and Christmas gifts and had at least a couple
of thousand dollars in his savings account.
But as I asked them both, what was I to think? As I said, I was just sick about the black
cloud over my relationship with the kids.
(It wasn't that the money made any difference—change taken from a jar
was hardly going to affect the household economy—it was the principle and the
loss of trust.)
Then Elliott thought about it a
little more, and confessed he had had a lapse in judgment (unusual for him): He had had friends over to play video games,
and he and others had left a friend of a friend in the house while they went
out. He said he would pursue the issue. He did so with his good friend Jake, who was
a friend of the other guy; Jake apparently agreed with Elliott that this other
kid was certainly the suspect and said that he—Jake—was also responsible. Jake followed up—and got a copy of the
coin-machine deposit slip from the kid!
$305.72.
I was astonished; I asked Elliott if
Jake had threatened the kid with cement galoshes or something. All the kid would have had to do was flatly
deny taking the money and that would have been that—although Jake and Elliott
were firmly convinced he did it. In any
event, as soon as Elliott learned how much had been taken, he promptly
transferred $305.72 from his checking account into mine. I told him that even though he had made a
mistake, I wasn't necessarily prepared to see him receive that kind of penalty,
given that he works for $8 per hour and his money is hard come by—I would let
him pay me back some of it but I'd just forget the rest. He wouldn't hear of that and said he would
get some of the money back via Jake. I
don't know if he got any more, or if Jake paid him some of it, but Elliott
wouldn't hear of my receiving less than what had been taken and indicated the
matter was closed as far as he and I were concerned. (It was interesting to me that when I told
Krystin that Elliott had transferred the money, she was dismayed that he had to
pay it all back from his own money.)
Krystin was also very relieved that
the culprit was identified, and apparently (I was told later) there were some
rather heated text messages between Elliott and Krystin. She was worried she'd be permanently under
suspicion and she blamed it on Elliott, and she told him so (when she knew she
hadn't taken the money so it could only have been him). I was just plain relieved that the shadow
over my relationship with the kids was gone.
This is not a novel to be
tossed aside lightly. It should be thrown with great force. (Parker, maybe)
SELF-ESTEEM,
n. An erroneous appraisement.
I
had to return to Tampa 11 days after Kathy and I came home from Naples: I serve on the Steering Committee of the
Coalition on Intercollegiate Athletics (COIA), which is a collection of faculty
leaders from about 60 of the universities with major athletic programs. COIA's goal is to seek reforms in college
sports to lessen the threat such major programs can have to the academic
integrity of the universities. (Some of
our colleagues may not always admit students who are prepared for college and
may from time to time provide them illicit benefits and may give them passing
grades for work that does not warrant them, to name but a few things that have
been known to occur, and to put it mildly).
Many
inside of higher education, and many outside it, believe (with good reason)
that college sports is out of control in a number of ways. The salaries paid to some coaches is grossly
disproportionate to those in the rest of higher education (including the
salaries of the presidents, much less the faculty), the role that alumni and
boosters play in some places distorts the academic mission, the practices of
some coaches is contrary to the educational mission of the institution, and so
on. The NCAA office I think shares that
view but it has been unable to find a path to return to more acceptable set of
practices that would embody amateurism and the collegiate model for
sports. University presidents were
polled a few years ago (by a distinguished national body, the Knight Commission
on Intercollegiate Athletics) and they essentially said they had no control
over athletics—they just hoped that any scandal at their school happened under
the next president. The NCAA itself
cannot exercise effective control because to do so would violate anti-trust
laws.
We
think—hope—that the NCAA has finally come around to the point that COIA has
been making all along: The only group
that can really exercise any control is the tenured faculty, because they are
not (as) vulnerable to boosters or to being fired for taking controversial or
contrary positions—but they have never been empowered to exercise that
control. So at this COIA meeting, unlike
the previous nine, the NCAA asked COIA for advice and recommendations on how to
create effective faculty control. We
proceeded to try to develop some advice to give them. Asking 50-60 faculty members to agree on
advice about anything as complicated as the national, conference, and campus
governing structures for college sports—and the relationship between the three
levels—is usually an exercise in futility.
(The common wisdom in higher education is that asking or directing
faculty members to do anything is like herding cats.) In this instance, however, recognizing that
they had to do something, the group actually cohered around a conceptual
proposal (which, I am glad to say, I had a role in developing).
One
of the topics that we heard about at the COIA meeting was concussions—and we
were all shocked to learn that on average there are about 10,000 sports-related
concussions PER DAY in this country.
When one acquires that fact, and then learns that the medical evidence
is rapidly accumulating that repeated concussions almost certainly lead to
early-onset dementia, one begins to ask questions about something like, say,
football. (And about boxing, obviously,
but also about hockey and soccer, among others)
A COIA colleague, a neurobiologist, told a group of us at dinner that
what happens to brain cells as a result of repeated concussions appears to be
identical to what happens to them with the onset of Alzheimer's. Others at the dinner table surmised that
football will be a very different game in a few years; the litigation alone
will likely change how it is played.
I am
so glad that Elliott had no interest in playing football or soccer or
hockey. I am so glad that *I* had no
interest in playing football or hockey (soccer, of course, was not really an
option 50 years ago). But this
presentation at the COIA meeting brought to mind something I hadn't thought of
for years: I had two concussions when I
was in elementary school, not from sports but from horsing around on small icy
hills on the way home from school. (I do
not actually know if a doctor determined they were concussions or if that's what
my mother decided; I think a doctor decided at least on one of them). I slipped and fell down and bonked my head; I
remember that in one case I was lying in bed afterwards and it felt like the
bed was tipping from side to side.
So I
asked my neurobiologist colleague if I was at risk for early-onset
dementia. He was pretty sure not because
I was so young and the concussions were not "repeated" in the way
that they are in a sport. I suppose I
could ask my doctor, but I haven't.
ACADEME,
n. An ancient school where morality and philosophy were taught.
ACADEMY, n. [from ACADEME] A
modern school where football is taught.
Misfortune,
and recited misfortune especially, may be prolonged to that point where it
ceases to excite pity and arouses only irritation. (Parker)
The
most interesting event at the COIA meeting was a fascinating after-dinner talk
by John Carroll, the former editor of the Lexington
Herald-Leader, the Baltimore Sun,
and the Los Angeles Times.
I
didn't even want to attend his talk because this meeting had the most brutal
schedule I've ever run into at a professional gathering: Between the time I arrived at the meeting on
3:00 on Friday afternoon and the time I left for the airport on Sunday morning
at 10:00, I was in 21 hours of meetings/sessions. You can do the calculations to figure out how
much free time there was. By Saturday
night, when Carroll spoke, I was ready to just go sit in my hotel room. But I'm glad I went.
Anyway,
Carroll related the events surrounding the Herald-Leader's
Pulitzer-Prize-winning investigative reporting of rules violations in the
University of Kentucky men's basketball program in 1985. The most recent entry in Wikipedia informs me
that "Kentucky has both the most all-time wins (2107) and the highest all-time
winning percentage in the history of college basketball (.763)." As a result of the investigative reports,
among other things, a bullet (probably) was fired into the newspaper's press
room windows, a delivery guy in the process of delivering a paper was chased
away by the homeowner, and woman driving a truck for the newspaper had her
window spat upon. Carroll said they were
deluged with letters, mostly vituperative; one of them, from Crab Orchard, Ky, made
a point that I and others at the meeting found striking: The author asked the newspaper not to do such
reporting in the future because "It's all we got" (that is, Kentucky
basketball).
Carroll
related another story in an email to me.
He had a friend
from
Inez, Ky., whose high school was a multiple winner of multiple state basketball
championships. High school basketball in
Kentucky is a lot like the movie Hoosiers.
My friend, being from the mountains, was
self-conscious about his accent and about the Appalachian image (and reality,
in some cases) of poverty and substandard education. When he went off to Morehead State University,
he said, he felt more self-confident because of his hometown's basketball
success. "I didn't have to be
ashamed," he said. That, I think,
reflects an element of the thinking that makes University of Kentucky
basketball so emotionally gripping to much of the population.
Carroll,
in the process of writing a book about the events in Kentucky, told us that
many in Kentucky feel that in the course of its history outsiders have come in
and taken advantage of them. Some of the
chief villains in the story have been the coal-mining companies and how they
obtained mineral rights from people. Many
in the state feel oppressed and angry toward outsiders and defensive about
things they value, such as U of Kentucky basketball. That wrath was directed toward the newspaper
when it uncovered major violations in the program.
I
and several others came to a realization, latent up to then for some of us, how
different the cultures and attitudes towards athletics are. I have had season tickets to U of M Gopher
football for over 3 decades and enjoy the games, and Kathy and I go
intermittently to other athletic events.
I can say without hesitation, however, that I have never lost a moment
of sleep because of the outcome of an athletic event. They are games, mostly fun to watch, but of
no moment in the larger scheme of our lives.
But that plea from the local writer to the editor of the newspaper, "it's
all we got," brought into sharp relief for me the circumscribed nature of
the lives of people who were perhaps both geographically and intellectually
isolated (this was before the age of email and the Internet), perhaps less well
educated, who had not been treated well by the world at large, and the
emotional attachment they have to local sports teams. I know that there are rabid fans of
professional sports teams as well, but are they so attached to their teams that
they would plea that "it's all we got"? (I don't know. While I know plenty of Vikings and Twins
fans, they're all lawyers and professionals who would never describe the role
of the teams in their lives that way.
But perhaps there are folks even in the metropolitan Twin Cities who
live and die with the Vikings.)
After
I got home and told her the story, Kathy's instinctive reaction, to the
response "it's all we got" was that it was pathetic. So was mine.
But as we talked about it, we also came to have a sense of dismay for
people whose lives are so entwined with a college (or professional) sports
team—and we were startled by the phenomenon as well.
Carroll
and a couple of others of us at the meeting met in the bar after his talk. We were speculating on why those in rural
Kentucky—and Alabama and Tennessee and other places—develop such devotion to a
college sport team. One factor might be
the South; it may be that adoration of sports is more widespread there than in
other parts of the U.S. (For college
football fans, one can look at the dominance of the Southeastern Conference as
further evidence of the proposition about athletics in the South.) Maybe I'm wrong, but I can't see the same
kinds of things happening if there were a major investigation of the Iowa
Hawkeyes or the Minnesota Gophers (more likely, in the Minnesota case,
embarrassment and anger at the rules violations than anger at the press).
Another
factor may be the urban-rural divide; Carroll told a story about how entire
small towns in Kentucky close up and go to the state high school basketball
championship if their team makes it into the tournament. If there are none of the distractions of
urban life—and Kentucky is hardly a heavily urban state, and would have been
even less so nearly30 years ago—it is easy to see that the local sports teams
acquire more importance in the social scheme.
One has a hard time imagining the people of Los Angeles or Seattle
reacting so viciously if the newspaper or other media found athletic program
violations.
So
although I didn't expect to do so, I came away from this meeting with a greater
appreciation of how much different a world I live in, when it comes to college
athletics, compared to those in some other parts of the country.
I
also think Carroll was right in one of his comments toward the end of his
talk: We in this country made a bad
bargain when we combined major spectator athletics with higher education. Not that it was ever a conscious decision,
because the combination simply evolved from student games in the late 1800s and
grew like topsy, but at best the evidence is mixed about whether they have been
boon or bane for colleges and universities.
My own view is that they have been more bane than boon; at this point,
however, that's irrelevant because major colleges and universities are not
going to eliminate their athletic programs.
So I participate in COIA in the faint hope that we might achieve some
modest changes in order to corral athletic pressures on the academic side of
the house that can distort the institution's values and mission. Many would say that I/we are tilting at
windmills.
(Carroll
read these paragraphs and wrote to me that I had presented the sentiments
faithfully.)
No
sooner had I written these paragraphs than came an article in the New York
Times (3/8/13) reporting that
Passionate sports fans can
experience a "cheap high," but it only becomes a problem if they shut
out other things in life, researchers said.
. . . In the 1990s, researchers began finding that "highly
identified" fans experienced higher levels of arousal — measured by heart
rate, brain waves and perspiration — and had fewer bouts of depression and
alienation than nonfans. Contrary to popular belief, this research suggests,
hard-core fans even have higher self-esteem.
"It's a source of validation for their self-conception," said
Susan Krauss Whitbourne, a psychology professor at the University of
Massachusetts Amherst, who has written extensively about the mental states of
sports fans. "For the really true fan, it can become a problem if they
shut out other things in life. But the positive side is that it's fun and
exciting. It's a cheap high. It's not that they don't have a life.
It
was interesting that the examples they used (in the article, not included here)
were all from college sports. I wonder
if the same is true of fans of professional sports. And if there is a difference across sports;
are rabid football fans different from rabid hockey fans? One suspects that the crux of the matter
comes in the "it only becomes a problem if they shut out other things in
life" qualifier. Or if they have
other things in life, period. The fans
that Carroll described—and perhaps others who are rabid followers of a team—may
have lives that don't have much else in them.
DISTANCE,
n. The only thing that the rich are willing for the poor to call theirs,
and keep.
If
all the girls who attended the Yale prom were laid end to end, I wouldn't be a
bit surprised. (Parker)
As has sometimes been the case with
certain friends of mine, they (a couple) in their annual letter last year raised
the question of "intentionality."
They will, I hope, forgive me if I take their use of the term and turn
it to a line of thought they didn't necessarily intend. (As one of them wrote, "as for
intentionality, I have to confess some ambivalence. I spend much of my days and nights enthralled
by my own to-do lists, and my medium-term future feels pre-ordained by post-it
notes and planner entries." To
which Kathy responded, when I read this aloud to her, "just like the rest
of us!")
I chose to take their introduction
of the notion of intentionality and think about how I want to spend the rest of
my life: What do I want to accomplish,
how do I want to spend my time? As Kathy
and I talked about it, I came to draw a distinction between "what do I
want to do/how do I want to spend my time?" and "what do I want to
accomplish" in the sense of "what legacy do I want to leave?"
It
goes without saying that I—and surely almost all parents—want to leave behind
children who are happy and educated and thoughtful citizens. Assuming something approaching a normal
lifespan, however, my ability to affect those outcomes has long ago
dissipated. To whatever extent I played
a role in that—and it's not clear to me how much of a role any parent plays in
the "happy" and "thoughtful" parts—I think Elliott and
Krystin are thoughtful and happy, and they are certainly educated.
A story I told at a February dinner that
my colleagues were so kind to have for me (more on that later) revealed one
legacy I leave whether intended or not:
Several years ago I was walking across campus with a faculty colleague,
and she commented to me (I paraphrase) that no matter how much a faculty member
may publish during a long career at the University, I will undoubtedly hold the
record for the most words written under the University's auspices. I suppose that's a legacy, of a sort.
I am quite clear in my mind about
how to spend my time on a day-to-day basis:
Inasmuch as I am not convinced that there is any hereafter, what I do
with my remaining time should be mostly spent in activities in which I enjoy
engaging. If I'm not going to a heaven,
I better make the best of what time I have.
(And certainly if I'm going to a hell, I damn sure better make the best
of the time I have!)
It is, of course, a little vain to
think about one's "legacy" if the sense of the term means something
for which one will be remembered. One
can leave a legacy for one's children, or larger family, or friends, or city,
or country, or the whole darn world.
Picasso and Plato and Einstein left a legacy for all of us, to some
extent or another. It is silly for most
of us to think about leaving a legacy of that magnitude and I certainly don't. But I think about leaving something behind
that makes the world a little better off or that provides something of value,
trivial or great. (For example, at least
for some period, P. D. James and Frank Herbert have left behind interesting
works of fiction that others find enjoyable.
Obviously, great authors such as Austin, Bronte, Dickens, Dostoyevsky, Twain, etc., have left
behind more momentous works.)
It is fun to toy with the idea of
writing a detective story (or stories) set in higher education (at the U of
Minnesota, although that's not
important), and I even have the opening plot events for several in mind: the head football coach dies in a mysterious
car crash in a snowstorm in Michigan; several people die from the malicious
release of organisms from a bio-safety level 4 lab; the university president is
found dead in one of the steam tunnels; a sculpture professor is found beaten
to death with one of her own sculptures; a biomedical engineering professor is
found dead in a corporate lab with diagrams of some new device—the list is
endless at a major research university. Unfortunately,
I have no idea what to do with those great opening scenes because I lack the
creative imagination to develop a plot that is both complicated and
interesting. Besides, I don't like to
know the end of a murder mystery before I read it, and if I wrote it, I'd know
the ending, so it wouldn't be interesting to me.
Several people have urged me to
write a lively history of the University of Minnesota (covering, say, the last
50 years or so). I could do that, but
the problem with writing an interesting history that is both factual and covers
the range of human emotions that were involved in events (for example, the
tumultuous departure of the president in 1988, or the removal of the athletic
director in 1989, or the controversy over the Najarian case, or the heated
tenure debate in the mid-1990s) is that living people would be hurt. Good history, in my view, has an assessment
of events from various perspectives, and I know that the perspectives on events
would not be positive toward some of the individuals who were involved. I'm not sure what the gain for posterity
would be other than titillating reading about events at a university. I also suspect the book would sell about 25
copies, at most.
Whatever it is I decide to leave
behind that I hope is of some benefit will probably have to be written. I write.
One of my dad's legacies is a
sandwich, peanut butter and raw onion slices on toast. I love them; Kathy and the kids screw up
their faces in distaste whenever I say I'm going to have one. One of my grandmother's legacies is fried
Spam slices (with a dash of garlic salt) on toast. As you can tell, I come from a long line of
fine food connoisseurs.
I
hate writing, I love having written.
(Parker) (Me,
too)
IMPROVIDENCE,
n. Provision for the needs of today from the revenues of tomorrow.
So it ever has been, is, and ever
will be. One winter Sunday afternoon
Kathy and I went out to look for a new sofa for our living room. The furniture in the front half of our house
mostly dates from the spring of 1997, when Pat and I finished
expanding/remodeling the house. When we
were done, we had half a floor of new social-space rooms and nothing to put in
them—and we had a dilemma. We had two
kids, 6 and 12, so we could either (1) get decent quality furniture and then
constantly worry about it and nag the kids to be careful/not spill/tell their
friends to not use it or (2) we could get less-expensive furniture and not
worry too much about it. Of the many
things parents can nag kids about, furniture, in my opinion, shouldn't be one
of them, so we opted for the Slumberland tent sale furniture (which we actually
did get at a Slumberland tent sale). It
was functional but not fashionable.
That furniture was not intended to
be permanent, but for the most part it hadn't been replaced, and now it was
well past its expiration date, so to speak.
So Kathy and I went to look for a sofa (which is what we needed
most). We ended up with a new sofa, a
new chair, a new rug, and new drapes, and completely changing the color scheme
of the room. After all that, we had to
get new table lamps. How does this
happen? We know—we sort of played off one
another while browsing the furniture and one thing led to another. I'm still not sure we ended up with stylish
and fashionable, but it is at least different.
Look at
him, a rhinestone in the rough. (Parker)
CABBAGE: A familiar kitchen-garden vegetable about as large
and wise as a man's head.
Nothing is more responsible for
the good old days than a bad memory.
(Benchley)
When we were in Scotland, I noted at the time
that I had had this reverie (half dream, half daydream, while half asleep in
bed) that I had been transported back to the constitutional convention in
Philadelphia in the summer of 1787. I've
thought about that ever since, in idle moments, and more specifically what it
would be like to explain aspects of life in the 21st Century.
I suspect most of us do
not think much about it—to the extent anyone thinks about life in the late 18th
Century at all!—but this year I realized with somewhat of a start that "the
material realities of daily life changed little for most people in Western
societies between the 12th century and 1787" (in the words of an historian
friend of mine in the History Department).
In both medieval times (and before) and in 1787, the darkness was lit by
candles, transportation was by horse or foot (or sail), there was no indoor
plumbing and no electricity, no central heating (or air-conditioning, which I
would have been unhappy about not having in the hot summer of 1787), no easy
access to clean water, no paved roads, no scientifically-based medicine and no
antibiotics or vaccinations, and so on.
The size of cities grew in that 500-year period, but daily amenities
were not much different.
Most
of the major improvements in our daily lives did not come until the 19th
Century, with the invention and growth of the railroads, steam engines, and so
on, and the introduction of the major conveniences we now enjoy didn't fully
evolve until the 20th Century. So
finding oneself in Philadelphia would be a shock to us—and to the Founding
Fathers. Imagine trying to explain a
wristwatch run by a battery, indoor lighting and plumbing, a dishwasher and a
vacuum cleaner, paved roads and automobiles, much less a cell phone, a laptop,
airplanes, Facebook and Twitter, or contact lenses. Or weather satellites or landing a man on the
moon.
Another modern "convenience"
that the 18th Century lacked was anti-perspirants. By all accounts, the summer of 1787 was hot;
the temperatures were in the mid-90s.
The paintings from the era, of the constitutional convention, have all
of those men sitting around fully and formally dressed; it doesn't appear they
were writing a constitution in t-shirts and shorts. Can you imagine the BO in that room? Bathing was infrequent, so perfumes were used
to disguise the smell, and that was an affluent group that would have had
access to perfumes/colognes, but even so, with the windows purported closed
during the proceedings, it must have been pretty ripe.
I had an exchange of emails with
Karie Diethorn, the Chief Curator at Independence National Historical Park, who
was kind enough to provide me with details about the daily conduct of the
constitutional convention and the clothing the Founding Fathers wore. I wrote to her to ask if the convention
members were actually dressed as they are portrayed in the portraits and asked
her about whether they really kept the windows closed.
Clothing
is a cultural construct as well as means of keeping warm/cool. In the context
of the 18th century, social convention dictated that the genteel (i.e. those
who didn't perform physical labor to make a living) when in public wore very
specific attire regardless of the weather. In the summers of 1776 and 1787, the
men who attended the Continental Congress and the Constitutional Convention
would have worn shirts, waistcoats, coats, britches, stockings and shoes at all
times as was their culture's social custom.
To
deal with heat and humidity, those meeting in Independence Hall would block
sunlight from their rooms using window shades or Venetian blinds (there are no
interior or exterior shutters on Independence Hall). They could fan themselves,
and they could adjourn regularly for refreshment.
The
tradition that the Independence Hall windows were closed during the Declaration
and Constitution debates to protect the confidentiality of the proceeding is, I
think, rather problematic. The first floor windows of Independence Hall are ten
feet above the ground surface, so no one could lean on the sills to listen in.
Also, the street in front of Independence Hall would have been quite busy with
pedestrian and horse-drawn vehicle traffic which would have made a lot of
noise. And, in the case of 1787, the delegates to the Convention weren't
committing treason, so being overheard wasn't an issue (actually, I doubt
whether in 1787 the public was much concerned with what the Convention was
doing). In the summer of 1776, there was concern about secrecy, but I suspect law
enforcement personnel (e.g., sheriff's officers) could patrol the Independence
Hall perimeter to shoo off members of the public who might be loitering there.
I think Jefferson's remarks about keeping the windows closed during the
Declaration debates to shut out the flies meant that the windows were cracked
for air circulation but not open wide to allow in large numbers of insects.
That response from Ms. Diethold
prompted questions from Kathy. Did they
change clothes when it was that hot? What
fabrics were their clothes made of?
(This question from someone who used to make most of her own clothes and
who, obviously, knows fabrics.) Wool
(and scratchy wool at that)? And did
they wear the wigs all the time? Ms.
Diethold again:
It's
worth noting that our present day cultural standards are not readily applicable
to past eras (or even contemporary foreign ones). A genteel appearance in the
18th century included a clean face, hands, and "linen" (shirts for
men, chemises for women). Again, such a cultural context represented one's
non-manual labor status. On the other hand, personal odor wasn't an issue.
Women could use scent to mask odor (theirs and that of others, if you hold your
handkerchief near your face); men could smoke and/or chew tobacco which had the
added benefit of preoccupying the senses.
Certainly,
someone could wear the same clothing repeatedly. I do think, however, that in
the heat of summer the gentry rotated their outer clothing (i.e., men's coats,
waistcoats, britches; women's gowns) in order to give servants a chance to
brush, air, and/or mend garments.
Regarding
fabrics, it depended on the occasion (including season, activities) as to what
someone wore. Wool, linen, silk, and cotton were all available to the
18th-century gentry. Woolen fabrics like broadcloth can be actually quite
light, and they do breathe. Cotton was very expensive, but women did have gowns
made of it. Linen, a very breathable fabric, was extensively used in the
southern colonies during the warmest weather. Silk, a very lightweight fabric,
stains easily and so was generally reserved for fancy dress, but it could also
comprise a waistcoat worn nearest the body over a shirt.
Regarding
men's wigs, these received meticulous care from servants. The big problem with
wigs was lice. Servants combed and styled wigs regularly to prevent lice
problems. As for wearing wigs, it's something men in the Revolutionary period
were used to and tolerated for gentility's sake. By the end of the war,
however, wig wearing had dropped off considerably, particularly among younger
men who saw it as old fashioned and politically conservative (i.e., it
suggested the old order of courtly dress). The replacement for wig wearing was
powdering one's hair white to emulate the French fashion. Powdering one's hair
brought a whole host of problems—it was messy, time consuming, and required
regular touch ups—but at least it didn't foster lice.
Benchley wrote that the good old
days are due to a bad memory—but his comment only speaks to changes in a single
lifetime. This musing has led me to
realize, more than ever, that I have no desire whatever to travel back in time,
unless I could perhaps visit for an hour and then return home. (I remember that when I had that reverie
about finding myself in 1787, after the novelty of explaining 21st Century life
had worn off, I was suddenly and overwhelmingly depressed that I had no way to
get back to modern life or my kids and family and friends.) Ms. Diethold at Independence Hall, in one of
her messages to me, wrote that "all this [the clothing, heat, lice, lack
of amenities] (plus the question of 18th-century dental care) informs the
answer I give when people ask me 'Wouldn't you love to live in the 18th
century?' I say, not on your life!" I laughed when I read that because it echoed
my sentiments exactly.
What
had changed dramatically between the
12th and 18th Centuries were ideas and their spread, advanced by Mr. Gutenberg
and his moveable type. The Church no
longer dominated intellectual life (virtually every one of the leading Founding
Fathers were skeptical about or openly hostile to organized religion) and
Enlightenment notions about society and the individual's role in it had upended
medieval thought. What I find striking
is that while the Founding Fathers were laboring—with modest but not complete
success—to create a new nation founded on the ideals they had come to know,
they were simultaneously living their daily lives in conditions that had not
changed appreciably in centuries.
Although
a much more condensed span of time, the accoutrements of daily life for most of
us have not changed much in 50 years, either.
My parents had a dishwasher, vacuum cleaner, washer, dryer, and
forced-air heat, air conditioning, and of course an automobile; those existed in
1964 in pretty much the same form as they do today, with improvements on the
technical margins. Go back 50 more years
that that wasn't true for most people (and in some cases for none, since no one
had home dishwashers or vacuum cleaners in 1914). One wonders what technological improvements
can be made in daily life that would represent dramatic change; a little robot
that goes around vacuuming quietly all the time would be an improvement, as
would self-cleaning clothes and dishes.
The former could be on the horizon as robotics improve, but the latter
seem like a stretch. Exchanging
automobiles for some non-polluting, energy self-sufficient way to get around
would be a terrific advance, especially if one simply plugged in an address and
it went there on its own.
ACCIDENT, n. An
inevitable occurrence due to the action of immutable natural laws.
COMPULSION,
n. The eloquence of power.
Related
to the foregoing contemplation of the constitutional convention is news about
the president of Emory University. One
of lectures I would have given the Founding Fathers would have been about the
terrible mistake they made in countenancing the continuation of slavery. (I know, anything else would have likely been
politically impossible and ended all hopes of a union—but then, I've also come
to think that perhaps a union of north and south wasn't such a great idea after
all, but I've commented on that elsewhere.)
One of the provisions of the constitution, one of the compromises
between slave and free states, was that slaves would count as three-fifths of a
person for the purpose of the census and determining the allocation of seats in
the House of Representatives.
The
president of Emory University, James Wagner, wrote a column in Emory Magazine about the challenges of
governing when politics are so polarized, as they are today.
One instance of
constitutional compromise was the agreement to count three-fifths of the slave
population for purposes of state representation in Congress. Southern delegates
wanted to count the whole slave population, which would have given the South
greater influence over national policy. Northern delegates argued that slaves
should not be counted at all, because they had no vote. As the price for
achieving the ultimate aim of the Constitution—"to form a more perfect
union"—the two sides compromised on this immediate issue of how to count
slaves in the new nation. Pragmatic half-victories kept in view the higher
aspiration of drawing the country more closely together.
Some might suggest that
the constitutional compromise reached for the lowest common denominator—for the
barest minimum value on which both sides could agree. I rather think something
different happened. Both sides found a way to temper ideology and continue
working toward the highest aspiration they both shared—the aspiration to form a
more perfect union. They set their sights higher, not lower, in order to
identify their common goal and keep moving toward it.
The New York
Times reported that "Leslie Harris, a history professor and the
director of a series of campus events that for five years examined issues of
race at Emory, said she was more troubled by the intellectual holes in Dr.
Wagner's argument. . . . [The comments are] 'a deep misunderstanding of
history,' Dr. Harris said. 'The three-fifths compromise is one of the greatest
failed compromises in U.S. history,' she said. 'Its goal was to keep the union
together, but the Civil War broke out anyway.'"
The faculty of the Emory College of Arts and Sciences
censured Wagner (but tabled a motion for a vote of no confidence). President Wagner apologized at length for the
comments. The reader comments in various
publications took widely varying views.
A retired professor of National Security Studies, at the National
Defense University, wrote a letter to the Chronicle
of Higher Education criticizing the critics. He (a white guy) had taught at a
predominantly Black university for a number of years, claimed he knew racism
when he saw it, and argued that Wagner
was not being racist when
he pointed out that the "Three-Fifths Compromise" at the
Constitutional Convention was an example of a useful bridging of differences on
political opinion. . . . In times like ours when the word "compromise"
seems to have become anathema to many, I thought Mr. Wagner's position was well
taken. What his critics don't seem to realize is that it was the slaveholders
in the South who wanted to count slaves as full persons—not three-fifths of a
person. It was delegates from the free states who insisted on three-fifths as
the measure of a person, as they did not want the South to gain additional
members of the House of Representatives. Where the three-fifths fraction came
from has always been a mystery. Why not four-sevenths or two-thirds? The answer
seems to be that three-fifths had been settled on as the amount of work a slave
did when compared with a free person. Those who measured a slave as
three-fifths of a person were the folks in the North—not the (obviously) racist
slaveholders in the South. The fraction was not about racism as much as it was
about political power.
The view of a number who wrote was
that Wagner wasn't a racist but he sure picked a bad example. I'm not going anywhere with this other than
to note that the compromises and agreements the Founding Fathers made continue
to reverberate, sometimes in odd ways, 226 years later. (Despite calls by some, Wagner did not lose
his job. And the faculty leadership at
Emory, the faculty members who would be most knowledgeable about his performance
as president, were wise enough not to want to lose a good president as a result
of a mistake: "We acknowledge the
hurt to our community caused by President James Wagner's use of the
three-fifths compromise clause in his column in the Winter, 2013, issue of the Emory Magazine. He has sincerely apologized for this mistake
in multiple venues, and he has held many listening sessions to hear concerns
from the community. We as the University
Faculty Council accept his apology. While
his words were insensitive, they were not malicious in intent, and discussion
of them has revealed failures throughout our community to live up to the
diverse and inclusive ideal to which we aspire.")
POLITICS, n. Strife of interests masquerading as a contest
of principles.
That woman speaks eighteen
languages and can't say "No" in any of them. (Parker)
At a time when the 150th
anniversary of Lincoln's Gettysburg Address is being celebrated, one viewer who
saw "12 Years a Slave" wrote a stinging commentary. I find myself in sympathy with his views.
Anyone who acquires the narrative of 12 Years A Slave and
finds it within his shrunken heart to continue any argument for the sanctity
and perfection of our Founding Fathers, for the moral wisdom of their
compromised document of national ideal that begins the American experience, or
for their anachronistic, or understandable tolerance of slavery is arguing
from a desolate, amoral corner.
If original intent included the sadism and degradation of human slavery, then original intent is a legal and moral standard that can be consigned to the ash heap of human history. And for hardcore conservatives and libertarians who continue to parse the origins of the Constitutions under the guise of returning to a more perfect American union are on a fool's journey to decay and dishonor.
If original intent included the sadism and degradation of human slavery, then original intent is a legal and moral standard that can be consigned to the ash heap of human history. And for hardcore conservatives and libertarians who continue to parse the origins of the Constitutions under the guise of returning to a more perfect American union are on a fool's journey to decay and dishonor.
For anyone to stand in sight of this film and pretend to the
infallibility or perfect intellectual or moral grandeur of a Washington, a
Jefferson, or a Madison is to invite ignominy from anyone else sensate.
Slavery was abomination, and we, in our birth of liberty, codified it and
nurtured it.
It took Lincoln, and a great war, to hijack the American experiment from its original, cold intentions by falsely claiming, a century and a half ago, that the nation was founded on the proposition that all men are created equal. It was founded on no such thing. It required blood, a new birth of honor and a continuing battle for civil rights that is still being fought for this nation to be so founded.
In the echo of this film, the call for a strict construction of our national codes and a devotion to the original ideas of the long-dead men who crafted those codes in another human age, rings hollow and sick and shameful.
I am struck by a remark by Michael Sandel
about adhering literally to the values of the past: "If any past civilization had succeeded in
protecting its values, we'd be stuck with values that we would find horrible." (Per Wikipedia, Sandel, a political
philosopher and professor at Harvard University, is a Fellow of the American
Academy of Arts and Sciences, was born and raised in Minneapolis (!), and has
taught the famous "Justice" course at Harvard for two decades, one of
the most highly-attended courses in Harvard's history.)
The
author of the commentary did make the point that there is much to admire in the
constitution, which is true, and that it has been improved enormously by the
addition (most of) of the later amendments and by Supreme Court decisions that
have ever-so-slowly grafted modern standards of justice onto an 18th-Century
document. (Some Supreme Court decisions
go the opposite way, of course; it is always some steps forward and some steps
backward. I wouldn't count the 18th
Amendment as positive, since I like to have the occasional wine or beer or
scotch, and I'm not enamored of the 2nd Amendment, either.)
Love is like
quicksilver in the hand. Leave the fingers open and it stays. Clutch it and it
darts away. (Parker)
POLITENESS, n. The
most acceptable hypocrisy.
I am sometimes envious when reading the annual holiday letters from
friends. Some of them report on such
interesting things, like skippering boats in the Mediterranean or playing in a
band, or about offspring competing in ironman competitions around the world,
becoming tenured professors, and so on.
Knowing the people who wrote about these adventures, I know it's not
boasting, it's passing along family news.
Our lives seem so dull by comparison.
IMAGINATION,
n. A warehouse of facts, with poet and liar in joint ownership.
CONGRATULATION,
n. The civility of envy.
A few years back I commented on a
mini-controversy about the (feared by some) disappearance of cursive
writing. There was a somewhat extended eruption
in the news in mid-winter about the topic again. Neither this nor a couple of the other
commentaries that follow are issues of great moment, they are simply
diversionary fun to think about when there are vexing and momentous matters
before the country and the world.
The Wall Street Journal published an article on January 30 titled "The
New Script for Teaching Handwriting Is No Script at All: Cursive Goes the Way of 'See Spot Run' In
Many Classrooms, Delighting Students."
Bylined in Raleigh, North Carolina, the article reports that with the
adoption of a "common core" of math and English standards in 45
states, teaching cursive writing is optional and few teachers are requiring
it. (But some of the states are
reconsidering, at the level of the state legislature. There's a great way for a legislature to
spend its time.)
(Keyboarding
proficiency by 4th grade, however, is required, and in my judgment that
requirement—distinct from teaching cursive—makes perfectly good sense. When Krystin was a little girl, in the
mid-1980s, before she was in school and before the advent of computers and
their progeny, I took the position that she should not learn typing because it
would mean she would be slotted into some kind of secretarial position—and that
her aspirations and possibilities should be larger. Little did I realize that the employee
category of "secretary" would largely disappear, that we would all,
professionals or otherwise, become our own secretaries in many respects, or
that the computer and its keyboard would become so ubiquitous that those who
cannot type/keyboard would be at a disadvantage in just about every endeavor of
life.)
The
article quotes a school board member as saying that they have to find room for
many subjects, including skills that children need, and some things just have
to go. The article has some tear-jerking
prose, such as the observation that "no matter that children will no
longer be able to read the Declaration of Independence or birthday cards from
their grandparents" if they cannot write (and presumably thus read)
cursive writing. (When was the last time
you had to read the Declaration of Independence in the original script? When was the last time you read the
Declaration of Independence? Or the
constitution?)
With
students, however, eliminating it from the curriculum is apparently extremely
popular. "When asked whether they
should have to learn cursive, 3,000 of 3,900 middle-school students surveyed by
Junior Scholastic magazine in 2010
said it should be erased. 'NO! OMG, 4get
cursive, it's dead!'" (It's hard to
know what I would have thought had I
been asked this question in 3rd grade, which is when we learned cursive
writing. In fact, one of the highlights
of my entire academic career was being the first student in my class to get
through all of the cursive writing cards we had to complete. I would have been second, but the guy who was
ahead of me got sick for a week, so I beat him.
J
Always odd what bits and pieces of life we can remember.)
At
least in line with my experience, and Kathy's and that of other friends, the article
reports correctly that most adults don't use exclusively cursive writing or
exclusively manuscript (print) lettering.
Most of us who write much at all make up our own combination, drawing
from both, including in our signatures.
I haven't used a cursive "G" to start my signature for years
and my capital "E" isn't cursive, either. Krystin and Kathy and I were talking about
this one night when these news articles came out and determined that none of us
use cursive letters to start our signatures—first or last name. Nor do we use exclusively cursive for the
interior letters. In fact, "it is
becoming increasingly rare to even have to sign your name. By 2016, nearly half of all home loans could
be closed electronically, meaning that thousands of people will buy homes
without having to physically sign their names." In a conversation shortly thereafter, Elliott
came down emphatically on abandoning cursive writing in the curriculum as a
complete waste of time. He added that it
was particularly irritating to learn it in grade school and then get to high
school, only to be told that teachers wanted nothing in cursive, they wanted it
printed (if it wasn't from a computer and printer). (He also said that he'd be glad to see
cursive gone because he hates writing "Engstrand." I agreed with him that it's an awkward
combination of cursive letters when being written.)
The
article goes on to quote a handwriting analyst who opposes elimination of
instruction in cursive writing. "'It's
a very natural process to take a crayon or a rock and make symbols with your
hand,' Ms. Dresbold said. 'It's just
bringing down things from your brain." Without that, "children are
not thinking as thoroughly.'" With
all due respect to the Journal, I don't
consider a handwriting analyst an expert on the subject of cognition and
learning.
Without any reference to the Wall Street Journal article, the
newsmagazine Prospect, in England,
three weeks later published an article titled "Curse of cursive
handwriting: It's absurd to teach
children this way" by Philip Ball.
In this case, the author went to people I'd consider experts. First of all, he said it is just peculiar
that we teach children first how to print—and then, 3-4 years later, teach them
a mostly new way of writing—"oops, forget what you just learned, now start
over." "We tend to forget,"
Ball wrote, "unless we have small children, that learning to write isn't
easy. It would make sense, then, to keep
it as simple as possible. If we are
going to teach our children two different ways of writing in their early years,
you'd think we'd have a very good reason for doing so." But we don't.
Part
of it, he wrote, is just sophistication; it's adult to use cursive and childish
to print. Ball asked his daughter's
teachers why they were teaching cursive.
Once they realized he wasn't kidding, they said because they'd always
done it and because it allowed the children to write faster.
"Surely, though, in something as fundamental to
education as writing, there must be scientific evidence that will settle this
matter? Let's dispatch the most obvious red herring straight away: you will not
write faster in cursive than in print. Once you need to write fast (which you
don't at primary school), you'll join up anyhow if and when that helps. . . .
On the merits of learning cursive versus manuscript, Steve Graham, a leading
expert in writing development at Arizona State University, avers that, 'I don't
think the research suggests an advantage for one over the other.'" One U.S. survey in 1960 found the same
rationale as Ball's daughter's teacher:
tradition, wide usage, and public expectations, and one superintendent
said they didn't really see any reason to keep cursive. A reading expert at Missouri State told Ball
that 'the reasons to reject cursive handwriting as a formal part of the
curriculum far outweigh the reasons to keep it.'"
It's the teaching of two systems that is one
problem. But Ball also commented that "were
there to be a choice between cursive and manuscript, one can't help wondering
why we would demand that five-year-olds master all those curlicues and tails,
and why we would want to make them form letters so different from those in
their reading books."
There is reason to be adept with a pen; our writing
and signature is as much a part of us as our clothing, our conversation, and
other attributes, Ball said. But there's
nowhere written a rule that it has to be cursive. Many of can admire the Spencerian script so
widespread in the U.S. from the middle of the 19th Century until the first
quarter of the 20th. But no one writes
like that any longer and our cursive writing no longer has the artistic beauty
of earlier styles. Those of us born
after WWII were taught Palmer Method of penmanship, with much plainer letters
and faster writing that could compete with the typewriter. But most modern cursive writing is execrable
and often illegible (sometimes to the person who wrote it). What is worth preserving there? This would be worth preserving—but no one, in
the sped-up life that we live, will take the time to write this way:
A few days later, because the Wall Street Journal article was written from Charlotte, the Charlotte Observer had to follow up with
an article of its own. It seems that a
Republican legislator introduced a bill in the North Carolina legislature to
require teaching cursive handwriting in the state's elementary schools.
The Observer went to its own local expert, a
retired faculty member and director of literary studies at the University of
North Carolina, who "said teaching manuscript – or print
– handwriting make more sense for the modern world. 'The research says that adults who write
manuscript, they write just as quickly as adults who write everything in
cursive, but it's more legible,' Cunningham said. 'It's just a simple matter that there aren't
any advantages to cursive handwriting.'"
The legislator disagreed, apparently an expert on psychomotor
development and cognition: "She
said learning cursive helps children with their brain development and motor
skills. And she thinks it aids students in reading documents such as the
Declaration of Independence or simply letters from an older relative." The latter is an assertion of fact that the
experts in the field don't have any evidence for; the latter may be true—or may
not be. Another of the (Republican)
sponsors of the bill "said teaching cursive makes students more
well-rounded, both in terms of disciplining them to learn it and in helping
them express their creative side. 'It
lends to our humanity to know cursive.'"
That strikes me as a lot of unsupported malarkey; I wonder if there is
evidence for those propositions.
After the question of teaching kids
two systems, the retired UNC professor may have made the most logical point, as
had Ball: "'I think manuscript
handwriting is superior as a system for teaching children because then the
letters they write look more like letters in books that they learn to read,' he
said. 'If you pick up a book to try to
read it, it is almost never written in anything vaguely similar to cursive
handwriting.'"
Just for fun, I wrote to one of the
faculty members in our composition program and asked her what she thought.
My opinions about cursive writing
are rather subjective (since it is not often discussed among college
composition instructors). . . . I
think cursive writing is important as a form of personal expression. I
would agree that printing / manuscript makes the most sense, but I'm just not
sure cursive writing isn't used anymore. I think about it less in terms
of efficiency (is it faster or slower to write in cursive?) than in terms of
expression. Handwriting is unique to an individual. . . . I think being able to write in an expressive
way is an important motor skill and an important literacy skill (but that
should not surprise you!). Whether or not it deserves a unit in
elementary school? Good question. I am a PTA member of my kids'
elementary school and I am always surprised at the new requirements and
subjects they must now teach. Those teachers are busy and are under a lot
of pressure. I can see how cursive writing falls off the list, but personally
I'm glad our school still teaches it!
She
went on to ask what happens when someone cannot afford all the latest
electronic equipment (computer, smart phone, etc.); "Would that person
only want to write in manuscript form? I don't think so. . . . Printing would be very tiresome, wouldn't
it? Also I was thinking that learning cursive to some degree is a
literacy activity: we must be able to read cursive as well as write
it. I don't know of many people who hand write using manuscript style
only."
Good
question about reading cursive if you don't write it yourself—that, presumably,
is a question of fact that someone could ascertain. One of the arguments
Kathy and my son made is that it is extremely rare to even run into anything
written in cursive any more—and good riddance, in their view. I don't
know—is printing tiresome? Might it not be easier to teach kids,
especially those who are non-native English speakers, manuscript writing from
the get-go, rather than teaching them both styles, when they're having a hard
enough time mastering the spoken tongue, much less the written?
This issue got more press in the New York Times, in a book review of The Missing Ink by British novelist,
columnist, and art critic Philip Hensher.
Hensher argues that handwriting "'involves us in a relationship
with the written word which is sensuous, immediate and individual. It opens our personality out to the world, and
gives us a means of reading other people.'" The reviewer reports that it wasn't
individuality that led to widespread use of handwriting, it was
uniformity: business needed writing for
purposes of communication, so along came Messrs. Spencer and then Palmer to
make writing uniform. Hensher maintains
that "'To continue to diminish the place of the handwritten in our lives
is to diminish, in a small but real way, our humanity.'" As the reviewer concludes, however, while the
book "succeeds in making a strong case for the renaissance of handwriting.
. . . when it comes to putting pen to paper nowadays, most people would prefer
not to."
The debate continued into the
spring, with pieces on "All Things Considered" on NPR, in the New York Times, and in the Chronicle of Higher Education.
We are now up to April 7, when New York Times critic Edward Rothstein
wrote "Cursive,
Foiled Again: Mourning the Demise of Penmanship." He confessed that he can hardly read his own
handwriting and that when he receives a handwritten letter, it seems "awkward,
barely fluent." Cursive is
disappearing, he concludes. However, he
argues, its disappearance is "more evidence of precipitous cultural
decline" and he takes a look at a book by an historian on handwriting in
this country. Once writing was a fine
craft and, moreover, one's position in life was reflected in the style of one's
writing (gentleman, lawyer, secretary, and so on). Benjamin Franklin thought the regularity of
mechanically-produced type distracted from meaning and that handwriting was
more personal and more revealing. In the
U.S., gradually everyone learned to write, so there was no social or role
differentiation associated with it. Then
the argument was made to do away with script as too confining, a barrier to
expression, so the movement was to teach children to print. So, "a craft that began by defining
social roles and manners and went on to become one of the arts of
self-presentation and self-expression was eventually seen as an obstacle to
self-expression."
Then Rothstein reaches what seems to
me a rather odd conclusion. "The
problem is that something is sacrificed in this renunciation of the hand: the idea that the appearance of words is part
of their meaning, that script is a public presentation of the private self;
that surface is part of substance. The
main point of display becomes the rejection of display." I've never thought that how a word is made
to appear on paper (or a screen, for that matter) is part of the meaning,
barring the use of italics or bold or upper case for emphasis.
The Times
continued on the subject with four op-ed pieces at the end of April.
One author (Zezima), citing various academic
authorities, made several points: printing is easier to forge, the abandonment
of cursive means a lost link with archival sources (coming generations won't be
able to read them because they never learned to write in cursive), schools aren't
teaching it because the standardized tests don't test for cursive (and what isn't
being tested isn't taught), and cursive writing helps with the development of
fine motor skills ("'it's the dexterity, the fluidity, the right amount of
pressure to put with pen and pencil on paper.'") I don't find these contentions to be
particularly persuasive. But the
comments of one of the author's grief-stricken authorities appealed to
something in me: "Richard S.
Christen, a
professor of education at the University of Portland in Oregon, said,
practically, cursive can easily be replaced with printed handwriting or word
processing. But he worries that students
will lose an artistic skill. 'These kids
are losing time where they create beauty every day,' Professor Christen said. 'But
it's hard for me to make a practical argument for it. I'm not one who's mourning it because of that;
I'm mourning the beauty, the aesthetics.'"
A young
assistant professor at the U of Southern California's distinguished College of
Education (Polikoff) was blunt: "Cursive
should be allowed to die. In fact, it's
already dying, despite having been taught for decades." He maintained that few people use it any more
and that much is done on keyboards and the remainder with print. Schools shouldn't be required to teach it
because there are more important skills for students to learn. "Additionally, there is little compelling
research to suggest the teaching of cursive positively affects other student
skills enough to merit its teaching. . . . As we have done
with the abacus and the slide rule, it is time to retire the teaching of
cursive. The writing
is on the wall."
A
counterpoint came from an occupational therapist in the Beverly Hills school
district (Asherson), who contended that "putting pen to paper stimulates
the brain like nothing else, even in this age of e-mails, texts and tweets. In
fact, learning to write in cursive is shown to improve brain development in the
areas of thinking, language and working memory. Cursive handwriting stimulates brain synapses
and synchronicity between the left and right hemispheres, something absent from
printing and typing." She noted that "students who wrote in
cursive for the essay portion of the SAT scored slightly higher than those who
printed." One might ask whether the
difference was statistically significant and whether there were intervening
variables—did the students who used cursive come from higher socio-economic
status, with a better education that included cursive writing, so they did
better? Her conclusion, "as a
result, the physical act of writing in cursive leads to increased comprehension
and participation," isn't warranted by citation of one study showing a
weak correlation. Interestingly, she
maintained that cursive does not "need to be fancy with slants, loops and
curls." So she tosses out Christen's
plaint on its ear: forget the artistic
element, make it simple.
The
director of archives and special collections at the University of Central
Arkansas (Bryant) maintained that our society is "losing its ability to
communicate via cursive writing" and that we "are becoming completely
dependent on machines to communicate with others." (Eliding the fact that people do print.) He argues for the beauty of handwritten
letters from 50-100 years ago and that people once took great pride in
penmanship. He echoes Christen a
little: "a case could be made that
some of the finer examples of cursive writing are actually a form of art." He also observes that email messages (and
surely tweets and online chats) are deleted while people tend to preserve
handwritten letters (and thus, I add, preserve small bits of social
history). I'm not sure moaning about a
failing tradition is useful (doubtless many mourned the passing of the horse
and buggy, too), but I agree with his point about the loss of the personal historical
record. Actually, there are also great
losses at an institutional level as well:
what I once would have done with a handwritten or typed note to someone,
whether the University president or a faculty colleague, I now do with an
email—and those, as we all know, are evanescent. I know of no one who is depositing email
records in the archives. (Who'd want to
sort through all those millions of messages?
One guesses that even the most diligent historian would surely find that
a daunting task unless there were a way to sort electronically using key
words.) The other reality about the
artistic beauty of older cursive that Bryant ignores is that everything today
has to be fast, fast, fast. Spencerian
script takes a heck of a lot longer than an email or text message. (And with text messages, one is licensed to
relax the normal rules of composition, making the communication of the
substance even quicker; some think that relaxation extends to email, an opinion
I do not share.)
Finally (at
least for the purposes of this disquisition), a professor of journalism at Duke
also (favorably) reviewed Hensher's book The
Missing Ink for the Chronicle of
Higher Education. Hensher's view, in
his report, is dramatic: "we're at a cultural tipping point: This is 'a moment when, it seems, handwriting
is about to vanish from our lives altogether,' he writes. He wonders whether that vanishing act will
point to a more profound phenomenon. "Is
anything going to be lost apart from the habit of writing with pen on paper? Will some part of our humanity, as we have
always understood it, disappear as well?'"
What's happening is that handwriting as a manner of self-expression is
disappearing and that we have substituted a more mechanical and less human way
to use the printed word. The review
author confirms Hensher's observation about the use of cursive: "Writing by hand, for this generation,
is somewhat akin to speaking by phone over a landline. Students are perfectly able to perform the
act. But interfering as it does with the
imperative of efficiency, they see little reason for it." So they don't (either use cursive or
landlines).
I'm
not sure I buy the proposition that we lose part of our humanity if we lose
cursive; that seems overblown. What
about the millions of people around the world who don't know how to write, much
less in cursive? Or the millions before
us, in centuries past, who never learned to write? To imply that they were not fully human seems
insulting.
The reviewer reported that Hensher
claims that handwriting reveals individual characteristics. "People who don't join up their letters
are often creative; alternatively, they may be "a little bit slow." People whose handwriting leans forward are
often 'conventional' thinkers; people whose handwriting leans backward are
often 'withdrawn.' And anyone who draws
a circle or a heart over his or her i's is 'a moron.'" I'm skeptical about the relationship between
characteristics and handwriting but I am inclined to agree with his last point.
I
think where I come down (without any further study, and I do not intend to
pursue it beyond what I've already read) is that cursive writing probably needs
to be optional in schools—but I say that with regret for the lost artistry. "Optional" means it surely will
disappear in a few years.
His voice was as intimate as the rustle of sheets.
Brevity
is the soul of lingerie. (Both Parker)
How do you
refer to the person to whom you are attached by marriage? When I took up this question, I didn't
realize that it would intrigue as many people as it did. So this section of the letter turned out to
be longer than I had expected.
Early in the year I listened to a
colleague (Friend A) over lunch and heard her refer to (a person I presumed to
be) her spouse as her "partner."
She wears a wedding ring and they have children; she is perhaps about
40. I wrote to her asking why she used
the term "partner" instead of the more commonly-used term "husband." I told her I speculated that she used the
term partner instead of husband (or wife) as a courtesy to our gay and lesbian
friends, who can't legally use husband/wife in most states, or perhaps to avoid
terms that identify sex. That's
defensible. It seemed to me an
interesting evolution in the language but I told her I wasn't sure I could make
that change, and begin referring to my wife as my partner, in part because, in
the present moment anyway, "partner" at least hints that you're not
married but perhaps living together, although in our case that is belied by the
fact that we both wear wedding rings. (I
suppose we could be "partners" even though each married to someone
else, which would be, at least in my world, a rather peculiar social
arrangement.)
My colleague (Friend A) wrote back
the following:
I
firmly believe in equality regardless of one's sexual orientation, and just can't
see the logic of granting certain privileges to some people based on their
sexual orientation while holding that back from others. In my mind, people seek
each other for multiple reasons and form "partnerships" in so many
different ways. So, until there is true equality in that regard, I do want to
emphasize that some people have the choice to have a "wife" or "husband"
while others don't.
I
also want to introduce some gender confusion into people's minds. I rely on
both masculine and feminine behaviors, as well as on
behaviors/attitudes/beliefs that some would equate with heterosexual and
homosexual people. If people see this strange confluence and wonder "what"
I am, then I want them to wonder why they are wondering and what difference it
makes.
Finally,
I just don't see myself as a wife. There are too many historical connotations
to that word -- ownership, obedience, deferring behaviors -- that I struggle
with. I know that one way to reconstruct
that word is to use it but mean something else. I need to leave that agenda for someone else,
though. . . . I just don't have the
stomach for it.
Friend B (also woman, early 40s,
divorced with children and
who has someone in her life with whom she lives—I labor to avoid using any of
the terms under scrutiny here!) responded to this view: "I fully agree here. I have a hard time with 'wife.' Perhaps this is part of the reason I have not
married the lovely man I live with. No
doubt it is due to the fact that I have been a wife in a rather negative
embodiment of the word. This is not a
unique issue with the women I run with. Oddly, the term 'husband' doesn't denote the
same trigger for strong independent women, I think. I don't think it is necessarily an issue
(meaning you can still be strong and independent, and not react to this word),
but this brings up an issue I have seen kicking around in my world."
I wrote that neither I nor, I would
wager a lot of money, my professional and faculty friends attach the baggage
that Friend A noted to the term "wife." She was of course
correct about the historical meaning, but at least in the circles in which I
move, that baggage is ancient history. My friends, it seems to me, have balanced
marriages with mutual respect and (differently for each marriage) obligations
and responsibilities that divided up pretty evenly. As far as I can tell,
ownership and obedience and deference never cross their minds! But they
still use the term wife (and husband).
Friend B wrote in response to this
that "you'd lose that bet," although wondered if responses might vary
by age or personal experience. Given
that challenge, I polled a few friends. I
outlined the opinions about "wife" and told them that I thought my friends don't attach the baggage
to the term and have balanced relationships without any thought of ownership or
obedience or deference. Here are the responses, some abbreviated (identified
by gender and my guess at approximate age; all respondents are highly-educated
professionals of some sort):
Woman, faculty member, 60s: I
am a wife and a partner. No problem with
it. Had a colleague who always referred
to the person she had lived with for years as her partner so I assumed they
were not married. They were. Ditto some friends who just got married after
25 years of living together. They now
happily refer to each other as husband/wife. In other words, there are no rules anymore,
but if people want to signal that they are married and of a particular gender, that
is great. That is, to me, the only
remaining purpose of wife/husband.
Male, attorney, 50s:
Neither I nor my wife object to the term. . . . To me, "partner" still carries an
ambiguity to it. It is a term that
signifies a close relationship, perhaps even committed or long term, but does
not say whether there is a formal legal or religious/sacramental bond. (He checked with his legally-bound person
before responding.)
Male, 60s, attorney:
Neither xxxx nor I have found this to be an issue. We generally use the term "spouse."
Male, faculty member,
60s: I have no difficulty with the terms wife and
husband, though I'm pretty much always aware of the issue. I find "my spouse" old fashioned and
sort of officious-sounding. Xxxx typically
refers to me as her husband.
Female, faculty
member, 60s: I must disagree with you, Gary. As a matter of fact, I strongly dislike both
words, "husband" and "wife," and I avoid using either term.
Female, scholar, 60s: I
have given the term "wife" very little thought as to its
historical/political connotations and I suppose I think of it as refining the
term 'spouse' to indicate the female half of the marriage. For gay couples, this could be murky, I
realize. By the way, I don't think using
the term "partner" to indicate one's spouse is very appropriate,
either, as it usually seems to indicate a non-married situation or a business
relationship. [She also had some
alternative suggestions: "Sprous-ner"
maybe? Or "spart-ner"? We'll
surely come up with something that fits everyone's requirements!"]
Male, attorney, 50s:
"I refer to her as my wife and she
refers to me as her husband. My
subjective view is that neither term carries baggage."
Female, psychologist, 60s: I think that the term "wife" does
carry a lot of baggage. Sometimes I
celebrate my husband as having great wifely skills. If I think of myself as "wife" it
is with a semi-conscious correction from the biased, icky connotations of past.
BUT when I verbalize my relationship to
my husband, it is likely to be "spousal unit"! I am not saying that spousal unit is a term
to promote. While I don't tend to say "partner"
- this word has the spirit of our marriage.
Female, business executive, 60s: I
generally use "spouse" without having really thought about it. I agree that "wife" has some very
heavy baggage, but among my friends who share a lifestyle that I enjoy, the
baggage disappears.
Female, attorney, ~60:
I am an uber feminist [and on the far left of
the political spectrum] and this is my opinion: if you are a woman who marries someone, you
are a wife. Period. END of sentence. Look it up in the dictionary.
I don't care what the connotations of the term "wife"
are: You bought them, lock stock and
barrel when you entered into that relationship. Re-define away, but don't you dare try to
weasel out. Do you know what this whole "I
just don't see myself as a wife" routine smacks of? It sounds like someone who wants all of the
inherent rights and privileges (legal, societal, moral) of that status, with
none of the negative implications. Well,
toots—NO. It just doesn't work that way.
I also happen to think that the term "wife" has plenty
of lovely connotations, by the way. In
an era when men (and the culture at large) treat women like disposable sexual
objects, it connotes someone special, someone valued and esteemed enough to
have been worth entering into a long-term commitment with. These same connotations apply to the word "husband"
in my opinion.
Upon
being introduced to someone's heterosexual "partner," I am always
tempted to inquire acidly, "And what business are you in?" Granted, in an era when homosexuals are denied
the right to marry, the term "partner" does have a use within that
limited context. When heterosexuals use
this term, I just wish they would be honest and say, "We're fucking, but
we don't have a commitment" because that is almost always the case.
I
also loved being married and I loved saying the word "husband,"
because I thought (and still think) that it indicates being part of a
relationship where two people have decided to cherish each other. I also found that if a husband doesn't treat
you as an equal within that relationship, then it's time to divorce.
Female, faculty member, 60ish: For
convention, it is easier to identify myself as "wife" and xxxx as "husband"
because everyone understands those terms.
Do they carry baggage? Some . . .
but I'm comfortable enough in my independence that it doesn't bother me. Oddly but currently, using the word "partner"
implies either a sexual orientation implication (e.g., in gay or lesbian
couples) or a strident feminism railing against the status quo (e.g., in
married couples.) What we need is
marriage to be legal for all. Then toss "wife"
and "husband" and just go with "partner" for all. But if you call me Mrs. Xxxx—duck! (Because that does imply ownership to me.)
Female, faculty
member, over 60: Your friend's point of view makes sense, and
I would not really argue with her. For
myself, however, I don't have that emotional take on the word. I've been working for 45 years to make women
and men equal both within and outside of marriage, which means transforming how
we understand the nature of the relationship. Gay marriage is very helpful, actually. When you can have wife-and-wife and
husband-and-husband as well as wife-and-husband, that "I now pronounce you"
moment loses some of its hierarchical baggage, don't you think?
Friend
B observed that there are problems for the non-married as well. "I struggle to give ____ a label,
particularly when telling a story to people who don't know him. But the story requires one to say something. It could be, 'my live-in lover, father figure
to my children, partner, significant other person and I went to a movie. . . .' It is just so f-ing awkward. I settled on 'partner' as 'significant other'
is just too many syllables and seems to be trying too hard."
I concluded some years ago that in
my opinion, the state (the government) should be in the business of recognizing
civil unions--of whoever wants to have that status--and "marriage"
should be left to priests, rabbis, ministers, and so on. Marriage can be a religious union, sanctified
by whatever church one belongs to, but it should be distinct from a union
recognized for tax laws, inheritances, hospital visitations, etc. We all
know, however, that that's a political non-starter, so the gay/lesbian
community logically seeks the same recognition for their unions that the state
gives to heterosexual marriages. I would do the same in their position
and I fully support their efforts to achieve that legal recognition.
Friend B took issue with this.
I
quite disagree on this. At first blush
it makes perfect sense, but by introducing different classes of "unions"
one introduces the ability of anyone to discriminate. We have not done well with "separate but
equal." In my mind there is
absolutely nothing that is acceptable except complete equality. Already one can have any sort of ceremony one
wants—hell, I just left LAX, where you can do a drive-up wedding with Elvis
hanging out the window officiating. But
it is still a marriage and has the tax, benefits, inheritance, etc., benefits. Marriage was a term adopted ages ago and it
can't go back. You can add religious layers upon it if you are so inclined, but
"separate but equal" will not work.
She makes an interesting point. If we are to continue to use the term "marriage,"
then I suppose everyone should have the option for it. Sigh, another good idea shot down.
Given the state of the language, I wrote
to Friend A that I am not comfortable referring to Kathy as my "partner."
That means, in my experience, a relationship that's committed, mostly, and may
or may not be considered "permanent" by the partners; as I suggested
earlier, it often signals "not married" as terms are construed
today. The comments from some of those who responded to my informal
survey expressed the same view. Kathy
and I talked about this one night during the time I was exchanging emails; she
suggested I introduce her as "I'm her sex slave" or "the one who
cooks" or "the person I ride to work with every day."
Among other things. (Those were not
serious, in case that doesn't come across!
We were laughing about them.) We both agreed that one of the more
reprehensible introductions we've both run into is the introduction as "the
mother of my children."
Friend B, however, likes "partner"
better—as with Friend A—and finds the word more affirming. "Indeed,
we need (due to social norms) terms for people [that provide] context. I would challenge the term 'partner' not
meaning permanent and 'marriage' meaning it is—I can name a dozen marriages
(without trying) that are significantly less than permanent either in terms of
commitment or duration. To me, a partner
is more about a choice I make every day.
Each day xxxx and I decide that we are still together. There is no legal
tangle that binds it."
As Kathy and I were talking, the one
thought I had difficulty articulating is that I am old-fashioned enough that I
believe the term "marriage" (which it seems we are stuck with, like
it or not) means something different from partner or significant other or any
other modern descriptors of relationships. One way for me to explain it—poorly—is
that one of the extremely trivial difficulties I had, when suddenly confronted
in 2007 with the fact that I was being divorced after nearly 25 years of
marriage, was that I could no longer wear a wedding ring. Yes, dealing
with the children and the house and all the accumulated "stuff" from
a long marriage and the relationships with friends and family were far larger
issues, but I was also saddened by the fact that I had to remove my wedding
ring. I like being married! I want to have a wife/spouse/lover/life
partner/whatever, and at least for old-fashioned me, wearing a wedding ring is
a message to myself and the world that I'm in a relationship that is—as much as
can be the case—permanent.
I inquired what's wrong with "spouse"
as an introduction/reference? That term doesn't have the same baggage, it
seems to me, and on a quick search of dictionary.com, found that it means "to
bind oneself, promise solemnly" or "literally, pledged (man, woman)."
That pretty well captures the sentiment I have about Kathy. (One of the
sources notes that the biblical definition of "spouse" only refers to
the wife, but that doesn't show up in the other definitions, and I don't think
anyone now attaches a specific sex to the word.)
She (Friend A) wrote back, with a
view very similar to that of Friend B:
The
term "spouse" just doesn't work for me. I'm actually not quite sure
why.
. . . It seems not to have a deeper
meaning that implies all that partner does. I think of "partner" as someone with
whom I'm choosing to intertwine my life—actively choosing. I think of "spouse" as someone to
whom one is legally tied—there may or may not be active choice and/or affection
involved. I agree with you that spouse
implies more permanency than partner. But for me, that's a good thing. Spouse implies that you're stuck with someone,
like it or not. Partner implies that you
want to be with someone. It seems more
affirmative and positive.
In
terms of wife . . . perhaps significant in this conversation is that I've done
a lot of feminist reading and was raised Catholic. . . . I am very sensitive to anything that
identifies gender. (The Catholic church
has very specific gender roles, as you might know.) I think there are times to signify gender,
certainly, but before I do that, I ask—what's the function of doing that? For example, when we say "girls and boys"
in elementary classrooms, we are reminding/signaling to the kids that there gender
is somehow meaningful—even if they didn't think it was. We are teaching them to separate themselves,
or at least take on an identity in particular ways that is dichotomously
positioned in relation to the other. I
think of gender as much more fluid than that and don't really like taking a
gendered position—unless there is a reason to do so. So, I don't want to use "wife" as it
is a gendered term.
With
the married piece . . . I don't want to
claim privilege, though, that gay and lesbian couples don't have. So, I come down right in the middle ---
wedding ring and "partner." The
best I can do at this moment!
Friend
B agreed that "spouse is very neutral—but it also is just that, neutral;
it is almost sterile. I think if I ever
get married [again] I could get my mind around 'spouse.' I like it because it achieves the gender- and
power-neutral term that would accurately reflect the relationship, but it would
also be accurate, meaning, 'we are married.'
It also sort of has this 'thud' to it—not a lot of spark, more duty,
obligation." I agree about "thud."
Because
all of these exchanges were going on with people who are at least 40, and
mostly in their 50s and 60s, I asked Krystin to contact her married friends and
ask them what terms they use. I was
surprised; of the dozen or so who responded, all but one said "husband"
and "wife" (the one reported that she and her spouse use "partner"). Krystin said that in her opinion, she and
her friends don't see the baggage that history attaches to "wife." They didn't grow up with it—her friends are
largely from professional families—and the parents had relationships that
seemed fairly equal, so "wife" doesn't mean anything more than the
female half of a marriage.
It may be that usage will evolve in
the direction of "partner" (although the anecdotal evidence from
Krystin hints that may not be happening). But at least in the way the
world uses language now, the term "partner" used by itself just does
not seem satisfactory to me. I like the solution described by my
friend Karen Seashore: wife and
partner/husband and partner. That picks
up the "married" concept, the legal/religious commitment, as well as
the active, affirming sense of "partner" that my two 40ish women
friends say is a better descriptor of the relationship. As Kathy points out, the term "partner"
also implies equality more than the traditional terms do. Now, I'll have to see whether I'll actually
start to introduce Kathy in social settings as "my wife and partner." I can imagine that it could elicit a
surprised, or amused, or quizzical look from the person to whom I am
introducing her. I do know for certain,
however, that after this, whenever I use the term "wife" to introduce
her, I'll trip over the word for a microsecond before I utter it!
I can also imagine the reaction of
the more conservative evangelical/fundamentalist religious denominations to a
message from the media or from the ether that they should not use the terms
husband and wife any longer but instead should use partner. Yikes!
A postscript: A few months after I wrote the foregoing, a
columnist for Slate wrote about her
objection to the term "wife" as she approached marriage (and her
betrothed's objection to "husband").
She recited the history of the terms, implying ownership and master of a
household, and opinions from several that reflect the ones I reported
here. She took a look at other
possibilities—partner, spouse, etc.—and found them wanting. She concluded, one might imagine mournfully, "as
my future — let me give this a shot – husband says, 'I expect
to use them [the terms husband and wife], because I just don't want to sound
like a weirdo.'"
MISS,
n. A title with which we brand unmarried women to indicate they are in the
market. Miss, Misses (Mrs.) and Mister (Mr.) are the three most distinctly
disagreeable words in the language, in sound and sense. Two are corruptions of
Mistress, the other of Master. In the general abolition of social titles in
this our country they miraculously escaped to plague us. If we must have them
let us be consistent and give one to the unmarried man. I venture to suggest
Mush, abbreviated to Mh. (One suspects Bierce would be
pleased at the substitution of Ms. in later usage.)
The
foregoing friendly debate is indirectly related to the question of the children's
last name. Even though women in American
society, when they marry, often retain the last name they grew up with, as Gail
Collins (I think) pointed out in a column in the New York Times a few years back, it's really a choice between two
men's last names: their husband's or
their father's. Even in Iceland, which
uses both patronymics and metronymics, as in Oddný Ósk Sverrisdóttir, Sverri was her father. (That is a real name; Sverrisdottir was one
of the researchers working on ancient bones in Sweden to look at the evolution
of lactase tolerance, research that Professor Marlene Zuk reported on, to which
I will turn later.)
There are rare instances of parents
giving the children altogether new last names.
I knew a couple who did so: The
mother was xxx Yoshikawa, the father was xxx Christiansen; the last name of
their two children was Christikawa. In
the case of most of the couples with whom I am familiar, however, where the
woman kept her last name, the children carry the father's last name.
The little I've thought about this,
there doesn't seem to be any easy solution barring the Christikawa
example. In the U.S. and Western Europe
and their progeny around the globe, any last family name a woman takes will, at
this point, be a patronymic—because we've only used the male's name for the
family for centuries. The Icelandic
practice could be altered in the U.S. to be both patronymic and
metronymic: my daughter Krystin could be
Krystin Patsdottir and Elliott could be Elliott Garyson (I believe this
practice is used by some in Iceland). If
Krystin had a daughter Alice she would be Alice Krystinsdottir, and if Elliott
had a son Albert he would be Albert Elliottson.
If Krystin were married to Alfred, and they had a son Alan, he would be
Alan Alfredson. Likewise, if Elliott
were married to Augusta and they had a daughter Ann, she would be Ann
Augustasdottir. If the "Patsdottir"
were treated as a last name, however, the practice would be a departure from
Icelandic naming, because (with a few exceptions) in Iceland people do not have
last names and are addressed by what we consider the "first"
name. Krystin Patsdottir would never be
addressed as "Ms. Patsdottir," she would always be addressed as "Krystin"
or "Krystin Patsdottir."
I suppose that because names are
given for purposes of conversation and communication, and identity and taxes,
any system works. Some of them, however,
sure could screw up genealogical research.
I wonder if, except as a check on identity, the Social Security system
and the IRS rely on the Social Security Number more than the name. Maybe in a few decades we'll all just have long
numbers and no names, and use a short version of the numbers to address people. "Hi, 46.
How's your daughter 57?" And
if you happen to know two or more 46s, what would be the problem? We all probably know more than one Bob or
Kristen or Eric; unless they're both in your presence at the same time, saying "hi,
Bob" would be no different from saying "hi, 46." And if they were both in your presence, you'd
just go to the next digit! Or children
can have one of their parent's numbers plus a digit; their children would have
that number plus one digit, and so on, and in 100 generations the identifying
number would be over 100 digits long.
That system might not work.
This wasn't just plain
terrible, this was fancy terrible. This was terrible with raisins in it. (Parker)
BELLADONNA:
In Italian, a beautiful lady; in English a deadly poison. A striking
example of the essential identity of the two tongues.
I
have sometimes wondered how the biblical literalists handle some of the laws
laid down in Leviticus. Paul Kedrowski,
technology entrepreneur, venture capitalist, and academic, has commented that "Leviticus is
best understood as Judge Judy for desert tribes." That may be a good way to approach it.
There's a debate in Christianity about whether the laws of Leviticus
and other parts of the Old Testament still apply. Some argue that the coming of Christ and the
pronouncements that appear in the New Testament supersede the provisions of Old
Testament/Mosaic law (or at least some of it).
Others maintain that the provisions of both still all apply. I don't have a dog in that fight, but I
confess that it's a puzzle for me to figure out how those who don't believe the
New supersedes the Old can follow some of the laws.
One
question that has also been raised is whether these laws apply to anyone other
than Levites, the group that generally performed priestly functions.
There are
a number of prohibitions in Leviticus that surely many if not most of even the
most devout and literalist Christians do not follow (this is not a complete
list): eating fat, eating blood (of
which there is considerable in steak, for example), reaping to the very edges
of a field, picking up grapes that have
fallen in your vineyard, holding back the wages of an employee
overnight, mixing fabrics in clothing, cross-breeding animals,
planting different seeds in the same field, practicing divination or seeking
omens (one might think that this would include astrology), trimming your beard,
cutting your
hair at the sides, getting tattoos, and not
standing in the presence of the elderly. Boy, there are a lot of people in trouble
nowadays just because of the ban on tattoos, forget the others! And another large group is in trouble for
wearing cotton-poly or wool-poly clothing.
I personally am in trouble if only because I do trim my beard.
Some prohibitions, of course, Christians and most
everyone else on the planet generally believe should be honored: having sex with your
mother, having sex with your father's wife, having sex with your sister or granddaughter or half-sister or biological
aunt or daughter-in-law or sister-in-law, marrying your wife's sister while
your wife still lives (I think we call this bigamy), stealing, lying,
defrauding a neighbor, spreading slander, and so on. Those strictures, however, aren't necessarily
limited to Christendom or places governed by Judaic law; seems to me they're
pretty common to much of human civilization.
Presumably
most thoughtful Christians take Leviticus—and other parts of the bible—as
allegory and metaphor intended to provide guidance for leading a life that
subsequently makes one worthy of entry into heaven. One can inquire, however, whether those who
claim to be literalists have a fully-functioning brain, because not only do
some of these laws make no sense, in some cases there are outright
contradictions in different places in the bible, so one literally cannot follow
the literal language of the book.
In most
developed nations with a largely Judeo-Christian history these matters would
not be worthy of worry, but in the U. S. people who claim to be literalists
seem to get elected to office in some places and bring their wingnut views into
the development of American public policy.
To the disadvantage of all of us.
One of
the other challenges to literalist interpretations, according to Professor
Timothy Beale at Case Western Reserve University, is that "'Accurate' is a
problematic word when talking about biblical translations. We don't have a single original Hebrew or
Greek source for any of the books of the Bible, let alone the Bible as a
whole. There are many different
manuscripts. The people who produce
these translations work in large committees of language and biblical scholars. Some are Christian, some are Jewish and some
aren't religious at all. They sort through the manuscript evidence and decide
which texts to use and translate."
Beale, who teaches bible studies at a secular university and who's
written a number of books about the bible, also points out that there are now
about 6000 editions of the bible printed each year. Beale, in an article in Five Books, noted an example of some of the difficulties these
scholars encounter when translating the ancient Greek to modern English: "It becomes extremely hard to know what's
behind it. 'My cup runneth over' becomes
'you blow me away.'"
See also Sandel's comment, cited
previously.
By the time you swear you're
his,
Shivering and sighing.
And he vows his passion is,
Infinite, undying.
Lady make note of this --
One of you is lying. (Parker)
Shivering and sighing.
And he vows his passion is,
Infinite, undying.
Lady make note of this --
One of you is lying. (Parker)
PRAY, v. To ask that the laws
of the universe be annulled in behalf of a single petitioner confessedly
unworthy.
Although
I am usually not comfortable being the center of attention (I suppose one can
argue that this annual letter is all about me, but I try to make it not so), I
cannot let go by without comment the February dinner hosted by my faculty
colleagues celebrating my 25 years in my position. I will not ever do this again in any of these
letters, but this year, only, I'm going to include a few of the extraordinarily
kind comments that people made at (or on) the occasion—because some of them are
funny (or at least worth a chuckle) as well as complimentary. I try actively to avoid tooting my own horn
because I think doing so is bad form and bad taste (and in the case of many who
do it, also inaccurate—the people doing so often have significantly less to
toot about than they think they do).
Before I get there, however--
Starting
down this line of thought led me to contemplation of Lake Wobegon, because I
read recently that the "Lake Wobegon effect," per Wikipedia (although
I actually first read about it in some national news journal), is "a
natural human tendency to overestimate one's capabilities, [and it] is named
after the town. The characterization of
the fictional location, where 'all the women are strong, all the men are good
looking, and all the children are above average,' has been used to describe a
real and pervasive human tendency to overestimate one's achievements and
capabilities in relation to others."
I realized, when I read that, that I have apparently been
misinterpreting Garrison Keillor's message.
As I've listened to A Prairie Home
Companion and the "news from Lake Wobegon" over the years, it
seemed to me that Keillor often spoke of his characters with their Scandinavian
backgrounds as shy and retiring, not ones to brag or boast, and in fact people
who are embarrassed if they are complimented too much. So really almost the opposite of the Lake Wobegon
effect described in Wikipedia (and, apparently, elsewhere).
I
tried out my interpretation of Keillor's message on a few friends. They were as surprised as I was at the
characterization of the "Lake Wobegon effect" and largely had the
same interpretation of Keillor's stories that I do.
One
of my long-time friends (and very thoughtful faculty member) wrote to me: "For what it's worth, I think, being
Scandinavian myself (and surrounded with such in family, friends, and
acquaintances my entire life), I do believe that Keillor's characterization
that Scandinavians are shy and retiring and embarrassed by compliments is
correct. That comment pertains to attributes of themselves."
Another
wrote: "Now
it's more a mantra than a specific piece of dialog, so I suppose the question
should be, 'How did Garrison Keillor first employ this line?' I suppose that I could imagine that it was
NOT used by any individual Lake Wobegoner to describe him or herself (which
would indeed be taking on airs) but, rather, might reflect the polite manner in
which they would tend to talk about neighbors to an outsider. It IS something of an anomaly within the Lake
Wobegon gestalt."
One faculty friend wrote that "what's
wrong here (Wikipedia) is to use (1) the comically hyperbolic and
gender-reversed description of the town for (2) "a real and pervasive
human tendency to overestimate one's achievements and capabilities in
relation to others." This is
incoherent at best. Someone (even
other than the author of the article, though I suspect the author invented the
misnomer) may have named a universal (?) individual tendency after the
fictional town, but does it make sense to try to force such a fit?
Another
long-time friend on the faculty wrote to me that "the first two phrases
are, to me, about self-deprecation: the
women in rural Minnesota are probably stronger than they would admit, and the
men are probably more handsome than they would state. (So, here, you are right--Wikipedia has got it
wrong.) But, of course, the beauty of
those two phrases is that they are the inverse of the usual claims. So maybe it is a message about gender
equality.
"The
third phrase, about the children, can be taken several ways--which was probably
intentional. It could be the (correct) statement that Minnesota children score
better on national tests than children from most other states. Or it could be that all of the parents believe
that their children are above average (which would seem to be in accord with
the Wikipedia view)."
Another
long-time friend, who is also Scandinavian and knows a lot about it and its
history, agreed with me.
I'm
with you! The main thrust in Lake
Wobegon is not, I think, overestimation of one's capabilities or
achievements—in spite of the standard description about men, women, and
children. In fact, that description
seems a little misleading when you think about the people Garrison describes. It's interesting to contrast this supposed
spirit with a spirit that has a Scandinavian name, and is supposed to be
characteristic of Scandinavians. It is
called "The Law of Jante." "The
Law of Jante" is named after a fictitious town, Jante, in Denmark
described in a novel by Aksel Sandemose, A Fugitive Crosses His Tracks.
The "Law" has ten parts - not coincidentally, I'm sure.
Here is a list from Wikipedia under
Sandemose:
The ten
rules state:
1.
You're not to think you are anything special.
2.
You're not to think you are as good as us.
3.
You're not to think you are smarter than us.
4.
You're not to convince yourself that you are better than us.
5.
You're not to think you know more than us.
6.
You're not to think you are more important than us.
7.
You're not to think you are good at anything.
8.
You're not to laugh at us.
9.
You're not to think anyone cares about you.
10.
You're not to think you can teach us anything.
But,
I don't think that Garrison is trying to represent the "Jante Law"
either. There is a kind of jaunty, innocent self-assurance in Lake Wobegon that
seems quite alien to the "Jante Law," but hidden behind shyness and
(maybe false) humility. Lake Wobegon
really has its own ethos, it seems. And
you have to think of it in terms of an underlying ethos of innocent confidence
and a surface ethos of shyness and—as I said—(maybe false) humility.
One
Norwegian friend commented puckishly to me that "I always interpreted it
the same way you did. And I think it is
Keillor's intent. Of course, I might be
falling prey to the human tendency to overestimate my power of interpretation. Query: Is the proponent of that assertion about human
behavior (overestimating one's abilities) someone who is unjustifiably
confident about the validity of the assertion?"
A
couple of other comments from friends were that "I have always thought
about it like you although I'd have to put much more energy into it to think
about why" and "I think that Garrison is having it both ways in Lake
W.: irrationally modest folks being
irrationally proud of their distinction—ingredients, actually of Minnesota
Nice."
A
view came from a faculty friend, not Scandinavian and not Lutheran, and a long
way from Minnesota. "It is not at
all surprising that Scandinavians in the upper midwest would take umbrage
at the near universal definition of Keillor's "Lake Wobegon" effect
as correctly stated in Wikipedia. As you
and your friends clearly state, Scandinavians are reserved, understated and
humble, not at all like Keillor's characterization of Lake Wobegon's men, women
and children. I believe that Keillor was
pulling the proverbial pant legs of Scandinavian culture by his statement about
Lake Wobegon's inhabitants which was and continues to be meant in jest. . . . I have listened to 'the news from Lake Wobegon'
many times and you are absolutely correct—the stories are completely contrary
to the "Lake Wobegon effect" we have been discussing. [I] contend that Keillor is pulling your legs
with the statement about the people of Lake Wobegon that always ends the 'news'
segment."
The upshot, in my judgment, is that
while there wasn't
exactly consensus, there was near-universal agreement that the "Lake
Wobegon effect" is not reflective of the personalities of the characters
in the stories. What is misleading, I think, is Keillor's phrase to describe
the town, which is very different from the stories he tells. I suspect the
phrase was simply good humor while the stories reflect his understanding of the
culture. Or in the words of a colleague,
"the name CAN be appropriated, as it has been, for the purpose, and
apparently by professionals of a sort, but one's chauvinism rises to say that the
effect owes nothing to the people of the town and everything to the comic
characterization."
Another
example of this "misuse" of the "Lake Wobegon effect"
appeared on the Huffington Post in
September. A colleague at the University
of Southern California, Bill Tierney, wrote that "Garrison Keillor has
long told stories about Lake Wobegon, his mythical home out there on the edge
of the prairie 'where all the women are strong, all the men are good looking,
and all the children are above average.'
California State University is inventing its own Lake Wobegon in dealing
with entering freshmen who need to take remedial classes in math and English." The point he was making (CSU decided to lower
the SAT score that identified a student as needing remedial work) is beside the
point here, and one that can be legitimately debated by experts in
education. I wrote to Bill and teased
him about the use of the phrase, and told him that the locals here didn't think
it was accurate.
BIRTH, n. The first and direst
of all disasters. As to the nature of it there appears to be no uniformity.
Castor and Pollux were born from the egg. Pallas came out of a skull. Galatea
was once a block of stone. Peresilis, who wrote in the tenth century, avers
that he grew up out of the ground where a priest had spilled holy water. It is
known that Arimaxus was derived from a hole in the earth, made by a stroke of
lightning. Leucomedon was the son of a cavern in Mount Aetna, and I have myself
seen a man come out of a wine cellar.
In
any case, it is Keillor's characterization of the vaguely Scandinavian,
self-deprecatory culture that I grew up in and around, which I think he
captures accurately in his monologues, that leads me to eschew
self-congratulation and too much self-esteem, and hence my apology for
departing from my own practice. So back
to the dinner, for a bit.
Several
of my long-time colleagues stood up and spoke for a few minutes after the
social hour and dinner. They all made
remarks that were humorous but also warm and more congratulatory than I
deserved. One of them, a talented singer
as well as a distinguished professor of Chemistry, revised the words to
Lancelot's song "C'est Moi" in Camelot and sang his version—new
verses for the entire song. (Some of
this is a little inside baseball, but you get the idea.)
Governance!
Governance! [Camelot! Camelot!
and so on]
From far away you
heard its call
Governance!
Governance!
And here you came to
give your all
You know in your soul
what's best for every chair
Whether SCaFA, Research,
SCEP, or FCC
. . .
But where in the
world
Is there in the world
A man so
*extraordineer*?
C'est toi, c'est toi,
Monsieur Gary E.
'Tis you, I freely
declaim
With wit and verve,
with humor and nerve
C'est toi, 'tis you,
the same
And so on.
The same Professor
Cramer tweeted about the event:
"Sneezes policy,"
huh? Never knew that before. Maybe that can be the epitaph on my
tombstone.
One
of my long-time colleagues, a former boss and friend, now retired and living
out of state (who obviously was not going to come to Minnesota for a dinner),
did write to me that "my tribute can come only by a simple e-mail
message! Gary, you have been a
tremendous asset and inspiration to me and generations of faculty and
administrators. You are a reincarnation,
of course, of the Renaissance Man!
Nobody but a Renaissance Man could write Christmas letters like yours." (And I can see Kathy rolling her eyes, "no,
no, don't encourage him!") I'm not
quite sure I live up to his description—in fact, quite sure that I do not—but
it is nice to know I've been an asset to somebody. As for being an inspiration, I might respond
to him that I'm an inspiration to people to brevity in writing, I being the
example of the opposite, a model not to be mimicked.
One
faculty member, who I've not even known that well, told me that "few
institutions have an individual who combines competence, reliability,
efficiency, diplomacy, humility and a masterful understanding of organizational
history in the service of the organization as you give to us. . . . I have come
to more fully appreciate all you do for the University in general and the
faculty in particular." After the
dinner, one colleague wrote to me that "it was a great pleasure to honor
someone who has done so much for our university."
One
thing these unexpected accolades did was reconfirm for me how little we
sometimes know about the role we play in life.
I frankly was startled at all these kind words, some of which were
entirely unsolicited. (Yes, of course I
know that people have to say nice things at a celebratory event, and even
exaggerate their compliments for such rituals.
Even through the social niceties, however, there came a glint of genuine
approbation, which I appreciated.) But I
really had no idea that people thought these things.
So
much for patting myself on the back. It
was back to real life after the dinner and so it is in this letter as well.
OK,
a few more lines. Every year the senior
faculty committee of the University has dinner with the members of the
University's governing body, the Board of Regents; I also attend. At this year's dinner, the president gave a
few words of welcome and then singled me out for part of my work; he said
something to the effect that none of those present will ever work at any other
organization where the records of committee meetings are so interestingly
written, detailed, and convey so much information. Or something like that. The room gave me a round of applause. I was startled; I think of the committee
minutes as a rather routine and hum-drum part of my job, really more a source
of news for the academic community than a formal record. I have known, from past conversations with
Board members, that they do read and learn from them, but this applause
business was a little embarrassing.
After all, I live in the land of the real Lake Wobegon, not that of the
so-called Lake Wobegon effect.
BAIT, n. A preparation that
renders the hook more palatable. The best kind is beauty.
Beauty is only skin deep, but
ugly goes clean to the bone. (Parker)
Krystin announced one Friday night
that she was going over to a friend's place on Sunday; the young woman friend
was involved in partners.com. We thought
this sounded like another online dating site, but no, it is akin to Avon or
Pampered Chef, where one has items for sale at a "party" in one's
home and invites friends. What was for
sale was "marital aids" or "relationship aids" such as
creams, jellies, handcuffs, and whatever.
Krystin said she was just going to provide support to her friend and she
certainly wasn't going to buy anything.
Kathy and I suggested that maybe we
should come along and find a few items for ourselves. Krystin blanched and said she'd stay home
then. Kathy thought the handcuffs might
be fun. I think Krystin wanted to leave
the room. Kathy finally told her that
everyone her age (and younger) just can't imagine their parents in bed, not to
worry about it, and no, we weren't going to attend the event. But we did have a little fun at Krystin's
expense J
APOLOGIZE,
v.i. To lay the foundation for a future offence.
Gratitude—the meanest and most
snivelling attribute in the world.
(Parker)
Because Kathy had never seen me
without it, I shaved off my beard last spring.
Doing so reminded me once again how much I hate shaving and how nicked
and cut my face gets because my beard is like a Brillo pad. I let it immediately grow back. Elliott wouldn't go along with the plan to shave
his off so we could see what he looks like without a beard.
What was funny was the way people
reacted. 3-4 of my friends/colleagues
asked if I had new glasses. Only one
immediately recognized that the beard was gone.
Everyone agreed, alas, that I looked considerably younger without the
beard, which is not surprising because my beard, unlike my hair, is almost
entirely white. But Kathy and I
agreed—other opinions are irrelevant!—that even so, I look better with the
beard than without.
Every year, back comes Spring,
with nasty little birds yapping their fool heads off and the ground all mucked
up with plants. (Parker)
WEATHER, n. The climate of the
hour. A permanent topic of conversation among persons whom it does not
interest, but who have inherited the tendency to chatter about it from naked
arboreal ancestors whom it keenly concerned. The setting up official weather
bureaus and their maintenance in mendacity prove that even governments are
accessible to suasion by the rude forefathers of the jungle.
April 19, 2013
Thoughts
for a Sunshiny Morning
It costs me never a stab nor squirm
To tread by chance upon a worm.
"Aha, my little dear," I
say,
"Your clan will pay me back
some day." (Parker)
EDIBLE,
adj. Good to eat, and wholesome to digest, as a worm to a toad, a toad to
a snake, a snake to a pig, a pig to a man, and a man to a worm.
I
would say that I continued to be puzzled by the gun debate after the shootings
in Newtown, Connecticut, but then, silly me, I continuously forget that people
are uninterested in the facts. (Cf.
Buffett, "what the human being is best at doing is interpreting all new
information so that their prior conclusions remain intact.") Australia had a tragic shooting in1996; the
Australian parliament shortly thereafter passed the first sharply restrictive
gun laws in that country's history—and the homicide rate from firearms fell
through the floor.
The Brady Center statistics tell the
tale: "In a single year guns killed 17 in
Finland, 35 in Australia, 39 in Britain, 60 in Spain, 194 in Germany, 200 in
Canada and 9,484 in the United States."
Even taking into account the population differences, the gap is
enormous. In 2010, the U.S. had 10.2
firearms homicides per 100,000 population.
Australia had 1.05. As for the rest
of the developed world, a sampling: Argentina,
5.65; Switzerland, 3.84; Costa Rica, 3.32, France, 3.00, Canada, 2.13; Greece,
1.5; Denmark, 1.45; Italy, 1.28; Germany
1.1; India, 0.93; Netherlands, 0.46; UK 0.25; Japan, 0.07. Most of them, I believe, have strict
gun-control laws. There are countries
with more firearms homicides, not ones we'd like to be compared with.
There is the argument that citizens need to have guns to fight in case
the federal government runs amok. Yeah,
right, their little arms caches are going to be effective against all the
weaponry of the U.S. armed forces. If we're
facing in this country a military takeover, I'd say we'll have a lot bigger
problems than fighting back with a few assault weapons and handguns. I would say that the American cause would
have been lost if the military is attacking U.S. citizens.
There's a hell of a distance
between wise-cracking and wit. Wit has truth in it; wisecracking is simply
calisthenics with words. (Parker)
ACCOUNTABILITY,
n. The mother of caution.
A
friend of mine at the University commented at a social event that when he
retired he was going to do two things:
become an historian and figure out his relationship with God and the
universe. That comment led me to realize
that for over 40 years I've basically been thinking about two questions: (1) the optimal organization of a society and
(2) religion. The second one is not for
this letter; my conclusion about the first is that no one has the right answer
but the best solution that imperfect human beings have come up with is the
Scandinavian model. Ample experience
demonstrates that the society that puts a high value on the common good is the
one that is best for all its citizens.
The experiment has been done, repeatedly, and the evidence is available,
even though many in American life simply deny it.
We have sometimes mused that if it
were not for the fact that doing so would take us too far from our children and
our friends, Kathy and I would think about retiring to Denmark or Sweden. A friend of mine on the far liberal end of
the political spectrum wrote to me last winter that "I'd love to move to Ireland, Costa Rica, etc. I feel so much less affinity for the US than
I did when I was young and that makes me sad." As with my friend, sometimes my frustration
with this country leads me to think I'd rather live elsewhere. These sentiments reminded me of one of Mencken's
assertions that I included in my letter last year, one that struck home with
me: "The notion that a radical is
one who hates his country is naïve and usually idiotic. He is, more likely, one who likes his country
more than the rest of us, and is thus more disturbed than the rest of us when
he sees it debauched." Thus is
raised the question of loyalty to my country and "patriotism," the
latter of which for many has a justifiable malodor about it given the abuse to
which it has been put in service of such things as wars in Iraq and attacks on
civil liberties.
Bierce's Devil's Dictionary contains this definition: "Patriotism, n. Combustible rubbish ready to the torch of any
one ambitious to illuminate his name. In
Dr. Johnson's famous dictionary patriotism is defined as the last resort of a
scoundrel. With all due respect to an
enlightened but inferior lexicographer I beg to submit it is the first."
Many of us perhaps recall making up
our really full address when we were young; for me, now, for example it would
be
Longfellow
Neighborhood
Minneapolis
Hennepin County
Minnesota
United States
Earth
Universe
The last entry is of
course silly and the penultimate one is unnecessary because it's true for every
living human being. The question is
which of those addresses commands loyalty and which commands none. I suspect most of us have a certain "home
towner" attitude about our neighborhood and city. I have no attachment whatever to Hennepin
County but I do to Minnesota and to the United States. Where things get dicey, in the eyes of some,
is when one also expresses attachment to the planet or humanity in general,
especially when that commitment may exceed or compete with loyalty to one's
country.
Many of us are familiar with Edward
Everett Hale's short story "The Man Without a Country," "the
story of American Army lieutenant Philip Nolan, who renounces his country
during a trial for treason and is consequently sentenced to spend the rest of
his days at sea without so much as a word of news about the United States"
and who later came to regret his decision.
(The story was written in 1863, published in the Atlantic, and intended to boost support at home for the Union
cause.) Quite apart from sentimental
reasons, one needs to have some country to call home, at least in the developed
world, for the purpose of health care, retirement benefits, a place to have a
residence, and so on. The more powerful
reasons for most of us, I suspect, are emotional, a commitment to the place of
one's birth and upbringing. Some move,
some renounce their country of birth, but the vast majority do not. One can ask, however, from a coldly rational
point of view, whether the political configuration on a particular piece of
geography should have one's loyalty simply because one happened to be born
there.
Conor Cruise O'Brien, writing about
the rise of nationalism during the French Revolution, included this interesting
footnote in his chapter of the book I was reading (The Permanent Revolution: The
French Revolution and its Legacy 1789-1989): "It may be held of course that, if
belief in a personal God is gone, the logical object of human devotion should be
humanity at large, not a particular nation or state. Unfortunately, humanity at large does not
seem capable of evoking such emotions, in most humans. The nation/state does evoke them, more
dynamically, and also more destructively, than anything else has yet done."
One
doesn't have to do much reading in history to know how effectively political
figures use nationalism. One only needs
think of Hitler as the starting example.
But the appeals to nationalism are used across the board, by all
political systems, to justify all sorts of actions, frequently actions that
subsequently turn out to be ill-founded, unwise, that lead to battles or war
and many innocent people being killed.
So while I am probably as guilty as many of being a nationalist (because
I have to believe that by and large the United States wants to do right in the
world, even though it doesn't always do so), I temper my nationalism with a
large grain of salt. Moreover, when
humanity faces the global threat of climate change, for example, nationalism
doesn't seem to be a very effective response.
INNATE, adj. Natural,
inherent—as innate ideas, that is to say, ideas that we are born with, having
had them previously imparted to us. The doctrine of innate ideas is one of the
most admirable faiths of philosophy, being itself an innate idea and therefore
inaccessible to disproof, though Locke foolishly supposed himself to have given
it "a black eye." Among innate ideas may be mentioned the belief in
one's ability to conduct a newspaper, in the greatness of one's country, in the
superiority of one's civilization, in the importance of one's personal affairs
and in the interesting nature of one's diseases.
"Coolidge is dead." How could they tell? (Parker)
I
have decided that I am opposed to serving copious hors d'oeuvres (except when
the invitation is for cocktails and hors d'oeuvres). After hosting several dinner parties when the
guests indulged in the hors d'oeuvres like they hadn't eaten in a month, and
were then too full to eat much of the dinner that we (i.e., Kathy) had
prepared, I concluded that we should offer minimal food before the meal. Do people starve themselves before a dinner
party? And then risk fainting if they
don't eat much fast?
On
the other hand, I have also noticed that frequently there are large portions of
the hors d'oeuvres left over—as well as large portions of the entre and other
elements of a dinner. Shortly after I
wrote the preceding paragraph I coincidentally happened to read an estimate (I
have no idea where—New York Times, Washington Post, Atlantic, who knows) that
half of the prepared food in the U.S. is thrown away. I thought to myself when I read the statistic
that it was probably close to right—all I have to do is think about all the
restaurant food I've not eaten and that wasn't suitable to bring home. I think I've been responsible for the
discarding of more French fries than most people in the Western Hemisphere.
So
my solution is to serve darn few hors d'oeuvres and small portions to guests. We'll just offer seconds. If there are leftovers, we can eat them the
next day. It just gripes me that (1)
there's so much food waste (and money wasted) and (2) some guests don't have
the good sense to eat lightly before a meal that took much time and thought to
prepare. (This dismay is one reason I
felt terrible when Kathy and I went to dinner at our friends the Collinses
after I'd eaten a too-large meal mid-day at a funeral. There was a superb meal and I could barely
eat any of it. I am still mad at myself
for that.) One caveat to my gripe, I
suppose is that (2) doesn't obtain in the case of people who have
large-capacity stomachs who can eat many hors d'oeuvres AND a full meal. I do know a couple of such people—and they're
skinny.
EAT, v.i. To perform
successively (and successfully) the functions of mastication, humectation, and
deglutition. "I was in the drawing-room, enjoying my dinner," said
Brillat-Savarin, beginning an anecdote. "What!" interrupted
Rochebriant; "eating dinner in a drawing-room?" "I must beg you
to observe, monsieur," explained the great gastronome, "that I did
not say I was eating my dinner, but enjoying it. I had dined an hour before."
OVEREAT, v. To dine.
There
was a serious omission from last year's letter.
While I do not intend to let my annual missive become a collection of
obituaries, the death of some of the people who have played a significant role
in my life warrants notice. I neglected
to note the passing of Elmer Kemppainen, one of my dear friends.
I
first met Elmer, at the time principal of Robbinsdale-Cooper High School, when
I subbed in a men's bridge group that originated in the University's athletic
department in the early 1970s. I later
joined the group permanently (which still plays, although with considerable
turnover in membership), and shortly afterwards, in the summer of 1977, invited
Elmer and his wife Ginny to an evening's bridge game with four other couples
that evolved into a 6-couple bridge group (that also still plays). Elmer retired from the men's group a few
years ago, and Elmer and Ginny retired from the couples group as well, but I
played bridge a couple of dozen times per year with Elmer for over 30 years. We sorely miss them in the couples group and
the men's group—but we have been delighted that Ginny moves to the Twin Cities
for the winter and substitutes in our couples bridge group that she and Elmer helped
start and in which they played for so many years.
Perhaps as important, Elmer introduced me
(and several others) to the game of, and the finer points of, bocce ball, that
delightful game invented by Italians and played widely by Finns on the Iron
Range in Minnesota.
Elmer
passed away in the summer of 2012.
Several of us from the bridge group drove up to Virginia, MN, for the
funeral. Elmer was one of the fiercest but
also affable bridge competitors I knew, sometimes provoking mildly withering
glances or amusedly skeptical comments from Ginny when he was perhaps competing
a bit too much in the couples group. But
he was at the same time a man of great good humor and extraordinarily
considerate of others. The world is
always a little worse off when people like Elmer leave it.
MULTITUDE, n. A crowd; the
source of political wisdom and virtue. In a republic, the object of the statesman's
adoration. "In a multitude of counsellors there is wisdom," saith the
proverb. If many men of equal individual wisdom are wiser than any one of them,
it must be that they acquire the excess of wisdom by the mere act of getting
together. Whence comes it? Obviously from nowhere—as well say that a range of
mountains is higher than the single mountains composing it. A multitude is as
wise as its wisest member if it obey him; if not, it is no wiser than its most
foolish.
I went to a convent in New York
and was fired finally for my insistence that the Immaculate Conception was
spontaneous combustion. (Parker)
Like
all of us in the information age, I have more passwords for websites than I
could ever possibly remember. So I have
them all written down, cryptically, on a Word document that I have to go to at
least once per week. A good friend of
mine posted this on Facebook; it's about right.
Anyone can do any amount of
work, provided it isn't the work he is supposed be doing at that moment. (Benchley)
A
little bad taste is like a nice dash of paprika. (Parker)
Every
year Kathy and I have a few folks over for meatball soup, basically a vegetable
soup with meatballs spiced following directions from my great-aunt Inez, who
got the recipe from her mother, my great-grandmother Inger Kjestine Larsen
(Jensen). Last year I felt virtuous because
I bought all the vegetables at one of the local coops, so they were all
organic. On the whole, however, I am
skeptical about the virtue of organic produce because there's no good evidence
that it tastes better or is any better for you or even that there is any
significant difference in pesticide levels and the corresponding hypothesized
(but not demonstrated) threat to one's health.
Every man owes it to himself
(and to his friends) to get away entirely alone in an isolated shack every so
often, if only to find out just what bad company he can be. (Benchley)
ALONE,
adj. In bad company.
My general observation—about which
there is probably piles of research for which I haven't looked—is that people
prefer green to the "built environment" (as the Brits put it). We rush to our lake places to be surrounded
by green (and water), we prefer larger suburban tracts with trees and bushes
and gardens, we like parks and parkways in our towns and cities, we plant trees
and bushes, and so on. The highest-price
homes in any metropolitan area are those on water and/or facing/adjacent to
green space. There are a few who really
like to live in the urban center city, in the high rises or the townhomes, and
obviously millions do so—willingly or not—in New York and Chicago and lots of
other places in the U.S. and around the world (Seoul, Tokyo, to name a couple). But on the whole, it seems to me, most of us
prefer to be surrounded by greenery. I
reflect on this every once in a while when I am walking to work: I walk across campus, up the University's Mall
(about two acres of green space and trees surrounded by buildings) while a
number of my friends who work downtown never get away from buildings or
concrete/asphalt once they get to their parking spot. I never object to going downtown for whatever
purpose—to eat, theater, etc.—but it's never as comfortable as our green and
flower-laden backyard or at a lake.
The unconscious thought behind that musing about green space was that human beings evolved in an environment of plants and animals and water and green and that we somehow still long for those kinds of settings because they are part of our genetic heritage. That is, of course, a mishmash of assumptions for which I had no theory or data. I found myself at least indirectly rebutted by an article that appeared about the same time I wrote the foregoing paragraph, "Misguided Nostalgia for Our Paleo Past" in the Chronicle of Higher Education in February. The author, Marlene Zuk, a professor of evolutionary biology—I only learned at the end of the article that she's at the University of Minnesota—reports on her experiences working in a lab in Sweden with biologists analyzing the bones of humans and animals from thousands of years ago to look for changes in genes that allowed those early humans to digest lactose (the sugar in milk) even after weaning, despite the fact that all other mammals lose that ability and dairy products cause stomach upset. A few groups of humans, however, especially those from northern Europe and some in Africa, continue to be able to consume dairy products all their lives. (Heavens, what would life be like without being able to eat cheese??? And for those of us who love it and have drunk it all our lives, milk???)
And why do we care, Professor Zuk asked? What the research is showing is that the evolutionary change that allows some humans to consume dairy only occurred about 4,000 years ago. And what that means, she writes, is that the research "is revolutionizing our ideas about the speed at which our evolution has occurred, and has made us question the idea that we are stuck with ancient genes, and ancient bodies, in a modern environment." She goes on to comment that the long-standing assumption that evolutionary changes take place over eons means we may "feel that humans, who have gone from savanna to asphalt in a mere few thousand years, must be caught out by the pace of modern life." We don't think we fit in modern society because we aren't that far removed from early hominids in Africa; in evolutionary terms as commonly understood, it's only been a few seconds of time since we moved from a nomadic life to the age of cell phones and landing on the moon. We assume we are afflicted in ways that our distant ancestors never were (AIDS, obesity, PSTD, high blood pressure, etc.) and that we'd all be better off if we lived more like those ancestors did, in more "natural" settings. "Our bodies and minds evolved under a particular set of circumstances, so the reasoning goes, and in changing those circumstances without allowing our bodies time to evolve in response, we have wreaked the havoc that is modern life."
That, Professor Zuk says, is balderdash, a "paleofantasy." First, our environments have been constantly changing as we have evolved, so to a certain extent we've always been out of synch with it. Second, she points out that humans have evolved only recently to function at high altitudes and developed a resistance to malaria. Evolution can work (relatively) fast, or slow, or in between, and one cannot assume that it has suddenly stopped working on human beings. (As Elliott comments every once in awhile, when he sees or learns of a young person—or animal—doing something that leads to them dying by reason of their own stupidity, that takes them out of the gene pool.) Third, what makes us think that we're more out of tune with our environment than our predecessors were? Zuk asks: "Would our cave-dwelling forebears have felt nostalgia for the days before they were bipedal, when life was good and the trees were a comfort zone? . . . Were, then, those early hunter-gatherers convinced that swiping a gazelle from the lion that caught it was superior to that new-fangled business of running it down yourself?"
Zuk goes on to point out that biologists are learning that many human genes have changed quickly—over the last few thousand years—while some have remained the same for millions of years (genes that we share in common with yeast and earthworms, for example). In some species, evolutionary changes are occurring in a matter of years (where a generation is only a year or so).
Perhaps some of us have evolved into homo urbanus, city dwellers who live entirely surrounded by glass, concrete, and steel and who walk on nothing but carpeting and tile. So, Gary concludes about himself in dismay, he may be evolutionarily behind his friends and colleagues who thrive in the concrete jungle, homo sapiens who's been left behind by homo urbanus. Now I can't walk up the Mall without feeling like I'm in a genetic backwater L
I cannot forebear from reporting one
of the reader comments in response to the Zuk article: "It has been argued with a fair degree
of plausibility that evolutionary inheritance helps explain why men have spent
such vast sums building golf courses, which tend to look like ideal Paleolithic
hunting grounds." I don't golf but
my affection for greenery extends into the house, as those of you who have been
to it know: I like to have plants
everywhere. This is the subject of mild
tension between Kathy and me because she thinks I have far too many, and too
many big, plants. She keeps threatening
to put toxic substances in the pots; she's refrained thus far, but I worry. . .
.
(I note that the Star-Tribune finally picked up this
story in late July, five months after the Chronicle
of Higher Education story. Swift
reporting. In the reader comments to
both the Chronicle and Strib
articles, the entire focus turned to heated debate about the paleo diet, which
was only a small part of the point Zuk was making.)
IDIOT,
n. A member of a large and powerful tribe whose influence in human
affairs has always been dominant and controlling. The Idiot's activity is
not confined to any special field of thought or action, but "pervades and
regulates the whole." He has the last word in everything; his
decision is unappealable. He sets the fashions and opinion of taste,
dictates the limitations of speech and circumscribes conduct with a deadline.
MONKEY, n. An arboreal animal
which makes itself at home in genealogical trees.
I was a major dope in January. I scheduled a couples bridge game on the
first anniversary of our marriage. Kathy
pointed that fact out to me, fortunately only with amusement. It was a Sunday; I promptly made dinner
reservations for Monday night at a restaurant we had gone to on our second or
third date (neither one of us can recall for sure which it was) and at least
bought a card. Sigh. I try to be attentive to significant dates,
and I had thought about the anniversary a couple of times while we were in Florida,
but then it simply vanished from my brain when we returned to Minnesota. I guess I was so distraught about the furnace
or overcome by the Florida heat that I forgot L
I felt a little better when I gave
Kathy my card at dinner on Monday and she confessed that she had forgotten to
get me a card.
Fortunately, neither of these
omissions has had any effect on the marriage J
Anyone will be glad to admit
that he knows nothing about beagling, or the Chinese stock market, or
ballistics, but there is not a man or woman alive who does not claim to know
how to cure hiccoughs. (Benchley)
ACCORDION, n. An instrument in harmony
with the sentiments of an assassin.
Apropos of the preceding Benchley
quip, I laughed out loud one night when Kathy and I were talking over a glass
of wine. I got the hiccups and she
assured me that she knew a cure. I laughed
and told her of Benchley's comment. My
dad also had a sure-fire cure. Benchley
was right. (I do have a sure-fire cure
that works for me: I just wait them out
and they go away within a minute or two.
But Kathy tells me she used to have them for hours when she was young,
so she had to find some way to deal effectively with them.)
Salary is no object; I want
only enough to keep body and soul apart.
(Parker)
EXPERIENCE, n. The wisdom that
enables us to recognize as an undesirable old acquaintance the folly that we
have already embraced.
Early in the year I wrote an email
to Krystin pointing out that the fiscal deal reached between the White House
and Congress meant that the payroll tax would go back up 2%, so she'd see a
slight decrease in her take-home pay.
My,
sometimes tax increases do test one's political allegiances. She wrote back to me that "Yes,
I did notice on my last paycheck - almost $60 less! Ugh. I
know this sounds very republican of me, but I want my money! If I didn't have
any debt, it wouldn't be a big deal and I'd be happy to pay higher taxes for
the good of everyone, but I DO have bills to pay. :/"
But that was the only grousing I heard,
and she seems to have fared alright financially even with the tax increase. (And it turned out to be a lot less than
$60.)
The
first thing I do in the morning is brush my teeth and sharpen my tongue. (Parker)
ECCENTRICITY, n. A method of
distinction so cheap that fools employ it to accentuate their incapacity.
I am one of those people who is
averse to having needles stuck in me. I
put up with the pricks (mostly with my eyes closed) for vaccinations and the
occasional blood draw that accompanies a physical exam, but would never think
for a second about getting any part of my body pierced for decorative
purposes. The idea of a tattoo is beyond
the pale, apart from the fact I'd never get one because I don't like them and
don't want dye under my skin.
I bring this up because one of the
side effects of not paying attention to her diabetes for many years is damage
to Krystin's eyes. At one point early in
the year she was close to losing some or much of her vision. It is a mark of how far eye treatments have
advanced, however, that with cataract surgery on both eyes (the implantation of
synthetic lenses) and shots, her vision was so much improved that she no longer
needs corrective lenses except for reading; it's now 20-25. (The synthetic lenses are not flexible, just
as our biological lenses lose flexibility as we get older, which is why almost
all of us need reading glasses even if our vision is otherwise fine.) Krystin has exclaimed about how well she can
now see.
Part of the treatment is shots
directly into the eyes. Krystin has
learned to tolerate them because she first receives a sedative and a local
anaesthetic, but the idea that someone would be sticking needles directly into
my eyes sends shivers down my spine. But
I'm glad she can handle the shots because it means her eyesight is fully
restored and protected (as long as she continues to maintain tight control over
her blood sugar levels).
On that general subject, I sent
Krystin an email message last spring.
I have told you this over the last couple of months, in bits
and pieces, but I want to tell you in writing as well as orally.
When I came home from work one day last January and told you
about my lunch conversation with Andy, who had related that he had been struck
down with the flu from roughly Christmas to mid-January and constantly fatigued
that entire period, you responded something like "welcome to my world
every day."
I have been appalled ever since you told me that because I
think I would have been in bed ever since. I think you are heroic to get up and
go to work every day and just get from one bedtime to the next. Combined with
the eye treatments, and the occasional pain they cause, and the dietary
problems you've had, I am simply astonished at how well you manage to lead a
life. I cannot tell you how much I admire you for your perseverance and
determination. So even though I occasionally grouse at you about leaving
clothes and shoes and stuff all around the house (you really ARE messy!), you
should know that I cannot say enough positive about your stamina, your
commitment to doing your job well, and getting to the surgery in June. I have
said as much to a number of my friends and colleagues.
If you want to berate yourself for sleeping too much, that's
up to you. Under the circumstances, I certainly will not.
We
must race to June!
Krystin wrote back:
Thanks, dad. I have been meaning to get back to you on this
for a while. I really appreciate it. One of the things I'll be talking with
Nancy Blume [Krystin's therapist, who she started seeing shortly before she
wrote this] about is how I can express verbally and in writing how I feel these
days, but nobody REALLY understands. But your warm comments offer me the
support and recognition that you understand that even though you can't feel
what I'm going through, you know that I'm having a hard time. I appreciate that
a lot, so again, thank you. Like you said, let's just hope the next couple
months go by fast and June hurries up and gets here!
The idea of living
and going to work every day while feeling like I had the flu is so aversive to
me that I had a hard time imagining how Krystin did it. She showed stamina I wouldn't have guessed
she had.
RESPONSIBILITY, n. A detachable
burden easily shifted to the shoulders of God, Fate, Fortune, Luck or one's
neighbor. In the days of astrology it was customary to unload it upon a star.
You
can't teach an old dogma new tricks.
(Parker)
We got to June and Krystin had the
transplant surgery. The donor was a
high-school classmate, who volunteered a kidney, and all the tests demonstrated
that she was a match. So the morning of
June 4 it happened—and it went exactly as the doctors expected, extremely
well. Krystin was eating heartily on
June 5 and most of the sluggishness from the general anaesthesia had worn off.
But there were bumps in the
road. She ended up back in the hospital
(ER) on June 17 with nausea and vomiting.
It turns out she had an infection (not related to the transplant) and
she had a reaction to one of the post-surgery drugs. That stay ended up lasting 5 days, to her
dismay, but the medical folks finally got her back to feeling good. Then, however, she had to contend for some
time with low blood pressure, the opposite of what many Americans deal with;
hers was running ~80 over ~40, so she'd be dizzy and light-headed when she
stood up from sitting or lying down. She
tried to go back to work part-time on July 1, less than a month after the June
4 surgery; her surgeon made it clear she was NOT to do that. So she stayed home longer and then returned
part-time on July 16. That worked
better.
Krystin's long-delayed plans to
finally move into a place of her own came to fruition; she got an efficiency
apartment on September 1. We were glad
for her. She needed to be out of daddy's
house, at age 28, and we needed her to be out—it was fine to have Krystin
around but it was not so fine to also have the enormous amount of stuff she
has, which constantly spilled into the dining room and living room and other social
places in the house. I am not obsessive
about neatness, but this debris everywhere was getting on my nerves. We were all better off with Krystin in a
place of her own.
A
couple of weeks after Krystin moved to her new apartment, she invited Kathy and
me to dinner. She said we would have
spaghetti but that we needed to bring the meat, because she doesn't like any
meat in spaghetti sauce. We think this
is a hoot. We may have to try this out
on some of our friends: "You're
invited for steaks on the grill and x and y and z—and bring your own steaks." I teased Krystin about this; I accepted her
explanation that she wouldn't do that with anyone other than us.
One
of Krystin's Facebook posts, however, seems to have summed up nicely where she
stood after the travails of ailments before surgery, then surgery, then the ups
and downs of recovery: "This year I
have been fortunate to get a new kidney, a new car, a new apartment, and now
two new kitties to greet me when I get home. And the year isn't even over yet!
Life is good." (Quoted with Krystin's
permission. I got rid of one kid and one
car on my automobile insurance!)
The
ventures to the hospital were not over, unfortunately. Krystin needed to have a gastric pacer
implanted, in order to help her digestion, which kept her in the hospital for
about a week, and then later in the fall she came down with Epstein-Barr virus,
otherwise known as mono, and suffered from two additional opportunistic
infections (not uncommon with immuno-suppressed patients). The mono and friends kept her in the hospital
for a week and then at home for another week, and she was in the hospital twice within a 2-week period in October
because an abscess in her throat that had to be cut and drained.
Fortunately, her employer (the University) is extremely considerate of people
afflicted with medical difficulties, especially those whom it also happens to
value, and the people with and for whom Krystin works (I happen to know from my
many discreet sources) have a very high regard for Krystin's performance. Her infectious disease physician told Krystin
she should plan on taking an extended period of time off; Krystin said "no"
and went back to work after being out for the two weeks. As of the moment I compose this, after she
just returned to work, she's doing well and feeling better.
Part
of the long-term plan is that Krystin will also have a pancreas transplant,
thus eliminating the diabetes. Because
she's had a kidney transplant, her name goes to the top of the list for a
pancreas transplant—but with all
the infections, she is now on hold on the national donor list until
mid-February. So it goes on and on.
I
sometimes think that "being sick" is enough to drive some of us
crazy, and just going back to work—OK as long as we're not contagious—is a
cure. I know that has been true for me
in the past, and it seems to be true for Krystin as well.
You might think that after
thousands of years of coming up too soon and getting frozen, the crocus family
would have had a little sense knocked into it.
(Benchley)
FINANCE, n. The art or science
of managing revenues and resources for the best advantage of the manager. The
pronunciation of this word with the i long and the accent on the first syllable
is one of America's most precious discoveries and possessions.
Many of us have, in the course of
work or other activities, taken the Meyers-Briggs Type Indicator (MBTI), which
classifies people on four scales (Extraversion (E) – (I) Introversion, Sensing
(S) - (N) Intuition, Thinking (T) - (F) Feeling, and Judging (J) - (P)
Perception). So one can be an INTJ or
ESTP or whatever.
Kathy has maintained, for as long as
I have known her, that she is an Introvert.
Spending time with others, no matter how well she knows them, is a drain
on her energies and she needs time mostly alone to recover (being only with me
counts as recovery time, I was glad to learn).
She hastens to assure me that that doesn't mean she doesn't want to
spend time with friends, quite to the contrary, but that she just needs quiet
time at some point to recoup. Although I
have taken the Meyers-Briggs, and come up as an INTJ—and thus an Introvert—I
nonetheless very much enjoy social interactions (well, at least in small
groups—I am indifferent to parties of 20-30 or more people). I wouldn't go so far as to say I thrive on
them, but I do find getting together with friends almost invariably stimulating
and fun. I suppose that if I had a job
like the president of the University, working with groups and individuals all
day and then participating in University events many evenings and weekends, I'd
probably be desperate for quiet time, but since I fortunately do not have such
a horrible schedule, I look forward to our intermittent social interactions.
The Meyers-Briggs is extremely
widely used—but there is also considerable scholarly criticism of its accuracy
and validity. People can take the MBTI
and then retake it again sometime later and have a different result. Even if the scales measure legitimate
constructs—which is open to question—the results of the MBTI depend on the
honesty of the individual taking it or the ability of the individual accurately
to respond (that is, even if the person is trying to be honest, he or she might
not be accurate in the self-assessment in responding). Moreover, if these are types as described,
the distribution of responses should be bimodal, with people clustering in one
or the other, but they don't; they cluster around the middle, in a normal
distribution, so one can be classified as an Introvert but only barely
different in responses to an Extravert.
Bob Carroll, at the Skeptic's Dictionary, observes that parts of the
profiles of types—e.g., INTJ—can apply to anyone and suggests that "they
read like something from Omar the astrologer."
A faculty friend of mine in the
Department of Psychology whose career has been devoted to studying interest
measurement (e.g., the Strong Vocational Interest Blank, Leisure Interest
Questionnaire) said the danger of the Meyers-Briggs is when employers use it to
make hire/don't hire (i.e., selection) decisions since there is no evidence of
criterion-related validity and the reliability of the scores is suspect. My friend wrote: "The best that can be said about it is
that it should start a conversation if the scales suggest issues related to the
job in question—but it should not be used to make decisions. The
primary use of the MBTI is in development programs, team building, or
coaching. But the danger even in these instances is slapping a label on a
person that implies that this is the way the person always
behaves without considering the situational demand which does change some
people's behavior."
Interestingly, the one scale on the
MBTI that a study by the National Academy of Sciences correlates with other
assessment instruments is the Extraversion – Introversion scale. And I don't doubt Kathy's self-assessment of
her internal state during and after social interactions, irrespective of
whether it's because of where she falls on an extraversion/introversion scale
or because it's just how she feels.
(Even I can grow weary of too much social activity/interaction; I
learned that back in the days when I attended the two national collegiate
athletic conventions, which were a week each and back to back. After two weeks of constant meetings, I just
needed to go home and be by myself. For
a couple of hours, anyway.)
As a general proposition, however, I
think it's hard to avoid the conclusion that the MBTI is pretty much junk
psychology.
MYTHOLOGY, n. The body of a
primitive people's beliefs concerning its origin, early history, heroes,
deities and so forth, as distinguished from the true accounts which it invents
later.
Four be the things I am wiser
to know:
Idleness, sorrow, a friend, and a foe.
Four be the things I'd been better without:
Love, curiosity, freckles, and doubt.
Three be the things I shall never attain:
Envy, content, and sufficient champagne.
Three be the things I shall have till I die:
Laughter and hope and a sock in the eye. (Parker)
Idleness, sorrow, a friend, and a foe.
Four be the things I'd been better without:
Love, curiosity, freckles, and doubt.
Three be the things I shall never attain:
Envy, content, and sufficient champagne.
Three be the things I shall have till I die:
Laughter and hope and a sock in the eye. (Parker)
I read the third volume of William
Manchester's magisterial biography of Winston Churchill (each volume is about
900 pages). The series had an
interesting publishing history. The
first volume was published in 1983, the second in 1988, and the third in 2013,
25 years later. Manchester died before
the last volume was even close to completion, but in his final illness he
worked with a journalist who he asked to finish the book. The reviews of it were almost entirely
positive, even though it wasn't entirely Manchester's work.
Krystin and Kathy both looked at it,
grimaced, and said they wouldn't be reading it (it's over 1200 pages). "TMI" is what I think went through
their minds.
I have a colleague and friend from
the Coalition on Intercollegiate Athletics (which I mentioned earlier) with
whom I was emailing (about head trauma) as I was reading the book. My colleague spends half the year in his
homes in Rome and Tuscany—mostly in Rome at his lab—and he had just arrived in
Rome. I told him I was at the point in
the Churchill biography when the Brits and Americans are driving up Italy from
the south to take Rome from the Germans.
He wrote back to me: "It is
my understanding that the Brits and Americans arrived in Rome from Southern
Italy faster than one can do it now due to the traffic jams on the Italian
highways! And they were driving tanks
and half-tracks!"
Even though I know how WWII turned
out, it was nonetheless painful to read about the events of 1940-41, when
England stood alone against Hitler and London was being bombed daily by the
Germans. One also becomes numb with the
casualty statistics: only 2,800 killed
in this attack, 25,800 killed in that one, 16,500 killed landing on this beach,
and so on. It really is true that these
are statistics, not events we can understand as human deaths, but I can imagine
what my reaction would be if either of my children were killed in battle (or
any other way). I would be devastated
and doubt that I would ever, the rest of my life, fully recover. (Maybe parents and families reacted a little
differently during a world war, where combat deaths by the thousands were a
daily occurrence. But even with "only"
2,800 killed in a small skirmish, that's 2,800 families devastated by a
loss. It is impossible for human beings
to imagine that writ large to tens of millions such deaths.)
Reading the Churchill biography also
led me to ask two mostly-but-not-completely-unrelated questions: (1) "in terms of daily life needs, is
there any reason to read history?" and (2) "what reverberations, however
faint, can we still see today from other dramatic events in history?"
INK, n. A villainous compound
of tannogallate of iron, gum-arabic and water, chiefly used to facilitate the
infection of idiocy and promote intellectual crime. The properties of ink are
peculiar and contradictory: it may be used to make reputations and unmake them;
to blacken them and to make them white; but it is most generally and acceptably
employed as a mortar to bind together the stones of an edifice of fame, and as
a whitewash to conceal afterward the rascal quality of the material.
They sicken of the calm, who knew
the storm. (Parker)
With respect to question (1), Kathy
and Krystin goggled at me—me, who has been reading history all his life—when I
suggested that the answer, in terms of daily life, is "no." I have tried to identify activities in a
normal day where one needs to know history in some fashion—but I can't. You don't need it to drive your car, or eat
at a restaurant, or do laundry, or mow the grass, or garden, or fly in an
airplane, or do your job, or for any of the million quotidian tasks in which we
all engage every day. On a purely
practical level, one simply does not need history. At that level, Henry Ford was correct: "History is bunk."
There
is debate about what Ford actually meant, but as I explored the context of the
quote, I concluded that in an odd way Ford's viewed aligned with mine. One web writer—who I otherwise consider a
socio-political nut job—retrieved an article by Canadian reporter Robert
Fulford that drew on an essay by a distinguished historian, Eugen Weber:
Ford thought that
devotion to the past prevents us from grappling with the present. . . . In 1914
all the European leaders knew history, Ford said, yet they blundered into the
worst war ever.
On another occasion Ford
recalled looking in American history books "to learn how our forefathers
harrowed the land"; he discovered that historians barely mentioned
harrows, the iron-toothed rakes essential to modern farming. Harrows, Ford argued,
meant more in history than guns and speeches. When history "excludes
harrows, and all the rest of daily life," then history is bunk.
Maybe Ford felt strongly
about harrows because he manufactured them, Weber says; even so, he was right
when he argued that history should tell how ordinary people lived.
Ford's observation about European leaders and World
War I is simply a limited version of Aldous Huxley's more general
proposition: "That men do not learn
very much from history is the most important of all the lessons that history
has to teach."
Ford wanted more history of daily life; I suggest that
daily life doesn't need history.
Of course I reject out of hand the
proposition that any well-educated, cultivated, thoughtful, aware, and
responsible citizen should not read history.
No one can understand most of the world that surrounds them unless they
have a passing knowledge of history.
They can live and work and play in it, but they won't understand it. They certainly can't have a decent dinner party
conversation. More importantly, they can
hardly have any idea of what their own socio-political beliefs are.
Oddly
enough, I found one of the most lucid, valuable, and well-ordered statements on
the value of learning history on the web pages of a faculty member at the
University of North Carolina at Pembroke, which is not one of the most renowned
institutions of higher education in the country. But Professor Brown did an admirable job of
summarizing the reasons to read history.
I do not agree with all of them; here they are, with Engstrand's
commentary.
Brown writes
that history "preserves and celebrates the memory
of great men and noble deeds"
(hard to believe he wrote only "men" in the 21st Century). Yes, but always, only, and inevitably with
the interpretation of the historian.
With history as with current news, one must always triangulate in order
to believe one is glimpsing anything resembling the truth. (Or, "the past
actually happened. History is what
someone took the time to write down."
A. Whiteney Brown) Pieter Geyl
maintained that "all historians are influenced by the present when writing
history and thus all historical writing is transitory. . . . There never can be a definitive account for
all ages because every age has a different view of the past." He argued that "the best that historians
could do was to critically examine their beliefs and urge their readers to do
likewise." "Great men and
noble deeds" may sometimes thus be in the eye of the beholder. (Viz., the Portuguese and Native American
views of Christopher Columbus and those who followed him.)
"Writing history allows the judgment and
punishment (vicarious) of the guilty, as, for example, in some histories of the
Third Reich." There is a dilemma
here, of course: I think historians will
say that the closer the events are in time, the more difficult it is to write
about them with anything even approaching objectivity. There is ancient precedent to this point of
view; attributed to Tacitus: "This
I hold to be the chief office of history, to rescue virtuous actions from the
oblivion to which a want of records would consign them, and that men should
feel a dread of being considered infamous in the opinions of posterity, from
their depraved expressions and base actions." There is, however, a long line of people in
the 20th Century alone who weren't too worried about what posterity had to say
about them, so history-as-deterrent is a thin reed on which to rest the
argument.
I like the view of one of the great living historians,
whose work I greatly admire, Simon Schama.
There is a sense in which, when you're a historian, you really oughtn't
to be knocking on the doors of power; your job is to keep the powerful awake at
night, to turn them into insomniacs. The
historian in me, the sceptic, the grandchild of Orwell, always hears the
inflation of rhetoric in it. History is
a tragic muse. One of its great founding
moments is the Peloponnesian War and the whole majestic, terrifying drama of
that builds up to the expedition against Syracuse that sees Athens sailing into
massive hubris. That is good, honest,
western history. It should never be
self-congratulation; it should keep people awake at night.
Professor Brown writes: "Writing history uncovers general truths
(or laws) about human nature and behavior" and "writing about history
reveals lessons for the future; this idea prompted the philosopher George
Santayana to exclaim, 'Those who cannot remember the past are condemned to
repeat it.'" I think this is just
baloney. Most of us are familiar with
the Santayana claim, which has become conventional wisdom. I think Mark Twain is closer to the mark,
however; he's alleged to have said that "history doesn't repeat itself, but it does rhyme." That seems to me right—and the variance
between repetition and rhyming makes it treacherous to try to draw lessons from
history. The Dutch historian Pieter Geyl
seems to me right: "History is not
to be searched for practical lessons, the applicability of which will always be
doubtful in view of the inexhaustible novelty of circumstances and combinations
of causes."
Professor Brown, continued. "Writing about history helps you learn
about yourself and helps in the creation of a personal and/or cultural
identity. Each of us is a social
creature, and that means we are at least partially the product of every
experience we have had and of all that we have inherited from our families, our
communities, our nation, and our spiritual, intellectual, and cultural
heritage. Study of history allows you to
situate yourself in time and place, and it helps you understand who you are and
how you came to be." I think he has
this exactly right. I am a
Danish-German-Swedish-English-Irish-Scot American who has a very confused
cultural identity.
Professor Brown: "Writing about history helps you expand
your horizons, allowing you to understand the values, attitudes, and motives of
other people, whether of different nationalities or racial groups or religious
orientations." Again, I think he
has this right—but this is a virtue hardly limited to history. One's horizons can be expanded equally well
by studying anthropology or international politics or sociology or theater or
any of a number of other fields.
Professor Brown then cites the
Bradley Commission on History, a group of historians funded by the Lynde and Harry Bradley Foundation
to evaluate the poor status of history in American schools. The reasons he elicits from the Commission
report must send shivers down the spines of the Texas GOP: "Studying history . . . helps
[individuals] to develop a sense of 'shared humanity'; to understand themselves
and 'otherness,' by learning how they resemble and how they differ from other
people, over time and space; to question stereotypes of others, and of themselves;
to discern the difference between fact and conjecture; to grasp the complexity
of historical cause; to distrust the simple answer and the dismissive
explanation" and so on. Brown
concludes that there is one word that describes the value of history: judgment.
It is required for "the profession of citizen" (I like that
phrase) that should be universally shared in an educated democracy. "It takes a sense of the tragic and of
the comic to make a citizen of good judgment. It takes a bone-deep understanding
of how hard it is to preserve civilization or to better human life, and of how
these have nonetheless been done repeatedly in the past."
(In case you forgot, the Texas GOP
party platform in 2012 included this language:
"We oppose the teaching of Higher Order Thinking
Skills (HOTS) (values clarification), critical thinking skills and similar
programs . . .which focus on behavior modification and have the purpose of
challenging the student's fixed beliefs and undermining parental authority."
According to news reports, the Communications Director of the
Texas Republican Party said that inclusion of "critical thinking skills"
was a mistake, an oversight, but that it couldn't be fixed until 2014 at the
next state party convention because it was part of the language approved at the
2012 convention.)
So draw your own conclusions about
whether reading history has value.
Intellectually, I think, surely.
Practically, in my view, none.
FAITH,
n. Belief without evidence in what is told by one who speaks without knowledge,
of things without parallel.
Well,
Aimee Semple McPherson has written a book. And were you to call it a little
peach, you would not be so much as scratching its surface. It is the story of
her life, and it is called In the Service of the King, which title is perhaps a
bit dangerously suggestive of a romantic novel. It may be that this
autobiography is set down in sincerity, frankness and simple effort. It may be,
too, that the Statue of Liberty is situated in Lake Ontario. (Parker)
With
respect to question (2), I start from the premise that, although it is growing
more attenuated, we still to a significant event live in the world that Adolf
Hitler bequeathed us. Before the Iron
Curtain came down, Eastern Europe was under communist rule because Russian
troops marched as far as they did in beating back the Germans. The architecture of much of London and
virtually all of Berlin is a result of the bombings during the war that
required rebuilding. Obviously, the
small number of Jews now living in central Europe reflects the Holocaust. And on and on.
I decided, as an example, out of
curiosity, to explore the extent to which we can still see reverberations from
the French Revolution and the Napoleonic wars.
The revolution started in 1789, so is 224 years distant, and the
Napoleonic wars ended in 1815, so are 198 years ago; what are the noticeable
effects from those tumultuous events, if any, that we can still detect?
In
the course of exploring that question, I recalled to mind my friend Regents
Professor Ellen Berscheid's grousing a number of years ago that she had spent
12 hours one day preparing for a one-hour lecture the next day—and that it was
unfortunate that some members of the general public think that teaching two
courses per semester, or about 6 hours per week, is so easy. (She also observed that that the 12 hours did
not cover all the time she spent over her career learning about the subject—it
took 12 hours of work just figuring out how to present it to students in a way
that it might sink in.) I put in far
more time reading books and articles about the modern-day consequences of the
French Revolution and the Napoleonic wars than the volume of text in this
letter justifies.
One historian observed, quite
correctly I believe—although I'd never thought about it before—that the
Napoleonic wars were the First World War; it extended into the Middle East and
that Napoleon at one point even had designs on Australia.
The consequences, however, are
significant, even 224 years after the revolution started. Here are some of them, gleaned, collated, and
integrated from a variety of sources. A
few of these effects are trivial, most are not.
-- The
modern terms "left" and "right" to describe places on the
political spectrum come from the seating locations of groups in the National
Assembly that arose during the revolution.
The practice remains in place in the National Assembly today, the lower
house in the bicameral French government.
-- Because
he needed money more than he needed land in North America, Napoleon sold the
Louisiana territory to the United States for a total of about $15 million. Can you imagine how difficult it would have
been for the young United States to have obtained all that land had Napoleon
not sold it to us in one big chunk? The
U.S. might never have obtained all of it, and there could have been long and
bitter piecemeal struggles over it.
-- Two new nouns entered the language in
1794: terrorisme (the policy of the
Terror) and terroriste (those who implemented it).
-- The Napoleonic wars initiated the era
of total and mass warfare. Up until
then, wars were fought by small groups of professional or purchased troops and
did not involve massive number of either troops or civilians. In the course of the revolution, the ground
was laid for state control of industries and mobilization of experts for
weapons production as well as conscription and large armies. These developments reached full flower in
World Wars I and II.
-- Baruch
Spinoza, in 1670 with Tractatus
Theologico-Politicus, developed the idea of a link between religion and
politics and argued that the Old Testament is a collection of stories to govern
a state. Spinoza "formulates the
basic concept of nationalism in general
terms: 'There is no doubt that devotion
to country is the highest form of piety a man can show, for once the State is
destroyed nothing good can survive.'"
Spinoza's analysis undermined the structures of Christianity and
replaced them with nationalism. (O'Brien,
in The Permanent Revolution: The French Revolution and its Legacy
1789-1989.) While this was 119 years
before the French revolution, Spinoza's idea of nationalism was given life by
the revolution. The chief publicist for
the Third Estate (the commoners), Abbe Sieyes, wrote that "the nation
exists before all, it is the origin of everything. Its will is always legal, it is the law
itself."
O'Brien maintains that while the
revolution did not create nationalism, it "accelerated the growth of
nationalism, secularized it and thereby helped to set it above all other
values. That tendency seems to have been
at work . . . from [Napoleon's 1814 defeat at] Waterloo on, throughout the
nineteenth and into the twentieth century." Withal, it is a legitimate contention that "the
Revolution's most permanent big legacy—to the French themselves no less than to
the other peoples of the planet—has been the apotheosis of the nation-state
within the modern theory and practice of nationalism." That nationalism "remained for all takers
precisely where the French Revolution put it:
in the divinized nation-state and the compulsions to national uniformity
of thought and feeling." It showed
up with a vengeance in the 20th Century in Nazi Germany.
One place it showed up, with a quite
different tone, was in post-Napoleonic Germany.
Before Napoleon, Germany was a bunch of little states and
principalities, the largest of which was Prussia. Napoleon, when he conquered Germany,
consolidated the states into a much smaller number. In combination with the idea of nationalism
that emanated from the revolution—and was taken up by the people in countries
that Napoleon conquered—the wars were a major impetus to the unification of
Germany in 1871 under Bismarck. Bismarck
might have had a much greater challenge if there were still 350 independent
German-speaking states in 1870.
-- "The
French Revolution . . . thus became a major cause for the creation of America's
early two-party political system and the most important issue in the emergence
of an active public culture. . . . The
clearest ideological differences between Federalists and Republicans emerged in
their interpretations of the French Revolution." One study of American newspapers in the 1790s
found that they gave more coverage to the French Revolution than to U.S.
domestic politics, and the dispute forced the two parties to define their views
of a republican society. One historian
concluded that the "French Revolution was thus the inescapable event that
forced the two emerging parties to define their explicit political differences
and to enter a heated battle for American public opinion." (Kramer, The
Global Ramifications of the French Revolution)
-- The events of 1789 and thereafter were,
in the view of many, the first actual "revolution" in a state. The word is overused now (the industrial
revolution, the scientific revolution, the computer revolution, etc.) but "the
heart of its meaning remains where it was put between 1789 and 1795: violent transformation of the political and
the supporting social arrangements of country, replacing one species of ruler
by another and giving itself and its people something of new look.
The French Revolution did much that went beyond politics . . . but it
was above all as a political event that it exploded upon the Europe of its day." There had not been such a tumultuous
overturning of a society in modern western political history ever before.
-- America and France were drifting toward
conflict in the mid-1790s; President Adams sent envoys to France to try to
resolve the problems; the envoys were essentially sent packing unless they paid
bribes (the "XYZ Affair").
Americans took umbrage, supported an increased military budget, and
because of the patriotic fervor, Congress adopted the Alien and Sedition Acts
to "define what it meant to be an American, and the definition tended to
exclude precisely those people who showed the greatest inclination to support 'French'
ideas or Jeffersonian republicanism."
Thus began the first of the xenophobic/scare-tactics episodes in
American history, repeated during the 19th Century and in the Red scares of the
20th Century under Presidents Wilson and Eisenhower.
-- Finally,
for the U.S., and somewhat more difficult to summarize, is the reaction to the
anti-religious, deistic views of the French revolutionary leaders as events
unfolded and the reaction to Thomas Paine's Age
of Reason (which opposed organized religion, attacked the Bible and
promoted deism following the events of the revolution, and sought to advance
the ideas of the Enlightenment). The
reaction was the Second Great Awakening, a reaction to skepticism, deism, and rejection
of Enlightenment ideas, which led to widespread evangelical religion. The U.S. has suffered from the effects of the
Second Great Awakening ever since. ("The
Enlightenment," in the broadest terms, covered roughly the late 17th and
much of the 18th Century, was led by or engendered by intellectuals such as
Hume, Locke, Isaac Newton, Adam Smith, Spinoza, and Voltaire, and was characterized
by the use of reason and reliance on the scientific method, challenges to
beliefs founded on tradition and faith, skepticism, and in general a "desire
for human affairs to be guided by rationality rather than by faith,
superstition, or revelation." It
was to these ideas, many advanced vigorously during phases of the French
Revolution, to which the American public, and American clergy in particular,
eventually reacted with revulsion.)
-- There
is a congeries of issues, mixed together, that emerge from a reading of the
literature. One is the power of the
ideas encapsulated in the rallying cry of the revolution: "Liberty, Equality, and Fraternity
(Brotherhood)." "The power of these three ideas would quickly
spread to the rest of Europe, especially after Napoleon's final defeat in 1815,
and eventually across the globe."
They showed up by establishing new precedents in democratic institutions
such as elections, representative government, constitutions, and civil law
codes. While many did not survive
Napoleon, the ideals did and affected events throughout Europe for the
following century and beyond.
There are other ramifications one could
add to the list. I looked primarily at
Europe and the U.S., but there were effects elsewhere in the world. There were also "soft" effects that
are difficult for even the best historians to track (see the preceding
paragraph). But there is little doubt
among historians, as far as I can tell, that there were impacts, often
indirect, that showed up in revolts and rebellions and revolutions during the
19th and into the 20th Century (e.g., the Russian Revolution in 1917).
If one accepts the claim that the French
Revolution and Napoleonic wars were in significant part responsible for setting
the stage for the unification of Germany in 1871 under Bismarck's hand, then I
can see a logical connection of events between the French Revolution and my
current job. Without a unified Germany, or
a Germany unified later or differently, both WWI and WWII are less probable,
and without WWII there would probably not have been a G. I. Bill, and then my
father would not have gone to college.
He once told me that without the G. I. Bill he would never have even
thought about college—they had no money and there was no history of higher
education in his family. If my father
hadn't gone to college, it's possible that I would not have, and likely wouldn't
have come from a parental background that insisted on college, and thus wouldn't
have had the job I've had for the last 25 years. So thank you Napoleon Bonaparte! (There are a lot of steps in that sequence of
events that I've omitted. . . .)
One
could repeat this exercise for the fall of the Roman Empire, the conquest of
England by William the Conqueror, the effects of Confucius on Chinese cultural
development, and so on ad infinitum, and in doing so come to realize more ever
more vividly the extent to which events, even those centuries ago, affect the
shape of the world today. At least it
did for me, with respect to the French Revolution and Napoleonic wars; I
learned a considerable amount. That, of
course, is justification in itself for the exercise.
Speaking of the effects of past events on
the present: "Khwarezm and Persia were crisscrossed with an
elaborate underground irrigation system that since antiquity had sustained a
thriving culture; the Mongols destroyed them. Arabic scholars contend
that the region's economy has yet to recover fully from this devastation."
Which destruction took place in 1221 or 1222.
(I think I read this in an airline magazine.)
HISTORY, n. An account mostly
false, of events mostly unimportant, which are brought about by rulers mostly
knaves, and soldiers mostly fools.
CYNIC, n. A blackguard whose
faulty vision sees things as they are, not as they ought to be.
Trying
to imagine the effect of thousands of combat deaths on families and survivors
brought to mind an episode I hadn't thought about for decades. It was the fall of 1972 or the early winter
of 1973 that I got into the only really vehement argument I ever had with my
mother. This was the time when the
Vietnam War was slowly winding down and the vast majority of the American
public had come to realize the war was a mistake. There was a military draft in place and males
graduating from high school each year were assigned a number from 1-365,
birthdates drawn randomly. If you were
born June 13, and June 13 was the first number drawn in the draft lottery, you
were #1. If June 13 was the last number
drawn, you were #365. As I recall, in
the late 1960s and early 1970s the military each year was going as high as
#200-250, so if you had a number in that range, you were likely to be
drafted. If your number was in the 300s,
you were almost certainly safe from being drafted.
My number, when we graduated from
high school in 1969, was 69, so I was pretty sure I would be drafted. Students were being given draft deferments
for up to four years if they were in college.
I would have gone to college in any event, but the draft deferment
provided an additional impetus.
Coming on to college graduation in
the spring of 1973, I was alarmed that I was going to be drafted and sent to
Vietnam. I had already determined that
was not going to happen—I was not going to get killed by some Vietcong booby
trap or sniper in the cause of a senseless war.
I couldn't claim conscientious objector—I wasn't then and am not now, 40
years later—so either I was going to jail as a protestor or I was going to
Canada or somewhere else. That's not
quite as brave as it sounds; at that point there were a lot of young males my
age who were protesting the war and the draft.
I would have gone to jail; I wouldn't have fled.
This topic came up with my parents
at their home on Dupont one late evening.
I said I would not under any circumstances be drafted and sent to
Vietnam. My mother got extremely angry
(Americans fought on behalf of their country, and so on). We argued back and forth; my father said
nothing. Finally I asked my dad—someone
who came out of WWII a wounded veteran who'd won the Purple Heart—what he
thought. He didn't really side with my
mother; his rather laconic comment was "if you don't go in the military,
you won't miss anything." That was
the end of the "discussion."
We all went to bed, and I don't recall that the subject was ever
mentioned again. With the exception of
this one eruption, my parents remained good friends of mine to the end of their
lives.
The military draft ended in June,
1973, about the same time that my college deferment ended. So I never faced being drafted. And all through college I never really
contemplated the possibility that I would end up in the military. Fortunately, the turn of opinion in the country
against continuation of the war made it entirely hypothetical for me. But I had determined that I, for that
misbegotten war, would not be a combat fatality, one of those numbers that
appears in Churchill's biography and the thousands of other histories about war
that have been written.
WAR, n. A by-product of the
arts of peace. The most menacing political condition is a period of
international amity. The student of history who has not been taught to expect
the unexpected may justly boast himself inaccessible to the light. "In
time of peace prepare for war" has a deeper meaning than is commonly
discerned; it means, not merely that all things earthly have an end—that change
is the one immutable and eternal law—but that the soil of peace is thickly sown
with the seeds of war and singularly suited to their germination and growth.
(If you don't give a darn about the
start of World War I, you can skip this section.)
A few weeks after I had written musings in this letter about the value of learning and reading history, I happened to discover that three new books about the events leading up to World War I had recently been published. They struck my fancy, and the writing appeared to be both accessible and lively, so of course I bought them and began reading them. One was July, 1914: Countdown to War by Professor of history at Koc University in Istanbul, Sean McMeekin; one was The Sleepwalkers: How Europe Went to War in 1914, by Cambridge professor of history Christopher Clark, and the third was The War That Ended Peace: The Road to 1914 by Margaret MacMillan, Oxford professor of international history. (At first I thought it interesting that three books dealing with the same events and time period happened to be published nearly simultaneously, realized we are approaching the centennial of the start of the war, and then concluded that it was unlikely the three authors all started to work on their books just because of the upcoming centennial. In any case, all three authors have also written previously on WWI.) For me this reading is one of life's delights; I don't claim it will help me in dealing with daily life or make me a better person. Some people like to go bowling, or paint, or go to movies. I like to read history.
As you might imagine, I was astonished when I started to read McMeekin's book on June 28, 2014 and realized that it was exactly 99 years to the day, a very long century ago, that the heir to the throne of Austria-Hungary, Archduke Franz Ferdinand, and his wife Sophie were assassinated in Sarajevo, Serbia, which set off a series of events that culminated a few weeks later in the start of World War I. McMeekin's new volume is a very neat bookend to Paris 1919: Six Months That Changed the World, also by Margaret MacMillan, in which she described the events leading up to the Treaty of Versailles, ending the war, signed under coercion by Germany—on June 28, 1919, exactly five years after Franz Ferdinand and Sophie were assassinated.
So 2014 will be the centenary of the start of WWI and there will doubtless be a significant number of articles in both the scholarly and popular press about how it began and its effect on the world of today. My only counsel, to those who may read those articles, is that it was extremely complicated, much more so than one might have been led to believe from a high-school or college survey history course.
Bismarck is famously quoted as saying, in 1878, that "Europe today is a powder keg and the leaders are like men smoking in an arsenal. A single spark will set off an explosion that will consume us all. I cannot tell you when that explosion will occur, but I can tell you where. Some damned foolish thing in the Balkans will set it off." (There is some dispute about whether he actually said it.) What I learned also from these books, although I vaguely knew it before, was that the Balkans were a hotbed of nationalist, religious, and ethnic fervors as well as secret societies and what today we would call guerilla groups. It was almost inevitable that something would come out of the area.
Anyway, McMeekin's book is both background and an hour-by-hour tracing of events, leading up to the assassination (what the assassins and supporters were doing) and after (the vast majority of the book). He literally notes the timing of telegraphed messages back and forth between Berlin, Vienna, St. Petersburg, Paris, and London, and tracks the discussions held by and with each of the leaders in the major powers. At the end of the book McMeekin seeks to apportion responsibility for the start of the war, and put the onus on Russia (with plenty of blame to spread around to Austria and Germany). It's Russia first because it began military mobilization before anyone else (but tried to hide it, not very successfully) and before the final Austrian response to Serbia was known—once that mobilization started, the other countries reacted in self-defense. Germany was the last to begin mobilization. But the Germans were fully blameworthy as well because they very early gave Austria a blank check in responding to the assassination in terms of its (Austria's) demands on Serbia—Austria would never have taken the aggressive posture it did toward the Serbian government had not Germany provided unqualified support. In doing so, McMeekin says, the Germans let the Austrians put a noose around their (the Germans') neck and then tighten it as events played out in the weeks following the assassination, events characterized in significant part by Austrian ineptitude. (But the Germans also brought grief on themselves by violating Belgian borders in attacking France, thus bringing Britain into the war.)
McMeekin also contradicts Barbara Tuchman, whose best-selling classic The Guns of August has pretty much been the standard interpretation of the events that led to WWI (in her analysis, it was largely Germany's fault). He points out that by mis-dating the Russian mobilization by two days, a critical mistake given how things were happening, Tuchman misunderstood the sequence of events—and thus misunderstood the relative blameworthiness of Germany and Russia in setting the machinery of war in motion. He also highlights how much deception both the French and the Russians engaged in during the weeks after the assassination—I learned something I hadn't known before, how much Russia and France connived all during that July to hide what they were doing and to make it appear that Germany would be the aggressor—because they were belligerent about going to war with Germany but wanted to make Germany appear blameworthy.
What was also striking, to me, was the exchange of "Dear Willy" and "Dear Nicky" messages between Kaiser Wilhelm II and Tsar Nicholas II (they were cousins). The two of them, however much power they had—and in neither case was it absolute—were led by their own government officers into a war that neither of them wanted and ultimately were unable to prevent. Both of them, of course, lost their thrones because of the war—and the tsar and his family also lost their lives after the Bolshevik revolution in 1917. (The tsar at one point, in response to a message from Willy, reversed the mobilization order, because "I will not become responsible for a monstrous slaughter." But shortly thereafter he changed his mind again, under pressure from his political and military advisers, and approved mobilization—and was responsible for "a monstrous slaughter," including, as it turned out, that of himself and his family.)
Clark covers the same
ground as McMeekin but goes into much greater review of the years before the
assassination, to provide context; one date in Serbian history that stands out
in its lore is June 28, 1389, the Battle of Kosovo, which the Serbians
eventually lost to the Ottomans but which acquired mythic nationalist meaning
in the 19th Century in Serbia.
There seem to be a lot of June 28s in this tale.
I
learned more about Serbian politics and political intrigues 1900-1914 than I
really cared to know—but at the same time I can say it provided a new outlook
on the predicament that Austria-Hungary found itself in (even before the
assassination). The situation in Serbia,
and in the Balkans generally, was a threat to Austria-Hungary, and the
assassinations raised the situation to one of paramount concern. What was also interesting to me was that
Austria (but not Hungary, in the Dual Monarchy that made up the empire) had
been making demonstrable progress in accommodating the needs of its many political
minorities—while Hungary continued to impose Magyar domination on everyone and Serbia
was seeking "greater Serbdom" to the disadvantage of all the other
ethnic groups in the Balkans.
Clark,
by the way, maintains that the "Dear Willy" and "Dear Nicky"
messages were carefully crafted by their foreign ministers.
Clark doesn't attempt to finger the blameworthy; "the outbreak of war in 1914 is not an Agatha Christie drama at the end of which we will discover the culprit standing over a corpse in the conservatory with a smoking pistol. There is no smoking gun in this story; or, rather, there is one in the hands of every major character." It was a tragedy, he wrote. Nonetheless, I think his narrative leads one to the same conclusion as McMeekin, even though Clark doesn't say so: although perhaps not significantly more "responsible" for the war, it was the Russians who, at the least, kicked the tripwire by mobilizing. After that, given the exigencies of mobilization in that day, war was nearly inevitable as the other players responded—and the Russians knew it.
Clark also makes the point that one cannot remove "contingency, choice, and agency" from the events; even though in retrospect the war looks like an inevitable outcome, the participants at the time could not see the future and didn't know what apocalypse they were rushing toward. There were multiple causes for the outbreak of war, but if one adds cause on top of cause on top of cause, it makes it appear that the war was unavoidable—and that clearly was not true. Choices were made and chance played a role. In addition, Clark suggests, "a striking feature of the interactions between the European executives was the persistent uncertainty in all quarters about the intentions of friends and potential foes alike." As the days passed in July, 1914, and communications and meetings took place, the options for all of them narrowed, but even up until the onset of military action there were possibilities for avoiding the way.
Probably the most famous comment made on the eve of the war (likely August 3, 1914, at the British Foreign Office, the date that Germany declared war on France and invaded Belgium) predicting what the war would bring was by British Foreign Secretary Edward Grey: "The lamps are going out all over Europe, we shall not see them lit again in our life-time." Given life in upper-class Edwardian England, I think it is safe to say that those lamps have never again been lit. And never will be. But Grey, in the eyes of some, bears considerable responsibility for those lamps going out.
Baron Andrew Adonis, a scholar, historian, journalist, and British cabinet member, argues in a review of a biography of Gray, that
he was such a disastrous
minister: arguably the most incompetent foreign secretary of all time for his
responsibility in taking Britain into the First World War, having failed in
July 1914 to do all within his power to stop the conflagration. . . . We cannot
know what would have happened had British policy been more effective. Probably
it was within the power of Asquith and Grey to have kept Britain out of the
war. Possibly they could have prevented it entirely, dissuading Germany from
supporting Austria in the chain reaction that led from Archduke Franz Ferdinand's
assassination in Sarajevo on 28 June to the German invasion of Belgium on 4
August.
Adonis's description of Grey's stance accords fully
with the two books.
In the July crisis, he may have desired peace, yet his policy produced the opposite result. So how far was he to blame? . . . [The author of the biography] does not address this question, beyond noting . . . Grey's stark irresolution throughout July 1914 on the basic issue of whether or not Britain would support France in resisting a German invasion—which had the fatal effect of encouraging both German and Austrian militarism and French and Russian resistance.
Adonis is equally critical of Grey's Prime Minister,
Herbert Asquith.
On 29 July, Asquith
concluded a meeting of the cabinet with the decision that, on the critical
issue of any German violation of Belgian neutrality, "Sir E Grey should be
authorised to inform the German and French ambassadors that at this stage we
were unable to pledge ourselves in advance, either under all conditions to
stand aside or on any conditions to join in." This one sentence contains the most damning
indictment of Asquith's and Grey's leadership and policy. It is evident that Asquith did not appreciate
the magnitude of the European crisis until 1 August, three days before the
German invasion of Belgium. . . . A
miscalculation of British intentions on the part of the other European powers
was critical to the outbreak of war. This happened for a simple reason: Britain's
intentions were unclear. The
responsibility for this lay above all with Grey.
After I had finished reading
McMeekin's and Clark's books, in the middle of July, I had to wait for
MacMillan's book until the end of October.
MacMillan provides a more
magisterial view of the world in the late 1890s through June, 1914; she flew at
the 20,000-foot level more than the other two.
But she also delved into the central factors that led to war.
She noted, for example, apropos of military planning,
that "what is striking about the decision-making in 1914 in how was
accepted that even the briefest of delays meant danger." So once one country started mobilizing (e.g.,
Russia), the others felt they could not hold off. The militaries also had such rigidity in
their plans that most felt they had no choice about what to do once
mobilization started; the Russian plan, for example, allowed only simultaneous
mobilization against both Austria and Germany—never allowing only a defensive
posture with Germany and a struggle with Austria in the Balkans. All of the parties also developed a deep
faith that only a military offensive would protect them, rather than staking
out a defensive position. This faith was
encouraged and reinforced by various events from 1905 to 1914, and led to an
arms race among the powers that significantly increased the probability of war,
no matter the intentions of the parties.
MacMillan reiterated the point that what was undertaken as defensive
reaction on the part of one country (England kept building ships for its navy
because Germany kept building new ships) was not perceived that way by the
others.
MacMillan (who cites Clark in her book) agrees with
him: while war seemed inevitable, there
were choices that could have been made.
She highlights, as does Clark, the flaws in the Kaiser and the Tsar,
both weak characters who occupied their thrones at an unfortunate time for
human history and who exercised their vast power in erratic and ill-informed
ways that exacerbated the uncertainties facing all of Europe's leaders at the
time (including their own in Germany and Russia).
The upshot of making 2013 the year of reading about 1914 was that for me, anyway, the received wisdom about World War I has been somewhat upended. France and Russia were not blameless by any means and England (Grey) contributed significantly to the way events ultimately unfolded. What is also clearer to me, however, is that trying to identify ultimate causes is a fool's errand—because there were too many of them coalescing all at once. Bad leaders who in some cases abdicated responsibility to the military, bad governments, unstable nationalisms, military planning, a lack of imagination on the part of both civilian and military leaders, a decade of events that made all the powers jittery, and a belief that somehow war would be avoided (because it had been up to that point, despite repeated alarms in the preceding decade). Sometimes a nation, or humanity, simply has bad luck because bad things all come together at once, and we had it on June 28, 1914.
June 28 has suddenly become a date I will recall every year. Think how the world has changed as a result of events on June 28, 1914 and June 28, 1919: two world wars, the rise and fall of bolshevism/communism in Europe, and all of their consequences. (I am surmising that Lenin and the Bolsheviks would not have succeeded had not Russia fared very poorly in the war—by 1917 facing food shortages and enormous casualty numbers—which led ultimately to Lenin's triumph.) A number of historians view the period 1914-1945 as a single war interrupted by a lengthy truce, and it is difficult to avoid the conclusion that the results of WWI set the stage for WWII.
I mentioned earlier the view that we live in a world bequeathed to us by Hitler. It is even more true that we live in a world bequeathed to us by the events of WWI because it spawned WWII as surely as the sun rises in the east. WWI essentially brought down the curtain on three hereditary monarchies that had ruled much of central and eastern Europe for centuries; it created an embittered Germany and a situation in central Europe (as well as the Middle East) that was clearly unstable and that became especially so with the onset of economic depression; it also brought to an end a life, at least among the reasonably affluent, that now seems dreamily idyllic when looking back 100 years. The duration and deaths of WWI led to a much different view about life and society for those who experienced it and, one can argue, generated the modern age, a more cynical take on life.
As a footnote to this digression on WWI, I've noted in the news in the past that Nicholas II was elevated to sainthood by the Russian Orthodox Church (this from Wikipedia):
In the last Orthodox Russian monarch and members of his family we see people who sincerely strove to incarnate in their lives the commands of the Gospel. In the suffering borne by the Royal Family in prison with humility, patience, and meekness, and in their martyrs deaths in Yekaterinburg in the night of 17 July 1918 was revealed the light of the faith of Christ that conquers evil. . . . The Russian Orthodox Church inside Russia rejected the family's classification as martyrs because they were not killed on account of their religious faith. Religious leaders in both churches also had objections to canonizing the Tsar's family because they perceived him as a weak emperor whose incompetence led to the revolution and the suffering of his people and made him partially responsible for his own murder and those of his wife, children and servants.
The great historian Simon Schama has observed that "history is the enemy of reverence." I would think that anyone who reviews many of the actions that took place during his reign as Tsar—pograms against the Jews, bloody suppression of the 1905 uprising, murder of political opponents, and a generally repressive autocracy—which he presumably was at least informed about even if he didn't direct them, would choke on the assertion that he "strove to incarnate . . . the commands of the gospels." It is probably true that he was the wrong man at the wrong place in history—perhaps not the brightest bulb in the chandelier—but to elevate him to sainthood seems to me to devalue sainthood, if one wants to believe in that kind of thing. History, in this case, seems not to have overcome reverence.
All three of these books were well worth the effort to read. I don't know that in the time left to me on the planet, however, I'm going to read in any further depth about the origins of WWI. The causes are multiple and one can construe them in a number of ways. There will never be a definitive answer to the question "why?"
SAINT,
n. A dead sinner revised and edited.
BATTLE, n. A method of untying
with the teeth of a political knot that would not yield to the tongue.
Kathy, Krystin, and I were chatting
one night, and when the conversation lagged a bit, Kathy pulled out a small
book by the publishers of the American Heritage Dictionary, 100 Words Almost Everyone Confuses & Misuses. All three of us know the correct usage for
most of the terms that are troublesome for many (complimentary, complementary; uninterested,
disinterested; imply, infer; and so on).
All of us, however, were surprised to learn that "peruse"
means to read carefully and thoroughly.
We all thought it meant to scan quickly; I have been using that word
incorrectly my entire life. How
disappointing.
Within two weeks after we had that
conversation, in A.Word.A.Day, a daily email I receive, the word "peruse"
showed up:
MEANING:
verb tr.:
1. To read or examine with great care.
2. To read or examine in a casual manner.
1. To read or examine with great care.
2. To read or examine in a casual manner.
Give me a break. Even with the vagaries of English, how can the word mean both of those things? The OED, one of the final authorities, is unhelpful: "Reading; examination, study; an instance of this." So I turned to the usage panel of the American Heritage Dictionary (which didn't have this explanation in their little book that started this inquiry). They write that peruse "has long meant 'to read thoroughly,' . . . which was acceptable to 75 percent of the Usage Panel in our 2011 survey. But the word is often used more loosely, to mean simply 'to read,' . . . Seventy percent of the Panel rejected this example in 1999, but only 39 percent rejected it in 2011. Further extension of the word to mean 'to glance over, skim' has traditionally been considered an error, but our ballot results suggest that it is becoming somewhat more acceptable. When asked about the sentence "I only had a moment to peruse the manual quickly," 66 percent of the Panel found it unacceptable in 1988, 58 percent in 1999, and 48 percent in 2011." I still think it's odd to have a word mean one thing and also its opposite, although I imagine there are other such examples in English.
Brad Leithauser in the New Yorker
wrote that he "was seeking a
replacement for "unfathomable" but, upon looking up "depthless,"
found that it was indeed a synonym for "unfathomable"—but also for
its opposite, "fathomable," which he reported means "Having no
depth; shallow." He decided the "word
was what I think of as an auto-antonym (a term that doesn't appear in Webster's
Second): it's its own opposite. Which is to say, it's a mostly unusable word." Another one I use fairly often, and have to
be careful about context, is "sanction."
Kathy and Krystin and I also agreed
that the use of some confusing words will always be beyond our reach (and each
of has different ones that we cannot master).
I, for example, will never be able to distinguish between the times when
it is appropriate to use "that" and when it is appropriate to use "which." I've never been able to understand the
difference between restrictive and non-restrictive clauses, so I fall back on a
simple-minded rule that gets me through most difficulties: If the clause needs commas around it, I use "which";
if it doesn't, I use "that."
My long-time informal personal writing mentor and friend, Regents Professor
Tom Clayton, will probably roll his eyes when he reads this, and justifiably
so.
And there was that poor sucker
Flaubert rolling around on his floor for three days looking for the right word.
Writing is the art of applying
the ass to the seat. (Both Parker)
I was glad to see that
astrobiologists believe the earth will be inhabitable for at least another 1.75
billion years, and maybe up to 3.25 billion years. I guess I should make sure my will is up to
date. (At some point the sun will begin
to expand before it collapses. The
astrobiologists point out, however, that we may make the planet inhospitable to
human life long before that because of anthropogenic climate change, for
example.)
The same day I saw the article about
the astrobiologists' publication there was also a news piece in the Chronicle of Higher Education about a
conference at the Library of Congress to discuss how long human civilization
might last. It's made it 5000 years—out
of the 4.5 billion years the earth has existed.
The question is whether we'll survive our own technologies; we've put a
strain on the planet since the invention of the steam engine. If there were a cataclysmic event, human
civilization could come to an end. One
way to try to ensure survival is to get some of us off the earth and to another
planet, Mars being the most likely candidate in the "near"
future. (Meantime, the astrobiologists
comb the galaxy looking for planets that can also support life as we know it,
but at present we have no feasible way to get some small chunk of humanity to
Mars, much less planets that are light-years away from us.)
One worrisome fact is "the
great silence," the fact that there's no evidence of any
technologically-advanced civilization elsewhere in the universe. It may be, a former NASA historian observed,
that civilizations just don't last much longer than that, so never get to the
point of settling off planet. (This is
the "Fermi Paradox," formulated by physicist Enrico Fermi in
1950: "the apparent contradiction
between high estimates of the probability of the existence of extraterrestrial
civilization and humanity's lack of contact with, or evidence for, such
civilizations.") Elliott
speculates, quasi-facetiously (and perhaps quasi-hopefully), that perhaps there
is a benign intergalactic civilization out there that has conquered the problem
of traveling faster than the speed of light, with ships that have "warp
drives," and that it does not interfere with planetary civilizations, or
invite them to join intergalactic society, until they are sufficiently advanced
and have demonstrated that they have gotten beyond warlike and other primitive
belief systems.
Although our technology is stone-age
compared to the technology that an intergalactic civilization would possess, I
wondered if ours is sufficiently advanced that it could at least detect large
ships flitting about the universe and perhaps other evidence of social
organization. So I asked a friend in
astronomy, Professor Roberta Humphreys.
She tells me, among other things,
that there's at least one guy looking for alien ships! He's a University of California astronomer,
Geoff Marcy, who was quoted in the Huffington Post as saying (reflecting the
common wisdom among any who think about it) that "the universe is simply
too large for there not to be another intelligent civilization out there. . .
. Really, the proper question is: 'How far away is our nearest intelligent
neighbor?' They could be 10 light-years,
100 light-years, a million light-years or more. We have no idea." He said he'll look for ships the same way
astronomers look for planets orbiting stars:
by dimming of the starlight as the object (planet, large ship) passes
between the star and earth. So who
knows, maybe he'll find some ships. Then
we can rest easy, knowing that it won't be the scene depicted in the movie "Independence
Day" but instead that someday we'll be invited to join the universe. Right.
Roberta also sent me the précis of
an article (not available yet when I composed this) about some recent
astrobiologists' work. In looking for
the origins of life when planets are formed, they traced the elements most used
by life on earth (carbon, hydrogen, oxygen, and nitrogen). The article authors conclude that "it is
now clear that the early stages of star formation fosters the creation of water
and simple organic molecules with enrichments of heavy isotopes. . . . Although the exact details are uncertain, it
is likely that the birth process of star and planets likely leads to
terrestrial worlds being born with abundant water and organics on the surface."
Anyway,
Roberta tells me that she goes through this line of thinking with her
classes. "The nearest habitable
planets should be only about 20 light years away, but that is habitable, a
terrestrial type planet in the life zone of its star. It doesn't mean
that there is life on it, although we would give life a high probability. The biggest uncertainty is intelligent life
(what is intelligence?) and the probability that it develops an advanced
technological civilization that wants to explore the Universe. These are big ifs. And even so, any communication means that our
civilizations have to overlap in the same time. Interstellar travel past the nearest stars is
impractical and probably impossible."
As a
result, she says, "we must use interstellar communication which travels at
the speed of light. Let's say an
advanced civilization exists in the Milky Way in our time frame, that is
now! But their planet is 1000 light
years away." And that means, she
explained, that "it takes 1000 years for the message to get here and 1000
years for them to get our answer."
So, "perhaps we just haven't been listening long enough."
As
for Elliott's intergalactic civilization, Roberta wrote me that the assumption
is that "this advanced civilization has as its primary goal to spread its
civilization, its seed, across the Galaxy.
That is, create a 'galactic empire.'
In this exercise we are limited by the laws of physics, so let's assume
the interstellar ships can travel at 1/10000 the speed of light. You can show that it will take only 100,000
years to cross the Galaxy [that is, the Milky Way]. So where are they? There are lots of answers." She concluded that in her view, "the
Universe is full of life, probably intelligent life as well, and asking the
same questions." (She explained
that a civilization would "cross the galaxy in steps—hops, colonizing the
appropriate planets along the way. If
the nearest habitable planet is 20 light years away, each step takes 20,000
years, add a little time to develop the resources on the colony planet
sufficiently to take the next step, and so on across the Galaxy, about 100,000
light years in diameter.)
I
commented to Roberta that perhaps any civilization would be daunted by the
prospect of sending off a ship that has to travel for thousands of years before
getting anywhere, unless their lifespans are also thousands of years. And
sending off the ships PLUS advancing technology to spread further would add to
the challenge, I would think--each little outpost would have to advance technology
OR the home civilization would--but then it would take centuries more to spread
around the advanced technology.
Communication, even at the speed of light, across solar systems and
galaxies would be similar to the speed of communication between our capitol and
the diplomats in Europe when the U.S. was founded: it took months for a
letter to go and be answered.
It's
interesting to contemplate whether any civilization would actually have the
long-range vision, ambition, and resources to start an effort that only pays
off a little bit every 20,000 years--and the planet that starts the effort
would never know (except in 40,000-year bits, as information trickled back,
assuming it did) what happened. To which
Roberta responded "that's the point.
Would ours? We can't even commit to going back to the Moon, let
alone Mars."
The
upshot of the conclusion from the astrobiologists' work is, given ~100 million
galaxies with ~100 million stars each, that there is other life, and perhaps
galaxies are teeming with life. Given the limits of physics, we each live
in our own little area, unable to reach any other. So it seems to me that the only effective way
to do the galactic empire would be to find warp drives or worm holes. I
know the limit of the speed of light, and I remind people of that every time
they want to wax poetic about interstellar travel and civilization. But I just
hate to believe we're limited that way. I don't want to believe it.
Of
course, if that limit can be exceeded exponentially, permitting easy
interstellar travel, I want it to be humans that do it first (I think), rather
than another species that might not take kindly to us. A la Stephen
Hawkings' comment about aliens vis-a-vis what happened to the Native Americans
when Columbus landed in North America:
the result was not pleasant. I
would like to believe that if humans start zipping about the universe, our
first inclination upon discovering another habitable planet that is in fact
inhabited by intelligent species would not be to destroy them and take over the
planet—and indeed that we would adopt a peaceful approach until learning
otherwise—and that if the reaction to our presence were hostile, but the beings
on the planet were unable to attack or give chase, we would simply back away
and leave them alone. Part of the
problem with Columbus and Native Americans, of course, was biological as well
as political, and one can imagine that issues of infection and disease could be
substantial were humans to encounter other species—be they intelligent or
simply tiny bugs akin to what we see every day.
I had been fed, in my youth, a
lot of old wives' tales about the way men would instantly forsake a beautiful
woman to flock around a brilliant one. It is but fair to say that, after
getting out in the world, I had never seen this happen.
If you wear a short enough
skirt, the party will come to you. (Both
Parker)
April 23 was a sad day for me. I received an assessment from a local tree
service company (made at my request) about the two very large maples in our
back yard. There had developed holes in
the trunks of both trees, where limbs were removed and the cuts never covered
over, and the center wood in the trees had rotted out. The report from the first company said that they
were a disaster waiting to happen.
Kathy suggested I get more than one
evaluation; after all, these are companies in the business of removing
trees. So I did; the second company sent
an arborist out and then he called me.
He said that the trees were in terrible shape and that he was "surprised
that they were still standing."
The trees were planted as saplings
by my great aunt and uncle in 1940 and 1941, and when I visited the house as a
child, I used to climb them. They have
almost been part of the family—and over the years we've planted dozens of hosta
in the back yard, plants that like shade.
For me, it was a great loss to have them come down.
But we did receive unexpected
third-party confirmation of our decision.
There's an outfit in Minneapolis, Wood from the [neighbor]Hood that
takes wood from trees that have been taken down and recycles it into cutting
boards, bread trays, etc. So I called
them and told them about these two large maples that were coming down, because
Kathy and I thought maybe we could have a couple of slices of the trunks as
mementos—table tops or trays, perhaps.
The guy from Wood from the Hood said he couldn't use the wood from those
trees because there was too much that was rotten. Besides, he said, the wood from those kinds
of maples doesn't hold up well to drying and will crack apart when they try to
make anything from it. So he didn't want
it and refused to make any pieces for us because he said we'd be unhappy with
the result.
After the tree folks removed the
trees and ground out the stumps, the back yard looked like a bomb had gone
off. There was no shade and there was
mud and river rock everywhere. We had an
arborist in to advise us on a replacement tree; he told us we should get rid of
the river rock ground cover because it heats up the ground in the summer and it
cools it in the winter—and is not good for a new tree. Well, that was easier said than done. We had a heck of a time finding any way to get
rid of the several tons of river rock—until finally we started to get bids from
various outfits to haul it away.
We also had no idea how vexing the
question of the replacement tree would be.
We learned a lot about trees by searching the web and getting advice
from horticulture colleagues and the Minnesota Extension Service. We changed our minds three times and finally
ended up with, of all things, an elm.
The horticulture people have bred elm cultivars that are resistant to
Dutch elm disease, so that's what we decided on, after choosing and then
rejecting a catalpa, a linden, and a honey locust. Fortunately, the guy at Bachman's was quite
understanding. (He's the one who talked
us out of the linden, which we really liked and which a couple of colleagues
recommended on the basis of personal experience. He said it's quite susceptible to being
defoliated by Japanese beetles and that we would be unhappy with him if he sold
us a linden. The web sources on the
linden backed him up.)
Bachman's came, the crew installed
the elm, leveled the ground and added a layer of soil, and laid flagstone for a
walk. We then, within a week, expanded
one hosta garden, added two more gardens on the site where the maples once
stood, planted ground cover plants beside and among the flagstone, laid more
flagstone and planted more ground cover, and planted new plants. We are both hurry-up-and-get-the-project-done
types, so we didn't waste time. We now
have the busiest back yard in south Minneapolis, I think, with pots of flowers
everywhere and paths and some river rock and trees and gardens.
WHEAT, n. A cereal from which a
tolerably good whisky can with some difficulty be made, and which is used also
for bread. The French are said to eat more bread per capita of
population than any other people, which is natural, for only they know how to
make the stuff palatable.
Los
Angeles: Seventy-two suburbs in search of a city. (Parker)
Kathy and I had decided that 2013
would be the year of travel in the U.S. and that we would remodel the kitchen,
which it badly needed. But then suddenly
we needed a new furnace, and had to take down big trees, and the kitchen
project got pushed into the future and we stayed home. In addition, small things piled up, like
replacing the service door to the garage, repairing wood in a corner of the
family room floor that had rotted because of a crack in the stucco, buying a
generator to deal with power outages, and (voluntarily) re-furnishing the
living room. Between that list, the
furnace, and the tree removal and associated costs to reconstruct our yard,
Kathy commented ruefully that this is for us "the year of the house"—as
opposed to spending the money in a more interesting and enjoyable way. Given Krystin's surgery and with various
people coming and going to do work in the house and yard, we finally decided we
just couldn't travel much this summer.
Drat.
We
certainly have not put off indefinitely plans for travel and do have tentative
plans for Italy and India. I hope to have
a trip planned the year of my funeral, whether I die at 70 or 80 or 90 or
later.
I
is the first letter of the alphabet, the first word of the language, the first
thought of the mind, the first object of affection. In grammar it is a
pronoun of the first person and singular number. Its plural is said to be
_We_, but how there can be more than one myself is doubtless clearer to the
grammarians than it is to the author of this incomparable dictionary.
Conception of two myselfs is difficult, but fine.
At
a party, an arrogant young man told Parker, as he looked around the room at the
guests, "I'm afraid I simply cannot bear fools." "How odd,"
Parker replied. "Your mother could, apparently."
In the
"funny language stories" category:
The Associated Press reported in March that the Swedish Language
Council, a semi-official body that's the language watchdog, compiled its annual
(2012) "list of words that aren't in
the Swedish dictionary but have entered common parlance" and included "ogooglebar"
on the list. It was defined as "something
'that cannot be found on the Web with a search engine.'" Google objected and insisted that it be noted
that Google is a registered trademark.
The Council deleted the word from its list but scolded Google for trying
to control the Swedish language. My
friend Nils Hasselmo—a linguist—wrote to me that "the Swedes should obviously
have their mouths washed with soap for using patented language!" He went on to add that "language is a
wild beast that will not be subdued!"
On that same subject, more or less, I recalled last
year for Nils a lunch I had had with him many years ago. He and I were having lunch in the Campus Club
on the University campus and an academic visiting from Norway (which I didn't
know at that moment) who Nils knew happened by our table and the two of them
had a few moments of conversation. I
asked him when he sat back down at the table if they had both been speaking
Swedish. He said no, the colleague had
been speaking Norwegian. So they
understood each other perfectly well, I asked?
They did, Nils said.
I wrote recently to ask Nils whether that is also true
for Danes speaking with either Swedes or Norwegians. Can they understand one
another perfectly well? He wrote to me
that yes, they do. "Speakers of
Swedish, Norwegian and Danish understand each other, but the degree may depend
on how close to the other country they live (or grew up). Western Swedes understand Norwegians very
well, and southern Swedes understand Danes very well. . . . Regarding Norwegian, it also depends on
whether the Norwegians are speaking Dano-Norwegian, the language spoken by the
upper classes during the 500-year rule by Denmark and continued by many
Norwegians, or New Norwegian, a language developed by Ivar Aasen in the 19th
century as part of efforts to stress Norwegian culture after the forced union
with Sweden [ended]. The former is
easier for Swedes than the latter, which was based on more western Norwegian
dialects."
There appears to be a reason that the language
speakers in the three countries can continue to understand each other. After the "ogooglebar" news, I learned
from Wikipedia that three of the Scandinavian countries (Denmark, Norway,
Sweden) have language councils that work together on a daily basis. The Danish council (Dansk Sprognævn) is
official, part of the Danish government, and among its tasks are updating the
official Danish dictionary, Retskrivningsordbogen (they tally new words and when they appear often enough, they add them
to the dictionary—"which all government institutions and schools
are obliged by law to follow").
Interestingly, the three councils apparently work together to make sure
that "the three Mainland Scandinavian languages, which are more or less
mutually intelligible, do not diverge more than necessary from one another." (Finland and Iceland both also have language
councils, but those languages are quite different from Danish, Norwegian, and
Swedish.)
Now, if we could just have such cooperation between
England, Scotland, Ireland, and the United States (that's a joke). Of course, the idea of a language council in
the United States isn't likely to get very far—and were one proposed, it would
no doubt be advanced by the wingnuts on the right who insist that America have
one official language that everyone must speak.
My response would be that the people who live in Mississippi and Alabama
should also learn to speak English.
Ducking
for apples - change one letter and it's the story of my life. (Parker)
ME,
pro. The objectionable case of I. The personal pronoun in English
has three cases, the dominative, the objectionable and the oppressive.
Each is all three.
Gary to Krystin and
Elliott:
It
has been suggested (not by Kathy) that I am too involved in you guys'
decision-making in your lives--that I should step back and just let you handle
things as the adults you are.
You
should tell me if you think that's true. It's of course hard for me to
know--parents don't have any good way to make comparisons. If there are times
when you want to say "I can handle this myself, dad," that's fine. I
start with the assumption that if I can help you, I should--but perhaps I
should start with the assumption that you'll ask for my help if you want it and
otherwise stay out of your decisions.
Krystin to Gary:
I
know with us still living under your roof, it's hard not to be involved in our
decision making, especially if it affects how much longer we stay in the house,
or if we're doing anything to move on and move up. To be honest, the only time
I felt pressure from you recently about decision making was about whether or
not to go back to Korea. The ONLY reason I didn't end up going back, the only
thing keeping me from going, was you. I don't mean that in an accusatory way or
to make you feel guilty. I'm just saying that is the only time (since I've
returned from Korea) I've felt you were maybe too involved in my decision
making. I know that you only had my best interest at heart, and I'm happy now
with my job and I know working here at the U has given me lots of benefits and
whatnot, but Korea does still pull at my heart, and I will want to go back.
Other
than that, I can't think of anything else off the top of my head. I don't mind
you being involved in my life, and I haven't really had any other big decisions
I've had to make anyway, so for now, I don't think it's a problem. Obviously I
can't speak for Elliott, but that's my take.
Elliott didn't
respond. I feel somewhat vindicated
about the pressure on Krystin not to go back to South Korea. With the medical issues that arose, living
there would have been a major problem.
She grudgingly agreed.
In
youth, it was a way I had,
To do my best to please.
And change, with every passing lad
To suit his theories.
But now I know the things I know
And do the things I do,
And if you do not like me so,
To hell, my love, with you." (Parker)
To do my best to please.
And change, with every passing lad
To suit his theories.
But now I know the things I know
And do the things I do,
And if you do not like me so,
To hell, my love, with you." (Parker)
(GE:
And when we are both at that place, things work out quite nicely)
Elliott, Kathy, and I were chatting one summer night and
he asked if black humor about the Holocaust or Shoah would ever be
acceptable. I said I thought not. There is much black humor I can accept, but
some evil is so far beyond the pale that it cannot be joked about. Elliott wondered if that would still be true
in 500 years. Who knows. Right now, I said I thought the only ones who
could indulge in black humor about the Holocaust are the Jews (and perhaps the
gypsies, and a few others who the Nazis sent to the camps). But I certainly am not going to do so.
Almost immediately after that exchange came the news
about Michael Karkoc, the Ukrainian immigrant who's lived in Minneapolis since
shortly after the end of WWII and is now alleged to have worked with and at the
direction of the SS in murdering quite a number of people during the war. The reader/blogosphere had mixed reactions,
but at least in the places I read, the significant majority favored deportation
to Germany or Poland for trial if either country decided there was sufficient
evidence to warrant prosecution. Karkoc
is 94, so some argued that he should be let alone on the grounds of age and
that he's apparently been a good citizen ever since he came to Minneapolis in
the late 1940s.
I think those who argue for possible deportation and
trial, if the evidence warrants it, have the stronger position. And I didn't encounter anyone making jokes
about the situation.
I've also told Krystin—the aspiring writer—that I don't
use the "nazi" descriptor as a noun (for example, "grammar nazi")
unless I am writing or speaking about Nazi Germany because I think the trivial
use of the term demeans the Shoah. Views
on this vary, with the majority of the websites (with both Jewish and
non-Jewish authors) arguing against use of the term because it trivializes what
happened. On the other hand, "there are
those who would assert that the only way to properly treat the Nazis is to deny
them the dignity of respect for their name not by avoiding using it, but
instead to use it freely for anything even vaguely negative; to make a mockery
of the Nazis by turning their name into a joke." That view may have some merit, but I go with
the trivialization view.
ABSURDITY,
n.: A statement or belief manifestly inconsistent with one's own opinion.
Heterosexuality
is not normal, it's just common.
(Parker)
It was an interesting two
weeks in Lake Wobegon, one on the home front and one nationally.
At the national level, as
far as I can tell, the Supreme Court told the gay community it could now be
legally married and do what it's been doing in the bedroom for millennia—and
the Court then did that very thing to the community of people of color. Did I get that right? But it was an odd juxtaposition of events,
because at the same time the Court gutted the Voting Rights Act, television
chef Paula Deen was getting dumped by major sponsors, even that bastion of
liberalism Walmart, for unacceptable comments about Black/African-Americans. So corporate and commentariat America was
according due respect to people of color at the same time the Court said the
right-wing political establishment of the South and a few other places could
now take steps to restrict the impact of their voting without fear of federal
intervention. (Which, of course, several
promptly began to do.)
One is reminded of Ambrose
Bierce's imaginary conversation in his definition of the devil:
SATAN, n. One
of the Creator's lamentable mistakes, repented in sashcloth and axes.
Being instated as an archangel, Satan made himself multifariously objectionable
and was finally expelled from Heaven. Halfway in his descent he paused,
bent his head in thought a moment and at last went back. "There is
one favor that I should like to ask," said he.
"Name it."
"Man, I understand, is about to be created. He will need laws."
"What, wretch! You, his appointed adversary, charged from the dawn of eternity with hatred of his soul—you ask for the right to make his laws?"
"Pardon; what I have to ask is that he be permitted to make them himself."
It was so ordered.
"Man, I understand, is about to be created. He will need laws."
"What, wretch! You, his appointed adversary, charged from the dawn of eternity with hatred of his soul—you ask for the right to make his laws?"
"Pardon; what I have to ask is that he be permitted to make them himself."
It was so ordered.
I
wish you could have heard that pretty crash Beauty
and the Beast made when, with one sweeping, liquid gesture, I tossed it out
of my twelfth-story window. (Parker)
CONSOLATION, n. The knowledge
that a better man is more unfortunate than yourself.
From the annals of
cribbage: Elliott and I have taken to
playing it once in awhile. I taught him
to play early last summer. I won't play "Magic"
and he won't play bridge, so we compromised on cribbage. I was ahead of him by four games at one
point, and then he double-skunked me. I
haven't played much cribbage in recent years, but I can't remember a time when
I saw a double skunk. I knew I wasn't
going to fare well when, on the second deal, he had 16 points in his hand and
24 in his crib. It went down hill from
there for me.
But later I won 10 games
in a row, clearly indicating I am a better player . J
In one game later in the summer, Elliott scored 21 pegging points on the
first hand—but still only barely managed to avoid my skunking him. He found these developments irritating.
Eventually the won-lost
totals began to balance out—except that I have maintained a consistent
9-12-game lead.
On the spur of the moment, when we'd both gotten big
hands, in one game we tracked the number of deals it took before one of us
won. It was six. We were amazed.
REASONABLE,
adj. Accessible to the infection of our own opinions. Hospitable to
persuasion, dissuasion and evasion.
I'd
rather have a bottle in front of me, than a frontal lobotomy. (Parker)
We were glad we took down the trees
because a heckuva storm came through town on June 21 and blew down lots of
trees—especially in our neighborhood of Minneapolis. (One tree toppled over just down the block
from our house—right on top of a car that was driving by. Fortunately, the driver was not injured, but
his car was squashed.) The storm also
took out a lot of power—it was the largest power outage in Minnesota in
history, I believe it was reported. The
storm brought a lot of rain, which followed a very rainy spring that had left
the ground already soaked. (And thus,
it was surmised, made it easier for the wind to topple the trees, because the
ground was softer than normal.)
We were among those who lost
power. When the power goes out, the sump
pump stops. When our sump pump stops,
the basement floods, which it duly did.
We had streams running across the floor to the drain from all
sides. We don't keep much directly on
the floor, so there was no damage of any significance, but it was still a big
mess to clean up.
This same thing happened about a
dozen years ago, when we were without power for about 5 days. No sump pump, basement flooded. What was really irritating last time was that
by the end of day 5 or so, it was only a small group of houses that had no
power—everyone across the street, behind us, and down the block had power, just
not 6-8 houses in a little group that included ours. One of neighbors at one point saw a power
company truck driving slowly by and ran after it and insisted the crew come and
look at the transformer or whatever it is on the pole at the end of our
block. They did so, and used a pole to
reach up and flip a switch—and bingo, our power was back on. So we were without for 5 days because some
switch didn't get flipped (sort of like a blown circuit).
This time I decided I'd had it with
power outages and wet basements (Kathy agreed), so we went out to Menard's on
Sunday and, luckily, they had just received a shipment of (gas-fueled)
generators. We bought one and brought it
home and got it going, so it re-started the sump pump, the refrigerator (the
food in which was largely OK, after only two days), and the chest freezer (in
which everything was still frozen like a brick). So next time. . . . And I'm sure there will be a next time; one
of the predicted results of global climate change is more violent storms. So now I can at least keep the sump pump
going. (When I re-started the sump pump,
it immediately ran uninterruptedly for quite awhile—and flooded a sizeable part
of the back yard.)
Meantime, that Sunday we (a
contingent of neighbors this time) saw an Omaha Power truck out in front
(brought in to help deal with the power outages) and as a group we asked the
crew to again look at the transformer on the pole at the end of the block. They didn't do so right then but did come
back that evening, we think because we'd pleaded with them to do so—and exactly
the same thing happened. One of them
took a pole, reached up and flipped a switch, and our power was restored. We decided later that as a group we'd invest
in a pole and after the next power outage we'd flip the switch ourselves. It was really irritating to be without power
for two days only because some circuit blew and it took so little to restore
service.
PAIN,
n. An uncomfortable frame of mind that may have a physical basis in
something that is being done to the body, or may be purely mental, caused by
the good fortune of another.
The
cure for boredom is curiosity. There is no cure for curiosity. (Parker)
I thought Margaret Talbot
in the New Yorker made a wise
observation after reporting and commenting on a report from the Pew Research
Center that now in 40% of households with children, women are the main
breadwinners. The majority of those
surveyed offered the opinion that raising children is more difficult when the
mother works outside the home (although fewer in the younger cohort believe
that than in the older one). She wrote:
When people talk about the
difficulty of rearing children today, they may actually be talking about
economics and about work. Life is harder
when mothers work outside the home because, obviously, there's more to do in
the same amount of time. . . . But life
is also stressful and often demoralizing in twenty-first-century America
because we all live under a speeded-up, coercively
multitasking, vacation-poor, debt-burdened, harsher, and less forgiving form of
capitalism than do the citizens of many other industrial countries.
Another reason, in my view, to
oppose a political philosophy that discounts a social/economic support
net: the more unrestricted the
competition and capitalism is, the worse the quality of life for everyone in
the system.
It
is worse, that is, except for the very wealthy and those who make it to the top
of the heap—but as Gladwell argues, in Outliers,
"the biggest misconception about success is that we do it solely on our
smarts, ambition, hustle and hard work" and that success "is not
exceptional or mysterious. It is
grounded in a web of advantages and inheritances, some deserved, some not, some
earned, some just plain lucky." If
one accepts Gladwell's view, which I do, then this speeded-up life we live in
this country isn't necessarily going to make the lower end of the
socio-economic spectrum any better off.
They just work harder and get less vacation time. (In the words of Grist blogger David Roberts,
"No one but the privileged believes they are in a
meritocracy.")
Coincidentally,
about the same time I wrote the foregoing, there appeared in the Washington
Post an article titled "40 maps that explain the world."
One of the maps was "Economic
inequality around the world," measured using the Gini coefficient (a
statistic measuring inequality). If in a
society of 100 people, 1 person received all the income and the 99 received
none, the Gini coefficient would be 1.0; if in a society of 100 people,
everyone received exactly the same income, the Gini coefficient would be
0. So, obviously, a lower Gini
coefficient means greater equality.
There was only one country
in the world with a Gini coefficient under 0.25: Sweden.
The rest of Scandinavia and part of Western Europe is in the 0.25-0.29
range; most of the remainder of Western Europe, Australia, and Canada are
0.30-0.34. The U.S. is in the 0.45-0.49
range, along with China, Mexico, and a few other places—so greater economic
disparities. What I find interesting is
that in international measures of happiness, such as the "Measures of Gross National Happiness" (2000 data)
by Ruut Veenhoven at Erasmus University in Rotterdam, Denmark and Sweden are #1
and 2. The U.S. is #3. Another study at the University of Illinois
found, analyzing Gallup World Poll data, that "the research indicated that
people have higher life evaluations when others in society also have their
needs fulfilled. Thus life satisfaction
is not just an individual affair, but depends substantially also on the quality
of life of one's fellow citizens."
Another of the maps was telling as well, drawn from
data from "Save the Children," which identified the best countries in
which to be mothers. As one might
expect, Scandinavia, Western Europe, Canada, and Australia are in the first
tier. The U.S. is in the second tier,
along with Mexico, Argentina, and parts of Eastern Europe. Of the 35 "developed" countries in
the world, the U.S. ranks 34th in child poverty rates (that is, only Romania,
ranked 35th, has a higher rate of child poverty than the U.S.; oddly, 30th to
33rd are Italy, Lithuania, Spain, and Latvia).
These are disappointing data. They are also interesting findings at a time
when some U.S. politicians argue for cutting the social safety net and for laws
that increase disparities in wealth.
(Elliott said he might voluntarily take Swedish, in the time he has left
in college, because he likes it and because if the U.S. goes to hell, he'd want
to emigrate (back) to Sweden, from which his paternal great-great-grandparents
emigrated in 1880.)
There
were interesting data posted in a study from the London School of Economics by
a guy named David Binder, "a Family
Fiscal policy Consultant for the Christian social policy charity, CARE, where
he conducts in-depth research on Family Tax, Welfare and Benefits." He also "explores other areas relating
to society and culture, evangelical christianity (of which he is a believer),
and criminal justice."
What Binder reported, using standard Organization for
Economic Co-operation and Development (OECD, basically the 34 most-developed
nations of the world) data, is a fairly significant statistical correlation
between national levels of happiness (using several variables) and taxation as
a percentage of GDP: the higher the
taxation level, the happier the population.
The relationship isn't perfect: Switzerland
and Ireland have low taxes and a high level of happiness while Italy has high
taxes and a low level of happiness. The
U.S. is a slight outlier; it ranks 32nd of 34 in taxation but 22nd in level of
happiness. There are other cases among
the 34 where the rank in one doesn't correspond closely to the rank in the
other, but the overall relationship is pretty clear.
Mr. Binder appropriately notes caveats.
It might be the case, for
instance, that in EU countries such as Denmark, Belgium, Sweden, and the
Netherlands there are common characteristics (apart from high taxation rates)
within their cultures that lead them to experience greater levels of
wellbeing/happiness. Nevertheless, one
could feasibly argue that higher levels of taxation receipts might allow for
governments to invest more in public services ('progressives' frequently assert
this line of argument) such as publicly funded leisure and personal care
services (one of the variables used in this study.) It might also be the case that higher levels
of public taxation leads to better healthcare, education, transport and other
provisions, which are all areas that could have an influence on an individual's
happiness.
The
data are certainly suggestive.
I would dearly love to see the
leading libertarian and Tea Party lights in American politics adduce some
decent evidence—and I mean hard evidence, not philosophical claims—that their
minimalist view of government works better than the version adopted in
Scandinavia and, in various parts and various ways, throughout much of the rest
of western Europe and the developed world.
As far as I can see, the nations with minimalist governments are also
those that are basket cases (compare some in Africa and Asia). There are good examples of the successful
high-tax/high-benefits states; where are the examples of the opposite? (One cannot, of course, have
low-tax/high-benefits states.) (No, I
don't equate the Tea Party with libertarians, but there is significant overlap
between them.)
The primary example inside the U.S.
is the South: It is low tax, low
government—and ranks low on just about every measure of civilized society one
can dream up.
I
went to convent in New York and was fired finally for my insistence that the Immaculate
Conception was spontaneous combustion.
(Parker)
IMPIETY,
n. Your irreverence toward my deity.
A Facebook "friend" posted
this: "Obama: Trayvon 'could have
been me'. This clown is in fricken believable. Impeach this god damn moron
Racist prick." This from a distant
relative whom I've never met and likely never will. One can infer the level of education the
person has achieved.
I posed a question to Kathy,
Krystin, and one friend: should I post a
comment along the lines "I always appreciate it when relatives engage in
thoughtful and rational discourse about public policy questions." Obviously snotty, but does it violate any
rules or norms of behavior for Facebook?
The answer to the question was "no," but I received conflicting
counsel about whether to post the comment.
Two of the three said I should not and should just de-friend the person;
one said I should. All three expressed
doubt I'd change the person's point of view. My friend Steve recalled a quote from Robert
Heinlein: "Never try to teach a pig
to sing. It wastes your time and annoys
the pig." He went on to observe
that "it's pretty difficult to hold a meaningful policy discussion with
someone whose cultural predispositions and education make it unlikely they can
deal constructively with the abstractions involved. . . . [they] simply lack
the tools and predisposition to have a reasoned discussion." I think that's right but wondered if a comment
would make others who read such posts stop and think about what they're
writing.
Kathy's advice was succinct: "Defriend this person. It's not worth the aggravation and this
person will never share your point of view." Krystin said I should post the comment but
also pointed out that "I doubt, though, that the relative would catch all
the sarcasm dripping from it."
In the end, I did nothing, heeding
the advice we all received from our mothers:
if you can't say anything nice, don't say anything at all. But I sure was tempted.
Later in the year, however, I could
not resist. He posted a picture of a
man, clearly affluent, silver-haired, sport jacket and shirt with cuff links,
sitting in a posh bar, with the subtitle "I don't deal often with Obama
voters, but when I do, I order French fries." I responded simply by saying that "gee,
almost all the Obama voters I know are MDs, JDs, and PhDs."
POLITICS,
n. A strife of interests masquerading as a contest of principles.
The conduct of public affairs for private advantage.
There
has been but one sweet, misty interlude in my [insomnia]; that was the evening
I fell into a dead dreamless slumber brought on by the reading of a book called
Appendicitis. (Parker)
Here's the Erma Bombeck memorial
portion of my letter.
-- As Rita Mae Brown famously paraphrased
it, "insanity is doing the same thing over and over again but expecting
different results" (the exact origin of an original assertion along that
line is not clear). Some among my
friends and relatives insist that one can put mostly-unrinsed dishes in a
dishwasher and that they will come out sparkling clean. I try this about 3-4 times per year, and
inevitably there are food residues baked onto the dishes by the washing/drying
cycle of the dishwasher. So I go back to
rinsing the dishes thoroughly, leaving barely a speck on them. But I keep trying the not-rinsing approach.
-- There is some sock-eating creature in
this house. As I compose this in late
summer, I have precisely 27 unmatched socks on the laundry table. Where do they all go?
-- Last year I wrote: "I have come in the last couple of
years to be astonished and irritated by what I describe as the cantankerousness
of inanimate objects. If I am pulling
hose across the yard, it finds a rock or plant to get stuck in. If I am vacuuming in the house, the cord finds
a chair or table to get ensnarled in." And so on. There is a(n obsolete) word for this! Resistentialism: "The seemingly spiteful behavior shown by inanimate
objects." Objects have continued to
resist me.
-- The Huffington
Post reported that "a new survey finds that men with beards
look as much as eight years older than those who are clean shaven,
according to a report in the Daily Mail."
Duh.
-- I assume most people know the story
behind the song "My Grandfather's Clock." (In a nutshell, in 1875 an American
songwriter stayed in a hotel in England that had a non-working large pendulum
clock in the lobby. The owners told him
that the previous owners, brothers, had died; when the first one died, the
clock lost accuracy and when the second one died it stopped completely—and no
one could get it to start again. The
American was apparently sufficiently amused by the story that he wrote "My
Grandfather's Clock," released in 1876.
It was wildly popular.) What I
didn't know was that it was because of the song that they became known as
grandfather clocks; before that they were "longcase clocks." How many of you learned the song in
elementary school? I have ascertained
that many of my friends did—but my kids did not.
-- For the first time in well over 20
years, I didn't pick my raspberries this year.
I think I have geriatric raspberry bushes—they don't grow very tall and
most of the berries were smaller than the last section of my pinkie
finger. Because I ignored them, the plot
also grew a good crop of weeds as well as raspberry bushes. I may just plow up the entire piece of ground
and grow something else. (Or, to be more
accurate, I may hire someone else to dig up everything.)
-- My friend Scott Eller and I used to
toss a Frisbee around when he lived on campus at the University and I'd hang
around with him. I can report that some
things never change: I regularly see
guys—and it is mostly guys—throwing Frisbees around on the Mall. I suspect that if Frisbees had existed in
1088, the year the University of Bologna was founded (the first university as
we use the term now), the students would have thrown them around then as well.
-- After questioning the veracity of my
assertion that it's better to drive with the air set to "recirculate,"
Kathy conceded after research from the University of Southern California
demonstrated pretty conclusively that doing so "is the best way to reduce
exposure to harmful traffic pollution."
The researchers provided findings that were new, at least to me. Beyond recirculating the air, "exposures
are lower in newer cars, at slower speeds, and on arterial roads, where
pollutant concentrations are lower than on freeways" and "concentrations
of particle pollutants on freeways are often five to 10 times higher than
elsewhere." Wow. I wonder if that's as true on a congested
urban freeway as driving across Iowa.
And that finding certainly makes even less attractive those homes
nestled right next to freeways in the city.
In any event, what they found was that for the average vehicle, "recirculation
settings reduce in-vehicle particle pollution for very small particles from 80
percent (of on-road levels) to 20 percent, and from 70 percent to 30 percent
for larger particles, compared to air ventilation settings which bring in
outside air." This assumes you have
your windows closed; if not, you get what's on the road.
-- One of the
great natural phenomena is the way in which a tube of toothpaste suddenly
empties itself when it hears that you are planning a trip, so that when you
come to pack it is just a twisted shell of its former self, with not even a cubic
millimeter left to be squeezed out. (Benchley)
This happens to me regularly.
-- One part of enjoying
the deck this year has been the veritable ornithological zoo we've had. One night—this was
unusual—we had a pair of doves, a pair of cardinals with baby, a couple of
crows, a hummingbird, and (almost always) about 15-20 sparrows. It's a
cacaphony out there, and the sparrows have taken to nesting in the ivy growing
on the garage, right behind our heads when we're sitting out. We sometimes joke that we will have to call
the police because of domestic disturbances that seem regularly to occur, as
the sparrows hover and flutter and squawk and squeek about who's going to be in
the ivy. In addition, we have a local bunny (who's eating the tops off
all our newly-planted ground cover plants and who runs—for a minute—under the
deck when I toss pebbles at it to scare it away) and multiple squirrels and
chipmunks (who will sometimes run right across the deck while we're sitting there). Sometimes the
sparrows also come perilously close to dive-bombing us, although we think not
intentionally.
-- I saw a Minnesota car license plate 862 OMG. I wonder if the powers that be in the Driver and Vehicle Services division of the Minnesota Department of Public Safety would allow 862 WTF, given its widespread use in social media and other places.
-- Kathy and I had a minorculinary evolution this year. Both of us, for our entire adult life, have avoided Brussels sprouts because we disliked them. At one social event, our friend Ann brought Brussels sprouts made some way that we ate them and like them. Kathy's been making them about once per week ever since, and Elliott looks forward to them (he says that when Kathy fries them in olive oil with salt and pepper, they taste like popcorn). Obviously the taste bud genes he inherited did not come from me.
FLOP,
v. Suddenly to change one's opinions and go over to another party.
The most notable flop on record was that of Saul of Tarsus, who has been
severely criticised as a turn-coat by some of our partisan journals.
All I need is a place to lay my hat,
and a few friends. (Parker)
A few years ago, in an earlier
edition of this letter, I noted the work of a young philosophy professor, Nick
Bostrom, who thinks about existential threats to humanity. The usual examples are a large meteor strike
(akin to the one that hit in the Yucatan and likely wiped out the dinosaurs),
nuclear holocaust (less likely, at least for now), a lethal virus invented in
some lab that is so contagious that it nearly eradicates humanity, grey goo
(nanobots released to clean up an oil spill, for example, but due to a
programming error they devour all carbon based objects, replicating and
destroying everything, turning the planet to dust), global warming (that makes
the planet inhabitable for humans), and so on.
Astronomers
know that in 3-4 billion years the sun will expand into a gas giant, burning up
Earth, before it shrinks to become a white dwarf. If we've survived but aren't off the
planet—out of the solar system—by then, we are in some trouble.
Aeon,
an electronic newsmagazine from England, reports that Bostrom now runs a center
at Oxford University, the Future of Humanity Institute, which thinks about this
stuff in a more organized fashion. [The
quotes that follow are from Aeon and
retain the British spelling.]
Interestingly, what he now worries about is not so much the
possibilities in the preceding paragraph but instead artificial intelligence
(AI), and in particular artificial intelligence that exceeds human intelligence
by a little or by a lot. The world is a
long way (yet) from building AI that is comparable to or exceeds human
intelligence, but those in the field think the day is coming when that will
happen.
One
of the fellows at Bostrom's center points out that "an artificial
intelligence wouldn't need to better the brain by much to be risky. After all,
small leaps in intelligence sometimes have extraordinary effects. . . . 'The difference in intelligence between
humans and chimpanzees is tiny' but it's [the difference] between 7 billion
inhabitants and a permanent place on the endangered species list.'" The
reason why an AI could be dangerous is that it would lack human emotions like
empathy and altruism; it will do what it is programmed to do in the most
efficient way possible (e.g., "If its goal is to win at chess, an AI is
going to model chess moves, make predictions about their success, and select
its actions accordingly. It's going to be ruthless in achieving its goal, but
within a limited domain: the chessboard"). In the larger setting of the earth, it would
be equally ruthless, and designing a "friendly" AI is not likely to
be easy. The guy told a story.
No rational human
community would hand over the reins of its civilisation to an AI. Nor would
many build a genie AI, an uber-engineer that could grant wishes by summoning
new technologies out of the ether. But some day, someone might think it was
safe to build a question-answering AI, a harmless computer cluster whose only
tool was a small speaker or a text channel. . . . [Call it Oracle, after the Oracle at Delphi.]
'Let's say you have an
Oracle AI that makes predictions, or answers engineering questions, or
something along those lines,' . . . 'And
let's say the Oracle AI has some goal it wants to achieve. Say you've designed
it as a reinforcement learner, and you've put a button on the side of it, and
when it gets an engineering problem right, you press the button and that's its
reward. Its goal is to maximise the number of button presses it receives over
the entire future. . . . We might expect
the Oracle AI to pursue button presses by answering engineering problems
correctly. But it might think of other, more efficient ways of securing future
button presses. It might start by behaving really well, trying to please us to
the best of its ability. Not only would it answer our questions about how to
build a flying car, it would add safety features we didn't think of. Maybe it
would usher in a crazy upswing for human civilisation, by extending our lives
and getting us to space, and all kinds of good stuff. And as a result we would
use it a lot, and we would feed it more and more information about our world.'
'One day we might ask it
how to cure a rare disease that we haven't beaten yet. Maybe it would give us a
gene sequence to print up, a virus designed to attack the disease without
disturbing the rest of the body. And so we sequence it out and print it up, and
it turns out it's actually a special-purpose nanofactory that the Oracle AI
controls acoustically. Now this thing is running on nanomachines and it can
make any kind of technology it wants, so it quickly converts a large fraction
of Earth into machines that protect its button, while pressing it as many times
per second as possible. After that it's going to make a list of possible
threats to future button presses, a list that humans would likely be at the top
of. . . . . You would have this thing that behaves really well, until it has
enough power to create a technology that gives it a decisive advantage — and
then it would take that advantage and start doing what it wants to in the
world.'
Lovely thought.
The other subject Bostrom and his
colleagues think about, which I also wrote about a few pages back, is whether there
is a barrier or limit to an advanced civilization. A Russian physicist, Konstantin Tsiolkovsky
(1857-1935), first proposed the notion of the "cosmic omen." That is, the universe is likely to be a
cradle of civilizations, given its vastness and the number of stars and planets
that likely exist—and modern astronomy is indeed identifying planets that could
support life. But so far we haven't
encountered any advanced civilizations.
That's Enrico Fermi's famous question that I mentioned earlier, "where
are they?"
Robin Hanson, a research associate
at the Future of Humanity Institute, says there must be something about the
universe, or about life itself, that stops planets from generating
galaxy-colonising civilisations. There must be a "great filter", he
says, an insurmountable barrier that sits somewhere on the line between dead
matter and cosmic transcendence.
If life on Earth is a fluke, then
there may not be any "great filter" and the future of humanity is an
open question. Bostrom hopes that the Mars
Rover does not find any sign of life on Mars, because if it does, then that
means life has evolved at least twice just in this solar system—and didn't make
it very far. We've made it farther than
multi-cellular organisms that might be found on Mars, but if life can evolve
twice here, it surely could have evolved millions of times elsewhere in the
universe and we'd see ships flying by us.
It may be that every civilization ends up using advanced technology to
destroy itself, for example.
If, however, there is no cosmic omen
and humans don't have some barrier to get past, and they survive their own
follies (nuclear weapons, global warming, etc.), the folks at the Center have
tried imagining how humans could expand into the universe, and offer a view similar
to that of Professor Humphreys. "The
consensus among them is that the Milky Way galaxy could be colonised in less
than a million years, assuming we are able to invent fast-flying interstellar
probes that can make copies of themselves out of raw materials harvested from
alien worlds. If we want to spread out
slowly, we could let the galaxy do the work for us. We could sprinkle starships into the Milky Way's
inner and outer tracks, spreading our diaspora over the Sun's 250 million-year
orbit around the galactic center."
The problem with the "cosmic
omen" and the lack of civilizations, as Professor Humphreys pointed out,
is that they don't take into account that fundamental law of physics, the speed
of light. As I think I have mused
before, even traveling as fast as we know how, it would take a ship carrying
humans thousands of centuries to get to the nearest star. The universe is about 12-13 billion
light-years wide. At least now, there
are no wormholes to jump to other parts of the universe, there's no warp drive,
there's no Crazy Eddie Point (see The
Mote in God's Eye). So unless there's
some way around the limit imposed by the speed of light, we are stuck here,
waiting for the sun to die.
I don't like to believe that. Maybe I've read too much science fiction and
liked Star Wars movies too much. So I'm
hoping that (1) we find a way to travel across space, (2) other unfriendly
beings, if they exist, do not, or (3) at least they don't find it before we do
and eradicate us.
I suppose it's silly to worry about
the future of the species when we have so many problems confronting us on the
planet now. But as the Aeon article points out, it is well for
any species that has the ability to reflect to think about its own extinction,
because 95% of all species that have ever existed have gone extinct, and if one
divides the history of the planet into 9 periods of roughly 500 million years
each (the Earth is approximately 4.5 billion years old), almost all of the
extinctions have come in the last period.
Which, of course, is when most complex life forms have existed.
Sigh. It's hard to be optimistic about this. But oh well, I guess I don't have to worry
too much about these conundrums. I
suspect my atoms will be re-mixing with the universe before we assemble an
Oracle AI and I am certain they will be before we learn whether there's a great
filter. I go back to reading history.
DOG:
A kind of additional or subsidiary Deity designed to catch the overflow or
surplus of the world's worship. (For some, there is another
subsidiary deity, CAT.)
(From
Parker's "Constant Reader" book review of The House at Pooh Corner by A. A. Milne, in The New Yorker [20 October 1928]):
And it is that word 'hummy,' my darlings, that marks the
first place in The House at Pooh Corner
at which Tonstant Weader fwowed up.
A friend of mine and I were
deliberating together over email about why it is that large organizations
(e.g., 3M, the Roman Catholic Church, large universities) sometimes make
decisions that strike outsiders as utterly stupid and counterproductive. In the course of this exchange, I came across
a comment by Robert Conquest (historian and poet). "The behaviour of any bureaucratic
organisation can best be understood by assuming that it is controlled by a cabal
of its enemies."
That
would be a good thing for them to cut on my tombstone: Wherever she went,
including here, it was against her better judgment. (Parker)
MAUSOLEUM,
n. The final and funniest folly of the rich.
We had a great reunion of sorts this
summer. In the early 1970s—I don't
recall exactly when—my friend Scott Eller and I played in a now-long-defunct
student bridge club at the University, when we were both students. In the course of playing we met Husain and
Durriya Sarkar; both had grown up in India, near Mumbai; Husain was a graduate
student in philosophy at the University and Durriya worked there in the medical
labs. Somehow we struck up a friendship
and began seeing them socially. Scott
and I introduced Husain to poker, a game at which he quickly became
excellent. (One time we were playing
poker—at that point, not that far from high school, it was mostly a group of
Scott's and my high-school friends—and on one deal the stakes got way higher
than was normal for us. Husain won about
$100, and we learned after the fact that he bought a new TV set for them. Husain was also sometimes a frustrating
bridge opponent. After 3-4 tricks had
been played, he would lay down his hand, tell you what cards you had, and claim
the tricks he had—and about 90% of the time he was right.)
Husain and Durriya's first child,
Casim, was born in 1976, about six months before they left Minnesota for Baton
Rouge, LA, where Husain was appointed to a tenure-track position at Louisiana
State University. I drove down to Baton
Rouge with Husain to help them move. And
there they have remained ever since, Husain as a professor of philosophy and
Durriya holding scientific appointments.
(Their second child, Ashifa, was born in Baton Rouge and is now married
and living in Mumbai with her husband and their infant daughter. My joke with the Sarkars is that Ashifa looks
like and probably passes for a native—until she opens her mouth and talks like
an American.)
There is some mild confusion about
how often I have seen them since they moved.
Krystin is positive she has met the Sarkars; I didn't think I'd seen
them since they'd moved. I must be
wrong. But it had clearly been a long
time, in any event.
By an odd turn of career, Casim
Sarkar was appointed to a tenured faculty appointment in biomedical engineering
at the University of Minnesota last year and began his appointment this past
summer. His residence here prompted a
visit from Husain and Durriya, so we got to see them several times in
August—and it was great fun to revive the friendship on a face-to-face
basis. (We had remained in touch by
email and occasionally Facebook, but that's a poor substitute for live
interaction.) It was a hoot to listen to
Durriya recall the events surrounding their purchase of a car—neither of them
had driven before—and the first time she drove was in a snowstorm, and ended up
driving up on the sidewalk because she had lost control of the car in the snow.
I took Casim to lunch at the
faculty-staff dining club at the University and then toured his lab. I met his postdoc, Ivan, I assume (from the
accent) from Russia or somewhere in Eastern Europe. He was incredulous when I happened to mention
that Casim had been born in Minnesota. A
very nice Indian kid, with degrees from Texas and MIT, comes as a professor to
Minnesota—returning to the place he was born.
(Yeah, I know he's not a "kid" at his age, but he's a kid as
far as I'm concerned. I saw him when he
was a new-born.)
Constant
use had not worn ragged the fabric of their friendship. (Parker)
SELFISH,
adj. Devoid of consideration for the selfishness of others.
I quit taking a multi-vitamin. About four decades ago a well-educated
girlfriend contended that taking a multi-vitamin was simply an insurance
policy: in case your diet that day didn't
contain what your body needed, the vitamin supplement would provide it. That made sense and I have taken a
multi-vitamin more or less consistently since.
Whether or not the logic of the
argument was supported by science is another matter. A few years ago I inquired of a faculty
friend in our Medical School whether taking vitamins was useful, along the line
my girlfriend had hypothesized. He said
they (the medical establishment) did not know.
They could not find evidence that vitamins taken in tablet form were
actually absorbed into the bloodstream, as they are when obtained from
food. But they weren't sure that they
were not, either. So, I thought, may as
well keep taking them; they're not that expensive and doing no harm.
Last summer, however, I read an
article in the Atlantic about
vitamins and the pharmaceutical industry, "The Vitamin Myth: Why We Think
We Need Supplements." The research
reported was disturbing.
On October 10, 2011, researchers
from the University of Minnesota found that women who took supplemental
multivitamins died at rates higher than those who didn't. Two days later,
researchers from the Cleveland Clinic found that men who took vitamin E had an
increased risk of prostate cancer. . . .
In 2004, researchers from the University of Copenhagen reviewed fourteen
randomized trials involving more than 170,000 people who took vitamins A, C, E,
and beta-carotene to see whether antioxidants could prevent intestinal cancers.
Again, antioxidants didn't live up to the hype. The authors concluded, "We
could not find evidence that antioxidant supplements can prevent
gastrointestinal cancers; on the contrary, they seem to increase overall
mortality." When these same
researchers evaluated the seven best studies, they found that death rates were
6 percent higher in those taking vitamins." After reviewing studies that included over
136,000 people, Johns Hopkins researchers "found an increased risk of
death associated with supplemental vitamin E. Dr. Benjamin Caballero, director of the Center
for Human Nutrition at the Johns Hopkins Bloomberg School of Public Health,
said, "This reaffirms what others have said. The evidence for supplementing with any
vitamin, particularly vitamin E, is just not there. This idea that people have
that [vitamins] will not hurt them may not be that simple."
The
statements that really caught my eye, however, were these:
In 2007,
researchers from the National Cancer Institute examined 11,000 men who did or didn't take multivitamins. Those who took multivitamins were twice as
likely to die from advanced prostate cancer. . . . On October 10, 2011, researchers from the
University of Minnesota evaluated 39,000
older women and found that those who took supplemental multivitamins,
magnesium, zinc, copper, and iron died at rates higher than those who didn't. They
concluded, "Based on existing evidence, we see little justification for
the general and widespread use of dietary supplements." (My italics.)
Following a study at the Cleveland Clinic, "Steven Nissen, chairman of
cardiology at the Cleveland Clinic, said, 'The concept of multivitamins was
sold to Americans by an eager nutraceutical industry to generate profits. There was [sic] never any scientific data
supporting their usage.'"
While
it is obviously unwise to make medical decisions based on articles in the
popular press, I have found that reporting in the Atlantic is among the best in the country, and the research the
reporter cited seemed to me pretty convincing.
Moreover, these are not small studies with only a few dozen or couple of
hundred subjects. Yes, one can quibble
endlessly about methodology and the appropriate use of statistics and whether
one can infer causation from correlation (of course you can, in certain
circumstances), but when all the evidence, data, and commentary from the
medical establishment is taken into account, it doesn't seem to me that
difficult to reach a conclusion.
The Atlantic article, moreover, follows
others over the years that have reported study after study that found that,
with a few exceptions, there is no documented benefit from taking
multi-vitamins or doses of specific vitamins—and that there is harm from too
much of some vitamins. The exceptions,
for example, noted in 2008: "some
extra vitamins have proven benefits, such as vitamin B12 supplements for the
elderly and folic acid for women of child-bearing age. And calcium and vitamin D in women over 65
appear to protect bone health," reported the New York Times' Tara Parker-Pope, a well-respected author and
blogger on health issues. Vitamin C may help moderate and shorten
colds. The Harvard Women's Health Watch
suggests that taking vitamin D may be of benefit to a number of groups,
including those who live in northern climates (i.e., north of the 37th
parallel, which is roughly the line at the top of Arizona, New Mexico,
Oklahoma, Arkansas, Tennessee, and North Carolina). The National Institutes of Health reached the
same conclusion and recommends vitamin D supplements during the winter. Apart from specific cases, however, the bulk
of the evidence suggests not taking multi-vitamins.
But
then the Harvard School of Public Health has this on its website: "Trying to follow all the studies on
vitamins and health can make your head swirl. But, when it's all boiled down,
the take–home message is actually pretty simple: A daily multivitamin, and maybe an extra vitamin
D supplement, is a good way to make sure you're getting all the
nutrients you need to be healthy."
But I can't find a date on that advice.
An August, 2013, health update from NIH was ambivalent about
multi-vitamins.
My
sense was that multi-vitamins don't do much (if any) good and there is
sufficient correlational evidence to suggest they may do harm. So I asked my doctor, whom I admire greatly
(he cites research literature when he talks to me). I have a physician who is firmly
evidence-based, which fits nicely with my equally firm views about relying only
on evidence-based medicine. His advice
matched what I read: take a vitamin D
tablet, a baby aspirin, and nothing else, unless I like to have expensive
urine. So I quit.
ZEAL,
n. A certain nervous disorder afflicting the young and inexperienced. A passion
that goeth before a sprawl.
The only "ism" hollywood
believes in is plagiarism. (Parker)
I subscribe to Pandora, an online
music service, to play music when I'm just working in my office. One picks the artists one wishes to hear;
mostly I listen to baroque music but also some from the 19th Century. One of the composers on my list is Richard
Wagner.
There were news reports last spring
noting that 2013 marks the 200th anniversary of Wagner's birth, he of the Ring of the Nibelungen cycle and a
number of other operas. Celebrations in
the U.S. have been muted because Wagner was an avowed anti-Semite; his music
received the kiss of death when Hitler proclaimed Wagner his favorite composer. (Woody Allen is alleged to have said that "Every
time I listen to Wagner, I get the urge to invade Poland.") Performances of Wagner's music are informally
banned in Israel. The records
demonstrate that Hitler did indeed love opera, and Wagner specifically, and the
fact that Wagner was anti-Semitic doubtless didn't hurt his cause in Hitler's
eyes. Wagner's music was played at Nazi
rallies. Wagner died in 1883, long
before Hitler and the Nazis ever came along, so Wagner himself can hardly be
blamed for the Nazis, but members of the Wagner family maintained close ties
with the Nazis and with Hitler in particular, and Wagner's political views were
not incompatible with those of the Nazis.
Many
do not face any dilemma because they never listen to opera so don't give a hoot
about Wagner's views. Some like opera
but do not listen to Wagner because they simply do not like his music; neither
Kathy nor her mother, both lifelong opera buffs, care to listen to Wagner, nor
does my friend Geoff Sirc, also a master opera fan. So Kathy is always willing to go to the
opera, just not Wagner. I've never seen
a performance of a full Wagner opera—when there have been Met Opera simulcasts
of Wagner in local theaters, Kathy's been unexcited about going, and I respect
her tastes.
I confess to a certain
ambivalence. On the one hand, I do like
some of the pieces from the operas, pieces we are all familiar with. Ride of the Valkyries, Overture to Die
Meistersinger, Overture to Tannhäuser, Overture to Rienzi, the Prelude to
Lohengrin, the Prelude to Tristan und Isolde, among others. Some of them are stirring. There are plenty of people in the music
business who regard Wagner as a towering genius for his operas and for his
innovations.
On the other hand, is it possible to
separate the political views of an artist from his or her artistic works? I did a quick web survey to learn if there is
any historical evidence for anti-Semitism on the part of, for example, Bach,
Beethoven, or Mozart. The short answer
is "not much, if any." The
most controversial piece is the St. John
Passion by Bach, drawing on the New Testament book John, which is the most inflammatory of the gospels in language
concerning the Jews. It seems, however,
that Bach actually played down the anti-Semitism as much as he could (his
contract required that he use the exact words from the bible, but other
materials he drew on softened the message to mitigate the anti-Semitism). To the extent Beethoven was anti-Semitic, it
was (at worst) no more so than the conventional views of his time (1770-1827)
in Lutheran Germany—and anyone who knows anything about the history of
Protestantism is aware that Luther himself was vigorously anti-Semitic. There's no evidence about Mozart at all. So one can embrace those three German
(Austrian, in Mozart's case) composers without dismay about their political or
religious views.
One web piece I looked at, on
jewishworldview.com, made the interesting point that while Bach's music (in the
case of the St. John Passion, for
example) included some anti-Semitism, none of Wagner's actual music was about
anything related to such matters; the Ring
cycle is all Norse mythology. So a
visitor from another planet, knowing nothing about the composers but listening
to their music, might conclude that anyone whose background is Jewish would
boycott Bach and listen to Wagner. (But
that same web author nonetheless drew the line between Bach and Wagner: Bach is fine, Wagner is not.)
As I read a bit more, I came to
conclude the picture is somewhat more ambiguous than might appear at first
glance.
Daniel
Barenboim, the famed (Israeli) pianist and conductor (who's been music director
at symphonies in Germany as well as the Chicago Symphony Orchestra and the
Orchestre de Paris, and who's also criticized Israel for occupying Palestinian
territories) has recalled the prevalence of anti-Semitism in Germany at the
time and concluded that the "historical background does not change the
fact that Richard Wagner was a virulent anti-Semite of the worst kind whose
statements are unforgivable"—but who also concludes that "as
revolting as Wagner's anti-Semitism may be, one can hardly hold him responsible
for Hitler's use and abuse of his music and his worldviews." Barenboim also observed that "the Jewish
composer Ernest Bloch, for one, refused to accept Wagner as a possession of the
Nazis: 'The music of the Nazis is not
the prelude to Die Meistersinger," he said, "but rather the
Horst-Wessel-Lied; they have no more honor than that, further honor cannot and
shall not be given them.'"
Barenboim
also opposes the Israeli ban on Wagner performances. He argues that the ban means "we are
giving Hitler the last word, that we are acknowledging that Wagner was indeed a
prophet and predecessor of Nazi anti-Semitism, and that he can be held
accountable, even if only indirectly, for the final solution." He rejects that proposition.
Moreover,
whatever his revolting public pronouncements may have been, Wagner picked
Jewish conductors and artists to lead and perform him music. One can always wonder whether people like
Wagner—and one can think of hundreds of different examples unrelated to
anti-Semitism—if plucked out of 19th-Century Germany and placed in 21st Century
Germany, would take a look at history and change his mind completely. One can suspect, too, that it would have been
a very long leap for Wagner to jump from his political views to endorsement of
genocide.
There
was a book published in 2005 by a lawyer who is also an art and music critic, Richard Wagner and the Jews, that was
summarized on Amazon (I have not read it).
The author noted Wagner's public anti-Semitism; "Wagner's
close ties with many talented Jews, then, are surprising. Most writers have
dismissed these connections as cynical manipulations and rank hypocrisy. Examination
of the original sources, however, reveals something different: unmistakable,
undeniable empathy and friendship between Wagner and the Jews in his life. Indeed, the composer had warm relationships
with numerous individual Jews. Two of
them resided frequently over extended periods in his home. One of these, the rabbi's son Hermann Levi,
conducted Wagner's final opera--Parsifal, based on Christian legend--at
Wagner's request; no one, Wagner declared, understood his work so well. Even in death his Jewish friends were by his
side; two were among his twelve pallbearers."
It is difficult to imagine that
someone with these kinds of relationships would have come anywhere near
endorsing the Nazis. At the very least,
Wagner must have faced (and ignored) a certain level of cognitive dissonance.
As I
often do when I encounter these kinds of questions, I asked a few friends who
think clearly about such things.
One,
a psychologist whose father was a secular Jew, wrote that "I tend to think
that in listening to Wagner, even when I find something 'awesome' or beautiful,
I find myself thinking 'but he was such a small, hostile person' and it
diminishes the listening experience."
A
faculty friend who teaches ethics wrote to me that "If you enjoy Wagner's
music, and not because it is anti-Semitic, I see no problems. The value of the music should, surely, be
kept separate from the value of the man."
Another
faculty friend, Jewish, wrote me that "I remember my Mom saying something
about Frank Sinatra's misogyny and outright nasty treatment of many women:
'You have to differentiate between the man and the music'. . . . And, by the way, I drive a VW Eurovan, but it
originally was the 'people's' car, designed in part by Hitler, and endorsed as
the official car of the Nazi party."
A faculty friend who's not Jewish (but who is an ordained Protestant minister) wrote to me that irrespective of whether the allegations of the connections between Wagner and Nazism are valid, "your questions remains: what do we do with art/science/other products that are created or even made possible in settings that are determined to be evil? . . . There is considerable discussion of this with regard to the medical/scientific findings that were learned in experiments that happened in the [Nazi concentration/extermination] camps. In fact, when Art Caplan was at [the University of Minnesota], he even had a large international conference considering these same questions. Currently the scientific consensus is that inappropriate methods undermine the validity of the findings, and the results should be considered skeptically if at all. . . . Regarding art that is produced in such settings, it remains a great question. We do value art that comes from asylums, or from other groups seen as demonic (think of the music from Stalin times, from Mao's China, etc.)."
I replied that it seems to me there is a difference between scientific research findings, which may be utterly ill-gotten and may also be invalid but which might also lead to great social benefit (e.g., medical "research" data from the Nazi camps, if it were accurate and useful), whereas there's no particular gain or loss for society if I do or do not listen to Wagner. In the case of music and art, especially when the composer/artist is long dead, it is purely personal opinion. But the parallel is an interesting one.
Another friend, also Jewish, told me
that "My first take on it is that we enjoy a lot of art whose creators
were pretty disgusting, from Rembrandt the insolvent to Renoir the
fascist. Why should Wagner be any
different? We enjoy Marlowe's plays (in
the rare instance we get to see them), although he was certainly a thug and
probably a murderer."
Those comments bring to mind a
thought I've had a number of times over the years. Why should anyone pay any attention to the statements
of artists about political matters—when most of them, in my experience and in
reading about them, have little or no education or learning about the
construction of a society or effective socio-political policies and practices? While I think it's perceptive when a leading
Hollywood figure endorses and financially supports a candidate I like, I don't
harbor any illusions that they may necessarily have any great wisdom on society
in general. So perhaps also I should
treat Wagner.
Kathy also made an interesting
point. Is there a difference between
listening to a piece of music when there are still royalties going to someone
and listening to a piece of music that's in the public domain—and the artist,
however reprehensible, is making no money?
She observed that if Bruce Springsteen, for example, were to give voice
to views she found vile, she'd never listen to him again, much less purchase
his music. In that instance, it would
also be current. What about, as with
Wagner, over 150 years have passed since some of the music was written—what is
the possibility, as I suggested before, that his views about the world would be
quite different today than they were in the middle of the 19th Century? Do we give some allowance, some grace, for a
different age?
And so perhaps I can simply put
aside Wagner's obnoxious views and continue to enjoy his music. It would be an odd stretch if I were to "boycott"
Wagner, in sympathy with my Jewish friends at the horror of the Nazis and the
putative link with Wagner, when those friends don't themselves boycott Wagner!
Authors and actors and artists
and such
Never know nothing, and never
know much. (Parker)
PATIENCE – A minor form of despair,
disguised as a virtue.
Those of you who attended the
University of Minnesota, or been on the Minneapolis campus much, know that
Northrop Auditorium is perhaps the iconic campus building. It was opened in 1929 and for several decades
was home to the Minneapolis Symphony Orchestra and site of Metropolitan Opera
performances, in addition to hosting a multitude of other cultural events and
countless graduation ceremonies. But
time took its toll and the University never had the money to effectively maintain
or improve it (capital funds went first to classrooms and labs and other
academic facilities). It was never a
good facility acoustically—and the original architects and users knew it. Dimitri Mitropolous, conductor of the
Minneapolis Symphony from 1937-1949 (before he left for the New York
Philharmonic), was once asked what would improve the acoustics of
Northrop. His response was "dynamite."
In the early 2000s it became clear
that something had to be done. As one
University official put it, during a discussion at a meeting I was in, the
choices were to demolish it, fix the exterior and use it as a parking ramp, or
board it up and let it sit there. The
HVAC system was the original one installed in 1929; the environmental safety
staff at the University determined that during events when the 4800-seat
auditorium was full or nearly so, the CO2 level in the auditorium got
sufficiently high that it made everyone sleepy.
The stage was being held up by pieces of 8x8 wood jammed under the
boards to keep them from sagging. And so
on. Moreover, unless something was done
to secure the exterior—roof, walls, windows—the building was in danger of
disintegration to the point it could not be saved.
Demolition of the iconic building at
the head of the Mall was simply not an option—everyone agreed about that. Making it a parking ramp was, of course, a
joke. But simply mothballing it was a
possibility.
So, much to his credit (in my
opinion), then-president Robert Bruininks authorized emergency expenditures of
about $14 million from University reserves and repair funds to stabilize the
exterior to prevent the building from falling down. He also launched an ambitious fund-raising
project to raise money to renovate it.
In a complicated funding plan using private support, some institutional
and some state funds for repairs and renovation, and a projected income stream
from future events, the University cobbled together enough money to completely
redo Northrop, at an additional cost of $87 million. Except for the ornate front lobby, the
building was completely gutted—only the shell remained. It was no doubt more expensive to gut and
renovate the building than it would have been to simply demolish it and start
anew, but it was inconceivable that that would happen.
There
will continue to be an auditorium in Northrop Auditorium, but smaller (a
decision made after extensive review of other performance facilities in the
Twin Cities area and consultation with other performance groups, such as local
theaters and music venues). What many
applauded was the intention of having the new auditorium match the acoustical
quality of the Concertgebouw in Amsterdam, one of the most highly-rated musical
halls in the world. Unlike its history
for the last 90 years, now there will also be major academic programs housed in
the building, so it will become a home for both students and faculty, a living
building. For most students at the
University for the last several decades, Northrop was a place where one went
for graduation and convocation—and that was it, unless one attended some kind
of cultural event as well. Most of the
time it stood empty, ghostly.
I and others from the University
took a tour of the mostly-completed renovation in August. It will be a fabulous facility when it's
done.
Marvelous though the improvements to
Northrop may be, the larger campus was classified by Travel and Leisure magazine as one of the ten ugliest in the
country. The methodology for compiling
the list was perhaps a bit suspect; it "consulted the Princeton
Review, Unigo.com, and other forums where students hotly debate all aspects of
campus life." Ugliness is also in
the eye of the beholder. Methodology and
subjectivity aside, I'm inclined to agree that the institution has not
maximized its setting along the river to make itself more attractive and
park-like. It also made, in my opinion,
some awful architectural choices from the 1960s to the 1980s, so ended up with
a considerable number of buildings that one (Minnesota) professor of sociology
described as neo-penitentiary. I recall
attending a meeting of the University's Board of Regents a number of years ago;
one of the Board members exclaimed that the Twin Cities campus was one of the
most beautiful in America. Some of us in
the audience just looked at each other and rolled our eyes. It is one of the world's great universities,
without doubt; it is not one of the more attractive campuses.
EGOTIST,
n. A person of low taste, more
interested in himself than in me.
She
runs the gamut of emotions from A to B. (Parker
writing about Katharine Hepburn)
I cancelled my Macy's account this
summer after I read in the news that Macy's, among other corporations, lobbied
Texas Governor Rick Perry to veto a law guaranteeing equal pay for equal work
for men and women. Perry, of course,
given his political views, vetoed the bill.
(A friend of mine posted on Facebook:
"Texans: They make the best
case against 'intelligent design.'")
I suppose I do business with a
number of corporations who engage in practices or lobby for laws that I
dislike, but I just don't know what they do.
When I learn of such activities, however, I cease giving them any of my
money. I have never set foot in a
Walmart and never will because I find their corporate human resources practices
to be reprehensible and the political activities of the Walton family are
completely at odds with my views.
The difficulty with this approach is
that if I carefully examined the activities of every company/corporation from
which I purchase products or services, I might end up unable to buy
anything. So I act only when it's
blatant. (But I have gone on the web
recently and determined that there are products from Koch Bros. companies that
I can avoid purchasing.)
RESTITUTIONS,
n. The founding or endowing of universities and public libraries by gift
or bequest.
It
serves me right for putting all my eggs in one bastard. (Parker)
Elliott and I often exchange
interesting websites (we do so with Kathy and Krystin as well, but usually it's
him or me who starts it). One he sent me
was "50 Unbelievable Facts about Earth"; one of the facts is that the
highest mountains and deepest ocean trenches only equal about 1/5000th of the
earth's diameter. Elliott pointed out
that as a result, "globes lie when they have textured surfaces for
mountains. A giant cosmic hand would not
be able to distinguish the ruggedness because the ratio of the distance between
the highest mountain top and the lowest ocean trench relative to the total diameter
of the earth is smaller than the grooves on your fingerprints relative to the
total thickness of your finger."
EDUCATION,
n. That which discloses to the wise and disguises from the foolish their lack
of understanding.
When I was young and bold and
strong,
The right was right, the wrong was wrong.
With plume on high and flag unfurled,
I rode away to right the world.
But now I'm old - and good and bad,
Are woven in a crazy plaid.
I sit and say the world is so,
And wise is s/he who lets it go. (Parker)
The right was right, the wrong was wrong.
With plume on high and flag unfurled,
I rode away to right the world.
But now I'm old - and good and bad,
Are woven in a crazy plaid.
I sit and say the world is so,
And wise is s/he who lets it go. (Parker)
Getting a college education and degree
has not been a direct or obvious path for Elliott. After two years at Minneapolis Community and
Technical College, where his academic record was outstanding, he spent last
year at the University, living on campus in one of the dorms. It wasn't a happy experience, for a number of
reasons, his academic record was not great, and the year there soured him
completely on the University.
During last summer he didn't do
anything about enrolling in school in the fall—enroll anywhere—which by August was starting to drive me crazy. He knows as well as anyone that a college
degree is now virtually a sine qua non
for professional advancement in the world (notwithstanding the examples of
Gates and Zuckerberg—at whose companies, many have pointed out, one almost certainly
needs a college degree to work at professional jobs). We were cutting and tearing down vines that
grow up on the house one day mid-summer; Elliott said to me that I didn't have
to worry about him not completing college because it was jobs like this—tearing
down vines—that he certainly didn't want to have for the rest of his life.
He concluded, after his
disappointing University experience, when he'd aimed for a degree in
psychology, that he'd made a mistake. He
had earlier decided that his talent and interest in art—specifically drawing—would
remain avocational because while he loved to draw, he didn't like it when he
was told what to draw or what medium to use (e.g., in an art class). Last summer, however, he said he realized
that that was the wrong decision, and that what he really wanted to get a
degree in was something related to art that would link him to the creative side
of video games. I told him that was fine
with me, but that this was certainly going to extend his college career. He knew that.
By late August, however, nothing had
happened, and I was getting increasingly annoyed. I had determined that he was NOT going to
have some part-time job, not be in school, and spend the rest of his time
playing video games, and I had told him as much. Then one day he surprised us (and I think
himself); someone his mother works with had some relative who graduated from
Moorhead State University (across the river from Fargo) with a Bachelor of Fine
Arts (BFA) and had been doing very well in the video-game industry for the ten
years since he graduated. The friend of
Pat's put her relative in touch with Elliott, who described to him what he'd
done and how impressed he was with the Moorhead BFA program. So that's where Elliott decided he'd go.
Elliott expressed a little dismay
that he'd wasted a year at the University.
I reminded him that no learning is wasted, and pointed out that he'd
found his classes in Japanese history and in Viking history interesting, and
that the abnormal psychology class may not have been well run, in his view, but
he still learned a lot from it. He agreed and went on to make an observation
that surprised me: he said that the two
courses he'd had to take as part of his major but that he'd not been looking
forward to, research methods and statistics, had taught him an enormous amount
that he'd retain for the rest of his life.
I told him I was glad, and that anyone who's reasonably conversant with
statistical methods and how to analyze statistics will be better off than someone
who is not. (Lies, damn lies, and
statistics and all that.)
What's amusing to me about this turn
of events is that Elliott talked matter-of-factly that the BFA—whether from
Minnesota, Moorhead, or Harvard—requires art history courses, and since he's
not had one, he'll have to meet the program requirement for 2-3 such
courses. Elliott has usually avoided
going into art museums when we've traveled and he'd earlier resisted the
suggestion from both Kathy and me that he take an introductory art history
course. Now that it's integral to the
program he wants to pursue, of course, it's just a fine idea. (I will give him due credit, however, for
changing his mind. The guy he'd been in
touch with about Moorhead explained to Elliott that in his experience, the
people who were more successful in the artistic/creative side of the video-game
business were people who had a good grounding in the methods and lessons of
traditional art, and Elliott took that message to heart.) I did not, however, resist ribbing him about
skipping the art museums.
(Later in the fall, after watching
the politics in Washington over the federal shutdown, and being increasingly
irritated by the fundamentalist/Tea Party right wing, the continuing attempt to
infuse religion into public policy, repeated attacks on the social safety net,
and the widening gap between the 1% and the rest, Elliott reiterated the
thought that living in Sweden might be attractive—particularly because there
are over 30 video-game companies headquartered there.)
PLEASE, v. To lay the
foundation for a superstructure of imposition.
Don't
look at me in that tone of voice.
(Parker)
I was reminded by a news article of
a book by Paul Meehl, Clinical vs.
Statistical Prediction: A Theoretical Analysis and a Review of the Evidence,
published in 1954. Meehl, who was a
Regents Professor of Psychology, Law, and Philosophy, and one of the developers
of the Minnesota Multiphasic Personality Inventory (MMPI), argued in the book that
in the field of psychotherapy, statistical models almost always yield better
predictions and diagnoses than the judgment of trained professionals. His contention has been supported by reams of
additional research in subsequent decades—and has also been expanded to other
fields, including cancer patient longevity, cardiac disease, likelihood of new
business success, evaluation of credit risks, suitability of foster parents,
odds of recidivism, winners of football games, and future prices of wine. He really called into question the value of expert
diagnosis, a question that remains open today.
Meehl's primary academic field was
clinical psychology. I took a seminar
with him when I was a graduate student in psychology many years ago, and Thomas
Szasz' book The Myth of Mental Illness
was one of the topics of discussion.
Szasz argued that mental illness was a social construct, unlike diseases
such as cancer or the flu. Meehl
dismissed Szasz; he said in the seminar something to the effect that "I've
been in psychiatric wards and you can't tell me those people aren't crazy." He was nothing if not direct.
Fortunately for me, when I was
seeing a therapist after my divorce, I only wanted a third-party view of
whether I was acting and thinking and responding appropriately, not a diagnosis
of psychopathology of some kind. (All
along she told me I was doing fine.) But
reflecting on Meehl makes one take expert opinion in a wide variety of fields with
a grain of salt. (Except, of course, for
my own.)
MORAL,
adj. Conforming to a local and mutable standard of right. Having the
quality of general expediency.
Those who have mastered
etiquette, who are entirely, impeccably right, would seem to arrive at a point
of exquisite dullness. (Parker)
Kathy and I attended what I think is
the hottest football game I have ever been to, the opening Minnesota Gopher
game on August 29 at 6:00 in the evening.
The temperature was over 90 degrees and we were in the sun. There is a joke in my family that I never
sweat. It isn't true, of course, and my
shirt at the game was proof that it isn't true.
It was uncomfortable and we left shortly after the half so we could go
home and cool down. (Unfortunately, we
also missed the most exciting part of the game, which the Gophers won 51-23.)
I had horrible visions during the
last part of August that our home central air conditioner would break down (it
is 17 years old), as the temperatures were into the 90s and the humidity was
high. Fortunately, it didn't. We would have moved temporarily to a hotel if
it had. Once again we Scandinavians don't
do well in the heat. I sometimes wonder
how my friend Nils Hasselmo, with whom I worked at the University off and on
for over 20 years, survives living in Tucson!
But on the other hand, in
mid-September we attended what was probably one of the most pleasant games I
can recall. The two of us had a nice
brunch beforehand, the climate was perfect for a game, and Minnesota won
42-24. The only shadow on Kathy's day
was that that evening I beat her 2-1 in cribbage.
PREJUDICE, n. A vagrant
opinion without visible means of support.
RUMOR, n. A
favorite weapon of the assassins of character.
There was what I thought of as a
rather ominous research finding published mid-year, a finding that confirmed my
own suspicions. Essentially, a group of
French researchers fed two different groups of mice a "high-fat, high-sucrose enriched diet, with one group
receiving a cocktail of pollutants added to its diet at a very low dosage." As reported in the Federation of American Societies for Experimental
Biology Journal, what the research findings suggest is that at least
certain kinds of disease and affliction may be more prevalent because of low ("safe")
doses of multiple pollutants. The
journal editor concluded that the research "shows
that evaluating food contaminants and pollutants on an individual basis may be
too simplistic. We can see that when 'safe'
levels of contaminants and pollutants act together, they have significant
impact on public health."
Two examples of
this have occurred to me over the past year or so—and these are only two very
small examples. Those of us who drive
cars know that one must replace tires every so often because they wear
out. What that means is that billions of
microscopic bits of rubber are being deposited on roads and nearby ground—or
floating in the air. Where do all those
little bits of rubber go? Presumably we
breathe them in and they get into groundwater as well as the soil in which
crops are grown. In addition, we fill up
the container under the hood with window-washing fluid so we can turn on the
wipers and clean dust and crud off the windshield. Have you ever looked at the label for that
liquid? It essentially declares the
stuff toxic to all living things. So we
routinely squirt it onto our windshields and watch it wash off onto the car and
the ground. Again, where is all that
toxic substance going? I almost don't
want to use the fluid, a resolution it would be easier to carry out in more
temperate climates; without it in Minnesota in the winter, water in the
container for washing the windshield would simply freeze.
And of course
cars also sometimes leak oil and antifreeze and other fluids.
While the
experimental research that was reported in this article is only one small
piece, the whole notion that having even low levels of multiple toxins in our
bodies may have effects beyond those of any single pollutant simply makes
sense. Apart from supporting strict laws
against pollution (and their enforcement) and legislators who vote for such laws
and also vote to provide funding for enforcement, and for laws and
appropriations that support, at a societal level, getting away from all of the
life practices that contribute to these kinds of pollution, there isn't much
any one person can do. We all drive cars
and we all get dirty windshields. But if
we could get around without cars, we'd be better off. And I wouldn't have to buy car insurance.
SAUCE,
n. The one infallible sign of civilization and enlightenment. A people with no sauces has one thousand
vices; a people with one sauce has only nine hundred and ninety-nine. For
every sauce invented and accepted a vice is renounced and forgiven.
Take care of the luxuries and
the necessities will take care of themselves.
(Parker)
It's too bad I live
in one of the most geographically-dispersed metropolitan areas in the country
(as my good friend the late Professor Judith Martin, urban geographer, told me
many times). The data on urban density
(from the Census Bureau, 2010, round numbers) are interesting.
Minneapolis-St.
Paul (MSP) ranks 16th in the country
in population, at 2,651,000. (First is
New York-Newark-area, at 18,352,000, LA-Long Beach-Anaheim is 12,151,000,
Chicago-Gary is 8,608,000, etc.) MSP
ranks just above Tampa-St. Petersburg (2,442,000) and just below San Diego and
Seattle (2,957,000 and 3,059,000, respectively). The others with a greater population, in
descending order from more to fewer people, after Chicago, are Miami,
Philadelphia, Dallas-Fort Worth, Houston, Washington DC, Atlanta, Boston,
Detroit, Phoenix, and San Francisco.
In population
density, however, MSP ranks 105th,
with 2596 people per square mile. There
are only five cities with more than 2 million people that have a lower
population density than MSP: Tampa-St.
Petersburg, San Juan PR, St. Louis, Boston, and Atlanta. Except for Atlanta, the population density of
the other four is similar to that of MSP:
Tampa is 2551, San Juan is 2478, St. Louis is 2328, and Boston is
2231. (Atlanta is a huge outlier, at 1706
people per square mile but an urban population of 4,515,000; I had no idea it
was so spread out.)
To give you an
idea of density in other areas, LA is #1 at 6,999 people per square mile. Here are a few others, in descending order
from the top after LA:
San Francisco 6266
people per square mile
San Jose 5820
Delano, CA 5482
New York 5318
Davis, CA 5156
Lompoc, CA 4815
Honolulu 4715
Woodland, CA 4550
Las Vegas 4524
Santa Maria, CA 4478
Miami 4442 (12th in rank in U.S.)
And a whole lot more California cities that occupy next 14 ranks,
down to 3775 people per square mile. A
few others you might know of:
Salt Lake City 3675 27th
New Orleans 3578 31st
Denver 3554 32nd
Portland OR 3527 34th
Chicago 3524 36th
Washington DC 3470 39th
Lexington KY 3315 51st
Phoenix 3165 57th
Baltimore 3073 61st
Houston 2978 67th
And then a lot more on down to MSP at 105th.
That dispersion has made mass transit a
challenge because the population density is low enough that that it can be difficult
to build mass transit that enough people can use. I think the Twin Cities needs about 20
light-rail lines, in a spider web pattern, if the system were to work well
(that is an opinion uninformed by any information from anyone who knows
anything about urban mass transit). But
at a billion dollars a crack for a rail line, I don't think even any of my
potential grandchildren will see that system in place.
So it is no surprise that the residents of
MSP aren't provided with a high-quality mass transit system—and no surprise
that there are a lot of cars on the road releasing minute quantities of rubber
and using windshield fluids.
At the same time one can decry dispersion
as an impediment to effective mass transit, I suggested earlier that people
like green and water. Having lawns and
trees and shrubs means people are more spread out, rather than being piled
together in high-rises a la Tokyo and Seoul with nary a tree or bush in sight. What I want is a compromise between dispersal
and mass transit: One heck of a lot more
tax money put into mass transit so we can all get around efficiently and with
far fewer demands on the environment while still having yards and shrubs and
gardens and the urban forest —and I'd gladly take the increased taxes if they
also meant we could get by with one car rather than two and could use that one
car less frequently. I might even save
money on that deal.
MIND,
n. A mysterious form of matter secreted by the brain. Its chief activity
consists in the endeavour to ascertain its own nature, the futility of the
attempt being due to the fact that it has nothing but itself to know itself
with.
Bewildered
is the fox who lives to find that grapes beyond reach can be really sour. (Parker)
There were a couple of interesting articles on food in the late summer. One was a book review in the LA Times, of a book by some guy named Michael Marder, Plant-Thinking: A Philosophy of Vegetal Life. The reviewer talked about theoretical developments in the humanities, "striving to go beyond the traditional human subject in order to account for other types of existence and experience, including animals and autonomous machines. A new field has emerged, loosely labeled 'the posthumanities,' which attempts to fill in the millennia-long blind spots caused by our own narcissism. . . . Marder wants to forge an encounter with vegetal life, all the while respecting the alien ontology of floral ways of being. For while a shrub may not consciously 'experience' the world in which it grows, this does not, for Marder, mean that it is not thinking and doing in profound philosophical, and even ethical, ways." There is language about the idea of a plant "soul," "non-conscious intentionality," plants as role models, the human-centric views of plants, and so on.
"Plant-thinking does not oppose the use of fruit,
roots, and leaves for human nourishment," this final section claims:
Rather, what it objects to is the total
and indiscriminate approach to plants as materials for human consumption within
the deplorable framework of the commodified production of vegetal life. . . .
[I]nstead of "What can I eat?" we should inquire, "How
am I to eat ethically?" To put it succinctly, if you wish to eat
ethically, eat like a plant!
And more in this vein.
I could hardly keep from cackling
when I was reading the book review, and several of the readers commenting
pleaded that someone tell them this was a hoax. Such language as this makes it difficult not
to think that this is a joke:
Marder's book heralds an impressive and singular new voice, prompting a
slew of new questions around different ontologies and shared ecologies. It
succeeds in expanding the circle in which, to gesture to Donna Haraway, species
meet. Marder's work brings out the
profound pathos underwriting a generation that has more experience growing
digital carrots and apples in Farmville than cultivating actual fruits and
vegetables. The sheer number of couch
potatoes and human YouTubers cannot be underestimated, and I hope this author's
subsequent work also considers the relationship between technology and vegetal
life.
I do not know if this kind of
writing and line of thought represent current approaches in some branch of the
humanities. If it does, one can
understand why there might be widespread public skepticism about requiring that
students be educated in the humanities as part of their college degree program.
But there's a new way to avoid offending either plants or animals in one's
diet: about the same time this (I think
goofy) review of a goofy book appeared, there was also news about meat
cultivated in a lab, so did not involve killing any cows for the beef. Interesting that PETA endorses the work
because it reduces or eliminates cruelty to animals (the process uses stem
cells). Other supporters, otherwise
vegetarians, note the reduction in demands on land and other natural resources. "Perhaps most strikingly, the
philosopher Peter Singer, whose book Animal Liberation (1975) was a
founding text for the modern movement, wrote in The Guardian: 'I haven't eaten meat for 40 years, but if in
vitro meat becomes commercially available, I will be pleased to try it.'"
The absolutists on animal rights,
however, don't support it, because they oppose any view of animals as
resources/property for humans. There
appears to be debate in the vegetarian/vegan community about the ethics of
eating in vitro meat. As one writer
observed, "The only logical way to make sense of the reluctance of many
vegetarians to back IVM [in vitro meat] is that their choices are not as driven
by animal welfare and environmental considerations as we—and they—assume.
Perhaps a distaste for eating meat is a visceral feeling that is only loosely
connected to a ethically motivated imperative not to cause undue suffering to
animals." It does indeed seem that
the animal welfare and environmental benefits arguments fall by the wayside if
there remains opposition to in vitro meat even though it would represent huge
gains on both scores and obviate the principled objections. I'm neither vegan nor vegetarian, but I think
this is a good idea; the resources required to produce meat for food vastly
outstrip those required to produce crops for food, so scaling in vitro meat
production up to factory production would dramatically reduce demands on the
planet, help address climate change (far fewer farting cows!), and perhaps allow
more people to be fed better (setting aside the problem that there are too many
people already). (The president and
co-founder of PETA doesn't get confused on the point: "'Any flesh food is totally repulsive to
me. . . . But I am so glad that people
who don't have the same repulsion as I do will get meat from a more humane
source.'")The researcher who announced the discovery, Mark Post at Maastricht University in the Netherlands, said that "potentially, you can do this in your kitchen. You can grow your own meat. But you have to know what you want to eat 8 weeks in advance." As I understand the situation from the media coverage, however, there's still a ways to go on getting in vitro meat to taste like meat from animals—and a considerable ways to go in scaling production to factory level. They have to culture some level of fat as well, among other things, to achieve the flavor most will want. I surmise that my children will eventually be able to eat in vitro meat and it will taste great. If they can pursue this to its logical end, then theoretically one will be able to eat veal, for example, without any qualms.
It's on the way to tablets
for food!
If you put together the extremes of these views, don't
eat plants and don't eat animals, it becomes unethical to do anything but
starve to death once one can no longer drink mother's breast milk. Of course, no one would achieve puberty and
thus be able to conceive children, so that option would disappear as well. (I know, it's unlikely that anyone's really
arguing that it's unethical to eat plants.
But the logic of some of these opinions, pushed to the extreme, do verge
on the hilarious.)
LAND,
n. A part of the earth's surface, considered as property. The theory that land
is property subject to private ownership and control is the foundation of
modern society, and is eminently worthy of the superstructure. Carried to its
logical conclusion, it means that some have the right to prevent others from
living; for the right to own implies the right exclusively to occupy; and in
fact laws of trespass are enacted wherever property in land is recognized. It
follows that if the whole area of terra firma is owned by A, B and C, there
will be no place for D, E, F and G to be born, or, born as trespassers, to
exist.
It's
not the tragedies that kill us; it's the messes. (Parker)
I
lost a good friend, colleague, and mentor in early September. Regents Professor Frank Sorauf died at 85 of
what probably was a combination of the effects of Alzheimer's and old age. I first took a class from Frank while an
undergraduate in political science, the American Judicial System, and then
worked for him as an undergraduate research assistant during my senior year; I
helped him do research on his book on the separation of church and state. (Hearing his tales about interviewing Madalyn
Murray O'Hair about the bible-reading-in-school case she won in the U. S.
Supreme Court were alone worth the price of the work [Murray v. Curlett, if you're interested].) At Frank's request, my friend Rolf Sonnesyn
and I read aloud to each other the page proofs of the book to catch any
printing errors—my, how the technology of printing a book has changed.
During
the year he'd hired me as research assistant he was appointed dean of the
College of Liberal Arts; he brought me along as a student assistant to the
dean, so he effectively launched my career working at the University. A number of faculty members whose memories
reach back 30+ years will still say that Frank was the best dean that CLA had
in a long time.
Frank was truly a renaissance man, a
pianist of near-concert quality, a music and opera expert, great chef, and
considerable athlete. I found him to be
a stimulating, interesting, and exacting teacher (who brooked no nonsense in
class); the fact that he was designated a Regents Professor signifies his
tremendous accomplishments in his field (he was one of the nation's leading
experts on campaign finance and co-authored a very influential amicus brief in
the Supreme Court case on the McCain-Feingold law). He had an incisive mind on public policy—and
most other—matters as well as a wonderful if frequently biting sense of
humor. Unfortunately, with the onset of
Alzheimer's in the last few years, we really "lost" him awhile
ago. So as is often the case with
Alzheimer's, the death was really a blessing.
But I miss my lunches with him and our comparing notes at the simulcasts
of the Met Opera performances, and he was a dinner guest at our house many
times over the years, to recall but a few of the times we socialized.
I was able to obtain, from a
colleague who had collected them, Frank's written notes about a memorial
service. They are pure Frank and can
perhaps convey in a small way part his personality.
I really don't think a memorial is
necessary – we all write on sand, and what we have accomplished will be erased
by the desert winds. The best thing
about the nightly desert winds is that they erase everything – mistakes
included – and that they rearrange and smooth the sands for new and fresh
impressions. All of that became clear to
me on the sands of the Sahara I saw in southern Morocco at sunrise one January
morning.
Let there be a memorial event, though,
if people feel so inclined. For once, I
will be unaware of any excess or false praise heaped upon my memory. (Or any too faint praise!) However, I do not want a memorial occasion to
be one I would have dreaded to have to attend. I would not want one that ran beyond 40 or 50
minutes. And less would really be
more.
Just do what you can to keep it
short and to prevent the worst assaults on the truth, the most egregious
self-aggrandizement of the speakers, and any flagrant imposition on the
patience of the assembled. If a few
invited (and prepared) speakers don't use up the time, I would rather there be
some music . . . and then refreshments.
I would also favor a venue at which my estate could offer people a REAL
drink.
VIRTUES, n.pl. Certain abstentions.
ACCOUNTABILITY,
n. The mother of caution.
There was an article in the local
newspapers about an increase in suicide rates, showing that in 2011 there was a
"13 percent increase from 2010 in suicides overall. Most of that increase is among middle-aged men
55 to 59 years old"; the rate for those over age 65 also increased "to
the highest levels in 12 years."
These Minnesota statistics, the article reported, reflect national data.
An associate professor at St. Cloud
State University, Jennifer Tuder, wrote an op-ed piece in the Star-Tribune about a one-person
performance she does about surviving suicide, after her father's suicide. She takes sharp issue with the contention
that suicide can be noble or that survivors are better off without the deceased. She, too, read the statistics and then wrote
the op-ed piece.
We aren't
[better off]. Surviving the suicide of a loved one at least doubles your risk
for completing suicide. Suicide survivors are twice as likely as the general
population to suffer from mental illness, particularly depression.
We feel
stigmatized, ashamed and isolated. We have to live with painful, unanswered
questions, particularly the question of "why?" We are not better off.
So why
is it so important to these men, these men who are so like my father
demographically, to think that suicide is a noble choice? I believe the answer
lies in our traditional scripts for masculinity.
If men can't provide and protect and be in control,
then they are seen—and see themselves—as failing, she maintains.
That
may be true for some, but for others it is pure poppycock. I could not disagree with her more
vigorously—if one changes the context.
Camus wrote in The Myth of Sisyphus: "There is but one truly serious
philosophical problem, and that is suicide." I agree.
In my construction, ultimately the question is whether my life situation
is so grim, unlikely to improve, and very likely to get worse, that the only
rational question is suicide.
Another
long-time colleague and friend of mine, and Frank Sorauf's, were talking over
lunch one day after Frank had been diagnosed with Alzheimer's (which had
started to show its effects, although only minimally at that time). My colleague—a husband, father, and
grandfather—said with grim determination that if he were diagnosed with
Alzheimer's, he'd commit suicide (before it became too advanced for him to know
what was happening). I said I would as
well.
Perhaps
Professor Tuder was not thinking of different contexts. One can understand how devastated a family
would feel if a senior male—which is what these statistics and her play are
about—committed suicide without notice and without any debilitating
conditions. I wouldn't take issue with
her under those circumstances. Whether
or not the "traditional scripts for masculinity" are characteristic
of all or most males seems to me to be questionable—and may well be
generational. Were I to lose my job, and
thus the larger part of the household income, I would become depressed if I
could not find another one. But with a
wife and kids I love, suicide wouldn't even cross my mind, and we'd have
sufficient income to muddle along.
I have made it clear to Kathy and the kids, however,
that if I am diagnosed with Alzheimer's, or something similar, and I still have
sufficient possession of my faculties to know what's going on, I will choose my
own exit. What possible benefit can
there be to my family or friends to watch me disintegrate to the point that I
not only lose my mind but, eventually, also control of bodily functions? From my standpoint, there is no
reason—none—to keep me alive at that point.
They would indeed be better off without me, and I would want them to be
comforted by the thought that I avoided being in a situation I vehemently
detested and took action to prevent.
There would be none of the "painful, unanswered questions" to
which Professor Tuder refers. (I have a
health-care directive that is starkly clear about my wishes if I am unconscious
and have virtually no chance of recovering my cognitive functions: no medicines, no heroic measures, no
nothing. Let me go.)
It
seems to me that in that kind of case, neither my spouse nor children should
feel stigmatized nor should they be at any greater risk of suicide than they
would have been otherwise. In fact,
given that we generally agree on this view about the end of life, I would think
they would be quite assertive with any potential critics in insisting that it
was absolutely the right thing for me to do—and that I'd make the same argument
to the critics if I were still around to do so.
It would be a logical impossibility, of course: after death I argue with critics that it was
right for me to commit suicide. I
suppose I could leave behind a vigorous defense of my rationale and action, and
I could compose such a document right now.
Maybe I should.
You may very well think "that's
easy for him to say now, but maybe not so easy if the time comes." I know, and you would likely be correct if
you thought that. Equally a concern is
that possession of reality will slip away before I could do anything about
it. In that case my survivors would have
to hope for a mercifully quick end from other causes (e.g., pneumonia or stroke
or cardiac arrest).
REFLECTION,
n. An action of the mind whereby we obtain a clearer view of our relation
to the things of yesterday and are able to avoid the perils that we shall not
again encounter.
BRAIN, n. An apparatus with
which we think what we think. That which distinguishes the man who is content
to be something from the man who wishes to do something. A man of
great wealth, or one who has been pitchforked into high station, has commonly
such a headful of brain that his neighbors cannot keep their hats on. In our
civilization, and under our republican form of government, brain is so highly
honored that it is rewarded by exemption from the cares of office.
A daily email I receive, ScienceDaily, with updates in various
fields of scientific endeavor, had two interestingly-juxtaposed articles on September
4.
It seems that while cleanliness may
be next to godliness, it isn't next to good health.
One article reported research at
Cambridge University suggesting (with what appear to be pretty decent—methodologically
sound—data) that there is a significant correlation between "a nation's wealth and hygiene and the Alzheimer's 'burden'
on its population. High-income, highly
industrialised countries with large urban areas and better hygiene exhibit much
higher rates of Alzheimer's." The
report went on to note that the "study adds further weight to the 'hygiene
hypothesis' in relation to Alzheimer's:
that sanitised environments in developed nations result in far less
exposure to a diverse range of bacteria, viruses and other microorganisms --
which might actually cause the immune system to develop poorly, exposing the
brain to the inflammation associated with Alzheimer's disease, say the
researchers." Apparently the
hygiene hypothesis is already well established in public health, linking
hygiene to allergies and autoimmune diseases.
But this was apparently the first significant research linking hygiene
to Alzheimer's.
We have often joked in our household
that we shouldn't be TOO clean because it is good for the immune system to have
a little dirt now and then. We had no
idea that there was actually research that supports our joke. But the article says we're right: "'Exposure to
microorganisms is critical for the regulation of the immune system,' write the
researchers, who say that say that—since increasing global urbanisation
beginning at the turn of the 19th century—the populations of many of the world's
wealthier nations have increasingly very little exposure to the so-called 'friendly'
microbes which 'stimulate' the immune system—due to "diminishing contact
with animals, faeces and soil."
Everything we have developed in first-world countries as a bulwark
against both disease and discomfort—"antibiotics, sanitation, clean
drinking water, paved roads and so on"—mean less exposure to bugs that
humans have faced since humans evolved.
And that "lack of microbe and bacterial contact can lead to
insufficient development of the white blood cells that defend the body against
infection."
In retrospect I'm
glad we let the kids play outside in the sand and dirt in the back yard when
they were little—particularly because "childhood—when the immune system is developing—is
typically considered critical to the 'hygiene hypothesis.'" In addition, however, "the researchers
say that regulatory T-cell numbers peak at various points in a person's
life—adolescence and middle age for example—and that microorganism exposure
across a lifetime may be related to Alzheimer's risk." This does make one
wonder about how fastidious we should be in cleaning/scouring our bathrooms,
kitchens, and so on. Most of us were
raised to keep things clean—and to use cleaning agents that purport to kill
germs.
Of course, as Bill Bryson pointed out in A Brief History of Nearly Everything, "because we humans are
big and clever enough to produce and utilize antibiotics and disinfectants, it
is easy to convince ourselves that we have banished bacteria to the fringes of
existence. Don't you believe it. Bacteria may not build cities or have
interesting social lives, but they will be here when the Sun explodes. This is their planet, and we are on it only
because they allow us to be." It
would seem, however, that our hygienic habits kill off enough of the
immunity-provoking bacteria that we increase our vulnerability to certain
categories of disease.
The second
article that same day reported on research at the University of California San
Francisco that seems to "reverse some of the negative effects of aging on
the brain, using a video game designed to improve cognitive control." One of the lead researchers "said his
game, NeuroRacer, does more than any ordinary game—be it bridge, a crossword
puzzle, or an off-the-shelf video game—to condition the brain."
In the game, . . . participants race a car around a winding track
while a variety of road signs pop up. Drivers are instructed to keep an eye out for
a specific type of sign, while ignoring all the rest, and to press a button
whenever that particular sign appears. The need to switch rapidly from driving
to responding to the signs . . . generates interference in the brain that
undermines performance. The researchers
found that this interference increases dramatically across the adult lifespan.
But after receiving just 12 hours of training on the game, spread
over a month, the 60- to 85-year-old study participants improved their
performance until it surpassed that of 20-somethings who played the game for
the first time.
The researchers, in physiology, neurology, and psychiatry, looked
at EEG recordings and the development of neural networks required for cognitive
control. They also found that the
effects of playing the game lasted.
The ScienceDaily article reported that "evidence
that the adult brain is capable of learning has been accumulating for more than
a dozen years." For all of us in
our 60s and beyond, that is reassuring.
But even so—as we all know, alas—brain function often erodes over the
years; the chap who developed the game, however, said that there are "some
exceptions, like wisdom."
Really? The claim is that wisdom
doesn't erode but cognitive functioning does?
How strange.
And darn it, here I thought playing bridge
into my dotage would help forestall or mitigate a decline in cognitive
abilities. Elliott must be equally
disappointed, because the games he buys seem not to have the same effect as
this one.
So I may have to take up specialized video
games as I get older—and sit in a dirty room while I play them.
DAY,
n. A period of twenty-four hours, mostly misspent. This period is
divided into two parts, the day proper and the night, or day improper--the
former devoted to sins of business, the latter consecrated to the other
sort. These two kinds of social activity overlap.
There was nothing separate
about her days. Like drops on the window-pane, they ran together and trickled
away. (Parker)
Anyone
who Googles "multitasking" will find a multitude of articles citing
scientific research demonstrating that multitasking is largely beyond the
capacity of the human brain and the idea that people can be more efficient by
multitasking is just garbage. It is true
that people can probably simultaneously use skills/capacities that are
unrelated to each other—one example is listening to the radio and folding
laundry—but trying to perform two tasks that both require a reasonable level of
cognitive function is almost certain to mean that both of them are done at a
lower level of quality than if done separately.
I have a personal example: try
petting two cats at the same time. That
does require a high level of cognitive function—and the cats let you know when
you're not paying attention.
THEOSOPHY, n. An ancient faith
having all the certitude of religion and all the mystery of science. The modern
Theosophist holds, with the Buddhists, that we live an incalculable number of
times on this earth, in as many several bodies, because one life is not long
enough for our complete spiritual development; that is, a single lifetime does
not suffice for us to become as wise and good as we choose to wish to become. .
. . Less competent observers are disposed to except cats, which seem neither
wiser nor better than they were last year.
A congeries of issues related to
privacy arose during the later part of the year. Some were visible to all: the National Security Agency access to emails
and telephone calls. Others, however,
were perhaps less prominent in the minds of many but they are ones that should
raise alarm bells for anyone who wishes, at his or her discretion, simply to be
let alone.
I consider the foundation document
in all of these debates to be Justice Brandeis's 1928 dissent in Olmstead v. United States. (I confess that Brandeis is in my personal
pantheon.) You can judge for yourself as
you read on, but in my view Brandeis was prescient in his assessment of the
interaction of technology and privacy.
The case was about wiretapping a telephone call and whether or not
federal agents doing so without judicial approval violated the Fourth Amendment
prohibition against unreasonable searches and seizures of the Fifth Amendment
protection from self-incrimination. The
Court ruled that the wiretapping did not violate the constitutional provisions;
Justice Brandeis dissented in what became one of the more famous dissents in
the history of the Court.
Brandeis went back to a 1914 case, Weems v. United States, to point out
that legislation and constitutional provisions may be enacted "from an
experience of evils" but the language should not "be necessarily
confined to the form that evil had theretofore taken. Time works changes, brings into existence new
conditions and purposes. Therefore a principle to be vital must be capable of
wider application than the mischief which gave it birth. This is peculiarly true of constitutions. They are not ephemeral enactments, designed to
meet passing occasions." At the
time the Bill of Rights was adopted, by and large the only way to force
self-incrimination was torture and the only way to search and seize was through
forceful entrance into a residence.
However, Brandeis warned, "Subtler and more
far-reaching means of invading privacy have become available to the Government.
Discovery and invention have made it
possible for the Government, by means far more effective than stretching upon
the rack, to obtain disclosure in court of what is whispered in the closet. . .
. The progress of science in furnishing
the Government with means of espionage is not likely to stop with wire-tapping.
Ways may someday be developed by which
the Government, without removing papers from secret drawers, can reproduce them
in court, and by which it will be enabled to expose to a jury the most intimate
occurrences of the home. Advances in the
psychic and related sciences may bring means of exploring unexpressed beliefs,
thoughts and emotions." Brandeis
couldn't specifically foresee computers, electronic mail, and neuroscience, but
the thrust of his thought was remarkably accurate.
Brandeis asked:
"Can it be that the Constitution affords no protection against such
invasions of individual security?"
His answer was a resounding "no." He wrote that "the tapping of one man's
telephone line involves the tapping of the telephone of every other person whom
he may call or who may call him."
Older means of spying "are but puny instruments of tyranny and
oppression when compared with wire-tapping." And its progeny, I am certain he would
say. Brandeis went on to cite a number
of cases in which the Court had construed broadly the meaning of the Fourth and
Fifth Amendments. He argued eloquently
and forcefully that
The makers of our
Constitution undertook to secure conditions favorable to the pursuit of
happiness. They recognized the
significance of man's spiritual nature, of his feelings and of his intellect. .
. . They sought to protect Americans in their beliefs, their thoughts, their
emotions and their sensations. They
conferred, as against the Government, the right to be let alone—the most
comprehensive of rights and the right most valued by civilized men. To protect that right, every unjustifiable
intrusion by the Government upon the privacy of the individual, whatever the
means employed, must be deemed a violation of the Fourth Amendment. And the use, as evidence in a criminal
proceeding, of facts ascertained by such intrusion must be deemed a violation
of the Fifth.
In my opinion, Brandeis's most important counsel was
this:
Experience should teach
us to be most on our guard to protect liberty when the Government's purposes
are beneficent. Men born to freedom are
naturally alert to repel invasion of their liberty by evil-minded rulers. The greatest dangers to liberty lurk in
insidious encroachment by men of zeal, well-meaning but without understanding.
Brandeis's opinion in Olmstead is a reflection of views he had
articulated nearly 40 years earlier, in a Harvard
Law Review article he co-authored with Samuel Warren entitled "The
Right to Privacy." Even then
Brandeis was raising alarms, with more foresight than he could have imagined: "Recent inventions and business methods
call attention to the next step which must be taken for the protection of the
person, and for securing to the individual what Judge Cooley calls the right 'to
be let alone.' Instantaneous photographs
and newspaper enterprise have invaded the sacred precincts of private and
domestic life; and numerous mechanical devices threaten to make good the
prediction that 'what is whispered in the closet shall be proclaimed from the
house-tops.'" After citing a
variety of areas of the law and legal precedent, Brandeis and Warren declare
that "the principle which protects personal writings and any other
productions of the intellect or of the emotions, is the right to privacy."
Now let's jump forward 85 years, to Slate this past September. A wife and mother, Amy Webb, writing for both
she and her husband, report that they will allow nothing about their infant
daughter to be posted on the web. She
observed the excitement of a friend who was a new parent in posting pictures of
their daughter (pseudonym Kate) to Facebook but cautioned that she worried
about how those posts would affect the child later, "and the broader
impact of creating a generation of kids born into original digital sin." She's worth quoting, beginning with her
citation of a message from Facebook:
"We are able to
suggest that your friend tag you in a picture by scanning and comparing your
friend's pictures to information we've put together from your profile pictures
and the other photos in which you've been tagged." Essentially, this means that with each photo
upload, Kate's parents are, unwittingly, helping Facebook to merge her
digital and real worlds. . . . The problem is that Facebook is only one site. With every status update, YouTube video, and
birthday blog post, Kate's parents are preventing her from any hope of future
anonymity. . . . Why make hundreds of
embarrassing, searchable photos freely available to her prospective homecoming
dates? If Kate's mother writes about a
negative parenting experience, could that affect her ability to get into a good
college? We know that admissions
counselors review Facebook profiles and a host of other websites and networks
in order to make their decisions.
There's a more insidious
problem, though, which will haunt Kate well into the adulthood. Myriad applications, websites, and wearable
technologies are relying on face recognition today, and ubiquitous
bio-identification is only just getting started. In 2011, a group of hackers built an app that
let you scan faces and immediately display their names and basic biographical
details, right there on your mobile phone. . . .
The easiest way to
opt-out is to not create that digital content in the first place, especially
for kids. Kate's parents haven't just
uploaded one or two photos of her: They've
created a trove of data that will enable algorithms to learn about her over
time. Any hopes Kate may have had for
true anonymity ended with that ballet class YouTube channel.
So what they did for their daughter
was create Facebook, Twitter, and other accounts for her—on which nothing has
been posted, nor will it be. They keep
the accounts active but private and all are tied to one email account. They also ask friends to remove any references
or "tags" about their daughter.
Shortly after this article, there
was a report that California had become the first state to adopt a law
requiring websites to allow minors (under 18) to remove postings on the
websites. Most of the major social media
sites already do allow removal, but one has to wonder how "removed"
those posting actually are. I've heard
any number of times, in meetings at the University, that once something is
posted on the web, it likely lives somewhere and can be found. I don't know if that's just urban legend.
Webb's manifesto did provoke a
response, from another parent, a father.
While I inevitably bridle
against the indiscriminate application of one person's morality upon everyone
else, regardless of individual circumstances, I will admit: Webb's critique has
a bite. It stings because there is truth buried in her holier-than-thou
denunciation of every parent who fails to live up to the author's pristine — "We
post nothing about our daughter online" — standards. It's true: Every time
a parent posts a picture or recounts an anecdote about their child online, it
is an invasion of privacy and a potential cause for peer embarrassment.
Webb's larger concerns —
she worries that advances in facial recognition technology and data mining
algorithms will mean that the digital trails we create for our children could
hurt their college admission and employment prospects — seem a little
over-controlling to me, but I'm not going to entirely dismiss them. Our
children will be more documented than any previous generation and it's hard to
know how to navigate that. There is much to learn.
He
argues that "modern society blows communities apart, splits family and
friends across continents, isolates us in our cubes, and works us too hard to
have as many physically embodied connections with each other as we would like." That seems like an overstatement as well, but
the point seems to be essentially valid.
"One meaningful reason why we value social media, even though it is
problematic in so many troubling ways, is precisely because it helps us stitch
our exploded communities back together, and keeps us in closer touch with the
people we love." Although I have to
say that on a personal level, the advance of the web hasn't precluded continued
physical connections to friends and family, and the web and social media simply
enhance it (and make routine transactions, like arranging a dinner, easier).
So is this paranoia? Or putting a finger in the dike? Her approach is not without merit, because
who knows what nefarious purposes corporations, government agencies,
terrorists, or any other kind of organization might pursue, now or in the
future? At the same time, it may be
quixotic. At the very least, however, it
may give the kid the breathing room to make her own decision. Of course, at about age 6 or 7 or whenever
she comprehends how to use social media, their daughter's privacy may go out
the window. Will they restrict her use
of the Internet in order to protect her privacy until she's old enough to
realize the implications of a decision?
Which would probably be well into her teens. (The Salon
author father noted this as well:
Webb's authoritarian approach may very well backfire once her daughter
reaches a certain age. Given the
ubiquity of social media and their use, it is probably more feasible to guard
privacy on the web than to ban children from using it in a quixotic quest for
privacy.)
An even more menacing potential
attack on privacy comes from another direction.
Brandeis would be scared out of his wits if he read law review articles
written by Duke law professor Nita Farahany.
She's been thinking about brain scans and privacy and the implications
of neuroscience for interpretations of the Fourth and Fifth Amendments and what
she describes as the "coming siege against cognitive liberties." As fMRI scans and EEG detection grow more and
more accurate, and begin to detect individual thoughts, is that an invasion of
privacy, a "search" within the definition of American constitutional
law? (One example given in a Chronicle of Higher Education article
about Farahany was whether psychologists doing some basic research using brain
scans discover accidentally that one of the subjects has committed a
murder. What do they do?) Here indeed is the path to the culmination of
Brandeis's predictions about the "psychic" sciences.
The advances in the field of brain
scans are startling. One researcher at
Berkeley using brain scans has been successful in reproducing on a video
monitor imagines of what the subject is seeing.
While those in the field agree that the ability to read thoughts now is
extremely limited, the guy at Berkeley is quoted as saying that "assuming
that science keeps marching on, there will eventually be a radar detector that
you can point at somebody and it will read their brain cognitions remotely." Is that an illegal search? Or, if were to reveal something
self-incriminating, would it violate the Fifth Amendment? How much information will the state be able
to gather legally, either by force or without someone's knowledge?
Another report came from SciTech Daily: "Scientists from the
Stanford School of Medicine have developed a new method of recording brain
activity that could lead to 'mind-reading' applications that would allow a
patient who is rendered mute by a stroke to communicate via passive thinking. .
. . 'We're now able to
eavesdrop on the brain in real life,' said one of the study authors. The finding could lead to "mind-reading"
applications that, for example, would allow a patient who is rendered mute by a
stroke to communicate via passive thinking. Conceivably, it could also lead to
more dystopian outcomes: chip implants that spy on or even control people's
thoughts." A lawyer at the Stanford
Center for Biomedical Ethics commented that "any fears of impending mind
control are, at a minimum, premature, said Greely. 'Practically speaking, it's not the simplest
thing in the world to go around implanting electrodes in people's brains. It will not be done tomorrow, or easily, or
surreptitiously.'"
These are troubling privacy
questions, to me, and it is these that begin to elicit worry about Huxley's 1984.
(I do not agree with the critics of the NSA eavesdropping when they
invoke Huxley; there would have been no national debate about this in the world
of Big Brother. Here, we're arguing
about it endlessly in Congress and the media.
This is no comment on the wisdom or legality of the eavesdropping—just
that it's not 1984. What Farahany is
thinking about comes closer.)
Another interesting question for the
"originalists" on the Supreme Court.
Exactly what did the Founding Fathers have in mind for brain scans when
they drafted the Fourth and Fifth Amendments?
From my perspective, there is reason
to worry both at the individual and societal level. One can imagine distasteful and threatening
government uses of brain scans and the like for malign "public policy"
purposes. But it is also possible to be
scared at the individual level. What if
I walk across the campus and think to myself "gee, I wish (U.S.) President
XXXX were dead" because I disagreed so vehemently with his/her
administration. I would never have any
intention of taking any action, never dream of taking action, and would not
have the means to accomplish anything even if I wanted to, but could some
zealous protector of the public interest, scanning thoughts across the
University, have me arrested and jailed?
Or what of a public policy researcher who develops policy proposals that
are against the interest of the governor/president/local police chief, and the
authorities seek to have that work squashed? Or a male undergrad who looks at some female
undergrad and imagines what she looks like without clothes or what she'd be
like in bed? These are trivial examples.
One can make a tally of the benefits
to the surveillance state; Stuart Armstrong did so in an article in Aeon, describing it as the "panopticon"
("a building, as a prison, hospital, library, or the like, so arranged
that all parts of the interior are visible from a single point"). The advantages include a sharp drop in crime
(both on-street and white collar as well as police brutality), a reduction in
abuses (child, manager-employee), a reduction in the size of police forces
(because guilt would be known and they would just need to go arrest people), a
reduction the number of laws (they proliferate in order to be enforced
selectively; if the laws hit the manor equally with the ghetto, there would be
quick movement to repeal many of the laws), a reduction in military budgets
because countries would know what their potential enemies are doing and not
need to arm against all possibilities, the ability to catch possible pandemics
early as well as expose "lax safety standards or dangerous practices,"
the ability to nab those who might wish to engineer synthetic diseases (or
nuclear devices or other weapons) for terrorist purposes, help in finding
people after natural disasters, the collection of masses of data that could
support or refute research hypotheses in many fields, allow elimination of
passwords or airport searches (the system knows where you are at any time so
can grant you access to a computer or ATM or let you through security),
eliminate the need for credit cards because one would simply take from a
merchant what one needs and would be automatically billed for it—and other
benefits. Armstrong characterizes this
watched society as "the imagined virtues of and vices of small villages,
which tend to be safe and neighborly, but prejudiced and judgmental." He doesn't buy all these benefits as
outweighing the danger, but concludes that "if we're headed into a future
panopticon, we'd better brush up on the possible upsides. Because governments might not bestow these
benefits willing—we will have to make sure to demand them." These advantages, however, would only accrue
if every single square foot of indoor and outdoor space were under
surveillance. Otherwise I could do a lot
of nasty things in the basement and no one would know. . . .
The former Secretary of Homeland
Security, Michael Chertoff, makes a different but related point. It's the widespread availability and use of
recording devices that poses as much of a threat as a government agency. Because everyone seems to have a way to
capture action and speech instantly, "virtually every act or utterance
outside one's own home . . . is subject to being massively publicized." Chertoff goes on to observe that "the
true horror of the East German Stasi or the Maoist Red Guard was the
encouragement of informants—private citizens reporting on other private
citizens and even family members. No police agency could be omniscient. The
oppressiveness of those police states came from the fear every citizen had that
another citizen would disclose deviations from the party line." Chertoff asks if we are creating such a
society, where "every overheard conversation, cellphone photograph or
other record of personal behavior is transmitted not to police but to the world
at large? . . . Do we need to constantly
monitor what we say or do in restaurants, at sporting events, on public
sidewalks or even private parties?"
I believe strongly in advancing
research in neuroscience; I'd like there to be ways to fend off Alzheimer's and
Parkinson's, for example. But those
successes may be inextricably mixed up with the kinds of advances that
Professor Farahany contemplates, or even the less biologically-based
surveillance that Mr. Armstrong speculates about, and the advances in both
spheres likely will come with a high price in privacy if we are not careful. I think Brandeis was right: we must "be most on our guard to protect
liberty when the Government's purposes are beneficent." But his caution goes beyond "government,"
as Mr. Chertoff argues; we may have as much to fear from Google and from
ourselves as we do from Washington.
TELEPHONE, n. An invention of
the devil which abrogates some of the advantages of making a disagreeable
person keep his distance.
And if my heart be scarred and
burned,
The safer, I, for all I learned. (Parker)
The safer, I, for all I learned. (Parker)
The writer George Saunders (whose
works are widely known but which I have not read, who TIME magazine declared one of the 100 most influential people in
the world, and who's a faculty member in creative writing at Syracuse
University) offered a few thoughts on modern technology that I found
captivating. He did an interview with
the Guardian about his reservations about the world of the web.
I have noticed, over the last few
years, the very real (what feels like) neurological effect of the computer and
the iPhone and texting and so on – it feels like I've re-programmed myself to
become discontent with whatever I'm doing faster. So I'm trying to work against
this by checking emails less often, etc., etc. It's a little scary, actually,
to observe oneself getting more and more skittish, attention-wise. . . .
I do know that I started noticing a
change in my own reading habits – I'd get online and look up and 40 minutes
would have gone by, and my reading time for the night would have been pissed
away, and all I would have learned was that, you know, a certain celebrity had
lived in her car awhile, or that a cat had dialled 911. So I had to start
watching that more carefully. But it's interesting because (1) this tendency
does seem to alter brain function and (2) through some demonic
cause-and-effect, our technology is exactly situated to exploit the crappier
angles of our nature: gossip,
self-promotion, snarky curiosity. It's almost as if totalitarianism thought
better of the jackboots and decided to go another way: smoother, more flattering – and impossible to
resist.
The
way it is used by many (not all, to be sure), Facebook is the perfect example
of what Saunders was talking about: "gossip,
self-promotion, snarky curiosity."
Perhaps Facebook is what he had in mind when he made the
observation. I have come to find 90% of
the postings worthless—and even worse, uninteresting. (Worthless but funny I would
appreciate.) But maybe I just have the
wrong Facebook friends.
SELF-EVIDENT, adj. Evident to
one's self and to nobody else.
ERUDITION, n. Dust shaken out
of a book into an empty skull.
With my perhaps oddball sense of
humor, I laughed out loud when I read this, one of the funnier things I've read
in quite awhile.
Bartender asked Descartes if he'd
like a drink.
Descartes replied, "I think
not," and disappeared.
CARTESIAN, adj. Relating to
Descartes, a famous philosopher, author of the celebrated dictum, Cogito
ergo sum—whereby he was pleased to suppose he demonstrated the reality of
human existence. The dictum might be improved, however, thus: Cogito cogito
ergo cogito sum— "I think that I think, therefore I think that I am;"
as close an approach to certainty as any philosopher has yet made.
I regaled some of you with the
factoid that I received from Kathy's uncle Phil about President John Tyler
(1790-1862, president 1841-45), that he has two living grandchildren (!). Tyler was the father of 15 children (the most
prolific president). The 13th child,
Lyon Gardiner Tyler, was born in 1853, when Tyler was 63. L. G. Tyler died in 1935 and had 3 children
by his first marriage and 2 more by his second (he got married the second time
in 1923, after his first wife died); the second two (sons) were born in 1924
and 1926—when he was 71 and 75 years old.
As of 1912, anyway, those two sons were still alive.
L. G. Tyler was an academic who
served as president of William and Mary College in Virginia 1888-1919; he's
credited with saving it after the disaster of the Civil War and laying the
foundations for it to become the excellent college that it is today. He also published quite a few books and
articles, primarily on Virginia history. So who says academics don't have fun? Yikes, becoming a father at age 71 and again
at 75.
This tidbit of history became the
subject of a conversation I had with two male friends. Kathy and I were staying with our friends Rolf
& Roberta Sonnesyn at their place on Leech Lake and were joined by our
friends Gary Voegele and Margie Durham (my college roommate and his
fiancée). Gary and Rolf and I were
musing about the practice of older men marrying much younger women and fathering
children at a time when most men are content to be grandfathers—if they're
still alive. We don't know how
widespread the practice was in the 19th Century, or in the 1920s when L. G.
Tyler married for the second time at age 70 and then had two more children. What was clear to the three of us, however,
is that it isn't something we would have the slightest interest in doing.
It may be that generation gaps didn't
matter as much 100 and more years ago.
Or perhaps it was culturally more acceptable. Irrespective of the norms, the idea of
getting involved with, much less married to, some woman in her 20s or 30s
(couldn't be a lot older than that and still have kids) is totally
unappealing. The experiential,
professional, and intellectual differences would be enormous and, at least for
the three of us, we agreed that social settings would be awkward. I can just imagine getting together with my
friends, most of whom are in their 50s and 60s, with a 20- or 30-something on
my arm. We agreed that maybe a hop in
the sack with some attractive young woman might be fun—I suspect most men of a
certain age don't find that idea repellant—but the relationship wouldn't last
more than a day or so. (And none of the
three of us would do anything of the sort when we are happily
married/attached. As Rolf said, maybe if
he were trapped on an island in the ocean and his only companion were a young
woman. . . .)
One might also posit that the role
of the father has changed in the last 100+ years, at least in certain socio-economic
classes. In my experience, and that of friends,
the dads were much more involved in parenting than were the males of my father's
generation. Obviously one is not going
to be much involved as a father when one is 75 years old when a child is born—he
won't even live that much longer, much less be involved in the child's
upbringing. So it would seem like
abandoning parental responsibility to have a child so late in life—not only
will the father not be there to help the mother, for the most part, the father
won't even be around long enough for the child to grow up to know his or her
father. (L. G. Tyler died in 1935, at
age 81, when the two boys were 5 and 9.)
Anyway, it was a surprise to learn
that someone who was president over 150 years ago still had living
grandchildren.
INTRODUCTION, n. A social
ceremony invented by the devil for the gratification of his servants and the
plaguing of his enemies. The introduction attains its most malevolent
development in this century, being, indeed, closely related to our political
system. Every American being the equal of every other American, it follows that
everybody has the right to know everybody else, which implies the right to
introduce without request or permission.
I misremember who first was
cruel enough to nurture the cocktail party into life. But perhaps it would be
not too much too say, in fact it would be not enough to say, that it was not
worth the trouble. (Parker)
Uff da. Elliott got ensnarled in the criminal justice
system as a result of an accident at work.
He was working one evening at the sub and pizza place, Davanni's—where's
he worked for over a year as a part-time job while in college—and accidentally
served alcohol to minors during a rush period.
He asked for IDs but didn't scrutinize them carefully enough, and it
turns out that it was a sting. He told
me that as soon as he had uncapped the beers, a police officer stepped in and
put his badge on the counter and told Elliott he'd just served liquor to
minors.
So now Elliott is accused of a gross
misdemeanor and has a court appearance on December 11. His employer receives a citation and fine and
Elliott, under the law, is liable to a fine of up to $3000 or a year in
jail. Elliott and I exchanged a number
of text messages while he was still at work, and I tried to calm him down, but
he was pretty bummed out. I have to
confess that I would be, too, if I were in his position.
What's ironic about it was that if
there's any 22-year-old who I would think less likely to get into the criminal
justice system, or deserve to, it's Elliott.
He doesn't drink, doesn't use drugs of any kind, and doesn't drive. His worst faults are not cleaning up his
dishes after he eats and perhaps playing too many video games. So this is just ridiculous. To impose a penalty other than a modest
deterrent fine for what clearly was an accident seems to me to be a waste of
the resources of the legal system, to say nothing of the unnecessary effect on
the people who get involved by accident.
But I suppose they have to enforce the law.
When Elliott let me know what was
happening, I contacted a few of my attorney friends. They all told Elliott not to worry too much about
it and that he might, for example, receive a suspended sentence for a year,
after which the incident would be wiped from his record, assuming no
recurrence. Or it might be
plea-bargained down to something less serious and perhaps include a fine. But they all cautioned that he MUST get an
attorney, because a gross misdemeanor is not something anyone wants on his or
her record. Given that Elliott has
concluded, after his time at Davanni's, that he never wants to be in the food
service business again in his life—did it, learned what's it like, doesn't care
to do it again—I would say the risk of recurrence is zero.
One of the attorneys referred to
Elliott talked with him a couple of times and then had a conversation with one
of the city attorneys. He advised
Elliott to go to court, ask for a public defender (which he qualifies for
because his personal income falls well below the threshold making him eligible
for one), and ask for a continuance. He
also told Elliott not to lose any sleep over the matter; the City Attorney's
office has a huge number of these kinds of cases and doesn't want to pursue
most of them to any significant extent.
Meantime, Elliott will presumably meet with the public defender and seek
a plea bargain. As Elliott observes, he
can't plead "not guilty" because he did actually sell the alcohol to
the minor, even though it was a mistake, so he'll ask for a reduced
penalty. We're hoping for a petty
misdemeanor (which is equivalent to a speeding ticket and which isn't counted
as a criminal act). We may not know the
end of this story for several months.
CHILDHOOD, n. The period of
human life intermediate between the idiocy of infancy and the folly of
youth—two removes from the sin of manhood and three from the remorse of age.
APPEAL,
v.t. In law, to put the dice into the box for another throw.
I have opined from time to time in
recent years that I have finally come to conclude that Abraham Lincoln was
wrong: he should have let the South
go. That view received oblique
endorsement from Hendrik Hertzberg in the New
Yorker in late October. Hertzberg
was writing about the origins of the 14th Amendment and its relationship to
U.S. debt obligations. He wrote that "throughout
the Civil War and afterward, Republicans in Congress had enacted some of the
most forward-looking legislation in American history: a national currency, the Homestead Act, a
transcontinental railroad, support for higher education, the definitive
abolition of slavery—all thanks to the extended absence of delegations from the
self-styled Confederate states. Now that
era was about to end." And with the
possible exception of the Sherman Anti-Trust Act in 1890, it was the end of
major progressive legislation until FDR and the Great Depression. Further progressive legislation has only been
enacted in fits and starts since then, primarily (but not exclusively) under
Democratic presidents and congresses, and frequently the lead opposition to any
step forward has come from those formerly Confederate states, the most
retrograde parts of the country.
HOMICIDE, n. The slaying of one
human being by another. There are four kinds of homicide: felonious, excusable,
justifiable, and praiseworthy, but it makes no great difference to the person
slain whether he fell by one kind or another—the classification is for
advantage of the lawyers.
Then if my friendships break
and bend,
There's little need to cry
The while I know that every foe
Is faithful till I die. (Parker)
There's little need to cry
The while I know that every foe
Is faithful till I die. (Parker)
In mid-November we had a delightful
visit to my friend of nearly 40 years, Denise Ulrich, in Flowery Branch,
Georgia (just north of Atlanta). Denise
lost her husband Bob a couple of years ago and has since remarried; she had
never met Kathy, so two old friends met their new spouses.
We poked around Atlanta, including a
visit to the Atlanta Cyclorama, which they claim is the largest painting in the
world, 42' by 158,' about the size of a football field, according to the guide. It depicts one day in the Battle of Atlanta,
in July of 1864, and includes the two commanding generals, Hood and
Sherman. One sits on auditorium seats in
the center that rotate to provide the entire view, which is narrated. It's really a spectacular painting; it was
done in Milwaukee in the 1880s, there was a traveling exhibition that included
Minneapolis, ownership changed hands several times, and it eventually donated
to the City of Atlanta and put on permanent display in 1898. (As one can imagine, it deteriorated over the
years and went through a $14-million restoration in 1979. It is one of only two surviving cycloramas of
the Civil War; the other one is of Gettysburg.)
What I found slightly ironic about
the event was that the guide for the Cyclorama was a young Black man, who
presented the information in an absolutely neutral manner. The soldiers and officers in the painting are
referred to as Confederate and Federal.
I don't believe that William Tecumseh Sherman is much admired in
Atlanta, since he burned the city to the ground after taking it in September,
1864, but neither the painting nor the attached small museum take sides.
Apart from visiting a very large
Hindu temple carved from stone and granite, containing no nails or timbers, and
visiting the Atlanta Botanical Garden (which places make us jealous of the
plants we can't grow in Minnesota), a good time was had by all. (We envy them their green and blooming plants
in November—but not a chance we'd ever want to live there, given what it's like
for the middle 3-4 months of the year.)
DELUSION, n. The father of a
most respectable family, comprising Enthusiasm, Affection, Self-denial, Faith,
Hope, Charity and many other goodly sons and daughters.
Outspoken?
By whom? (When Parker
was told that she was outspoken)
Elliott observed that Thanksgiving
Day and the Friday following it are two of our busiest days of the year. Because our house can't hold the family when
many are present, my brother and sister-in-law have been kind enough to host
their share of the holidays plus mine, for which we are profoundly
grateful. We figure the least we can do
is bring the turkey and a couple of other dishes, so we're up bright and early
Thursday morning cooking.
Friday, by tradition since the kids
were young, is the day we go to a tree farm north of the Twin Cities and cut
down our Christmas tree. Which we did,
as usual, this year. Elliott is right
about "busy": Driving to get the
tree and getting it decorated took from 9:00 to 5:15—it's a full-day
project. That is because we still use
the little C7 lights, in strings of 25, that I have to drape around the tree
and then clip to branches and get the 1940s reflectors all in place. That process alone takes 2 hours. Kathy and I had decided that this year we'd
count the number of ornaments we put on the tree—and then forgot, of
course. But hanging ornaments was
another hour-plus. But the tree is up
and we are on the way to the holiday season.
Because of our practice of getting
and decorating our tree on Friday, we obviously do not engage in any "Black
Friday" shopping. I'd rather clean
up the cat litter box than go out shopping that day. My brother told me an interesting fact he'd
read: retail margins in December are no
different from their margins the rest of the year, so either the Black Friday
discounts aren't any greater than normal or the manufacturers are giving the
retailers a deeper discount on the wholesale prices. In either case, we don't have any wish to go
shopping when there are 67 bazillion other people out.
What fresh hell is this? (Parker)
POSITIVE, adj.: Mistaken at the top
of one's voice.
A closing item about Dorothy Parker, from City Paper (Baltimore), 9/21/2005, in an
article titled "Best Under Appreciated Local Landmark: Dorothy Parker Memorial Garden":
Little known fact: When sharp-shooting wit Dorothy Parker passed
away in 1967, she left her literary estate first to Dr. Martin Luther King and,
upon his death, to the National Association for the Advancement of Colored
People, to which Parker's estate rolled over following the 1968 assassination
of the civil-rights leader. Parker
herself was cremated, and nobody claimed her ashes, which sat first at New York's
Ferncliff Crematorium and then in her lawyer's office. In 1988 the NAACP, which receives royalties
from Parker's manuscripts to this day, learned that her ashes remained
unclaimed, and it stepped forward, erected a memorial garden, and interred her
ashes there [that is, at the Memorial Garden in Baltimore]. Here, a circle plaque, surrounded by pine
trees, reads in classy tribute: This
memorial garden is dedicated to her noble spirit which celebrated the oneness
of humankind and to the bonds of everlasting friendship between black and Jewish
people.
Ambrose Bierce disappeared at age 71. He purportedly left for Mexico in October
1913, but even that "fact" has little supporting evidence. His body has never been found, and those who've
studied the events have basically thrown up their hands. No one knows what happened to him.
If you have any young friends
who aspire to become writers, the second greatest favor you can do them is to
present them with copies of The Elements
of Style. The first greatest, of course, is to shoot them now, while they're
happy.
I'd like to have money. And I'd
like to be a good writer. These two can come together, and I hope they will,
but if that's too adorable, I'd rather have money. (Both Parker)
Adam
Gopnik wrote in the New Yorker that "Dr. Johnson said, rightly, that
anyone who decides to write something believes [himself/herself] wiser or
wittier than the rest of mankind, and that it is up to the rest of mankind to
determine if [he/she] is." I don't
agree with Gopnik or Johnson because I certainly make no such claim. I write to pass along family news and offer this
meandering journal on events of varying degrees of importance—the latter mostly
in order to keep my brain from atrophying.
On that note, I'll leave you with my wishes for you for
2014 and beyond as articulated by Bierce.
FUTURE, n. That period of time
in which our affairs prosper, our friends are true and our happiness is
assured.