Good afternoon.
I was having lunch in
late December with an old friend. The
discussion turned briefly to politics (a topic I mostly try to avoid of late
because I am cognizant of its negative effects on my blood pressure). My friend spontaneously opined that Abraham
Lincoln should have let the South go.
That is an opinion I share; I've downgraded my ranking of the Lincoln
presidency because Lincoln wasn't prescient enough to realize the long-term
effect of keeping the southern states in the union. (I write that somewhat facetiously.)
A month or so earlier
I had read a brief extract from a biography of Zachary Taylor (by John Eisenhower),
who became president in March, 1849, and died in July, 1850. Taylor, it seems, had a much more deferential
attitude to Congress than did his predecessor, James K. Polk. Polk, you may know, was responsible for the
acquisition of more territory for the United States than every president except
Thomas Jefferson (the Louisiana Purchase).
Through war and diplomacy, Polk added both the southwest and the
northwest, and he didn't pay a lot of attention to Congress in the process of
doing so.
It takes little
imagination to understand why Polk was upset at a comment that Taylor made
after his inauguration. Here is the
biographer Eisenhower quoting Polk:
"'Something
was said which drew from General Taylor the expression of views and opinions
which greatly surprised me. They were to
the effect that California and Oregon were too distant to become members of the
Union, and that it would be better for them to be an independent government. He said that our people would inhabit them and
repeated that it would be better for them to form an independent government for
themselves. These are alarming opinions
to be entertained by the President of the United States. . . . General Taylor's comments, I hope, have not
been well considered.'"
Although not for the reason
Taylor articulated, 167 years later there is occasional chatter about
California seceding from the U.S. I don't
blame some Californians for contemplating the possibility. If they were to do so—although there's no
constitutional means for secession to happen—the rest of us who count ourselves
as progressives are screwed, because the departure of that many Democrats from
Congress would make it more conservative.
If Oregon and Washington followed suit—creating the Pacific States of
America—the situation in the leftover U.S. would be even worse.
* * *
As you know, we had
bad news delivered to us on October 17 of last year. The hospital social worker, sitting with us
and the physician, described Krystin's medical condition and then flat-out
said, "this is not survivable."
I realized then and later that that's the way I preferred to hear the
information—and learned thereafter that I'm in the majority on that score.
A study out of Brigham
Young University contends that people generally prefer to hear bad news with
candor and with little or no "buffer." A buffer is a way to try to soften the blow, "a
polite lead in." Whether any buffer
at all is seen is desirable varies with the situation. If the bad news is about a social
relationship, a small buffer may be preferred:
"we need to talk" and then get to the nub. The small buffer allows the recipient of the
news to understand that bad news is coming.
If the bad news is about a physical fact, the study participants wanted
no buffer at all: "your illness is
terminal" or "the water is polluted."
Those who must deliver
bad news are usually uncomfortable in the position. A different study commented:
That most
people have difficulty communicating bad news is reflected in what's called the
MUM effect ("keeping mum about undesirable messages"). . . . "If sharing bad news was in no way
negative for senders, then there would be no reason for senders to report
feeling uneasy, reluctant, and hesitant to share bad news. Indeed, these costs might very well represent
the essence of the MUM effect."
One imagines that someone in the position of the hospital social worker
is trained on how to deliver bad news and, with experience, can steel
themselves to the effects it will have.
The guy clearly wasn't happy about what he had to tell us, but of course
he never lost his composure or his steadiness in conversation, and he was
sympathetic.
The desire for
frankness is not universal. I can still
recall that my mother never asked for nor received an honest assessment of her
medical state when she was dying of cancer.
At one point her physician simply lied to her by telling her that she'd
be able to attend an annual event a year later.
We all knew that wasn't true and so did he. That refusal to acknowledge her
situation—which I'm sure she knew, deep down—made it difficult for the rest of
us to talk with her about it. We had to
tip-toe around the topic whenever it was the subject of conversation.
One thought about the
preference for receiving the news straight, no buffer, is that the person may
know the news is coming anyway, so delaying or complimenting beforehand just
creates anxiety. "Dressing up bad
news with a pretty bow doesn't make things any better. In fact, it could make them worse." Another element of the interaction that's
important is tone. Aggressive (e.g., in
a job termination) begets aggressive; sympathy and kindness beget as positive a
response as possible (which, of course, isn't going to be very much, in these
circumstances).
I'd just as soon not
be in the position ever. I hope I can
avoid it. Been there, done that, don't
need any more experience on that score.
* * *
Last
summer Kathy and I attended the wedding of a daughter of long-time
friends. It was a traditional (Catholic)
wedding and the Bible readings were ones we have heard many times at
weddings. (I want to make it clear at
the outset that this is not a
criticism of the couple who got married and their choices of readings at the
ceremony. My comments are in
general.) One of the readings has
annoyed me for decades, and I've heard it at perhaps 75% of the weddings I've
been to: I Corinthians 13:7 (Paul's
first letter to the Corinthians). In the
Revised (or English) Standard Version, it goes like this:
Love bears all things, believes all things, hopes all
things, endures all things.
In the King James Version, it reads as follows:
(4)
Charity . . . (7) Beareth all things, believeth all things, hopeth all things,
endureth all things.
I have for
years found that claim about love to be ridiculous, silly, and downright wrong
as a definition of love for a couple exchanging vows. For example, love shouldn't bear or endure an
abusive, drug-ridden relationship after prolonged efforts to deal with the
problems, or a relationship that involves abuse of children. We all know that love in real life does not bear, or believe, or endure all
things. One could argue, given increased
divorce rates over the last 50 years, that in fact love bears and endures quite
a bit less than it used to. What I have
learned, however, is the language of I Corinthians 13:7 isn't quite what the
plain English would suggest. There are
two difficulties, one contextual and one translational.
The
contextual: A friend directed me to a
Biblical exegesis website. "In 1
Corinthians 13, we find one of the most beautiful and familiar chapters in the
Bible. This chapter is typically read at
weddings and anniversary celebrations.
It has even been set to music.
Yet, this was never the original intent.
Instead, Paul was writing a rebuke to a dysfunctional church for their
abuse of spiritual gifts. Typically
though, this understanding is often ignored." As my friend commented, "a marriage
ceremony is definitely out of context for these verses!"
The
translational: Paul wrote his letters in
Greek (at least that's what the best scholarship seems to say—as opposed to
Hebrew or Aramaic). The word he used was
agape, which the Oxford English Dictionary defines as "Christian love, as
distinct from erotic love or simple affection; charity." The Random House Dictionary has three
definitions: "1. the love of God or Christ for humankind. 2. the
love of Christians for other persons, corresponding to the love of God for
humankind. 3. unselfish love of one person for another
without sexual implications; brotherly love." So the "love" in 13:7 isn't the
love we usually associate with romantic relationships that lead to a marriage.
A
faculty friend who knows something about classical languages offered
observations on the use of the words. "I
was going to say that 'love' is a particularly bad translation and like so many
modern ones substitutes something general and vague for something pointed and
relevant even if not 100% accurate in representing the Greek. A term that came around a few decades ago, Caring, seems to me much of the right
idea, and comes from Latin Charitas,
like charity. The Greek term agape has no corollaries in English." He added, apropos of the use of I Corinthians
13:7 at weddings, "you are entirely right about that text for a wedding,
but the modern translations are so vacuous it hardly matters what gets used."
My
friend also provided me a link to commentary on I Corinthians 13 in the King
James Bible.
Some critics claim that the word "charity" is
either wrong or outdated. Newer
translations use the word "love" instead. The Greek word at issue is "αγαπη
(agapē)". Thayer defines this word
as "brotherly love, affection, good will, love, benevolence" (Thayer's
Greek Definitions). The definition of "charity"
is "benevolent goodwill toward or love of humanity"
(Merriam-Webster). The English word "charity"
comes from the Latin "caritas", which means "Christian love"
as opposed to sexual love (Online Etymological Dictionary). Throughout history, Latin theologians such as
Augustine have used "caritas" as a term of art to refer specifically
to Christian love (On Christian Doctrine, 3.10.16). Whenever "charity" appears in the
KJV, it is in reference to Christian love toward fellow Christians.
Of course
I'm not going to go on a rant about this the next time I attend a wedding. But I may not be able to resist a small shake
of my head (largely because the priest/minister should know enough to suggest
the passage not be used in the wedding ceremony).
* * *
My
exchange with my friend about Corinthians inspired a few other reflections on
matters indirectly related to the Bible.
He wrote, on the matter of being in church, "I haven't been in a
church [for over two decades]. But
confess I am often tempted to be, and with the right church (St. Agnes comes
very close, since it uses Latin at High Mass) I might actually go. Not for the theology but the liturgy. That would be a high Anglo-Catholic church
using Jacobethan English in Prayer Book and Bible, and they are few and far,
far between." That isn't a flavor
of church service that many would like, but he's a classicist!
I told him
that what he and I appreciate, and what my son does not, is the liturgy and the
music. Even though I am not religious, I
attended church as a youngster and grew familiar with the hymns and
readings—and, to a modest extent, church architecture. It's for that reason, I think, that Pat and
I—16 years ago—so enjoyed Evensong at Westminster Abbey when we sat with the
choir. If you aren't raised in a
religion, you don't have any of that history or the affection for the services
(even though you might not believe in the theology). I enjoyed the Catholic wedding that provoked
my response to Corinthians; it helps when the priest is good (i.e., funny,
thoughtful) at what he does.
My friend wrote back.
"You're a good example of what the church does for one whether one
believes or not. The liturgical
churches—the ones with serious esthetic expressions unlike the evangelicals—do
a great service for the spirit, for conduct, and even for (dis)belief, I
think. [I may not have believed in the
theology] but I could see how it all fitted and worked together in The Church,
and I respected that, without knowing until later precisely what my relation to
the Church was."
There was an added benefit, in my opinion. I opined to him that "I also think that
my experiences in church are what allow me to admire the breath-taking art and
architecture of the great churches/cathedrals, especially of Europe (although
the National Cathedral in D.C., along with St. Patrick's and St. John the
Divine in NYC inspire awe as well). Even
the St. Paul Cathedral!"
* * *
For some reason I was struck by a column by an editor at
the Chronicle of Higher Education on
quitting. Kelly Baker (Ph.D. in religion) wrote it after
becoming irritated by a newsletter from her kids' school urging that the
parents not let their children quit an extra-curricular activity even if they
don't like it. She realized that the
clear message is that quitting is failure.
She argues, emphatically, that is not so and the world shouldn't look at
it that way. She compiled a list of ways
quitting can be positive.
Ø Quitting
can be brave.
Ø Quitting
can be knowing your boundaries and limits.
Ø Quitting
can be an affirmation that your time is valuable, so you care about how you
spend it.
Ø Quitting
can be about what matters to your life — or what doesn't.
Ø Quitting
can be about shifting priorities.
Ø Quitting
can be about respecting your values.
Ø Quitting
can be about dignity.
Ø Quitting
can be about economics.
Ø Quitting
can save your life.
Ø Quitting
can be about leaving abusive spaces or walking away from abusive people.
Ø Quitting
can be about what you need or don't need, and about what you want or don't
want.
Ø Quitting
can be mundane or ordinary. Or it can be a radical declaration that you are
done with one way of living and you're trying out a new way.
Ø Quitting
can be a claim about who you are or who you are striving to be.
Ø Quitting
can just be a choice.
For almost every one of those bullets I can think of having
quit something. (Not sure about the
first one; I can't recall that I was ever "brave" for quitting
something. In the case of the
alternatives in the antepenultimate bullet, I've never engaged in a radical
declaration I was done with one way of living.
To put it mildly.)
I have
always been grateful to my dad for not making me play little league/park board
baseball for more than about 3-4 games.
He was the captain of his high school (Minneapolis West) baseball team
and had aspirations to professional ball (but WWII got in the way, in which he
was shot in the shoulder by a German sniper, so that was the end of his
baseball career). But he remained a
lifelong baseball fan and, when he became father to a son, he of course wanted
me to play baseball. I found it
dreadfully dull and I hated standing out in the sun in the outfield. I'm sure he was disappointed that I didn't
want to play—but he didn't force the issue and he never said anything to me of
any disappointment he may have felt when I quit.
Dr. Baker
makes several points I agree with. "Quitting
is not inherently a failure."
And: "There's nothing heroic
about sticking around. . . . Suffering
isn't heroic. It's just suffering." Nor is it necessarily good (or bad). I'm sure she'd put qualifiers around these;
it can be heroic but necessary at the bedside of an ill spouse or child or
parent. She asserts, I think correctly,
that "choices have contexts and histories. . . . We can't assume that all quitting is about
failure. We can't assume all sticking
around is about success."
She writes
that she was, for a long time, one of those who bought into the American creed
that winners never quit. The result was
that she stayed in terrible jobs, in terrible relationships, and endured
experiences that she would not have had to if she'd just quit. As the saying goes, you have to know when to
hold 'em and know when to fold 'em.
Quitting can be cutting your losses.
We also
tend not to hear, or tell, the stories of stumbling or of failure (except
perhaps to those to whom we are the closest).
So the culture expects to hear about winning, not losing. (One comment on Twitter, in response to her
column, was that quitting is an essential skill in mountaineering: knowing when to turn back. If you don't know, lives can be lost,
including your own.)
* * *
Coincidentally,
a few months after the editorial on quitting there appeared a piece, also in
the Chronicle of Higher Education,
about "grit" and teaching character by David Gooblar. Those of you outside education may not have
noticed that there's been widespread reaction to Angela Duckworth's research on
"grit," which is "a blend of perseverance and passion. . .
. Her theory suggests that character —
and, in particular, the tendency to stay the course in pursuing difficult
long-term goals — might be as important as academic knowledge and skills in
determining student success."
Schools have been trying to measure grit in students and teachers as
well as trying to incorporate grit in their curricula.
Professor
Gooblar related that he encountered "the grit debate when I was mulling
the issue of how to teach character in a college classroom." He maintains that "we as faculty members
should be trying to help our students develop into more capable, ethical, and
critical citizens — and not just helping them master certain knowledge and
skills." (That's not a contention
that all faculty accept as a legitimate part of their job; one commented, "It's not
enough that in order to be considered a good teacher I must simultaneously be a
subject matter expert, a motivational speaker and a stand-up comic, but I must
also take on the role of parent? . . . If
my professorial workload has become laden with the responsibility for creating
intellectual character for every student who walks through my classroom (or
virtual classroom) door, I want a raise."
There is also the not-trivial problem of measurement: how do you decide when a student has learned "grit"?)
Gooblar
faults the hoopla surrounding grit. One,
it's too simplistic; even if it's as important as advocates say, there are
still many elements that affect student success in school. Two, the concept hasn't been adequately and
systematically explored; even Duckworth opposes using it in schools at this
point. Three—and I think this is a
central problem—it blames the student for failure and doesn't account for the
many structural problems that many face "because grit maps so easily onto
traditional American narratives of self-reliance and meritocracy — narratives used
for centuries to justify a 'natural' racist, sexist, and classist hierarchy —
it seems particularly problematic. . . . Poor students are performing worse than rich
students? They're just not gritty
enough." (This reminds me of the
distinction between emphasizing internal rather than external factors when
assessing character and status.)
In his
article, Gooblar went on to explain how he thought that intellectual virtues
should be taught in college classes, which I am not going to explore. What he urges getting away from, however, is
the simplistic notion of grit. Several
reader comments point out, however, that teaching "character" or
intellectual virtues has to start before kindergarten, not in college. The older the students, the less malleable
they become—assuming there's some degree of malleability to start with, and
most seem to believe that character *can* be taught—so by the time they're in
college, their "character" is probably pretty well set. Needless to say, parents play the most critical
role when the child is growing from infant to pre-schooler.
So it's perfectly
acceptable to quit—but if you do, you have no grit. (I'm sure that Dr. Baker would distinguish
between quitting a specific activity and the long-term commitment required to
acquire a good education—even if "grit" isn't the term she'd use.)
* * *
There is
out there in the world research that looks extremely interesting—and which I
cannot even fathom. Some of such
research occurs in physics. First, in
1928, Paul Dirac predicted antiparticles and that when they met with particles,
they would be destroyed but produce energy.
The prediction was subsequently verified. In 1937, Ettore Majorana predicted that "in
the class of particles known as fermions, which includes the proton, neutron,
electron, neutrino and quark, there should be particles that are their own
antiparticles." A U of
California/Stanford research team believes it has found the Majorana
fermion. Just in case you wanted to
know, "the particular type of Majorana fermion the research team
observed is known as a 'chiral' fermion because it moves along a
one-dimensional path in just one direction."
I think
this stuff is fascinating, even though I don't understand it. Even something as apparently as recondite as
Majorana fermions may have practical application. "Far in the future . . . Majorana
fermions could be used to construct robust quantum computers that aren't thrown
off by environmental noise, which has been a big obstacle to their
development. Since each Majorana is
essentially half a subatomic particle, a single qubit of information could be
stored in two widely separated Majorana fermions, decreasing the chance that
something could perturb them both at once and make them lose the information
they carry."
The guy
who led the research has a puckish name for the particle: "the 'angel particle,' in reference to
the best-selling 2000 thriller Angels and
Demons, in which a secret brotherhood plots to blow up the Vatican with a
time bomb whose explosive power comes from matter-antimatter annihilation. Unlike in the book, he noted, in the quantum
world of the Majorana fermion there are only angels -- no demons."
* * *
I found this among the excerpts from the novels of P. D. James that I
one time compiled.
"The tragedy of loss is not that we grieve, but that we cease to
grieve, and then perhaps the dead are dead at last." From Original
Sin.
I think that may be right.
On that cheery note,
Gary
I can't really imagine that Lincoln could have done anything other than what he did. But through the marvel of hindsight, I can wish he'd thought about a different path. He could have negotiated the sale of the federal facilities. With the South gone, the structure of federal (north) income would have had to change, I agree.
ReplyDelete