As
always when January comes, I find "de-Christmasing" the house to be
one of the more depressing times of the year.
For those who are Christians, or raised as such, or who observe
Christmas in some fashion for whatever reason, many of us found Christmas to be
a wonderful time of the year (as kids, of course, we looked forward to the
presents, but everybody seemed a little bit happier during the season). In some ways I'm still like a little
kid: I love putting up the tree, seeing
friends and family, and giving presents (like many who are getting older,
receiving presents is not something that matters to me). So when it comes time to take down the tree,
put away the Christmas dishes, and put the living area back to normal, I am
always a little sad.
The "de-Christmasing" in 2017 came as soon as
possible. Our tree this year, for some
reason, stopped taking water after about 10 days. Because we cut the tree the day after
Thanksgiving, by Christmas it was thoroughly dry, needles were falling off, and
the branches were visibly drooping. We
had friends for dinner on December 29, and I was half worried that the tree was
going to start on fire from the heat of the lights.
For those of us in the Upper Midwest, added to the mild
melancholy of the end of the holidays is the annual realization that we face
the three long, dark months of winter.
Sure, the onset of winter comes long before January 1, but late in the
calendar year we have the gatherings over Thanksgiving and Christmas to look
forward to. Even though I'm not going to
work any longer, so don't have to face going out in the dark and cold mornings
(except to say goodbye to Kathy), after the New Year it's still cold and the
days are short until April approaches whether or not I leave the house; I want
to take a nap and when I wake up, have it be April 1.
As soon as Kathy retires, I won't take a nap, we'll just
head out of town until April 1 (we hope).
* * *
Vaguely on the "de-Christmasing" theme, each
year, too, I wonder whether I'll be here the next Christmas to open all the
boxes of lights and decorations. That
question isn't a reflection of age; every year for decades I have mused about
the uncertainty of another year of life as I put away the Christmas items. Except for those who are critically ill, I
suspect virtually all of us go to bed at night fully expecting to awaken the
next morning, and we make plans for the upcoming days and weeks. We even make plans for events months in
advance. But I surmise that few of us
make plans more than a year ahead of time, or at least do so only for very
special events, and I don't know anyone who'd make plans for something five or
ten years hence. Life just doesn't
provide those guarantees. Of course it
doesn't guarantee a next day, either (I had an uncle who died in his sleep at
age 49), but the opportunities for life to go off the track in 12 or 24 hours
are far fewer than they are in a year or more, as the risks accumulate. I think this is a matter of elementary
statistics. Fortunately, for most of us,
the accumulation of very small odds of demise remain very small odds. So I put away the decorations planning to be
here to take them back out again next Thanksgiving.
A friend of mine who reads these letters wrote an email
to me after reading the 2016 edition:
"Speaking of science, I discerned (and it was difficult not to)
that much of your letter this year spoke of death." Heavens, I certainly didn't intend the letter
to be a memento mori! So I will avoid
the topic in these messages (mostly).
Before
going on to other matters, however, I did run across an article discussing who
lives longer, people who eat meat versus people who do not. (More likely those who do not eat meat,
although the evidence seems to be somewhat mixed.)
What
caught my attention, however, were the opening two lines of the article: "Our ability to live a long life is
influenced by a combination of our genes and our environment. In studies that involve identical twins,
scientists have estimated that no more than 30 percent of this influence comes
from our genes, meaning that the largest group of factors that control how long
a person lives is their environment."
I confirmed this with a long-time colleague who studied identical twins
and the relative impact of genetics and environment on a multitude of human
traits.
The
number, 30%, disappointed me. I was
hoping it was higher, since my forbears (mostly) lived to ripe old ages. So I have to take more responsibility for my
lifespan and wellbeing than I want to.
OK, so much for death and dying.
* * *
As
2016 came to an end, the majority of media outlets and op-ed pieces and many of
the people I know were glad to see it go.
Especially for those who consider themselves on the progressive or
liberal end of the political scale, much else that might have been good during
the year was overshadowed by the outcome of the U.S. elections. (Even my principled conservative friends have
been dismayed and often revolted by the events of late 2016 and into
2017.) In addition, many cultural icons
departed. It was a disheartening end.
Jesse
Miksic's take, in Berfrois, was one
that struck a chord with me. He wrote
that 2016
was
the last year we saw traces of reality, even just in the rear-view
mirror.
. . . What we didn’t realize was that,
in 2016, the rational bubbles in global politics and economics . . . the
narratives that made sense, that gave some warmth to universalist and
altruistic and aesthetic principles . . . those were gasping their last gasps,
shivering their last shivers as they prepared to implode. We didn’t realize that from that moment
onward, a sort of depraved incoherence would take permanent root in the bedrock
of reality.
Rationality
stopped being effective as treatment for the ridiculous and the idiotic. We inherited Trump, Brexit, Putin, and Assad
as the foreground. "How could you
make sense of any of them? Each of them
represents, in some fashion, a triumph of 'reality' over intelligibility. . .
. Brexit and Trump . . . turned populism
against liberal democracy. They signaled
the reign of a new tribalism, linked appropriately to the ascendance of a new
global authoritarianism."
In a
future message I'll recount a short exchange with a friend about withdrawing
from participation in politics. Miksic
raised the question as well.
After
2016, those of us who weren’t cut out to be heroes just had to retreat. As the seasons became surreal, rationality
lost its promise of sanctuary. . . .
Some of us had to look closer to home for our solace . . . we had to
step back from the grand arcs, now reduced to rubble, and just focus on the
empty back-roads behind our parents’ houses.
It
was cowardly, these things we had to do.
We couldn’t all be fighters. We
sometimes just had to remind ourselves that we were here, creating little
sparks of meaning in an era where meaning was collapsing.
How does one be a "hero" in such an age? "After 2016, heroism became garbled and
buggy. Was it heroic to be ascetic and
non-violent? Or was it heroic to be
infinitely principled and reactionary, now that principles were all being
burned on the trash-fires of history?
Compromise became dangerous and counterproductive, because 2016 had
proven that impulse and misinformation and sabotage were more effective than
forging alliances and making commitments. . . .
America changed for the worse. Democracy
spoiled into a one-party authoritarian government in perpetual war against
populist discontent."
The future turned out to be much worse than we'd expected
during most of 2016. The question is,
how much of this kind of exposition is appropriately apocalyptic and how much
will turn out, over time, to have been overblown? I don't know; I'm afraid it's the former but
I'm hoping like heck it's the latter.
2017 hasn't turned me into an optimist on that score.
That same issue of Berfrois had this summary of the
year: ". . . a hexed 2016, the year
of unironic Darth Vaders taking power across the globe. A radical sadness—a sadness unknown to this
or the previous generation or the generation before that—has gripped the good
people of this United States."
This pessimism about the year in various quarters
prompted me to reflect on *my* year in a larger sense, something I don't do
much. All in all, for me personally,
2016 had far more positives than negatives:
a marvelous trip to Italy with Kathy and our boys; a fun venture around
Florida with Kathy; Krystin getting to a place where she's healthy and happy;
retirement that has turned out to be much more fun than I expected; and Elliott
graduating from college. Life in
general, moreover, was on an even keel, with continued and continuing rewarding
relationships with friends and family, so I really can't complain much about
2016. Too bad that warm feeling was
overwhelmed by political developments.
* * *
In addition to the phrase "maiden name" being
antique, so also is the custom of only women wearing wedding rings. But it wasn't all that long ago that in the
vast majority of marriages in the U.S., only the woman wore a ring. There's an interesting history behind the
evolution of the practice of the double-ring ceremony. (The custom of wedding rings goes back to
pharaonic Egypt as well as ancient Rome and Greece.)
It was World War II that did it, just as it affected many
things in the life of the West (and the planet). There had been an attempt in the 1920s, by
the jewelry industry, in order to increase sales, to promote men's engagement
rings. According to Livia Gershon in
JSTOR Daily, "the effort was a colossal failure. . . . [I]t simply didn’t fit a historical moment
when marriage as a union of equals was a fringe view and consumption was a
distinctly feminine activity. It didn’t
help that, within a few years of the campaign’s launch, the Depression led to
declining marriage rates. For couples
who did tie the knot, the idea of paying for a second ring was often a
non-starter."
In the 1940s, however, between a change in the view of
marriage and another push by the jewelry industry, the custom of men wearing
wedding rings became the norm. As men
went off to war, the ring was a link to home.
After the war, such celebrities as Humphrey Bogart wore a ring when he
married Lauren Bacall. Before the war,
about 15% of nuptials had two rings; after the war it was 80% (according to the
website Gretna Green Wedding Rings; I have no idea the actual source of the
data, but the latter number accords with my experience). Now the practice of both parties wearing
wedding rings is so widespread that the absence of a ring signals that one is
single, or so says Wikipedia.
I only have one friend left who was married shortly after
WWII. He told me that they "just
assumed that every couple exchanged rings, and so we did, too." My dad wore a wedding ring, but inasmuch as
he died in 2005, I can't ask him what he and my mother were thinking when they
each had a ring.
One
change in the view of marriage, so it seems, was that women were no longer
being "given away" by one person on whom they were dependent (their
father) to another on whom they would be dependent (their husband). It's still a quaint custom for the father (or
substitute, as in the case of my mother) to walk the bride down the aisle in a
church wedding, although more often than not, in my experience, both parents do
the walking with the bride. Does that
mean the bride, dependent on mom and dad, is being given to someone else? Probably not.
Most marital vows I've heard in the last several decades don't subscribe
to the proposition that the woman is a dependent of the man.
Some cultures call for wearing the wedding ring on the 4th
finger of the right hand, some on the 4th finger of the left. There are various theories of why each custom
emerged. There appears to be a general
consensus, however, that the 4th finger is the one least used in
day-to-day life, so wearing a ring on that finger won't disrupt daily
activities. That view obviously
developed before the advent of typewriters and computer keyboards, but even now
a wedding ring doesn't interfere with use of a keyboard.
The antique practice that has not disappeared is that it
is only the woman, the bride-to-be, who wears an engagement ring. The campaign of the 1920s has never
returned. (I don't know what the
practice is among gay and lesbian couples.)
In our case, Kathy had an engagement ring, and as is widespread practice
among the couples I know, she wears it with her wedding ring. (It never dawned on us to get me an
engagement ring.)
* * *
It is interesting to me that the Norman Conquest plays a
role in modern politics. An event in
1066 reverberates today, more than just the faint echoes of nearly 1000 years
ago. An article with the title "Old
English Has a Serious Image Problem" led me to understand the
reverberation. My first reaction to the
title was "Really? Who even cares
about Old English, much less worries about its image?" I learned who cares and why from Professor
Mary Dockray-Miller, a professor of English in the College of Liberal Arts and
Sciences at Lesley University (Massachusetts), an expert on medieval studies
and English language history.
Old English, for those who may not know (I among them),
is the language spoken before the Normans invaded Britain under William the
Conqueror in 1066 and brought their French to the court and the nobility. It's also called Anglo-Saxon, after the
Germanic people who emigrated to Britain during the 400s after the fall of
Rome. Those Germanics supplanted the
Romano-British natives (whose culture was a fusion of the local Celtics, there
since roughly 1200 B.C, with the Romans, who arrived in Britain in the year
43). So the Anglo-Saxons, who pushed
aside the Romano-British natives, were in turn pushed aside by the
Normans. All the evidence suggests,
however, that this wasn't genocide; genetic studies seem to demonstrate that
the majority of English are descended from Anglo-Saxons, not Normans. But the Normans were dominant culturally and
economically.
Anyway, Old English became a subject of study in U.S.
higher education, as well as by Thomas Jefferson, among others, and especially
after the Civil War. You would think
this a pretty innocent, if rather antiquarian, linguistic pursuit. If you did, you would be wrong. It appears, so Professor Dockray-Miller
contends, that
the
study of Anglo-Saxon played a part in the more general cultural belief—prevalent
at the time—in the superiority of northern European or "Anglo-Saxon"
whiteness. In 2017 . . . the American
neo-Nazi movement that calls itself the "alt-right" is resurrecting
medievally tinged celebrations of "European heritage" as part of its racist
agenda.
Inside colleges and universities in the 19th
Century, "Anglo-Saxon" mostly referred to the study of the language
of pre-1066 Britain.
Outside
the university, however, the phrase "Anglo-Saxon" did not refer to
early medieval English. Instead, it was
racial and racist, freighted with assumptions of privilege and
superiority. The cultural rhetoric of
Manifest Destiny specifically defined "Anglo-Saxons" as superior to
enslaved and free Africans, Native American Indians, Mexicans, and numerous
other groups defined as non-white, including Irish and Italian immigrants. . .
. These racist associations stemmed from
the medievalism and Anglo-Saxonism bred by nineteenth-century racial and
political theorists, who used supposedly scientific and religious proposals to
argue for the superiority of the Anglo-Saxon race.
The relative positions of the Normans and the
Anglo-Saxons were used by the South before and after the Civil War (in ways
that were grossly historically inaccurate).
Southerners identified more with the Normans (and feudalism) before the
war, in some half-assed justification of slavery; after the war, they switched
identification to the Anglo-Saxons, the people whose country was invaded but
who were determined to retain their own culture. (Part of this may be an outgrowth of the
notion of the "Norman Yolk," the idea that the nasty Normans imposed
restrictions on a supposedly freer Anglo-Saxon society. The Norman Yolk argument wasn't developed
until 500 years after the conquest, however, and was a political tool for the
16th century, not a studied historical conclusion. But it nonetheless survives in some
quarters.)
Meantime, by modern standards, as time passed the academy
was not without sin. Professor of
Anglo-Saxon and president of Bryn Mawr (the all-women college) in 1916, M.
Carey Thomas, addressed the college.
If
the present intellectual supremacy of the White races is maintained, as I hope
that it will be for centuries to come, I believe it will be because they are
the only races that have seriously begun to educate their women. . . . almost
all of our student body are early time Americans. . . . Our Bryn Mawr College students therefore as a
whole seem to belong by heredity to the dominant races.
"The study of
Anglo-Saxon . . . illustrates the overt and implicit racism at the turn to the
twentieth century that prioritized 'Anglo-Saxon' whiteness over all other
racial categories." The racism may
not have been enthusiastically endorsed in all quarters, but it was accepted. (One of my colleagues on the English faculty
takes issue with this claim.
"Anglo-Saxon was studied at Oxford because it was the beginning of
English; the curriculum was 'from Beowulf to Virginia Woolf.'")
In modern higher education, there's no issue because
almost no one teaches Old English any more (outside of the occasional graduate
program) and any works from the period are read in translation. In modern politics on the right, however, it
may indeed remain an issue.
"American neo-Nazis . . . claim to glorify the 'greatness' of their
'European heritage.'" Not only is
that twaddle, it's arguably choosing the wrong horses. There's little doubt it was the Normans who
built England into the powerhouse that became the British empire—for good or
for bad. So if the neo-Nazis want to
claim successful and powerful forbearers in their British heritage, it should
be the Normans, not the Anglo-Saxons.
(In the current pejorative popular in certain circles, the Anglo-Saxons
were losers!) The idea that either the Normans or the
Anglo-Saxons would want anything to do with the neo-Nazis, however, is
doubtful.
Nonetheless,
my point remains: it's fascinating (to
me) that an event nearly 1000 years ago, about which I am certain the vast
majority of people (quite understandably) know nothing, still weasels its way
into current politics, even if on the lunatic fringe.
* * *
It
seems that it's not just the field of Old English. The medievalists in general are facing a
problem.
Apropos
of the Norman Conquest and Anglo-Saxons, it appears that those in the broader
field of medieval studies find themselves in a puzzle succinctly summarized by
an article title in Pacific Standard: "What to Do When Nazis Are Obsessed With
Your Field." One medievalist, David
Perry (professor of history at Dominican University) wrote that "mostly
we're just a collection of predominantly white scholars who are surprised and
disturbed to discover our classes and books might be well-received by white
supremacists. Having discovered it, the
question is what to do."
The
problem is that "white supremacists explicitly celebrate Europe in the
Middle Ages because they imagine that it was a pure, white, Christian place
organized wholesomely around military resistance to outside, non-white, non-Christian,
forces." Except, as the historians
will point out, it wasn't. The
supremacists cite the Holy Roman Empire (a polyglot empire if ever there was
one), the Knights Templar (central to the Crusades and active from the early
1100s until the Pope abolished it in 1312), "Vinland" (the medieval
Viking name for North America, including a dreamt-up theory that whites are
native to the continent), and various other elements of medieval history. These claims about the Middle Ages are not
supported by historical research nor by the historians. As we know, however, facts are often
irrelevant to these psycho-historical affiliations that bear little
relationship to reality (no matter how disputed history might be). "The alt-right’s 'fantasy' of the
medieval past couldn’t be further from the truth, says Suzanne Akbari, director
of the University of Toronto’s Centre for Medieval Studies. 'The medieval past
is actually highly integrated, highly diverse, with a tremendous amount of
cultural interchange. Reconstructing
those histories of exchange, both cultural and economic, is a very vital area
of our field.'"
Part
of the problem, hinted at in Perry's comments, is that the field itself is
pretty much white. It's also been
focused on Europe and, so some say, resistant to changing its approach (e.g.,
looking at critical theory or examining the biases in the disciplinary work
itself). That conservatism is beginning
to change, but in the meantime those who don't look positively on the alt-right
or white supremacists face the question of how to deal with their embrace of
medieval studies.
This
is a case where no attention is better than bad attention.
No comments:
Post a Comment