Friday, September 29, 2017

#13 the worth of history or not, history in the digital age, napkins, Elliott's conscience



The present engenders the past far more energetically than the other way around.  --Joseph Brodsky

But as no doubt many of you know, "the past is a foreign country:  they do things differently there."  --L. P. Hartley

            I have on occasion remarked on what seems to be the pointlessness of studying history, in part because people don’t take from it what they can, in part because it is difficult to draw lessons from history (it doesn’t repeat itself but it sometimes rhymes), and for most of us it has few implications for our daily lives.  I took that contrarian view despite the fact that most of my avocational reading most of my life has been history—because I enjoy it.

            Farnam Street, a weekly blog (and one of the better ones around, in my opinion) featured a short discussion of, and abstract from a book by, B. H. Liddell Hart, who was primarily a military historian.  He was also a British officer in WWI and lived to see both WWII and much of the Cold War (he died in 1970).  Hart’s final work, unfinished, offers among the most graceful and compelling reasons to understand history—but also to be an unapologetic advocate for truth.  I suppose that position verges on pious niceties about virtue and wisdom, but I never tire of encountering such admonitions.  I wish I could write as well as he did.  Forgive the excerpts, but I think they are splendid.

What is the object of history? I would answer, quite simply—"truth."  It is a word and an idea that has gone out of fashion.  But the results of discounting the possibility of reaching the truth are worse than those of cherishing it.  The object might be more cautiously expressed thus: to find out what happened while trying to find out why it happened. In other words, to seek the causal relations between events.  History has limitations as guiding signpost, however, for although it can show us the right direction, it does not give detailed information about the road conditions.

But its negative value as a warning sign is more definite.  History can show us what to avoid, even if it does not teach us what to do—by showing the most common mistakes that mankind is apt to make and to repeat. A second object lies in the practical value of history.  "Fools," said Bismarck, "say they learn by experience.  I prefer to profit by other people’s experience."

The study of history offers that opportunity in the widest possible measure. It is universal experience—infinitely longer, wider, and more varied than any individual’s experience. How often do people claim superior wisdom on the score of their age and experience. The Chinese especially regard age with veneration, and hold that a man of eighty years or more must be wiser than others. But eighty is nothing for a student of history. There is no excuse for anyone who is not illiterate if he is less than three thousand years old in mind.

            What a challenging proposition:  "no excuse" for not being 3,000 years old in mind.  For those who are literate but working two minimum-wage jobs to hold house and family together, Hart’s edict may be a little extreme.  When could they possibly find the time and energy to read enough history to become sophisticated enough to analyze current events in light of history?  When half of Americans don’t even vote in one of the most heated and divisive presidential campaigns in American history?

History is the record of man’s steps and slips.  It shows us that the steps have been slow and slight; the slips, quick and abounding.  It provides us with the opportunity to profit by the stumbles and tumbles of our forerunners.  Awareness of our limitations should make us chary of condemning those who made mistakes, but we condemn ourselves if we fail to recognize mistakes.

            In this Hart seems to speak to the apparent fragility of a democratic society.  The steps are slow, the slips are quick.  We will see what 2017 brings, but from the perspective of early in the year and the perspective of one who likes progressive views, integrity, and honesty in elected officials, one can argue that the United States "slipped" badly.  Germany slipped badly in 1932 and it took 13 years, a world war, millions of deaths, and unspeakable human horrors to correct the course.  I can hope fervently that 2016 is not a slip that parallels 1932, and while the situations are vastly different in many ways, there are enough similarities that one must worry a little.

There is a too common tendency to regard history as a specialist subject— that is the primary mistake.  For, on the contrary, history is the essential corrective to all specialization.  Viewed aright, it is the broadest of studies, embracing every aspect of life.  It lays the foundation of education by showing how mankind repeats its errors and what those errors are.

We learn from history that men have constantly echoed the remark ascribed to Pontius Pilate—"What is truth?"  And often in circumstances that make us wonder why. It is repeatedly used as a smoke screen to mask a maneuver, personal or political, and to cover an evasion of the issue. It may be a justifiable question in the deepest sense.  Yet the longer I watch current events, the more I have come to see how many of our troubles arise from the habit, on all sides, of suppressing or distorting what we know quite well is the truth, out of devotion to a cause, an ambition, or an institution—at bottom, this devotion being inspired by our own interest.

I see this last as hauntingly appropriate for our times, when one sees this happening constantly in American politics, both national and local.  And truth (yes, the word is chosen purposely) to say, it is the right that is far more guilty than the left when it comes to "suppressing or distorting what we know quite well is the truth" if "truth" is defined as demonstrable facts rather than baseless claims.  Hart pleads for living in a fact-driven world; instead we have devolved to fact-free exchanges (it is an assault on the meaning of the term to call it "discourse").

We learn from history that in every age and every clime the majority of people have resented what seems in retrospect to have been purely matter-of-fact comment on their institutions.  We learn too that nothing has aided the persistence of falsehood, and the evils resulting from it, more than the unwillingness of good people to admit the truth when it was disturbing to their comfortable assurance.  Always the tendency continues to be shocked by natural comment and to hold certain things too "sacred" to think about.

I can conceive of no finer ideal of a man’s life than to face life with clear eyes instead of stumbling through it like a blind man, an imbecile, or a drunkard—which, in a thinking sense, is the common preference.  How rarely does one meet anyone whose first reaction to anything is to ask "Is it true?"  Yet unless that is a man’s natural reaction it shows that truth is not uppermost in his mind, and, unless it is, true progress is unlikely.

            Given the widespread concerns about "fake news" and how it may have affected the 2016 U.S. elections, I can hardly think of a more pertinent question that every citizen of the Republic should be asking.  Every day.

            Hart offers a rather harsh view of democratic governments that I don’t think can be justified by careful analysis of the data available on national innovation and the ability of those with great talent to exercise it.

We learn from history that democracy has commonly put a premium on conventionality. By its nature, it prefers those who keep step with the slowest march of thought and frowns on those who may disturb the "conspiracy for mutual inefficiency."  Thereby, this system of government tends to result in the triumph of mediocrity—and entails the exclusion of first-rate ability, if this is combined with honesty.  But the alternative to it, despotism, almost inevitably means the triumph of stupidity. And of the two evils, the former is the less.  Hence it is better that ability should consent to its own sacrifice, and subordination to the regime of mediocrity, rather than assist in establishing a regime where, in the light of past experience, brute stupidity will be enthroned and ability may preserve its footing only at the price of dishonesty.

I have to wonder what Hart was thinking about when he wrote the first two sentences in that paragraph.  It doesn’t align with my observations about history.  Even if he’s right about democracy, however, he takes the Churchillian view that democracy may be terrible but it’s better than all the alternatives.

            Is it possible that the triumph of stupidity can precede rather than follow the move to despotism?

            Hart is also of the view that the pursuit of truth through the study of history is not easy.  One "has to learn how to detach his thinking from every desire and interest, from every sympathy and antipathy—like ridding oneself of superfluous tissue, the 'tissue' of untruth which all human beings tend to accumulate for their own comfort and protection.  And he must keep fit, to become fitter.  In other words, he must be true to the light he has seen."  I don’t know many people who are able to do this.  I try, but I’m sure I’m not entirely successful.

            Finally, Hart takes a position contrary to one I espoused a few years back:  he maintains there are implications of history for personal life, and in a surprisingly simple fashion:  "the experience contained in history is seen to have a personal, not merely a political, significance.  What can the individual learn from history—as a guide to living?  Not what to do but what to strive for.   And what to avoid in striving.  The importance and intrinsic value of behaving decently.  The importance of seeing clearly—not least of seeing himself clearly."   Behaving decently:  something that has come close to vanishing from American politics.

            Lord Acton, the English historian and Catholic leader, about whom more in a few pages, had a view about history that modern historians are reluctant to embrace. 

There are principles, he believed, that transcend specific historical periods, and the historian should not hesitate to condemn cruelty or injustice whenever it occurred.  "Historical responsibility has to make up for legal responsibility. . . .  Unjust men should not escape the judgment of history merely because our time may be more "advanced" than theirs.  To hold otherwise would mean that "we have no common code; our moral notions are always fluid; and you must consider the times, the class from which men spring, the surrounding influences . . . until responsibility is merged in numbers.

            Hart urges us to learn what to strive for, behaving decently.  Acton argues we can apply standards of decency (condemn "cruelty and injustice" and judge by a "common code").  Should one do both?  One can make the argument that the Nuremburg trials of Nazis after World War II were a bow in Acton's direction.

The German philosopher G. W. F. Hegel had a different take on history.  "We learn from history that we do not learn from history."




* * *

If A. J. P. Taylor is right, that "the present enables us to understand the past, not the other way round," understanding the past may be more difficult as time passes because the writing of history has become more complicated since Hart’s day.  From the Chronicle of Higher Education:  "The Problem of History in the Age of Abundance," written by a Canadian professor of, among other things, digital history, Ian Milligan.

Milligan describes the difficulties and the ethical questions that modern historians face given the enormous amount of information that has accumulated on the web in just the past 20 years.  He begins by recalling GeoCities, which provided free websites to everyone.  In 15 years the site grew from a few thousand users to over seven million, with 186 million different URLs by the time it was closed down in 2009.

Web archives offer the power to peer into the minds and thoughts of millions of people. The ethical onus lies on researchers. Historically, ordinary people did not leave behind many records, forcing historians to learn about them from the scant moments when they came into contact with large record-keeping institutions like censuses, churches, poor rolls, or the criminal-justice system. Between 1674 and 1913, the Old Bailey, the central London legal court, collected the transcripts of 197,745 trials. Today’s Old Bailey website describes its holdings as the "largest body of records documenting the lives of non-elite people ever published." This is not hyperbole. For 239 years, 197,745 trials are a good standard of historical documentation. Compare that with the seven million users and 186 million "documents" that were generated on Geo­Cities in 15 years, and which represent only a fraction of the web of that era, and you get a sense of the enormous scale historians now confront. 

Milligan acknowledges that even today not all "ordinary people" are captured in the archives of the web, because web access is still constrained by race and wealth and geographic location.

Milligan maintains, I think correctly, that any historian who ignores the information on the web—no matter the topic—will write bad history.  It would, he said, be "intellectually dishonest" not to rely on personal testimonials and other histories and commentary, but who has sufficient time to read everything that might be pertinent to one's historical inquiries?  In reaching into the deep past (pre-web), historians have to scrounge for materials (surviving correspondence, legal records, etc.); now their problem is that they are deluged with so much material it can hardly even be sifted, much less read and carefully analyzed and evaluated.  Web archiving began in 1996 and now contains 445 billion web pages and (in 2014) over 10 petabytes of storage (petabyte = 1000 terabytes).

The mechanisms to deal with this avalanche of information are problematic.  Search algorithms produce what the algorithm is supposed to produce—but might not produce what the historian needs (and won't even know it's needed).  As Milligan points out, everyone is driven by their search engines:  if we're looking for something, we'll usually read the first page of hits and not much more.  But if the point is to write a history, will you read the 100th and 1000th hit?  As he points out, if the writer relies solely on the algorithms, they should be credited as co-author.

What is one to claim is our "cultural heritage" when we have more "heritage" than we can ever absorb?

There are also ethical issues.  Many web pages are/were created by people who didn't realize that those pages would have continued life in an Internet archive somewhere.  They didn't have the means to exclude them from web searches nor will they necessarily be asked for consent to use them in research.  (The same is true for Twitter messages; they are public but it may not be academically legitimate to include them in research without the consent of the person who tweeted.  I wonder how that concern will apply to the tweets of Mr. Trump.)  At the same time, the guestimate about the average life of a web page is 100 days; if they're not captured in an archive, they can disappear—which is more history then lost.

Mulligan argues, I think correctly, that the pages should be collected and retained, if history is to be accurate.  Moreover, if treated like publications—which they resemble in some ways—there is a legal right to cite and quote from them.  To also use them ethically, he suggests the same standards applied to oral historians:  their work needs to be considered by an Institutional Review Board, bodies charged to approve research projects to ensure they meet appropriate ethical, safety, legal, and any other applicable standards.  Yes, IRBs have themselves been subject to criticism (see, for example, recent controversies at the University of Minnesota), but they're better than no review at all and we haven't discovered any flawless method of review.

Mulligan concludes:

Ultimately, web archives offer power: the power to reconstruct online lives, to peer into the minds and thoughts of millions of people from 20 years ago, and to move toward a potentially more democratic form of history. We need to think about the role of algorithms, to reach out to our librarian and archivist colleagues, and to begin a broader conversation about the implications. Without taking these steps, we will not be able to write honest histories of the 1990s or beyond.

The issue of search algorithms showed up again in the Chronicle of Higher Education a month later, with a darker hue.  ("Google and the Misinformed Public")

            The problem is the lack of curation, allowing the widespread distribution of what we have come to know as "fake news."  Even though Google and Facebook "may disavow responsibility for the results of their algorithms, . . . they can have tremendous — and disturbing — social effects.  Racist and sexist bias, misinformation, and profiling are frequently unnoticed byproducts of those algorithms."  Libraries and other agencies evaluate the credibility of sources; organizations like Google and Facebook don't—or if they do, how they do it is not transparent.

That misinformation can be debilitating for a democracy — and in some instances deadly for its citizens. Such was the case with the 2015 killings of nine African-American worshipers at Emanuel A.M.E. Church in Charleston, S.C., who were victims of a vicious hate crime. In a manifesto, the convicted gunman, Dylann Roof, wrote that his radicalization on race began following the shooting death of Trayvon Martin, an African-American teen, and the acquittal of his killer, George Zimmerman. Roof typed "black on White crime" in a Google search; he says the results confirmed (a patently false notion) that black violence on white Americans is a crisis. His source? The Council of Conservative Citizens, an organization that the Southern Poverty Law Center describes as "unrepentantly racist." As Roof himself writes of his race education via Google, "I have never been the same since that day."

Roof’s Google search results did not lead him to an authoritative source of violent-crime statistics. FBI statistics show that most violence against white Americans is committed by other white Americans, and that most violence against African-Americans is committed by other African-Americans. His search did not lead him to any experts on race from the fields of African-American studies or ethnic studies at universities, nor to libraries, books, or articles about the history of race in the United States and the invention of racist myths in the service of white supremacy. Instead it delivered him misinformation, disinformation, and outright lies that bolstered his already racist outlook and violent antiblack tendencies.

Professor Noble at UCLA, who wrote the article, observes that searches can erroneously simplify complicated issues and provide no index of "veracity or reliability."  Searches can appear impartial, but no one knows the algorithms used by Google and other firms because they are seen as intellectual property, and in many cases the searches don't produce information that we would normally acquire from teachers, books, and other legitimate sources.  Professor Noble argues for holding search platforms accountable and perhaps even regulate them.  I don't know if that could work and I'd worry about who the regulators are, but her larger point, that search algorithms can lead people astray, is surely legitimate.  I would like to think that those who have a decent education and who are modestly intelligent can tell the wheat from the chaff, but I have no data to support that claim.

* * *

            I am certain that every human being alive has quirks and idiosyncracies (vis-à-vis their own culture).  I have mine, and one of them occupied my time for part of an afternoon last December, as it does periodically.  It’s napkins, specifically linen dinner napkins.

            Most people, reasonably, pay very little attention to napkins, other than to get them in their laps.  My idiosyncracy is that I can’t stand it when the corners and edges of my linen dinner napkins don’t align when I have set the table in preparation for hosting friends or family.  It’s specifically linen, too, because napkins made of other fabrics come out of the dryer, get folded, and they look just great.  (I’d like to say this is my only idiosyncracy, but others who know me well, like my wife and children, might have a different view.)

            All of my dress/work clothes and the tablecloths go to the laundry to be done by someone else.  I haven’t ironed any of my own clothes for decades because I am not competent; no matter how hard I worked at it—those many years ago—I could not make the clothing look good.  So I decided to hell with it, I’ll bring them in to be done professionally.  But I cannot get the laundry people to get the napkins right—the corners are often off by an inch or more, they are not folded on center, etc.  They are neatly pressed—and crooked.

            So once or twice per year, as soon as I’ve run through the supply of linen dinner napkins, I wash them and then iron them myself.  And *I* get them aligned and the corners touching.  I suppose life would be easier if I just got rid of the linen napkins, but I will always remember my friend the late Dagny Christiansen one time exclaiming, when we had her and her husband Holger for dinner, something like, "oh my, linen napkins.  We only use those for our best friends."  I just looked at her.  She smiled.  What she didn't need to know was that some of the napkins were older than I am; my great-aunt Inez bought them in the 1940s and I inherited them.  If nothing else, they are a tribute to the strength of linen.  I still use them.  (When Kathy and I were in Ireland a few years ago, I bought a dozen or more Irish linen napkins, so replacements are at hand.)

            But then, as I sometimes do the older I get, I said to myself, "OK, Gary, you have X number of hours left on the planet if you live out your normal life expectancy; does ironing napkins fall under rule #2 that you established?"  (2.  Must it be done?  No.  Will I have fun doing it?  No.  Then I won't do it.  3.  Must it be done?  No.  Will I have fun doing it?  Yes.  Then I'll do it.)  I may soon reach the point where the answer will fall under #2.  I’ll let the unmatched corners be damned.  But for now, I still like giving dinner guests neatly pressed linen napkins.

* * *

            I'm not sure how to categorize this, but there is something I find humorous and a sign of how far we have come from real life.  Several times last winter I was in a setting where people turned on their TV sets to a fireplace fire, complete with crackling sounds.  Heavens.

* * *

Elliott and his dad, texting last fall (thus the abbreviated language)

Elliott:  Have said before sometimes I wonder why I in school when I could have just been a conman for a living.  Just saw another story of two English women catching and bottling "fresh country air" and selling it, mostly to Chinese people, for $115 a gallon.  And people are buying it.

I could have thought of that.  My conscience is keeping from being rich.

Dad:  The people on the religious right would say you couldn't have a conscience because you aren't religious.  So what's stopping you?

Elliott:  Last thing I ever want to do is support their ideas through my actions.



Wednesday, September 27, 2017

#12 living a long time to Mencken critique to Mencken on Gatsby to Fitzgerald and Princeton football to the Big Ten to Crisler departure to old scandals to University of Minnesota football – a series of segues





            I was musing in writing last year, via relating a conversation that Kathy, Elliott, and I had had, about living forever.  I happened upon an article about Ray Kurzweil, a futurist who has made a number of predictions that were born out by events.  He’s best known for declaring that "the Singularity" will occur in 2045, at which point human and technological intelligence will be merged.  In the meantime, however, he maintains that humans will live forever, and that phenomenon will start by approximately 2029.  Eternal life will be achieved by the incorporation of nanobots in our bodies to take over where the immune system fails; these little bots will race around our body fixing things that fail. "Kurzweil imagines a future in which technology is so advanced that humans are able to hack the body and significantly increase our lifespans — perhaps forever."  (This isn’t wildly outside the bounds of current science; there is research going on in nanobots attacking cancer.)

            A friend of mine who’s a biologist, however, is skeptical.  "What biologists usually find when we simplify the human body (or any organism's body) is that we are being naive.  I am pretty confident that nanobots will have a place in human therapy, but we don't understand the complexity of the body well enough to program nanobots to fix everything.  So you survive cancer and die of an aneurysm or survive both of those and Alzheimer's gets you.  So my take is that he is being incredibly optimistic.  Of course, his prediction is only adding a few years, not living forever yet.  And if we lived forever, this planet would be in deep doo doo.  As would our economy, family relations, etc.  Imagine, your 7,000-year-old son moving back in with you."

            One of my favorite quotable sources in American history, H. L. Mencken, commented on living forever.  He, among many others, was asked by historian Will Durant to reflect on life, religion, and the purpose of life.  I share his contentment about simply disappearing.

I do not believe in immortality, and have no desire for it. . . .  What the meaning of human life may be I don’t know:  I incline to suspect that it has none.  All I know about it is that, to me at least, it is very amusing while it lasts.  Even its troubles, indeed, can be amusing.  Moreover, they tend to foster the human qualities that I admire most—courage and its analogues. The noblest man, I think, is that one who fights God, and triumphs over Him.  I have had little of this to do.  When I die I shall be content to vanish into nothingness.  No show, however good, could conceivably be good forever.

            My biologist friend was familiar with the Mencken quote.  "It is interesting that Mencken hits on a question that comes up many times in our biology classes:  students are always looking for the purpose behind life.  And at its base, the purpose of life is simply to make more life.  If you look at our evolutionary adaptations, they serve to help us reproduce.  So if we lived forever and stayed true to our biology, this planet's human population would get to extremes!"

* * *

            In pursuit of candor, I must acknowledge that Mencken has his severe critics, and rightfully so.  He was both racist and anti-semitic, in addition to being an unmerciful critic of democratic society and admirer of Germany.  In 2003 the art critic and essayist Hilton Kramer—himself no liberal—reflected after re-reading some of Mencken.

Even the political reporting that once gave me a chuckle now strikes me as more dispiriting than amusing.  The facile rhetoric of remorseless, uproarious ridicule that made Mencken a culture hero in the 1920s turns out, in retrospect, to have been exactly what Irving Babbitt said it was in 1928—"intellectual vaudeville," full of bluster and farce aimed at what now seem easy targets, but thin in intellectual substance and woefully lacking in a sense of history.  [That Mencken was seen as] vastly entertaining, and indeed liberating, by a great many intelligent people is not to be doubted. But not to be doubted, either, is that ours was a very different country and a very different culture in the early years of the twentieth century.  It was a far more provincial country with a far more philistine culture than comparable readers would find tolerable today.

Mencken was wildly popular on college campuses in the 1920s, but also liked by some of the leading figures in American political and intellectual life, such as Justice Oliver Wendell Holmes.  Mencken crashed and burned, however, in later years, and is now largely unread except for his witty quips about American politics and society.  So I quote him from time to time, but with an understanding of his limits.

            One Mencken piece was making the rounds after the November 2016 elections (and often incompletely, with words added that Mencken didn’t pen, or condensed in a way that hid Mencken’s point).  What Mencken wrote in 1920 about political contests was this:

The larger the mob, the harder the test.  In small areas, before small electorates, a first-rate man occasionally fights his way through, carrying even the mob with him by force of his personality.  But when the field is nationwide, and the fight must be waged chiefly at second and third hand, and the force of personality cannot so readily make itself felt, then all the odds are on the man who is, intrinsically, the most devious and mediocre — the man who can most easily adeptly disperse the notion that his mind is a virtual vacuum.

The Presidency tends, year by year, to go to such men.  As democracy is perfected, the office represents, more and more closely, the inner soul of the people.  We move toward a lofty ideal.  On some great and glorious day the plain folks of the land will reach their heart's desire at last, and the White House will be adorned by a downright moron.

            Does Mr. Trump represent the candidate who’s "the most devious and mediocre" and able to "adeptly disperse the notion that his mind is a virtual vacuum"?  I have to say he sure tried to live up to Mencken’s description.  I don’t think he’s a moron, however, although I don't think he's actually very smart. 

            Another from Mencken’s oeuvre, less well known, can be read as commentary as current as anything one can read on the web or news media.  It’s the opening sentence in one of his collected works, from an essay "On Being an American."  (This is all one sentence!)

And here, more than anywhere else that I know of or have heard of, the daily panorama of human existence, of private and communal folly—the unending procession of governmental extortions and chicaneries, of commercial brigandages and throat-slittings, of theological buffooneries, of aesthetic ribaldries, of legal swindles and harlotries, of miscellaneous rogueries, villainies, imbecilities, grotesqueries, and extravagances—is so inordinately gross and preposterous, so perfectly brought up to the highest conceivable amperage, so steadily enriched with an almost fabulous daring and originality, that only the man who was born with a petrified diaphragm can fail to laugh himself to sleep every night, and to awake every morning with all the eager, unflagging expectation of a Sunday-school superintendent touring the Paris peep-shows.

Mencken’s cynicism shows, but the daily panorama isn’t much different from when he wrote that in the 1920s.

* * *

            Not to dwell on Mencken, but I will anyway.  I happened across a book review Mencken wrote that appeared in the May 3, 1925 Chicago Tribune.  The review was of F. Scott Fitzgerald's The Great Gatsby.

            I have always been fond of Gatsby, even though it's ultimately quite depressing.  Mencken's review is almost a paraprosdokian:  he starts out seeming to pan the book and then ends up lauding it.  He wrote that it "is in form no more than a glorified anecdote, and not too probable at that. . . .  This story is obviously unimportant and, though, as I shall show, it has its place in the Fitzgerald canon, it is certainly not to be put on the same shelf with, say, This Side of Paradise.  What ails it, fundamentally, is the plain fact that it is simply a story — that Fitzgerald seems to be far more interested in maintaining its suspense than in getting under the skins of its people."

            Mencken praises the improvement he saw.

What gives the story distinction is something quite different from the management of the action or the handling of the characters; it is the charm and beauty of the writing.  In Fitzgerald's first days it seemed almost unimaginable that he would ever show such qualities.  His writing then was extraordinarily slipshod — at times almost illiterate.  He seemed to be devoid of any feeling for the color and savor of words.  He could see people clearly and he could devise capital situations, but as writer qua writer he was apparently little more than a bright college boy.  The critics of the Republic were not slow to discern the fact.  They praised "This Side of Paradise" as a story, as a social document, but they were almost unanimous in denouncing it as a piece of writing.

It is vastly to Fitzgerald's credit that he appears to have taken their caveats seriously, and pondered them to good effect.  In "The Great Gatsby," highly agreeable fruits of that pondering are visible.  The story, for all its basic triviality, has a fine texture, a careful and brilliant finish.  The obvious phrase is simply not in it.  The sentences roll along smoothly, sparklingly, variously.  There is evidence in every line of hard and intelligent effort.  It is a quite new Fitzgerald who emerges from this little book, and the qualities that he shows are dignified and solid.  "This Side of Paradise," after all, might have been merely a lucky accident.  But "The Great Gatsby," a far inferior story at bottom, is plainly the product of a sound and stable talent, conjured into being by hard work.

Thus Fitzgerald, the stylist, arises to challenge Fitzgerald, the social historian, but I doubt that the latter ever quite succumbs to the former.  The thing that chiefly interests the basic Fitzgerald is still the florid show of modern American life — and especially the devil's dance that goes on at the top.  He is unconcerned about the sweatings and sufferings of the nether herd; what engrosses him is the high carnival of those who have too much money to spend, and too much time for the spending of it. Their idiotic pursuit of sensation, their almost incredible stupidity and triviality, their glittering swinishness — these are the things that go into his notebook.

What I find striking in Mencken's review is its immediacy.  He's writing about the top 1%, or at least some of them.

            One recurring question in the field of literature is the place of The Great Gatsby in the American canon.  One of my friends in English told me it's "certainly an important book -- one of the contenders for the Great American Novel, and certainly more often quoted (if only from a few lines) than most others."  He went on to point out that one of the English faculty at the University of Minnesota has edited a collection of Fitzgerald's St. Paul short stories and that Garrison Keillor named a theater in St. Paul after Fitzgerald.

* * *

            Were I inclined to alliteration, I would title the next few paragraphs "Fritz, Fitz, and Football."

            In the course of a little reading about the initial reception of The Great Gatsby, I stumbled across a Wall Street Journal article from 2014 about F. Scott Fitzgerald and football, of all things.

            It seems that Fitzgerald was a rabid Princeton football fan.  According to the Journal article, he decided to go to Princeton after watching the Harvard-Princeton game in 1911.  In case you don't know, the Harvard-Princeton games are a "storied rivalry."  "Once there [at Princeton], he tried out for the team—but got cut on the first day, a well-chronicled disappointment that some scholars believe explains the sense of rejection that permeates his novels, especially 'The Great Gatsby.'"

            Fritz Crisler was head football coach at Princeton 1932-1937, and then went to the University of Michigan (where he essentially built Michigan into the athletic powerhouse it remains to this day).  He gave an interview in 1956 to a romance language graduate student, Donald Yates, which Yates later wrote about for the Michigan Daily; Crisler related that "F. Scott Fitzgerald called him ‘between 12 midnight and six a.m. of the night before our games—not just sometimes, but practically every eve of every home game.’"  Crisler pioneered the practice of creating different offensive and defensive teams—that is, two "platoons"; before that, the players were on both defense and offense.  Crisler is described by the College Football Hall of Fame as "the father of two-platoon football."  What two-platoon football allowed, in essence, was unlimited substitution of players during a game.

            The reporter who wrote the Journal article posed a query:  "The tantalizing question raised by the 1956 interview is:  Did Crisler get the idea [for two platoons] from Fitzgerald?  It is not a subject discussed in the ever-expanding library of popular and academic writing on Fitzgerald. . . .  Scholars who focus on Fitzgerald’s fascination with money, women, booze, jazz and 1920s Paris have never made much of his devotion to a Princeton football team that won 10 national championships in his lifetime.  His life as a devoted fan never fit well in the narrative of Fitzgerald as a tortured artist, heartbroken by his wife’s mental illness and confronted at every turn by commercial failure."

            The author of the 1956 article with the Crisler interview said that Fitzgerald may indeed have been the one who devised the idea of two platoons.
           
During his Princeton years, Crisler told Mr. Yates, his phone would ring late at night before games.  Answering, he would hear the voice of Fitzgerald, calling from Miami, Chicago or Hollywood.  The calls came "between 12 midnight and six a.m. of the night before our games—not just sometimes, but practically every eve of every home game," Crisler told Mr. Yates. Often, behind Fitzgerald’s voice, Crisler heard the laughter and cries of a dying party.

What Fitzgerald called to talk about was Princeton football.  "It wasn’t just a matter of the habitual old-grad spirit and enthusiasm," said Crisler.  "There was something beyond comprehension in the intensity of his feelings. Listening to him unload his soul as many times as I did, I finally came to the conclusion that what Scott felt was really an unusual, a consuming devotion for the Princeton football team."

. . .

He was a smart football fan, though, to judge from that 1956 interview. "Sometimes he had a play or a new strategy he wanted me to use," said Crisler.  "Some of the ideas Scott used to suggest to me over the phone were reasonable—and some were fantastic."

In the fantastic department, Crisler cited an example:  Fitzgerald, he said, "came up with a scheme for a whole new offense.  Something that involved a two-platoon system."

At the time of the interview, the coach was already known as the father of two-platoon football.  But Mr. Yates didn’t know that.  "I didn’t pay a lot of attention to sports," says Mr. Yates, now 84 and a professor emeritus of Latin American literature at Michigan State University.

So Mr. Yates didn’t ask Crisler the million-dollar question:  Did he get the idea for a two-platoon system from Fitzgerald?  Looking back at the statements Crisler made to him, Mr. Yates says, "That seems to be what he is saying."

Up until 1941, college football rules didn't allow substitutes except in the case of injury.  When the rules were loosened because of WWII, Crisler moved to two platoons (at Michigan).  It is entirely possible, the Journal reporter observed, that the idea originated elsewhere and that Crisler just adopted it, along with many other coaches.  (The NCAA then banned two-platoon football again in 1952 and only repealed the ban in 1964.)  Nonetheless, Fitzgerald was "way ahead of his time," said the current Princeton football coach, Bob Surace (who'd had no idea Fitzgerald was such a fan until the Journal reporter alerted him to the history).  But there was one other piece of evidence to support the claim that Fitzgerald might have been instrumental in adopting two platoons when it became permissible to do so.

In 1962, Fitzgerald acquaintance Andrew Turnbull wrote a biography of the author.  He recounts that Asa Bushnell, a Princeton athletic manager during the Crisler years, reported receiving a call from Fitzgerald promoting the idea of distinct units of players.  "Princeton must have two teams," Fitzgerald told Bushnell, according to the book.  "One will be big—all men over two hundred [pounds].  This team will be used to batter them down and wear them out.  Then the little team, the pony team, will go in and make the touchdowns."

            Not conversant with anything more than a brief outline of his biography, I didn't know that the last thing Fitzgerald read in his life was a Princeton Alumni Weekly analysis of the upcoming football season.  He died of a heart attack at age 44 while reading it.  Fitzgerald had written marginal comments in the article that one scholar described as "good prose," so "that makes college football the last thing he [Fitzgerald] ever wrote about."

            The question I had, and which I posed to the Journal reporter via email, was whether Crisler might have met Fitzgerald when he (Crisler) was at the University of Minnesota (1930-32).  The reporter told me that he didn't know.  So who knows, it may have started right here in Minnesota—but, having asked the question, I doubt it.  There's no evidence that Fitzgerald was a Minnesota football fan, and he (along with his wife Zelda and their baby girl) left St. Paul permanently in 1922, when Fitzgerald was 26 years old.

            My colleague the late Bob Geary, associate director of men's intercollegiate athletics at the University of Minnesota (along with his wife, killed in the charter plane crash in Reno, Nevada in 1985, and with whom I spent many, many hours working—and laughing because of his marvelous sense of humor), more than once commented that in his opinion, it was the move to two-platoon football that doomed Minnesota's continued dominance in the sport.  (Minnesota won five national championships before the war:  1934, 1935, 1936, 1940, 1941.  After that its performance was mediocre, although it did get to the Rose Bowl in 1961 and 1962—when one-platoon football was again the rule.  It has been up and down since, more down than up, and the peaks of the "up" haven't been all that high.)  One theory about the decline and its relationship to two-platoon football is that the more heavily populated states were better able to recruit and retain more of the outstanding high school football players; when only one platoon was permitted, schools could not grab as many players, so they were better distributed over more institutions.  Minnesota, with fewer high school football players, benefited from the one-platoon rule.  That theory doesn't explain how Oklahoma did so well, of course, but obviously coaches have a significant role in recruiting and winning as well (which Bud Wilkinson did at Oklahoma during the 1950s).

            What does explain how Oklahoma and Nebraska and others in the Big 12 became football powers, my friend Holger Christiansen points out, was the decision by the Big Ten to adopt need-based financial aid for athletes in 1956.  There's a long history there, starting with the "Sanity Code" adopted by the NCAA in 1948, which permitted institutions for the first time to openly provide scholarships (more accurately, grants) and jobs to athletes, but the student had to demonstrate financial need.  Eight years later the NCAA voted to allow scholarships without regard to financial need or academic performance. 

            The Big Ten chose a different path.  It

initially implemented one-year need-based aid to replace the previous job-based program. . . .  The Big Ten decided to base its financial aid on the expected ability of a student-athlete's parents to pay for college
expenses. . . .  The Big Ten's need-based aid policy failed and in 1962 was eliminated in favor of full-ride scholarships.  Need-based aid failed not only because it increased the conference's administrative workload, but also because it hindered the ability of conference schools to recruit the best athletes within the rules.  Because other major football conferences had implemented full-ride scholarships, the Big Ten faced a significant competitive disadvantage in recruiting.  Prospects could receive a larger grant-in-aid package at schools in other conferences, including some conferences that offered four-year scholarships instead of one-year renewable scholarships.  Recognizing the Big Ten's noble yet naive attempt to implement a unilateral need-based aid system, former NCAA head Walter Byers noted that, "competing schools laughed at the Big Ten as they mined the lode of athletic talent in the Big Ten area."

Thus the Big 12 rose to football prominence.

January 21, 2015

Nevada officials have erected a new plaque in the memory of the 70 victims and lone survivor of the 1985 Galaxy Airlines Flight 203 crash in Reno, Nev., on Jan. 20, Scott Sonner / AP



* * *

            I can't resist passing along a story that some of you have heard about Fritz Crisler.  Before Crisler was at Princeton, he was at the University of Minnesota.  In early 1930, following the departure of Clarence "Doc" Spears (coached 1925-1929, record of 28–9–3), after frantic searching, President Lotus Coffman hired Fritz Crisler as both football coach and athletic director.  The incumbent athletic director, the University's first, was either forced out or resigned, because Crisler wasn't prepared to work under him.

            By late 1931, Crisler was on his way to Princeton, after less than two full years at Minnesota.  The public story was that Princeton was making an offer that Crisler couldn't refuse—and that Minnesota couldn't match.  Former Circuit Court of Appeals Judge George MacKinnon (formerly in the U.S. House of Representatives, a seatmate of Richard Nixon, politically conservative, football player under Doc Spears and part-time assistant coach under Crisler and then Bernie Bierman) says the public story about Crisler's departure was malarkey and said very few people knew what really happened.  (MacKinnon told me this in an interview I had with him and then a number of letters back and forth in the mid-1980s.
"Crisler had been fired privately" by Coffman, before he departed for the Orient in October of 1931.  "All the rest, continuing as Athletic Director, was just temporary window dressing to give him time to get another job." MacKinnon, who was no admirer of Crisler's ability as a coach, nonetheless said it was not Crisler's won-lost record that led to his demise.  (Although, he said, "his record would have been enough to fire him.  He had lost seven games in two years, against Spears' loss of four games in his last three years by a total of five points.")  Rather, it was Crisler's unacceptable behavior as a womanizer.  MacKinnon tells the tale.
President Coffman found out that Crisler, a married man, was carrying on a liaison with a leading lady actress at the Bainbridge Theatre in Minneapolis, and taking her on trips to Chicago.  President Coffman called in Crisler and confronted him with the charges.  Crisler denied them.  Coffman then appointed an investigating committee with Dean Everett Fraser of the Law School as Chairman.  The Committee hired a private detective to investigate the alleged liaison with the actress.  He investigated and reported back to the Committee.  Dean Fraser told me personally one night after a Law Review banquet, "You would have thought he [Crisler] would have had enough brains to pull down the shades."  When the Committee reported their findings to Coffman he fired Crisler.
It is MacKinnon's view that "the die had already been cast for Crisler to leave and Bierman to replace him" before Coffman left the United States.

So anyone who thought football scandals are new doesn't know history.  They go back much further than 1932.  Here, a few excerpts from Colliers, December 2, 1905.

The University of Minnesota has entered into a sort of co-operative football alliance with the commercial interests of the city of Minneapolis. . . .  [I]n the annual game between these colleges [Nebraska and Minnesota] last fall Minnesota played two men who were entered for participation in this game only, and a third who was in college for football alone.  Usher H. Burdick of Mandan, North Dakota, a former Minnesota end, who had left college in June, was solicited to return by Mr. Frank Force, sporting editor of the Minneapolis "Tribune, was promised a position in one of the Hennepin County offices, and came back for the Nebraska game. . . .  Another participant in this game was Henry O'Brien, then a professional coach, employed by Macalester College.  O'Brien asserts that he entered practice for the Nebraska game, and played quarter in the second half only upon the solicitation of Coach Williams and "Ikey" Kaufmann. . . .  "Sunny" Thorp was the favorite with the Minnesota bleacher crowd during the season of 1904.  He was in college for football alone.  Thorp demanded a position which would pay him $60 monthly.  Frank Force, the newspaper man, claims to have found this position in the office of Hugh Scott, the county auditor, and when Thorp was not paid the full amount he asked, Force says that he collected the salary allowed, raised the difference by subscription, and paid the player.  "Sunny" Thorp was registered in the law school. . . .  Commercial Minneapolis has willingly supplied berths for players. 
These facts seem without point, and explain nothing of a convincing nature in themselves, but they raise the vital question.  Can a football man attend the arduous daily practice, earn his way through college, and still be a student?  Minnesota has solved the problem of attendance through her "night law school," but the dean of that department answers the vital question.  Dean Pattee, who has had many football men under his charge, deprecates the practice, declares that "only about one-third of the football men can possibly maintain average standing," and asserts that, "because the game as now played is prejudicial to the highest interests of the university, the faculty of this law school seem to stand as a unit in favor of its abolition."  In this view Dean Pattee . . . differs with the president of his own institution.  President Cyrus Northrop, of the University of Minnesota, can see no real demoralizing evil in the game of football.  Out of twenty-nine college presidents, who replied to a communication sent out by Judge Victor H. Lane of Michigan, in an inquiry into the faculty attitude toward athletics, the reply of President Northrop was alone favorable to the game as now played.  He advances the opinion that "as long as football is skillfully and honestly played," he is in favor of it, and because "it has such a tremendous hold upon everybody" he can see "no use in fighting it even if it is an evil."
The demand for victory comes with no more striking force from the commercial interests of the Minneapolis than from the sporting element:  the habitues of the saloons, the cigar stores, and the gambling dives now languishing "under the lid."  Minneapolis has had a wave of municipal reform.  The city conscience has been exercised, and vice in its manifest forms has been driven from the town, or suppressed beyond the reach of the novice.  There are no curb men who now even dare to announce in a whisper chance games going on "inside," yet when the dignity of a university is loaned to the practice of betting thousands of dollars are openly wagered in public on the results of the larger games.
* * *

            The Fitzgerald story inadvertently provided a rather neat segue to football at the University of Minnesota.

As I was writing in early 2017, we in the Twin Cities were bombarded with news about the University of Minnesota football team.  For those outside the area, a number of players were suspended because of group involvement in sexual activities with a young woman; the team proposed to boycott playing in a post-season bowl game because, they charged, of a lack of due process for the players; the head coach, Tracy Claeys, tweeted support for the boycott; the University's internal report on the events was released, suggesting that the players indeed engaged in unseemly behavior; the team gave up its boycott; the team won the post-season bowl game even though they were underdogs; the athletic director fired the football coach after the bowl game.  The major issue was the toxic culture demonstrated by what was essentially a gang rape (although the woman may have initially consented) and how the University would respond to an extremely sensitive matter.

            College athletics, at the major-school level, and particularly in football and men's basketball, is a swamp.  But draining that swamp is a far greater challenge than that faced by Mr. Trump (in that he and his administration could act unilaterally and get something done, although it's evident he and they will do nothing of the sort, whereas in college athletics no one has sufficient authority to take any effective action because it's a pattern diffused throughout the culture).

            An academic friend of mine who's a fairly keen observer and fan of college sports supported the decision to fire the coach. 

I think that [the president and athletic director] did the right thing in firing Claeys. Even though he established a winning record and accomplished what [former head coach Jerry] Kill couldn't -- win bowl games -- the football team's toxic culture needs to be cleaned up, and Claeys is not up to that task.  Hopefully the next coach will be like the one [another school] just hired, who won't tolerate any improper behavior.  I was on a committee that interviewed the two finalists.  The first guy had a proven record but his answer to questions about Title IX was that he wanted to have a strong connection with the police so that he would be the one to get a call to get them out of trouble before it hit the media, whereas the other guy talked about establishing a culture of true respect and zero tolerance. Claeys struck me as an old school coach like the first candidate. Those guys need to retire, as this is a new age.

            My disagreement with my friend is that Claeys is "old school" and that he and coaches with similar attitudes should get out of the way in a new age.  Claeys is younger than I am by 17 years.  I don't think this attitude, taking a sort of "boys will be boys and I'll support my team," is confined to "older" coaches.  It's an attitude that I saw quite a bit when I was more involved in athletics (admittedly, that was about 30 years ago), but based just on what I read in the news media for a long time, I doubt much has changed.  That macho culture seems to be alive and well among at least a fair number of those in college (and professional) sports.  It clearly isn't universal; there is a large number of male coaches who don't have any more tolerance for the abuse of women than I do.

My friend didn't disagree.

Both young and old coaches can be old school.  So to me that term is NOT about age, it's about a perspective.  Indeed, there were plenty of coaches from the dawn of football who didn't embrace a toxic masculinity culture. Perhaps what we should say is that as society becomes less and less tolerant of that then it will be harder for today's football coaches to embrace a culture that was much more widely tolerated in the past?  So it's easier for a younger coach who is just getting started to pay more attention to Title IX than a coach who grew up in an era before it was embraced, and that's a good thing!

I agree but I'm not sure that the macho culture is anywhere near extinction.

Most Read