Thursday, September 21, 2017

#9 counting to 1 million; cell phones; house woes-is-us; bathrooms; genetics of disease; Abraham Flexner; who affected lives; research bits; more on Flexner on universities





            There was a fun little piece in "Today I Found Out" on how long it takes someone to count to 1,000,000.  A software engineer did so, for "an average of 16 hours every 24 hour period for 89 straight days."  He live-streamed it; said he would have gone crazy if he hadn't.

            As for the question of how long it would take to count to 1,000,000,000, the best estimate, based on the time it took to count to 1,000,000, is somewhere between 244 and 285 years, assuming one counts for 16 hours per day every day.  So unless the medical folks can extend the human lifespan far beyond what it is now, no one will ever count aloud to a billion.

* * *

One of a number of articles (gripes) about cell phones appeared in the American Scholar last spring.  The author related a story about being in Berlin and going with a friend to a trendy bar to have a drink and talk.  The bar had a no-smartphone policy; if you want to check email, you have to go outside.  "My initial inclination was to laugh at the preciousness of it all, but as the night wore on, I realized that the no-smartphone rule was a large part of what was allowing us to talk to each other so intensely."  The guy needed to write a note to himself about his regular column—but had to ask the bartender for a pen and paper (which the bartender provided).

By the time I finished, I couldn’t deny the pleasantness of it all—the feeling that through a few well-thought out rules, it’s more possible than I would have thought to make the relentless march of "progress" stop.

            I am as ready as anyone to complain about the often annoying ubiquity of technology in our daily lives but I'm also not particularly sympathetic to practices like bans on cell phones in restaurants.  I suppose it's fine if establishments want to adopt that rule, but in the conversations in restaurants that I have with Kathy, with Elliott, with Krystin, and others, often the phone is an aid rather than a hindrance.  We can get answers to questions that come up as we talk rather than speculate uselessly about something.  Google knows all.  The cell phone doesn't deter us from having intense conversations when they evolve, conversations that don't require allusion to facts; we just ignore the phone when it's unneeded.

            My somewhat skeptical view is that if you can't have a serious conversation because you have a cell phone available, you have bigger communication problems than having a cell phone available.

            There may be a real problem with cell phones, rather than a pseudo problem, if the folks at the University of Texas at Austin business school are correct.  The provocative title of their publication is "Brain Drain:  The Mere Presence of One’s Own Smartphone Reduces Available Cognitive Capacity" and appeared in the Journal of the Association for Consumer Research.  What they found, through a couple of experiments, is that people did worse at computer tests (that required serious attention) if their smartphone was in the room rather than located elsewhere.  Some phones were on the participants' desks; others were in pockets or bags; others were in a different room.  All were on "silent."  Out of the room, participants did markedly better; in the room but in a pocket/bag, they did slightly better.

The findings suggest that the mere presence of one's smartphone reduces available cognitive capacity and impairs cognitive functioning, even though people feel they're giving their full attention and focus to the task at
hand. . . . 'Your conscious mind isn't thinking about your smartphone, but that process -- the process of requiring yourself to not think about something -- uses up some of your limited cognitive resources. It's a brain drain.'. . . Having a smartphone within sight or within easy reach reduces a person's ability to focus and perform tasks because part of their brain is actively working to not pick up or use the phone.

            I would like to think I don't have any such problem.  My cell phone is with me most of the time, but unless I'm suddenly at a point of waiting for something or somebody, or it rings or pings me with a text message, I don't even think about it—and I can concentrate on whatever I'm doing.  The authors claim that even when you're not thinking about the phone, you suffer brain drain.  I wonder if there's a generational difference:  those of us who grew up without a cell phone treat it differently than those who've had it as an appendage their entire life.

* * *

And then there are periods where everything in life does not go right.  Kathy posted the litany on Facebook.  I've been project and construction manager for much of the summer.  Normally none of us are particularly interested in home maintenance stories.  Perhaps this is an exception.

First it was the broken garage door spring, last spring.  OK, no big deal, $199 to replace and "tune up" the garage door opener.

Then it was a plumbing problem in the basement:  a 90-degree joint, in the original 1931 plumbing, developed a slight drip.  I got the plumber out here; two guys came and after ditzing around for awhile, told me they have to replace the pipe that goes up to the first-floor bathroom.  I believe them.  They also told me they either have to knock out part of a lathe-and-plaster wall or part of the original ceramic tile in the first-floor bathroom.  I told them "no" and that they needed to find another option.  They did, but it took two guys a full day to get the fix.

Then there was the rotting wood on the trim on the outside of the house that had to be replaced.  We knew that project was coming; Elliott had mushrooms growing on his bedroom exterior window sill.  Of course, the carpenter and I found more than I had expected.  And discovered that the concrete sill under one of the basement windows had disintegrated, letting water run down the outside foundation, making for a wet wall inside.  So he put in a new sill.  This all was a week's worth of work for the guy.

The same day the carpenter first came to start working on the trim the washing machine apparently got out of balance while spinning—I wasn't here at the time—so it banged against shelves, which knocked the laundry detergent container on the floor, which popped the top off, which then poured out most of the contents.  The detergent came from Costco, so of course it was oversized—and there was a gooey mass about 8' x 8' in the basement corner under and around the washing machine.  That was such fun to get cleaned up—and I couldn't really get all that goo out of the absorbent concrete floor.  But for the fact they would have tasted like laundry soap, by the time I was done you could have eaten your scrambled eggs off that floor without risk of dirt or disease.

I knew that part of the trim repair/replace project would include repainting all the trim.  Elliott and I did the first floor, anything we could reach with a six-foot stepladder.  The second story and the peaks, however, were going to be done by someone I hired.  I was not going to get on extension ladders and be 20-30 feet above ground—and I didn't want Elliott on one either (even though he volunteered to paint the upper-level trim for less than what I would pay a professional painter.  I told him "no" because I valued his good health.  Over the years I've had two friends relate incidents where someone was killed as a result of falling off an extension ladder.)

We also discovered that the crawl space under the family room was damp along the base of the cinder blocks.  We determined that too much water was resting against the side of the house when it rained, so the trim-repair guy dug out the river rock, added a layer of plastic and dirt to angle water away from the foundation, and put the river rock back on top.  Another day's work.

Worst, in the course of doing the foundation work, we found out there was a leak in the roof or wall that had been going on for years that rotted the wood between the outside and inside walls.  It was literally dripping water in the crawl space.  The sheet rock in the family room was mushy and the baseboards were warping.  The guy doing the trim work took out some of the sheet rock; the studs behind it crumbled to the touch.  All of this was behind a big chair and wasn't evident just to look at it (and I don't go peering around a basement crawl space with a flashlight.  I guess I should have.)

We had decided earlier that we should replace the central AC unit.  It was working fine, but a couple of technicians I'd had out to check it over earlier had warned us that the unit was living on borrowed time because it was years past its life expectancy.  So I got a local outfit with outstanding Angie's List ratings to replace it.  Better at a time of our choosing than on some 95-degree day when everyone wants their AC replaced.  (It took them three trips to get it right, but they did so.)

The night before the AC guys are doing whatever they do to the forced-air furnace to make sure the AC is working properly, the water heater decided to start working only sporadically.  I had both the AC guys and the water heater guy in the basement working.

Two tries at repairing the roof didn't stop the leak in the wall—when it rained, we could watch the water dribbling down the interior where the sheet rock had been torn off—so awhile later we concluded we had to get an entirely new roof.  We had known this would be necessary as well, at some point, when the day came to sell the house, because—like the AC—it was 20 years old.

Meantime, there seems to have been spontaneous generation of water under our refrigerator, which caused the ceramic tiles to come loose and the base of the plaster wall to get soggy.  I say "spontaneous" because the refrigerator-repair guy spent about 45 minutes doing everything he could think of to find a leak in the refrigerator, without success.  But I had to get a guy over to re-set the tiles and repair the wall.  And, a task I loved, repainting the wall and baseboard next to the floor behind the refrigerator—where no one will ever see it.

A task I'd put off for several years:  the layers of paint since 1940 were flaking off the trim around the front door.  I loved my great-aunt Inez dearly, but I could have (figuratively, of course) strangled her for painting all the interior varnished woodwork in the house when they moved in.  In the spirit of all these construction projects, I finally bought the stripper, the stain, and the varnish and took the trim back to the original wood.  Stripping stained wood leaves the stain but it looks bleached, so I re-stained and then varnished.  This brought back memories of stripping and refinishing 14 doors in the house in 1997 when we remodeled.  I vowed then I would never again strip and refinish woodwork.  I was wrong.  (I knew I could have hired someone to do this work, but it's tedious and time-consuming to get 75 years' worth of paint out of all the creases and crevices in trim and then refinish the wood.  I was pretty sure I'd do a better job, anyway—I was an expert 20 years ago—so I did it myself.  Close to a violation of my retirement rule #2 ("Does it have to be done?  No.  Will I enjoy it?  No.  Then I won't do it."  I could have let the next owner deal with the flaking paint—but we have to look it for several more years and it just looked tacky.)

The most annoying part of watching all the money flow out was that our lives were not improved in any fashion.  All we'd done is maintain the status quo—having hot water, cool air in summer heat, a dry interior, a kitchen floor, etc.  I'd feel better about spending this much money if we had a major redecoration or new addition or newly-configured space.  But nope, it's the same old house.  And the leak isn't fixed yet so the wall remains ripped open to dry and stays that way until we know the water is held at bay, so to speak.

            Moving into a modern townhouse looks better and better.

* * *

            With all the hoopla surrounding transgender people using bathrooms, my long-thought solution has been strengthened by research!  Of course, my solution would lead many to recoil in horror, but that doesn't change my mind.  All human beings have to eliminate solid and liquid waste, just as we all have to eat and drink.  The latter we have both simple and elaborate ceremonies around; about the former we have society-wide inhibitions.

            I think we should just have one bathroom for everyone, a bathroom that multiple people of whatever sex or gender can use simultaneously.

            Scholars at Ghent University in Belgium (in another field I didn't know existed, "queueing theory") have demonstrated that long lines for the women's room can be substantially reduced or eliminated, and thus "help in battling female-unfriendly toilet culture."  First, they investigated why women always face longer lines than men.  I doubt anyone would be surprised at the reasons:  there are often net fewer toilets for women than men (because urinals take up less space, so in the same square footage there are more "sites" for men); women spend more time because they typically clean the toilet and because they sometimes have more complicated clothing to remove; and when there are crowds, the first two effects are amplified, leading to very long lines.

            Second, they modeled a number of different bathroom layouts, at both busy and calm periods, and learned that by far the best for reducing waiting is unisex toilet cabinets, with additional men's urinals optional.  One alternative that works the least well is segregated restrooms.

            Clearly, in order to eliminate waiting and to allow us all to move quickly through a task we all face but none enjoy, we should just build one big bathroom wherever there are bathrooms.  My guess is that a large majority of Americans would gasp at the idea.  While I'm conservative in my own social behavior, I think this fetish about elimination is silly. 

* * *

            Unfortunately for those with genetic diseases, and probably many of us with diseases of any kind, my late friend Burt Shapiro has been vindicated in a study out of Stanford.  He cautioned me many times, at our regular lunches, to be skeptical of claims that a gene (or a few genes) had been discovered to be the cause of a disease.  (Aside from the rare cases where a single gene does control disease, although those cases are not trivial:  examples include cystic fibrosis, sickle cell anemia, Tay-Sachs, muscular dystrophy, Huntington's disease.)

            The Stanford researchers concluded that "the gene activity of cells is so broadly networked that virtually any gene can influence disease. . . .  As a result, most of the heritability of diseases is due not to a handful of core genes, but to tiny contributions from vast numbers of peripheral genes that function outside disease pathways."  For example, they discovered that height is influenced by the entire genome, not just a few genes for height.  "Any given trait, it seems, is not controlled by a small set of genes.  Instead, nearly every gene in the genome influences everything about us.  The effects may be tiny, but they add up."

            This moves the hypothesis about the link between genes and traits and diseases from polygenic to (in the term devised by the researchers) "omnigenic."  There may be a few genes that have a greater effect on a trait or disease, but there are so many additional genes that have a small effect that one can't say the trait/disease is "caused" by a few genes, because the cumulative indirect small impact of the many outweighs the effect of the few core genes.  What this analysis means is that heritable diseases come from the effects of the many as much or more as from the effects of the few.

            This conception of genetically-inherited traits and diseases hasn't been fully embraced by the research community—but the reactions have been positive.  Of course, if it's correct, it means that treating anything other than single-gene (or perhaps few-gene) diseases becomes enormously complicated.  Alas.

* * *

            It is worth thinking anew about the words and work of a man the New York Times eulogized in 1959 in a front-page obituary:  "No other American of his time has contributed more to the welfare of this country and of humanity in general."  This is not a man well-known to most, but the Times wasn't too far off the mark.  I missed the sesquicentennial of his birth last year, so I'm a little late.  I am celebrating Abraham Flexner.

            Anyone who has ever visited the doctor, and certainly anyone who has gone to medical school, is the beneficiary of Flexner's early work.  Flexner had no training as a physician; his college education was in Greek, Latin, and a Master's in Philosophy.  He had been a high school teacher (apparently outstanding at pedagogy), had written a book about American colleges, The American College: A Criticism, and had traveled in Europe looking at pedagogical strategies.  The head of the Carnegie Foundation, Henry Pritchett, read the book and recruited Flexner to evaluate medical education in the United State.

            Flexner did extraordinary work.  In 1910 he issued his report, which essentially ended for-profit medical schools (which most were up to that point).  The majority of schools were rated substandard; one-third of them closed after the report was issued and another third had to make substantial improvements.  ("The Report became notorious for its harsh description of certain establishments, for example describing Chicago's 14 medical schools as 'a disgrace to the State whose laws permit its existence . . . indescribably foul . . . the plague spot of the nation.'")  The report called for, inter alia, increasing preparation, admission, and graduation standards, offering a curriculum that was purely science-based, getting medical school faculty to engage in research, and adopting stronger state licensing standards.  The recommendations were largely adopted across the country and medical education was transformed.  Flexner also "convinced some of the wealthiest men in America to donate massive sums for the establishment of modern, research-oriented medical schools at Chicago, Columbia, Rochester, and elsewhere."  The medical schools of today reflect Flexner's work.

            1930 saw the publication of Flexner's Universities:  American, English, German.  He expanded on his 1908 book, which led Pritchett at the Carnegie Foundation to recruit Flexner for the study of medical schools.  For anyone interested, I've appended at the end of this message a very brief precis of his book.  His views of what a university should be have not been embraced at all in the United States, although he set forth an archetype, elements of which institutions can measure themselves against.

            But that wasn't it for Flexner!  In the words of Donald Drakeman, "Flexner’s Institute for Advanced Study [IAS] is one of the greatest second acts in educational history."  (I'd call it a third act, following the medical school report and his 1930 book, but whatever.)  Flexner worked with a philanthropic couple who donated $5 million in late 1929 (but before the stock market crash) to establish the IAS at Princeton.  It became, and remains, one of the great think tanks of the world, and along with Oswald Veblen (the first faculty member, nephew of the muckraker Thorstein Veblen), Flexner recruited some of the most brilliant physicists and scientists alive (with considerable help from Adolf Hitler).  The faculty and members of the IAS have won an astonishing number of international awards (e.g., Nobels).

            Flexner's advocacy for the IAS reflected the views he expressed in 1939 in Harper's, in an essay titled "The Usefulness of Useless Knowledge."  It was a tumultuous time to make that case, shortly after Germany had invaded Poland and set off the European theater of WWII.  Flexner was arguing against using the standard of "usefulness" in judging research and thinking, a position he adhered to in the philosophy of the IAS.  I need to let Flexner speak for himself; this is the second paragraph of his 1939 essay.

We hear it said with tiresome iteration that ours is a materialistic age, the main concern of which should be the wider distribution of material goods and worldly opportunities.  The justified outcry of those who through no fault of their own are deprived of opportunity and a fair share of worldly goods therefore diverts an increasing number of students from the studies which their fathers pursued to the equally important and no less urgent study of social, economic, and governmental problems.  I have no quarrel with this tendency.  The world in which we live is the only world about which our senses can testify.  Unless it is made a better world, a fairer world, millions will continue to go to their graves silent, saddened, and embittered. . . .  Now I sometimes wonder whether that current has not become too strong and whether there would be sufficient opportunity for a full life if the world were emptied of some of the useless things that give it spiritual significance; in other words, whether our conception of what is useful may not have become too narrow to be adequate to the roaming and capricious possibilities of the human spirit.

            There are so many examples that supported Flexner's view.  He was once in a debate with Kodak founder George Eastman, who lauded the work of Marconi.  Flexner dismissed Marconi as "inevitable" and pointed out that the real work that permitted the invention of radio transmission was that done by "Professor Clerk Maxwell, who in 1865 carried out certain abstruse and remote calculations in the field of magnetism and electricity. . . .  Other discoveries supplemented Maxwell’s theoretical work during the next fifteen years.  Finally in 1887 and 1888 the scientific problem still remaining — the detection and demonstration of the electromagnetic waves which are the carriers of wireless signals — was solved by Heinrich Hertz."  Flexner maintained that neither Maxwell nor Hertz was thinking about usefulness; they were driven by curiosity.  Marconi simply invented "the last technical detail."  Flexner adduced a multitude of other such work:  Faraday (electricity), Maxwell's equations ("underpin all electric, optical and radio technologies, including power generation, electric motors, wireless communication, cameras, televisions, computers"), Gauss's non-Euclidean geometry (used in piloting ships and planes around the globe), and so on.  None of them had anything close to real-life application when they were first devised nor was "usefulness" a standard by which the work was judged.  (Flexner put Thomas Edison in the same category as Marconi:  building on useless theoretical work performed earlier.)

            Flexner's argument, thus, is not that "useless" knowledge remains useless forever, only that we never know what "useless" knowledge might come to be critical for some advance in engineering, technology, or human life—or when.  So, he maintains, it is essential to continue to allow those who are able to pursue abstract thought wherever it may lead.  The key word is curious:  curiosity may kill cats but it often leads to profound discoveries in every field of human endeavor (including, Flexner contended, in the arts and humanities, not just the sciences).  Flexner stuck to his principles when hiring at the IAS, and brought in John von Neumann, Kurt Gödel, Alan Turing, and of course Albert Einstein.  All of their work initially had no practical use; all of it eventually became central to our lives in the 21st Century.

I am not for a moment suggesting that everything that goes on in laboratories will ultimately turn to some unexpected practical use or that an ultimate practical use is its actual justification.  Much more am I pleading for the abolition of the word ‘use,’ and for the freeing of the human spirit. To be sure, we shall thus free some harmless cranks.  To be sure, we shall thus waste some precious dollars.  But what is infinitely more important is that we shall be striking the shackles off the human mind and setting it free for the adventures which in our own day have, on the one hand, taken Hale and Rutherford and Einstein and their peers millions upon millions of miles into the uttermost realms of space and, on the other, loosed the boundless energy imprisoned in the atom.  What Rutherford and others like Bohr and Millikan have done out of sheer curiosity in the effort to understand the construction of the atom has released forces which may transform human life; but this ultimate and unforeseen and unpredictable practical result is not offered as a justification for Rutherford or Einstein or Millikan or Bohr or any of their peers.

            Flexner's corollary, implicit point is that it can be inaccurate to ascribe the invention or development of anything to a single individual.  Virtually every discovery or invention has a long history behind it.  It may take genius to finally put the pieces together and invent or devise something or some scientific advance, but whoever it is builds on decades or centuries of earlier work.

            What Flexner was arguing against was putting all the emphasis in educational programs on usefulness and practicality.  It is desirable, of course, to train competent lawyers and doctors and engineers.  A reviewer of Flexner's essay wrote that "even in the pursuit of strictly practical aims [in professional schools, for example] an enormous amount of apparently useless activity goes on.  Out of this useless activity there come discoveries which may well prove of infinitely more importance to the human mind and to the human spirit than the accomplishment of the useful ends for which the schools were founded."

            Flexner was acutely aware of what was happening in the world, of course. 

In certain large areas—Germany and Italy especially—the effort is now being made to clamp down the freedom of the human spirit.  Universities have been so reorganized that they have become tools of those who believe in a special political, economic, or racial creed.  Now and then a thoughtless individual in one of the few democracies left in this world will even question the fundamental importance of absolutely untrammeled academic freedom.  The real enemy of the human race is not the fearless and irresponsible thinker, be he right or wrong.   The real enemy is the man who tries to mold the human spirit so that it will not dare to spread its wings, as its wings were once spread in Italy and Germany, as well as in Great Britain and the United States.

I wish Flexner could be raised from the dead and have a chat with certain members of Congress about funding research.

* * *

            My little ode to Flexner reminds me of that pointless but amusing debate we can have over drinks about who affected the most lives in the 20th Century (for good or for bad).  I wouldn't argue for Flexner, of course.  Many, I imagine, would immediately suggest Hitler (or Stalin or Mao).

            To my way of thinking, two likely candidates are Alexander Fleming (who discovered penicillin) and Jack Kilby (and others, to be sure, who invented the integrated chip, making all modern electronics possible [see Flexner's caution about attribution!]).  Per Wikipedia:

There is no consensus on who invented the [Integrated Chip].  The American press of the 1960s named four people:  Kilby, Lehovec, Noyce and Hoerni; in the 1970s the list was shortened to Kilby and Noyce, and then to Kilby, who was awarded the 2000 Nobel Prize in Physics "for his part in the invention of the integrated circuit".  In the 2000s, historians Leslie Berlin, Bo Lojek and Arjun Saxena reinstated the idea of multiple IC inventors and revised the contribution of Kilby.  [Footnotes omitted]

            I'd argue that between penicillin and computers/cell phones, either of those two affected the lives of far more people than Hitler or other political leaders of the 20th Century.

            TIME magazine's list of the 100 most important people of the 20th Century (which isn't the same standard as "affected the most people") are Einstein, FDR, and Gandhi as 1, 2, and 3.  I'd still go for penicillin and computers.

* * *

            It is news like this that makes me extremely wary about loosening regulations covering drugs allowed onto the market for widespread use.  London School of Economics and Political Science researchers (along with faculty at Stanford and the University of Pennsylvania) took a look at the outcomes when the Food and Drug Administration "fast tracked" approval of drugs—that is, let them be available to the public without sufficient clinical evidence of their effectiveness.  They are deemed "reasonably likely" to have benefits but don't have to meet the regular standards of review.

            Their study covered the years 2000 – 2013 (so this wasn't a "Bush" or "Obama" effect).  The researchers charged that the evidence used to approve them was inadequate and flawed.  There were no randomized trials but the drugs nonetheless became part of standard treatment regimens. 

I don't have access to the full article, but it appears that what the researchers did not do was provide information about whether patients had been harmed by the drugs that received faster approval.  One would like to know that before being too critical of a more flexible approval process—but I'm still wary.

* * *

            And then there is research I could never, ever understand without someone giving me the kindergarten explanation:  "A team of Hokkaido University researchers has developed the world's first method to achieve the catalytic asymmetric borylation of ketones."  The kindergarten summary told me this success will make possible multiple new medicines.  Glad to know.  (I poke fun at these descriptions of research, but I understand that the technical language means something to those in the field—and that (presumably) the researchers could, if asked, explain in plain English what they're doing.  I also accept the likelihood that is indeed important stuff.)

* * *

            More than you ever wanted to know and certainly don't need to read:  Here's a little more on Flexner, taken from his book (1930) Universities:  American, English, German.  These are a few summary pages drawn from a long book, but I think you can get the gist of Flexner's thought.  (You should know that others have offered very different views of what a college or university should be:  John Henry Cardinal Newman, Thorstein Veblen, Robert Hutchens, Clark Kerr, Burton Clark.)

Universities are "inside the general social fabric of a given era"; they are an expression of the age.  Universities have changed dramatically; change came "sometimes slow and unconscious, sometimes deliberate and violent – made in the course of centuries by institutions usually regarded as conservative, frequently even as the stronghold of reaction. . . .  But a university should not be a weather vane, responsive to every variation of popular whim.  Universities must at times give society, not what society wants, but what it needs." 

            Universities should have four major concerns:  the conservation of knowledge and ideas, the interpretation of knowledge and ideas; the search for truth; the training of students who will practise and 'carry on.'"  But the university "is only one of many educational enterprises" with certain functions; other functions should be fulfilled elsewhere.  "Creative activity, productive and critical inquiry – all in a sense without practical responsibility – that must bulk ever larger and larger in the modern university."  Conservation will take second place to the advancement of knowledge.  The demands of democracy, social and political, are the most dangerous with which to deal but are among the most important; "the chasm between action, on the one hand, and knowledge, on the other, is widening rather than contracting.  Practice cannot be slowed down or halted; intelligence, however, must be accelerated."  In this respect, Flexner, unfortunately, seems to have been prescient about the 2000s.

Flexner was at odds with the world of higher education when he wrote the book in 1930—and his views on the educational role of a university have been repudiated even further in the intervening years.  In his opinion, the "training of practical men is not the task of the university"; the journalist and merchant and politician must be educated in some other way. 

            Flexner agreed that universities should study contemporaneous issues.  There is a pressing need for the advancement of the social sciences, given the social problems that exist and the rate of social change—and one can argue that social change has accelerated in the succeeding 80+ years.  Societies must act, either intelligently or blindly; "the weight and prestige of the university must be thrown on the side of intelligence."

            Flexner argued for both the sciences and the humanities; science "periodically and episodically destroys the foundations upon which both science and society have just become used to reclining comfortably."  "While pure science is revolutionizing human thought, applied science is destined to revolutionize human life. . . ."  At the same time, the humanities become more important; society can be so taken with advances in knowledge "that we lose our perspective, lose our historic sense, lose a philosophic outlook, lose sight of relative cultural values."  Science "is indifferent to use and effect" but "taste and reason . . . sit in judgment on the uses to which society puts the forces which the scientist has set free."

            The modern university addresses itself "to the advancement of knowledge, the study of problems, from whatever source they come, and the training of men."  This includes the constitution of the stars, the atom, of Oklahoma and Kenya, and what is happening with them.  "All these are important objects to know about.  It is not the business of the university to do anything about any of them" (italics in original). 

            If one concedes for purpose of argument that all the things a university is doing are worth doing, are they things it should do?  They are not, says Flexner.  It would be all they could do even to do what they should do—which they are not—and they have assumed tasks that "are irrelevant and unworthy."  If society deems those functions worthy, it should find some other way to do them.

            Here's where Flexner parted company with virtually all of American higher education.  What belongs in a university?  The advancement of knowledge; "assuredly neither secondary, technical, vocational, nor popular education."  They are important activities, "but they must not be permitted to distract the university."  (Secondary education is concerned with the student's manners, morals, and mind.  "The university has no such complicated concerns. . . .  In the United States, 'secondary education' would swallow not only the high school, but much, perhaps most of the college.")

As for what the university should include, "of the professional faculties, a clear case can, I think, be made out for law and medicine; not for denominational religion, which involves a bias, hardly perhaps for education, certainly not at all for business, journalism, domestic 'science,' or library 'science.'"  The professions that belong are the "'learned professions," which "derive their essential character from intelligence."  Law and medicine may do nothing more than train practitioners, in which case they should be separate vocational schools; unless the faculties "live in the atmosphere of ideals and research, they are simply not university faculties at all."  So there goes a significant chunk of the modern university—in addition to journalism and the business school, Flexner would also have tossed out education, agriculture, and engineering (except as fields engaged in basic research).  He was consistent about medical schools:  if patient care was their primary mission, then out!  If focused on research and advances in science as well as training physicians, then they should be inside universities, which is essentially what he argued in his 1910 report.
           
            Flexner turned his eye on U.S. universities.  He maintained, and I think most historians of education agree, that the United States had no university (in his sense as well as in the sense of a modern research university) until Johns Hopkins opened in 1876.  Six decades later, he identified three parts to the American university:  secondary schools and college, graduate and professional schools, and "'service' stations for the general public." 

Flexner had a scornful view of high schools; it isn't clear to me that his criticisms are entirely mistaken, even today:  subjects "like Latin, mathematics, or history, and skills, like typewriting or cooking, are ingeniously combined on an utterly fallacious theory into 'units,' 'points,' and 'counts'" that accumulate to a diploma and a "'high school education.'"  It is "a kind of bargain-counter on which a generous public and an overworked and underpaid teaching staff display every variety of merchandize – Latin, Greek, science, agriculture, business, stenography, domestic arts"  -- that leaves the student to pick and choose and accumulate the required credits.  "The American high school is neither intelligent, selective, nor thorough; and it is the high school graduates of a given June—mostly untrained, mostly unselected, . . . who become the college students of the following autumn."
           
            Flexner's criticism of high schools runs to colleges as well.  Colleges try to select, but cannot because they "do not know what they wish" (e.g., brains, industry, scholarship, character).  Once at the college, the students receive the same kind of education that they received in high school, "another bargain counter period is lived through in the college."  And again, the student "nibbles at a confusing variety of courses . . . so many hours of this, so many hours of that, so many points here, so many there . . . until again, at the close of four years, he has won the number of 'credits' or 'points' that entitle him to the bachelor's degree."

            The undergraduate at the typical (even elite) college spends two years on studies that should be in high school.  There is much opportunity to study subjects worthy of attention—but there is also much opportunity to complete degree requirements by such courses as "'principles of advertising,'" "'practical poultry raising,'" and "'clothing decoration.'"  Mixing different training harms them all, especially higher education, and "not all the weight of its wealth and numbers can place cooking and wrestling on an intellectual par with music, science, literature, and economics, or make it sound educational policy to jumble them together."

Why do some universities believe they must provide service?  State institutions feel compelled "to justify themselves to the man in the street or on the farm" because their income is from the legislature; the private institutions must be useful in order to obtain gifts and so they are not seen as aristocratic.  "Useful" means immediately so, because Americans like results.  Flexner cites the late president of Brown University in a lament, in a passage that rings true today as well: 

I am inclined to think most Americans do value education as a business asset, but not as the entrance into the joy of intellectual experience. . . .  They value it not as an experience, but as a tool.  Possibly Americans value everything in that way—value church and religion and marriage and travel and war and peace, never for the sake of themselves, but always for the sake of something just around the corner.  The sense of tomorrow has amazing power in our American life.'"

Flexner eschewed what is part of the core responsibility of places like Minnesota, a land-grant institution obligated to lend help to the people of the state.  "Contacts between faculty and business, industry, health, sanitation, or philanthropy are essential to the faculty itself.  To that extent—and to that extent only—such contacts should take place," but universities are harmed when the contacts multiply.  It happens when faculty "are at the beck and call of industries" and work closely with agriculture or education.  "I admit that the line is hard to draw.  But meanwhile the university should by precept and by example endeavor to convince the public that in the long run it will suffer, not gain, if it treats its universities largely as service institutions."

In with service can be classed such units as "domestic science or household arts, schools of journalism, business, library science or librarianship, optometry, hotel management, etc., none of which belongs within a university."   (Flexner cited contemptuously an M.A. thesis from the University of Chicago entitled "A Time and Motion Comparison on Four Methods of Dishwashing."  He must be spinning in his grave.)  "There may be need of training some persons in cooking, clothing, and housekeeping" and people to teach them, [but] neither from cultural, practical, nor scientific standpoints, are the persons who need the training, the persons who give the training, or the subjects themselves of university calibre.  Separate modest, inexpensive, and unpretentious institutions, conducted by teachers who are graduates of the school of experience will achieve the end far more effectively than the pretentious establishments that now encumber many universities."  Some of this instruction he so disliked is indeed delivered in technical institutes and community colleges, which most would describe as "modest, inexpensive, and unpretentious."  At least, inexpensive compared to other colleges and universities.

Business schools fall in the same lot.  "Is business a profession? . . .  I have already pointed out that professions are learned professions, that they have cultural roots and a code embodying an ideal; that in the long course of their history, one can make out the essentially intellectual nature of their attack on problems.  The case is different with business.  The profit-making motive must dominate; advertising is an element indispensable to success."  Business has always been around, always will be, and should be; it makes people happier and brings much to many.  But "is business today in itself an end fine enough, impersonal enough, fastidious enough, to deserve to be called a profession?  I do not myself think so. . . .  Modern business does not satisfy the criteria of a profession; it is shrewd, energetic, and clever, rather than intellectual in character."  Business is indisputably of great importance in society, and therefore universities should study it.  "It is one thing for economists and sociologists to study the phenomena of modern business in a school of business . . . and it is quite another thing – and, in my judgment, an irrelevant and unworthy thing" to offer courses in advertising, sales, finance, and so on, "to furnish advertisers, salesmen, or handy men for banks, department stores, or transportation companies."

"A genuine university is . . . an organism, characterized by highness and definiteness of aim, unity of spirit and purpose."  But the best ones in America "are merely administrative aggregations, so varied, so manifold, so complex that administration itself is reduced to budgeting, student accounting, advertising, etc."  They "do not possess scientific or educational policies, embodied in some appropriate form. . . .  In the reckless effort to expand, and thus cater to various demands, the university as an organic whole has disintegrated."





No comments:

Post a Comment

Most Read