Wednesday, December 19, 2018

#55 more on funny words, marketplace of ideas, dinner parties, how do you think?, inheriting personality traits, passion and learning





Good afternoon.

            I sent you my wishes for the season.  That's what now passes as a Christmas/holiday card from me.

A linguist friend of mine wrote back to me after my little report on funny words.  "There is one more possible explanation for words being funny: their similarity to words in other languages - which may, of course, be because those words are related to one or more of the factors that make words funny within a language. There is a linguistic technical term for this phenomenon: 'pernicious homophony.'" 

He provided examples from Swedish.  "When we boys encountered the German word for cousin, which is 'vetter,' we found that word incredibly funny.  Why?  Because it sounded like the plural of 'vagina' in our dialect.  Which, of course, confirms that funniness may be related to sex.  Another example is the Swedish word 'fack,' one of whose meanings is something like 'trade,' for example in the name of the prominent art and design school in Stockholm, named 'Konstfack,' School of 'Art Trade.'  The funniness is, I am sure, obvious to you.  I have encountered Americans who have found a particular Swedish road sign very amusing, not to say hilarious:  'Full fart,' [which means] 'full speed'!"  Finally, "you cannot even say farewell in English to a Swede without running the risk of 'pernicious homophony'!  'Good' is, of course, 'god' in Swedish - no problem there.  But 'bye' in 'good bye' may pose a problem.  I have heard little Swedish children laugh uproariously when hearing somebody say 'good bye.'  Why?  Because in Swedish a word pronounced the same way as 'bye' in English means 'poop.'  So, Gary:  'Good poop' to you!"

This was one of those moments, reading an email, when I really did LOL.

* * *

Only someone miraculously innocent of history could believe that competition among ideas will result in the triumph of truth.
— John Gray

            I have long wondered about the validity of the concept of "the marketplace of ideas" and the proposition that the truth will out, given enough free speech.  I worry that Gray may be correct (although he *is* sort of a "nattering nabob of negativism" among philosophers about liberal society).  The First Amendment is grounded in the belief about the marketplace of ideas.  Although the concept is rooted in earlier writings,

[t]he first reference to the "free trade in ideas" within "the competition of the market" appears in Justice Oliver Wendell Holmes, Jr.'s dissent in Abrams v. United States.  The phrase "marketplace of ideas" first appears in a concurring opinion by Justice William O. Douglas in the Supreme Court decision United States v. Rumely in 1953:  "Like the publishers of newspapers, magazines, or books, this publisher bids for the minds of men in the market place of ideas."

            It's a claim, not a law of nature or society.  I've not seen any data to support it (or refute it).  I'm not sure how one would gather the data or even what the data would be.  But if Gray is correct, and the principle is wrong, then what?  The alternatives sound even worse, such as state control of speech (e.g., 1984, Nazi Germany).

            What prompts the concern, of course, is the vast amount of disinformation (that is, outright lies and other factually incorrect statements) that is spread on social and other media in the 21st century.  If the falsehoods and misdirection triumph, in the sense that they guide elections and public policy and government action, then we are all in for a rough ride into the future.  It doesn't help that the current occupant of the White House is chief spreader of manure.

* * *

            A marvelous occasional publication from Drexel University, The Smart Set, recently carried an article titled "Reviving the Dinner Party."  It was written by a young woman, a self-described Millennial, who told of her path to hosting dinner parties.  Such events were not part of her family life while growing up; her parents never had them.  (And, she added, they also didn't have many close friends.  What a surprise.)  Living in apartments with roommates during college and immediately thereafter didn't provide a venue for anything like a "dinner party."  Sure, they'd get together for pizza or spaghetti and play video games or watch a sporting event, and more often it would be "meeting at the gym, carpooling, or joining a club, all of which I've done in pursuit of deepening friendships."  Or dining out or meeting at the local coffee shop.

            But when she was a guest at others' homes,  

I enjoyed the feeling that dinner at other people's houses gave me; every person . . . who has allowed me to visit their home and dine with them has made me feel so connected and supported as they let me in. It was a fundamentally different and positive feeling from meeting at a corporate or public space. . . .  So I wanted to host dinner parties partly to respond to the good feelings I received when other people hosted me in their homes.

Heading into her late 20s and living in a smaller Midwest city, she was having difficult developing new friendships of the kind that she'd known in college and a little later.  "I could see that I wasn't getting very far, emotionally, just sitting next to people in restaurants or joining them at yoga class. Everyone was very pleasant, but something about hosting people in my home felt more like friendship to me, no matter how stressed out it made me feel."

            One of the hurdles she had to get past was the stereotyped view of a "dinner party."  That is, "a single host or hostess preparing a planned dinner, where enough people would come together to fill a table and would eat together" and converse before, during, and after the meal.  She relates that a writer for the New York Times declared that such events are disappearing; he "sees stylized civility and elegance as fading, as more and more diverse groups of people no longer hold the same rigorous social mores that characterize a certain form of dinner party nostalgia. (Think: champagne glasses and a ton of forks to choose from)."  She reports that various media observers have written eulogies for the dinner party as traditionally envisioned, hosted by the socially and financially elite. 

Unlike a family potluck, where rules might be family specific, there were very specific forms of decorum that separated regular folks from the truly refined. In a modern era where millionaires and billionaires are born or made every day into a wide variety of cultural contexts, it's hard to expect that everyone attended some form of 'finishing school' to get them up to speed on these rules.  [For example, that scaramouche in the White House.]

She worries about several matters in contemplating a dinner party. 

As a greater percentage of people from middle and low income backgrounds go to college and join a diverse set of college graduates, the dinner party I could throw is becoming populated with many more diverse voices and experiences than before. I want to invite a wide variety of people to a party, but what if they clash on conversational topics? . . . Do I dress up for my guests? Do I have food waiting when everyone arrives and let them serve their plates, or do I get everyone to sit down and bring the food out?

She concludes that while there's no Amy Vanderbilt or Emily Post to tell her how to have a dinner party, it's also true that "no one really should be able to tell me how to have dinner with my friends, but it does leave a nascent hostess constantly wondering about a potential misstep."

            So she resolved to have several dinner parties in the following year. 

I was realizing that friendship, so direct in kindergarten and so prolific in college, was hard to come by in my young professional life. It was quite easy to meet people, and quite easy to then recognize people again at a later date, but things tended to stall around there. I wanted some kind of shortcut to intimacy, something that would show others that I was serious about becoming friends even when I was still figuring out if that was true. I had figured out that social media wasn't cutting it, and just "friending" someone wasn't drawing anyone new into my circle. The dinner party became my means.

My parents' social pattern was almost the opposite of hers.  All the time I was growing up they were either hosting a dinner gathering or guests at someone else's dinner party many times during the year.  There is a chicken-and-egg question here, but my parents also had reams of good friends; did they have friends (at least in part) because of dinner parties, or did they have dinner parties because they had a lot of friends?  I guess liking to host and attend dinner parties is genetic, because rough calculation suggests to me that we've hosted well over 300 dinner parties in the house I've lived in for nearly 30 years (including summer events on the deck, which count as a "dinner party" in my sense of the term).  [The crack about genetics is a joke—but if genetics play a role in what one might call "sociability"—a broader personality trait—then it's not entirely without substance.  Read on for more on this topic.]  And, of course, both Kathy and I have been to countless dinner parties, some more casual than others, but dinner parties nonetheless.  We enjoy them all (well, almost all).

In contrast, in conversations with a couple of friends in the last year or so, I was surprised at statements made.  "Oh, we never entertain" and "I think we had one party in our house once" (in a house they've lived in for decades).  I don't criticize others for not hosting, but it's sure not our style.  I can't imagine not entertaining friends from time to time.  The young Smart Set writer is also correct, in my experience, that a dinner party is one way to both build and cement friendships.  (Dinner parties are not the *only* way to do so, but they're one obvious and comparatively easy way to do so.)  They are also a way to determine that an acquaintanceship should (or should not) progress to a friendship.

* * *

            Darn it, I read a quotation attributed to John Kenneth Galbraith and later I couldn't find it.  If I remember it correctly, Galbraith was asked if he thought in words or pictures.  He replied, "I think in thoughts."  Without doing any reading in neuroscience or philosophy, and relying solely on my own reflections, my experience is the same as Galbraith's.  I don't think in words or pictures, either.  I don't know what I think in, except that whatever it is, it gets translated from brain through the fingers or the mouth into words, which is how I—or anyone—turns whatever is in their brain into something for others to behold.  Surely the thoughts precede the printed or spoken word.  Don't they?  Apart from the instance when we speak without thinking, which often turns out badly—but even then, presumably there had to be some microsecond of thought before the words were uttered.    

* * *

            A neurogeneticist at Trinity College Dublin, and author of Innate: How the Wiring of Our Brains Shapes Who We Are (2018), recently wrote an article that supported two opinions I've long held.  Given that some of this is contested territory, my reaction may be a case of confirmation bias, but the puissant logic of his position seems to me difficult to get around.

Many of our psychological traits are innate in origin. There is overwhelming evidence from twin, family and general population studies that all manner of personality traits, as well as things such as intelligence, sexuality and risk of psychiatric disorders, are highly heritable. [So the difference among people on such things as] IQ scores or personality measures is [partly, but significantly] attributable to genetic differences between people. The story of our lives most definitively does not start with a blank page.

            What he goes on to clarify, however, is that there are few "direct links from molecules to minds" and that the "commonly used 'gene for X' construction is unfortunate in suggesting that such genes have a dedicated function: that it is their purpose to cause X.  This is not the case at all."  The term "gene" has two different meanings, which causes confusion.  In biology, it's "a stretch of DNA that codes for a specific protein. So there is a gene for the protein haemoglobin, which carries oxygen around in the blood, and a gene for insulin, which regulates our blood sugar, and genes for metabolic enzymes and neurotransmitter receptors and antibodies, and so on."  In the study of the heritability of traits or characteristics, a gene is something that can be passed between generations that is related directly to a trait or condition (such as sickle-cell anemia or blue eyes).  What links the two is "variation: the 'gene' for sickle-cell anaemia is really just a mutation or change in sequence in the stretch of DNA that codes for haemoglobin."

            Genetic variants are what cause differences among people on traits and conditions.  But, he cautions, they "might be having their effects in highly indirect ways."  Human brains vary widely "in the size of various parts of the brain," characteristics that are heritable to some extent, but "the relationship between these kinds of neural properties and psychological traits is far from simple."  The search for a link between various brain structures and behaviors, sometimes identified, "have not held up to further scrutiny."  The reason, he says, is "that the brain is simply not so modular" and functions are spread all over the place as well as reliant on the interconnections.  "There is no one bit of the brain that you do your thinking with."

So there's no gene for intelligence.  Not even just a few genes for intelligence, probably.  There are variants that "have now been associated with intelligence" but each one is very small in its effect.  Interestingly, they seem to be involved with brain development—so intelligence may (at least in part) be a reflection "much more generally how well the brain is put together."  Similarly with other traits; these genes "are multitaskers: they are involved in diverse cellular processes in many different brain regions."  Moreover, because all these systems are linked together, variants in one part affect others.  So don't go looking for genetic conditions that control or affect traits.

The relationship between our genotypes and our psychological traits, while substantial, is highly indirect and emergent. It involves the interplay of the effects of thousands of genetic variants, realised through the complex processes of development, ultimately giving rise to variation in many parameters of brain structure and function, which, collectively, impinge on the high-level cognitive and behavioural functions that underpin individual differences in our psychology.

And that's just the way things are. Nature is under no obligation to make things simple for us. When we open the lid of the black box, we should not expect to see lots of neatly separated smaller black boxes inside – it's a mess in there.

            Mess though it surely is, it appears to be a mess because we can't figure it out, at least not now.  But that doesn't mean it doesn't work.  I see too many instances of kids with traits that reflect the parents—and upbringing isn't a satisfactory answer because it often seems too simplistic.

            He puts paid, once again, the notion that ailments and traits are linked to specific genes, thus endorsing the proposition that you should be *very* skeptical any time you read or hear a news piece reporting a link between a gene and something or another. 

* * *

"Do what you like to do.  It'll probably turn out to be what you do best."
— Wallace Stegner

            Great students don't need passion for schooling, or so recent (and I think pretty solid) research says.  I'm trying to figure out if I or any of my friends were passionate about school.  I think I probably wasn't.  In junior high school, obviously not, given my grades.  But not in high school, either, I think, even though I performed well.  (The research is on 15-year-olds, so the "school" is secondary, not college or university.)

            Most of us would think passion a critical part of success in most endeavors.  You have to passionate about the violin to put in the thousands of hours of practice to play at the top tier.  You have to be passionate about gardening to grow beautiful flowers and tasty vegetables.  You have to be passionate about farming to voluntarily adopt it as a life pursuit.  And so on.  Stegner's hypothesis is supported by data.  My long-time faculty friend Jo-Ida Hansen, who for many years ran the Center for Interest Measurement Research at the University of Minnesota, once commented to me that the research is clear:  People do well at things in which they are interested and, conversely, they are not likely to do well at things that do not interest them.  One can debate whether "interest" = "passion," but they're clearly related.
           
Professor Jihyun Lee at the University of New South Wales (Australia) does interesting work.  Trained at Columbia and Harvard, she worked for the Educational Testing Service before going to Australia.  She focuses in part on making psychometric tests more accurate and usable in educational settings, a worthwhile goal.  In the course of her work, she looked at Programme for International Student Assessment (PISA) results, an international survey of students conducted every three years; the 2015 results included students in 72 countries.  In addition to tests in reading, math, and science, it also asks students about their attitudes toward school.

What Professor Lee found is that there is nearly zero correlation between attitude to school and academic performance, and her findings from the 2015 PISA results parallel those in the earlier PISA surveys.  She analyzed the data by country (developing or developed), gender, and socio-economic status.  None of those variables matters.  (Statistical analysis led her to conclude that about 2% of the variation in academic performance depends on passion, so it's essentially irrelevant.)  "This means that in most countries, academically able students do not hold their schooling in high regard.  Similarly, academically less able students do not necessarily have low opinions about their schooling.  There's simply no connection."

So, she asks, what motivates students to do well in school?  Drawing on other research based on PISA results, she concludes that the drive "comes from within. . . .  What sets academically able and less able students apart is self-belief about their own strengths and weaknesses.  Individual psychological variables such as self-efficacy, anxiety and enjoyment of learning in itself explain between 15 per cent and 25 per cent of the variation in students' academic achievement." 

What accounts for the other 75% of the variation?  One guesses it's raw ability, socio-economic status (apart from passion), attendance, parental education, and a multitude of other factors.  School quality surely matters.  The debate about what factors weigh more heavily, and can be manipulated to the advantage of students, has been going on in the field of educational research for decades. 

This was one of those articles where the reader comments were largely on point and thoughtful.  One weakness that several commentators noted was that 15-year-olds around the world are being asked to respond to these four questions in the PISA survey:

(a) school has done little to prepare me for adult life when I leave school
(b) school has been a waste of time
(c) school helped give me confidence to make decisions
(d) school has taught me things that could be useful in a job

At 15 years old, how would they know?  *Maybe* they could legitimately answer (c), but they certainly can't assess accurately (a), (b), and (d).  As one reader wrote, he teaches 15-year-olds and loves them—but they're knuckleheads.  A variety of other commenters, however, thought Professor Lee made a valid and useful point.  I think the critics have the better argument.

            The comment about knuckleheads parallels my skepticism about relying too heavily on student evaluations of teaching.  There are plenty of examples of the "hardest" teacher being the one whose students take away the most from a class—something they realize decades later.  Even college students can be knuckleheads.

            The more I thought about this, the more I realized I was *not* passionate about school.  In fact, I often allowed myself to be distracted from studying.  I very much wanted to *learn* the subject matter (some in high school, more in college, and especially in graduate school), but *studying* is hard work and, for me, not something I looked forward to.  Sitting down with book or journal article, highlighter in hand, was something I had to force myself to do.  I knew I had to do it, and I always did, and on occasion *did* enjoy the actual work of learning, but for the most part not.  (I should add that this description is not accurate when one is doing one's own research; in that case, the reading and studying the work that's been done is fun.)

            I'm done studying for this message, so I'm sending it.

-- Gary


Saturday, December 15, 2018

end of the year wishes for you

 
Good morning.
Whatever the occasion you mark at this time of year, if any, I wish you contentment.  I hope you find yourselves at ease, secure, and fulfilled in your personal lives and relationships, able to face the worries of the world, large and small, from that firm foundation.  May your 2019 have more positives than negatives, may the positives bring delight to you, may the negatives be minor, and may the year find you at peace in your life.

-- Gary

Thursday, December 6, 2018

#53 retiring abroad, the worst year to be alive, bridge, funny words, wind tunnels





Good afternoon.

Kathy noticed an article on BBC about an American couple, upon retiring, that decided to sell everything they owned and travel around the world.   He's 72, she's 62.  They've already gotten to 80 countries and 250 cities and "said they spent on average $90 per night, spread across 163 different Airbnb rentals, to enjoy their different type of retirement. . . .  They have funded their travels from their savings, and through selling their Seattle home, boat and cars."  Kathy suggested, I assumed facetiously, that we could consider such an option.

It is true, we could.  Anyone doing this would have to have a reasonable amount of money available ($90 per night x 365 nights = ~$33,000, and you still haven't eaten or paid for transportation costs or for any other activity).  Assuming that some day you'd want to come "home," wherever that might be, you'd need to have a nest egg for renting/buying a place and furnishing it.  Or, as Kathy observed, you rent a storage unit and keep your flatware and dishes and the like.

Speculating about this is like speculating about winning the Powerball lottery:  it can be amusing but it ain't ever gonna happen.  The BBC article makes no mention of children or grandchildren; most people who have them would not want to be away permanently.  I am also enough of a creature of habit that I wouldn't want to be away from my house and my friends and my regular activities in life.  Can I be gone Jan-Mar somewhere warm?  You betcha.  Indefinitely?  Nah.

* * *

            When was the worst year to be alive?  Arguing for one year versus another is one of those interesting parlor games we can play.  To make a persuasive case, of course, you need a decent grasp of the entire span of human history.  One Harvard historian and archaeologist, Michael McCormick (chair, Harvard University Initiative for the Science of the Human Past), argues that the worst year was 536.  Or at least 536 began a period when life was a challenge.

A mysterious fog plunged Europe, the Middle East, and parts of Asia into darkness, day and night—for 18 months. . . .  Temperatures in the summer of 536 fell 1.5°C to 2.5°C, initiating the coldest decade in the past 2300 years. Snow fell that summer in China; crops failed; people starved. . . .  Then, in 541, bubonic plague struck the Roman port of Pelusium, in Egypt. What came to be called the Plague of Justinian spread rapidly, wiping out one-third to one-half of the population of the eastern Roman Empire and hastening its collapse.

            That the mid-sixth century was a dark period (both literally and figuratively) has been known to historians for a long time.  It was the analysis of ice cores that gave away the story:  volcanic eruptions.  McCormick says the evidence suggests Iceland; there are other data points that hint at North America.  In McCormick's view,

a cataclysmic volcanic eruption in Iceland spewed ash across the Northern Hemisphere early in 536. Two other massive eruptions followed, in 540 and 547. The repeated blows, followed by plague, plunged Europe into economic stagnation that lasted until 640, when another signal in the ice—a spike in airborne lead—marks a resurgence of silver mining. . . .  When a volcano erupts, it spews sulfur, bismuth, and other substances high into the atmosphere, where they form an aerosol veil that reflects the sun's light back into space, cooling the planet.  [Another research team] found that nearly every unusually cold summer over the past 2500 years was preceded by a volcanic eruption.

The result was that 536-545 was the "coldest decade on record in 2000 years."

            Earlier research had discovered that lead also disappeared from the atmosphere from 1349-1353, a period of the Black Death (bubonic plague returned), "revealing an economy that had again ground to a halt."

            I don't have any point to make in telling this story, other than to be impressed with the combination of historical records with meticulous scientific research.  Given the prospects for global climate change (i.e., warming), perhaps we need a few more volcanic eruptions.

* * *

            I have played bridge since the summer of 1967, when three high school friends who played needed a fourth—so they spent quite a few evenings with me during summer break teaching me to play.  For several years thereafter I was a "slop" bridge player—I didn't know about or learn the various bidding conventions or the logic and percentages of play.  In the mid-1970s I came to know (through work at the University) Bob Geary, in the Department of Men's Intercollegiate Athletics (who was an excellent bridge player), who had prepared a "cheat sheet" on bidding.  Encountering that summary made me explore and think about the game more thoroughly.  And so I've played the game with varying frequency over the years, sometimes more often, sometimes only 6-7-8 times annually.  (I can attest that your bridge skills deteriorate when you play so infrequently.)

            In recent years I have, on occasion and not terribly seriously, urged Elliott to learn to play.  His natural retort was "who would I play with other than you?"  His point was well taken; there are few people in his age cohort, few Millennials, who play bridge.  Alas.  Elliott would also reply that he'd learn to play bridge if I'd learn to play "Magic: The Gathering," an incredibly complex "combat" card game that now has about 20,000 different cards.  New cards are issued every year.  I did sit for one session with him to learn how Magic is played; I told him afterward that it was too complicated for me—and besides, "who would I play with other than you?"  There are few among Baby Boomers who play the game.  

Wikipedia tells me:  "Released in 1993 by Wizards of the Coast, Magic was the first trading card game created and it continues to thrive, with approximately twenty million players as of 2015, and over twenty billion Magic cards produced in the period of 2008 to 2016 alone."  (Prize money for national and international tournaments (first place) reaches $50,000.)  Elliott recently told me that he'd gone back to the rules for Magic to refresh his memory and reported that the glossary of terms alone totals 39 pages.  Here's a picture of one Magic card.



            This isn't about Magic, however, it's about bridge.  Elliott recently agreed to learn to play.  He's doing it as a favor to me; in the 6-couple bridge game I've played in since 1978, my long-time partner now spends much of the year in Florida, so I must continually find a sub.  I renewed my request to Elliott, this time telling him I needed a regular partner.  He said he'd play and has taken to the game with enthusiasm.  So if things work out, he'll become a "regular sub" in our group, the youngest member by far.  *I'm* the youngest of the 12 members (by about 12 hours), so Elliott will be younger by nearly 40 years.  (In the case of one of the other players, Elliott will be 63 years younger.)  Fortunately, the other members of the group think it's marvelous that Elliott's learning the game.

            Several bridge-playing couples who are friends of mine have been kind enough to have us over so that Elliott can sit at a table and actually play the game.  Teaching the game (with only two people) is boring; reading to learn how to play ranks as one of the most dreadfully boring tasks I have ever encountered.  I am grateful to those who are good at the game who are willing to go back to basics and teach a beginner.

* * *

Sometimes I stumble on research, the usefulness of which even I wonder about.  Psychologists at the University of Alberta have determined that upchuck, bubby, boff, wriggly, yaps, giggle, cooch, guffaw, puffball, and jiggly are the "10 funniest words in the English language."  Just as American legislators—both state and federal—sometimes take potshots at (apparently completely useless) research conducted with public funds, I wonder if there are parallel criticisms from Canadian legislators.  The little article I read about these funny words did not suggest any useful application of the findings.  Often, I will argue, legislative potshots are misguided, or political, because the legislator(s) in question do not understand basic research or how sometimes goofy-sounding research can lead to useful improvements in human or other life.  In this case, however. . . .

It seems there are two factors that make a word funny:  the form of the word -or- the meaning of the word.  The form-funny words are funny for reasons that have nothing to do with their meaning.  So, for example, probably wriggly and cooch.

(I have to confess:  after I keyed the word "cooch," I had to look it up.  I was not familiar with it.  Oh.  I guess I don't get out enough or don't move in company that uses the word.  The Urban Dictionary, (1):  "A vagina. This is where a penis would normally be inserted (although there are other points of insertion). In other cases, it is where a woman inserts her vibrator, or where someone inserts their tongue, or fingers, etc. Lots of things can go up there, but they're all working towards the same goal."  (2) "That very special place on a woman that men spend their lives striving to visit over and over and over again.")

Anyway, "the purpose of the study was to understand just what it is about certain words that makes them funny."  The predicted humor in the meaning of a word "were taken from a computational model of language and measure how related each word is to different emotions, as well as to six categories of funny words: sex, bodily functions, insults, swear words, partying, and animals. . . .  It turns out that the best predictor of funniness is not distance from one of those six categories, but rather average distance from all six categories. This makes sense, because lots of words that people find funny fall into more than one category, like sex and bodily functions -- like boobs."  Did you know that there are six categories of funny words?

So there you have it:  why words are funny.  And I learned a new one—that won't be useful in any venue in which I speak or write.

* * *

            The student newspaper at the University, the Minnesota Daily, reported that the "mechanical engineering department received a wind tunnel last month that will provide a new way to gain experience and learn new skills in agriculture research for undergraduate and graduate researchers."  It's another one of those cases when (at least for me) the first reaction is "who'd have thunk it?" and then, on reflection, realize that "of course!"  It's "23 feet long and can produce wind speeds of up to 12 miles per hour."

            What use is this?  One is that it "allows aerosol researchers at the University to better understand the use of sprays in agricultural settings and minimize over-spraying in fields."  A researcher at the company (a Land O'Lakes subsidiary) was a research assistant at the University as well as an alum; he arranged the donation.  So now faculty and students can play with breezes and aerosols.  I would think the Mechanical Engineering faculty and students would find this a wonderful addition to curricular and research possibilities.  One research agreement with the company that made the donation is aimed at "understanding droplet transport in agricultural settings."

            This kind of work is what defines a land-grant university, which Minnesota is.
 
            No, I don't have anything to say about the nomination of Joan Gabel to be the University's next president.  Her CV looks good, as does her experience.  My only (tiny) regret is that she apparently has no connection whatever to the University or the State of Minnesota.  Such a connection isn't essential, but it adds an element to the understanding of the place.  (Yes, Eric Kaler had such a connection; he earned his Ph.D. in Chemical Engineering at the University.)  If finally selected, her previous accomplishments suggest she'll do well.  I certainly hope so, for the sake both of the institution as well as the state.

            My best—

            Gary

Most Read