Monday, October 16, 2017

#15 moving to a condo?, marital cheating, algebra or statistics, a memorial, my artistic talent, friends versus family in older folks, algorithms for health, artificial intelligence and decision-making





Kathy and I have begun to debate the merits of moving into a condo when she retires.  We have both been averse to the idea whenever we've thought about our next housing option, but as we talk about leaving for 2-3 months in the winter, and traveling at other times during the year (including perhaps renting a lake place for a period in the summer), the ability simply to walk out and close the door has its attractions (we would have to have no pets and no plants, alas, and we wouldn't be able to garden).  Finding someone to take care of, or stay in, a house or townhouse while we were gone would be a challenge every year.  So we're rethinking whether we could live in a condo.  We know that millions of people do so happily.  I'm not sure we would.

I'd be interested to hear opinions from those who have confronted this decision—or are thinking about it.

Of course, our decision to move is somewhat contingent on the U.S. economy and the housing market.  If the market heads south, and we could get significantly less for our house than we want, we might end up staying put for longer than we plan.  Apart from the market, the biggest challenge is finding a townhouse (especially a one-floor townhouse) that's not located 20 miles out of the city.  A condo would probably be easier to find, although in either case one that has the space we want at a price we want to afford is the question.

* * *

            From the University of Utah: 

Older Americans are cheating on their spouses more than their younger counterparts, with 20 percent of married Americans over age 55 reporting they've engaged in extramarital sex.  Just 14 percent of those under age 55 say they've cheated.

            Maybe the younger folks just haven't been married long enough to catch up.

* * *

"In real life, I assure you, there is no such thing as algebra."  -- Fran Lebowitz

In her humorous vein, I think she's correct.  Which is why I have argued, with no success, that the University of Minnesota should require statistics, not algebra, for admission.  (Statistics is admissible for meeting the University's graduation requirement in "mathematical thinking.")  But for admission, a student must have four years of math of some kind, including "two years of algebra, one of which must be intermediate or advanced algebra, and one year of geometry."  Further, the website explains that "examples of 4th year math include calculus (preferred), pre-calculus, analysis, integrated math 4."

I'm not sure what "analysis" or "integrated math 4" is, but I suspect they're not statistics.  I am most certainly not sure why the institution would "prefer" calculus.  One could expand Lebowitz's wisecrack to include calculus, in my opinion. 

An interjection:  I'm using the University of Minnesota to register my complaint, but I'm pretty sure its requirements do not differ much from those of the vast majority of other major research universities.

 Fields that need algebra and calculus—the physical sciences and engineering, for example—can of course require it.  But for all students, a course or two in statistics would be infinitely more useful in life, such as enabling the intelligent, critical, and skeptical reading of graphs, poll results, and the multitude of other bits of data that are thrown at us daily.  Elliott had to take senior-level statistics when he was a Psychology major at the University (before he transferred to Moorhead to get his degree in Art), and he has said repeatedly since that it was probably the most useful course he ever took.  (He also had to take the research methods course, which is equally useful in looking at the world, but I wouldn't go so far as to require that for every undergraduate.  A couple of solid statistics courses could cover research methods.)

The evening before we departed for California in late June, I drove Elliott to what we are assuming was his last undergraduate college work, the last exam in an online math class he had to take to meet Moorhead State's distribution requirements.  He could have folded it into coursework in an earlier semester when he was on campus, but he appealed the requirement late in his college career.  The denial of the appeal meant he had to take a course after he had already returned to Minneapolis.  The appeal—which I think should have succeeded—was based on the statistics and research methods courses he'd taken in Psychology, which were far more advanced, rigorous, illuminating, and educative than the goofy freshman-level math class he had to take.

In any case, he was glad to be done with all math classes forever (he thinks and hopes) and with college (at least for now).

Mid-year I read a blog by a community college dean asking his followers to comment on a proposal in California to permit students in fields outside of engineering, math, etc., to take statistics rather than algebra to meet graduation requirements.  One of the problems with algebra is that it is a major barrier to graduation; it's probably the course that causes the most student attrition.  (As I noted, this isn't a problem at Minnesota, where statistics does count toward graduation requirements.)

One argument against substituting statistics for algebra is that it could be seen as "watering down" degree requirements.  I think that's just balderdash.  A rigorous statistics course is just as challenging as an algebra course—but the content is more understandable for many people (including me). 

Another argument is that students need to be exposed to algebra so they can decide whether they want to go into any of the STEM fields (most of all which use algebra).  It seems to me there are other and better ways to let students learn about STEM fields without first subjecting them to algebra.  If they're interested, I would think they'd then have more motivation to learn algebra.

The most persuasive argument against the proposal, at the community college level, is that most 4-year institutions require algebra to graduate.  My point is that the 4-year institutions should change!

The final argument I've been able to find is that one needs algebra to understand statistics.  That isn't a claim that receives universal assent.  Moreover, the modest algebra required for statistics could be picked up in the first few sessions of the statistics course.  But I've taken more statistics courses than I care to remember and I don't remember that I had to know algebra to understand statistics.

            One can turn Elliott's and my "usefulness" argument around:  if usefulness is a criterion, why do we require the study of history or literature?  One of the commenters on the blog post made an observation about that question. 

The question then is what is the purpose of the broad requirements, whether Numeracy or Humanities or whatever?  Seems the "logical" answer has to be something like enriching the way that students view and think about the world, with the different areas offering different perspectives.  If the purpose of a math requirement, then, is to demonstrate how mathematics helps us to conceptualize the world, I'm not sure that algebra would win out over calculus or statistics. . . .  So the issue should not be whether statistics is easier than algebra which is easier than calculus which is easier than . . . , but whether the ultimate purpose of the requirement is met by statistics whatever that purpose might be.

            As one professor—who acknowledged he's a statistics prof—put it, "it's stupid that thousands of students are taking algebra as their 'last' math class. . . .    They don't get high enough in the sequence to get to the interesting stuff, the word problems are all horrible at that level, and it's mostly computation and memorization, which they then dump.  Statistics is a vastly better course and will teach them things they can use in daily life or in just about any career."

* * *

            One sad event this year was going to a visitation for a faculty member in Veterinary Medicine, Bob Morrison.  I didn't know Bob through the University; Kathy and I knew Bob and his wife Jeanie because we traveled in the same group to India, when we got to know them reasonably well.  We had dinner with them later, and talked with them at post-India-trip gatherings of the tour group.  Bob was an internationally-renowned swine researcher, and he and Jeanie were at a conference in Prague.  The 6-passenger vehicle they were in was hit; three of the six people were killed; Bob was one of them, but Jeanie survived, although with serious injuries.

            All of us who had toured with Bob and Jeanie were shocked.  Kathy and I went to the visitation to talk to Jeanie, and were dumbfounded at the number of people there.  We had to stand in line for an hour and 15 minutes in order to have about 30 seconds of conversation with her.  It was over 90 degrees that July day, and the line extended outside the church; inside, it wound around several rooms in feeble air conditioning.  There were more people at that visitation than I have known my entire life, I think.  That may be a little hyperbolic—but not by a lot. 

            In any case, Jeanie seemed surprised and pleased that we'd come.  She looked remarkably good for having lost her husband, gone through several surgeries, and been in the hospital for an extended period, first in Europe and then here.  We agreed we would get together for dinner once her life settled down a little more.

* * *

A Facebook post from me after we returned from the West Coast.

I have approximately 1% the artistic talent of either Kathy or Elliott.  That may be an exaggeration.  Today I tested the limits of my talent:  I varnished rocks.  We picked up pockets full of colored rocks at Agate Beach in CA on our trip.  Bright when wet, the rocks turn dull when dry.  The Google says I can put varnish on them to restore the color (and since it's on the web, it must be correct, right?).  We've also had rocks we've collected in various places over the years in our flower boxes; I scoured those off and varnished them as well.  There's my contribution to artistic endeavor.  Beyond mineral oil on the seashells.

* * *

            One of the items that appeared in Futurity (the daily report of findings from major research universities) last summer had this headline:  "Friends beat family for aging well."  Before either of us read the precis, Kathy and I talked about what that might mean.  My first-blush reaction was that the claim was counterintuitive—that people, as they age, get closer to children or siblings or someone in their family.  As we talked about it, however, we realized that for daily life, friends are likely more important to older adults than most of the relatives are:  the kids and siblings have their own lives and only occasional contact with a parent.  What has more of an impact is those with whom one spends the rest of the time, the non-family time.

            The research looked at survey data from nearly 280,000 people:  271K from 100 different countries and, separately, from about 7500 older American adults, about relationships, strain, and chronic illness.  In the first, "both family and friend relationships were linked to better health and happiness overall, but only friendships became a stronger predictor of health and happiness at advanced ages."  In the second, if friends caused strain, people were more ill; if they were supportive, people were happier.

            Friendships are optional; relatives are not, author Professor William Chopik pointed out.  While family relationships can be a source of support, they can also be negative or boring.  Friends are people we've chosen to stay connected with; those we don't tend to drop away over the years.  Friends also help when someone has few or no close relatives or who don't rely on family members for support.  "Keeping a few really good friends around can make a world of difference for our health and well-being.  So it’s smart to invest in the friendships that make you happiest."

Chopik noted that relationship research hasn't focused much on friendships.  That may be a mistake, "especially considering that they might be more influential for our happiness and health than other relationships.  'Friendships help us stave off loneliness but are often harder to maintain across the lifespan.  If a friendship has survived the test of time, you know it must be a good one—a person you turn to for help and advice often and a person you wanted in your life.'"

            Kathy's and my hypothesized reasons why the research might show what it does didn't turn up in the summary.  I bet we're not wrong.  As we looked at our own parents, my father when a widower and Kathy's mom now (alive and well), in both cases it was/is friends who were important to their daily life, not their kids.  It isn't that I didn't love my dad, or see him on occasion; he didn't need my care and attention because he had his own life to live.  The same is true for Kathy and her mom.  I suppose you could argue that they have their friendship circles because their kids ignore them, but that misplaces the chronology:  the friendship circles typically existed before the older age.

* * *

            Another study, this one out of Stanford, wasn't the least bit surprising to me.  Also from Futurity, "Algorithm beats experts at diagnosing heart rhythm."  Here's a paragraph from my annual letter last year:

I was reminded by a news article of a book by Paul Meehl, Clinical vs. Statistical Prediction: A Theoretical Analysis and a Review of the Evidence, published in 1954.  Meehl, who was a Regents Professor of Psychology, Law, and Philosophy, and one of the developers of the Minnesota Multiphasic Personality Inventory (MMPI), argued in the book that in the field of psychotherapy, statistical models almost always yield better predictions and diagnoses than the judgment of trained professionals.  His contention has been supported by reams of additional research in subsequent decades—and has also been expanded to other fields, including cancer patient longevity, cardiac disease, likelihood of new business success, evaluation of credit risks, suitability of foster parents, odds of recidivism, winners of football games, and future prices of wine.  He really called into question the value of expert diagnosis, a question that remains open today.

            The Stanford researchers found that some heart monitors collect rhythm data that can be analyzed by an algorithm for dangerous arrhythmias—and it does at least as good a job as cardiologists, sometimes better.  Some arrhythmias are difficult to detect, and there are very similar arrhythmias, some of which require immediate attention and some of which require none.  The algorithm can detect the differences between them.

            The evidence continues to accumulate.  Go see your local artificial intelligence doctor, not your live physician.  (I'm half joking.  But only half; within a few years, I suspect AI is going to be doing a lot of diagnosing—and better than your internist.)

            (An unrelated story about Meehl.  I had him for a seminar in the mid-1970s when I was a graduate student in psychology.  The seminar was on clinical psychology and diagnosing patients.  Meehl took aim at those who argued that mental illness is a social construct; he announced, in no uncertain terms, something along the line of "I've spent a lot of time dealing with patients in mental hospitals and you can't tell me they aren't crazy.")

* * *

            In that same vein, the Wall Street Journal had an article mid-summer reporting that the folks who create artificial intelligence (AI) sometimes don't know what the machines are "thinking."  (I will put quotation marks around that word when applied to machines because no one has demonstrated that the work of circuits is anything like the functioning of neurons in the human brain.)  AI is used now in sentencing and bank loan decisions; it may be incorporated into self-driving cars to judge who should have "the best chance to live" when an accident is unavoidable.  ("Career of the Future: Robot Psychologist Scientists Go Inside Minds of Machines.")

            There is a variety of AI based on "neural networks," "systems that 'learn' as humans do through training, turning experience into networks of simulated neurons."  What results is an assemblage of millions or billions of artificial neurons (rather than code written by a programmer), "which explains why those who create modern AIs can be befuddled as to how they solve tasks."  When that's the case, and we don't know how it works, then all sorts of problems can arise.  How does it decide who should live in an impending accident?  Will it make decisions based on race or sex or age?  Will those biases in the machine only become known when it has made many decisions?

            As the reporter noted, you can't ask the machine how it makes its decisions.  "Artificial intelligences can excel at narrow tasks, but even those that talk have introspective powers about on par with a cockroach."  The reporter did explain how this unpredictability can come to be.  If an artificial neural network is shown many images of a cat, it is eventually able to reliably identify cats.  "The tricky bit is that neural networks learn by altering their own innards. This is basically how your brain works, too.  And like the connections between the 86 billion or so neurons in your brain, the precise way an AI 'thinks' is incomprehensible."

            So the work goes on to understand how AI works—once they've gotten it working!  One approach is to use the same cognitive tests that psychologists use with children; apparently doing so allows understanding of how AI thinks.  The step after that is persuading it to change its mind; "because engineers typically create many versions of an AI when trying to discover the best one, the use of cognitive psychology could give engineers more power to choose the ones that 'think' the way we want them to."  On the other hand, maybe we don't want it to think like humans because it may discover something new.

            The outcome, potentially, is better decision-making in a variety of fields, "with fewer mistakes and more accountability, because their output is measurable and we might be able to trace exactly how they make decisions.
We ask humans to do this all the time—in a court of law, when dissecting a business decision—but humans are notoriously unreliable narrators."  Presumably, any bias that crept in could be eliminated.

            I wonder if Paul Meehl saw the application of AI to decision-making as a logical outcome of his conclusion about statistical versus clinical decisions.  My bet is that he would.  "Clinical" is nothing more than a human deciding; "statistical" can stand as a metaphor for a computer (AI).

No comments:

Post a Comment

Most Read