Good morning! I hope you've had a pleasant start to 2018.
Contrary to what I
wrote just a few weeks ago, I've accepted Kathy's wisdom (and the urging of
others, SDE, NHE) and created a blog site.
(To make that sentence accurate, I should write that "Kathy created
a blog site and then showed me how to use it.") Although I don't know why anyone would do so,
I put up all the messages since I began them last September in case you wish to
re-read any of them. I'm going to put up
all my previous annual tomes as well, just so everything's in one place. (My electronic files are not well ordered. .
. .) At least for the time being,
however, I'm also going to continue to send the text in the email message
itself. I only change this technology
stuff in small, incremental steps.
A blog site provides
the option for comments (that are public).
Of course I'm glad to have thoughtful comments. Please, however, write to me privately on
email if you wish.
* * *
Our local newspaper had a headline in late December that made me angry,
sad, and glad: "U tests new
transplant treatment for Type 1 diabetes."
The University of Minnesota—which is one of the major centers in the
world for research on diabetes—is going to be the third institution to try
using embryonic stem cells (from fetuses not used for in vitro fertilization)
to treat type 1 diabetes.
"Treat," in this case, would essentially mean
"cure," because the process involves injecting immature pancreas
cells (that have been coaxed into developing from stem cells) that will begin
to produce insulin. The patients will
have to take immunosuppressant drugs (to prevent rejection), but if the
treatment works, they won't have diabetes.
It is hard to predict where this line of research and trials will go,
but I can conceive of a day when a diabetes patient's own stem cells could be
used to create the pancreatic cells, thus eliminating the need for
immunosuppression. (I'm not predicting
that; I know squat about the details of ongoing research on diabetes and stem
cells. It strikes me, however, that
might be a logical next step.)
I am angry—one of the multitude of reasons I intensely disliked George
W. Bush as president, and still do—because the Bush administration restricted
stem cell research to existing lines at the time. (Congress also messed around with stem cell
research, and the Minnesota legislature made similar attempts as well.) From all I've read, that put many potential
advances in stem cell research on hold for the eight years of his
presidency. I am sad because had this
breakthrough occurred eight years earlier, it could well have helped, even
saved, Krystin. I am glad because the
development holds out hope for the millions of people with diabetes. Not an immediate "cure" or treatment,
but surely one that has great promise—and promise in the lifetime of most
patients. That's what we'd always hoped
for for Krystin: the development of a
treatment, within her lifetime, that would stave off further damage from the
disease and allow time for advances to improve her overall health. (It may be, in this specific case, that it
was the device to carry the stem cells, rather than the stem cells themselves,
that was the medical advance, but my reaction stands: the stem cell research was delayed, which
meant this research was delayed.)
* * *
Some would not concur with my subject description as
"edible and electronic." The
latter, yes; the former, no. A friend
and I had a comical email interlude. We
were writing about Christmas. He opined
that, compared to holidays in other religious traditions,
frankly the
food is better at Christmas, . . . particularly so compared to the spam my
former mother-in-law literally served at my first Christmas in Kansas. It was the largest whole piece of spam I ever
saw and it was covered with a horribly smelling instant gravy from a
store-bought packet. I had never seen or
eaten spam before that fateful day and I must say that it exceeded in a
negative way all my preconceived ideas about spam. I politely choked down a few pieces and vowed
never to eat spam again.
I wrote
back to him:
As for Spam,
of course I live in its home state. My
grandmother used to make me Spam sandwiches:
she cut it into thin slices, sprinkled a little garlic salt on it, fried
it, and put it on bread with cheese. I
loved them. Elliott's and my little
secret is that we still like a fried Spam sandwich, perhaps once per year; it's
especially good with Velveeta. Kathy
leaves the room so she doesn't barf. . . .
Fried spam--the only way to eat it--is not that different from eating
bacon!
My friend conceded.
"Okay, okay, I get it—spam is wonderful (for a very select few).
Congrats on your willingness to eat highly unusual items." He expressed doubt, however, that spam
qualifies as red meat or bacon.
Hormel insists that whenever the word is used, it's
supposed to be SPAM. I decline to follow
that protocol. It was invented in 1937
and includes pork shoulder, ham, salt, water, sugar, and sodium nitrite. There are different stories about what, if
anything, the four letters stand for.
Probably nothing, although Hormel at one point said it was short for
"spiced ham," and at another time "shoulder of pork and
ham." Who knows.
When I related this exchange to Kathy, she asked how the
name of the Hormel meat product became the descriptor for unwanted electronic
mail. It seems that the answer to that
question is "Monty Python."
According to http://www.todayifoundout.com:
The real
origin of the term comes from a 1970 Monty Python’s Flying Circus skit. In this skit, all the restaurant’s menu items
devolve into SPAM. When the waitress
repeats the word SPAM, a group of Vikings in the corner sing “SPAM, SPAM, SPAM,
SPAM, SPAM, SPAM, SPAM, SPAM, lovely SPAM!
Wonderful SPAM!”, drowning out other conversation, until they are
finally told to shut it.
Exactly
where this first translated to internet messages of varying type, such as chat
messages, newsgroups, etc., isn’t entirely known as it sort of happened all
over the place in a very short span of years, in terms of the name being
applied to these messages. It is,
however, well documented that the users in each of these first instances chose
the word “spam” referring to the 1970 Monty Python sketch where SPAM singing
was drowning out conversation and SPAM itself was unwanted and popping up all
over the menu.
Other websites confirmed this history. So unwanted "meat" led to naming
unwanted email.
* * *
Elliott
and I have gone around about the virtue of potatoes. I sent him a link to an article informing us
that while it is certainly not healthy to eat only one food, if you are in a
desperate situation, a single food that you could survive on for awhile is
potatoes. I told Elliott "See,
they're good for you!"
I have
said a number of times that I've never met a potato (that is, ways of preparing
potatoes) that I didn't like. Elliott
didn't dispute the content of the article; "I know they're good for you. I just don't want them for dinner every day."
Of course
I didn't give up. "You can eat
different versions every day of the week!
these are just off the top of my head.
fried
mashed
au gratin
potato salad
baked
fries
hash browns"
Elliott
was unpersuaded, although his claim was clearly inaccurate: "They all taste the same."
Anyone
wants to invite me for dinner, please feel free to serve potatoes.
* * *
In response
to the data on the increasing number of oldsters and their health care costs, a
friend related that she'd read a book a few years ago on that topic. It was Sick
to Death and Not Going to Take It Anymore (2004), and described the
problem:
our
healthcare systems and support systems like Social Security were designed to
meet the needs of a population where you were healthy, healthy, healthy,
dead. But with advances in medical
science, we are now a population of healthy, healthy, healthy, less healthy,
even less healthy, dependently unhealthy, close to death for a while, then
dead. Our systems were not designed for
the prolonged illness and dependency.
One aspect
of the "dependently unhealthy, close to death for a while" period is
our inability to make a decision about dying.
My friend wrote:
More
recently, Atul Gawande's Being
Mortal: Medicine and What Matters in the
End is an exploration of making medical decisions that make more sense both
for the person and for society. (That
is, just because there is a medical response you could take doesn't mean it is
something you should do.) I remember
when we brought my mom up here to [a medical facility]
-- she had
basically collapsed after chemotherapy and radiation following surgery for
colon cancer. I asked the clinical nurse
manager on her floor if she had any recommendations for an oncologist in the
area. Her response was "what do you
need an oncologist for? Your mom is now
too weak for surgery, chemo, radiation or any other treatment. What you need is a physician that will help
your mom live out the life she has left with the minimum of pain and
problems." Perfect answer. And what we did. More treatment would have killed her faster
than the cancer eventually did.
My friend
also related hearing an item on National Public Radio about retirement. "The program also interviewed a
retirement researcher who posited that the workers in the last 100 years are in
a sweet spot where retirement is possible.
Before that, most died before they could retire, after the baby boomer
generation retires, there will be no way to support retirement benefits for
those who come after."
I told her that of course there's a way to support
retirement benefits for our children.
One would be to raise taxes to ensure the solvency of Social Security
(and change the law so there's no income cap on the tax, and impose a means
test, so the wealthy don't receive the money—another way they can contribute in
a way that is beyond the ability of most of us). Another would be to establish mandatory
401(k)-type plans for everyone who works.
Those are not mutually exclusive.
I'm sure there are other possibilities.
No matter what is chosen, I for one do not want to see our children and
grandchildren left without a secure and reasonable retirement income. Or without health coverage.
* * *
A friend
of mine, long-time faculty member at Colorado State, wrote back to me about
on-line education, something he's been delivering for years. He made several pertinent observations.
(1) It isn't cost effective to do a single course of modest
size, but when there are large numbers of students taking the same course(s),
it can be (is). (2) It allows more
students access to lectures by the best faculty, although it's not face-to-face
in a classroom; that may be better than in-class instruction with a mediocre
instructor. (3) Student engagement can
be an issue. Is it the delivery or the student? (In my experience, there will be students who
aren't engaged in almost every undergraduate class. One of the rare instances when people pay
money for something they don't take.)
(4) The great virtue of on-line courses is flexibility for the
students—they don't have to drive to campus and go to a brick-and-mortar
building (so can spend a lot less time and money on commuting and parking and
more on actual education and on other life demands). This is less true, of course, of students who
live on campus. (5) Some subjects can be
adapted to on-line education better than others. My friend suggests math, business,
engineering can more easily be adapted than the humanities and liberal arts,
with the social sciences perhaps in between.
That's probably right. (I suspect
on-line education also works better for introductory or survey courses; for
juniors and seniors, primarily on-line education—no matter the field—would be
sort of crappy, in my opinion. I learned
much and enjoyed the junior and senior seminars in political science,
experiences that would have been impossible to replicate on line.)
I don't intend to denigrate on-line education, because it
serves a useful purpose. What my friend
didn't mention, but which is one place where it works effectively, is
professional education for people who aren't close to a college campus (e.g.,
continuing education in law, business, health fields, etc.) or people who want
to earn a degree but are bound in place (family, job, home) far from a campus—or,
as my friend mentioned, to people whose lives don't permit spending a lot of
time on a campus, no matter how close. In those instances, it's clearly an advance in
our ability to deliver education. The
trick, always, is whether the institutions have the resources to deliver
high-quality education on line. I
suppose a lousy education is better than none at all—but we (the institutions)
shouldn't be delivering lousy educations or (the state) permit them to be
delivered.
What I would not like to see on-line education used widely
for is traditional undergraduate education.
The interaction with peers and with faculty cannot be replaced by a
screen but it is an essential part of the "growing up" process of
going to college when you're 18-22 or thereabouts. I and many people I know made what became
lifelong friends when we were in college.
That would not have happened if all or most of the education had been on
a screen.
* * *
An article I found jocular—although I don't think the
author intended it that way—appeared in Slate;
the subject was the use of double and single quotation marks. The content is beside my point, but in brief,
the author noted the standard American rule:
double quote marks for quotations, single quote marks for quotes inside
quotations. What the author had
discovered, in writing classes for college freshmen, that
some
students adopt another rule: double
quotes for long quotations, single quotes for single words or short
phrases. They'll quote a long passage
from Measure for Measure accurately,
but when they want to quote one of Shakespeare's words, a cliché, or some
dubious concept like "virtue," they'll go with single quotes.
He found,
on exploration, to his surprise, that the practice turned up in several
unexpected places—news reports, professorial papers, blogs, and more. He turned to a linguist at the University of
California-Berkeley for an explanation of how this kind of practice could
evolve. The linguist, Geoffrey Nunberg,
responded, "People just make shit up."
(On the
serious side of the question, the author wrote that
I suspect
the main driver of this use is a matter of simplification: the desire to avoid the shift key. . . . Several commenters cite labor-saving as a
justification for singles, and I believe that's why I've found myself using
them. When dashing off a quick word in a
chat or marginal comment, I sometimes just don't work up the energy to reach
for that shift key. . . . People type
more now than ever before. We like to
take it easy now and then, and hitting shift to capitalize or properly
punctuate can sometimes feel like a hassle—if the most trivial in human
history.
There may
be something to that!)
* * *
One of the
difficulties of being a woman and being employed is the challenge of childbirth
and child rearing. Research out of
England documents the issue (following a long, long line of research on the
general topic of women and motherhood and working). "In a study of workers' attitudes,
mothers who took time off to care for babies were seen as less committed and
competent at work. Meanwhile, those who
continued working were viewed as less caring parents." It is a no-win situation for the women, it
seems.
The
consequences varied by the focus. If a
mother took a leave, she was viewed as less competent or committed at
work. If she didn't take a leave, she
was viewed as uncaring.
There is a
solution to this problem that few people would likely endorse, although I think
it the best one. If it is only women who
are given leave (paid or not), or mothers are given a longer leave than the
father, the women will always be slipping behind in career and salary
progression (especially if, as has been the case forever, mothers do more of
the parenting and child care).
Maternity leave does women no favors vis-à-vis the workplace. The equitable solution is to make parental
leave mandatory and make it apply equally to mothers and fathers. If you have or adopt a child, both mom and
dad must take x months off work (and not work—be put on leave, with no
continuing responsibilities, such as via the Internet). In that way women are not disadvantaged by
parental leave and men can stay home and actually help with the baby!
* * *
Heavens, I
went to three movies in five days. I
can't recall any time in my life I did that, and the phenomenon is even more
surprising because I only see 1-2 movies per year—and then largely at Kathy's
or Elliott's prodding. The only one that
I saw that has any Oscar potential is Gary Oldman playing Winston Churchill in
"Darkest Hour." As for "The
Last Jedi," I've come to the conclusion that almost every
"action" movie—western, sci-fi, whatever—comes down to two guys
fighting with some kind of sticks. In
"The Last Jedi," it's with lightsabers, a high-tech stick. The movie was well done, in the "Star
Wars" franchise tradition, a good afternoon's entertainment. By comparison, "Darkest Hour" led
Kathy and me to have a conversation about the state of the world in May, 1940.
* * *
Every year
we see news about the rankings of universities around the world. They are all junk. Unfortunately, people pay attention to
them. Here is the ranking of the
University of Minnesota (Twin Cities) for 2017 by four of the major ranking
outfits. Each one uses different
criteria to rank.
56 Times Higher Education (London), tied at
56 with U of North Carolina -Chapel Hill.
Oxford and Cambridge ranked #1 and #2.
Think the Times favors British universities?
34 Academic Ranking of World Universities,
ShanghaiRanking (formerly Shanghai Jiao Tong University, China). Harvard and Stanford ranked #1 and #2.
137 QS World University Rankings. MIT and Stanford #1 and #2.
71 USNews & World Report, national
universities (tied with Baylor and Stevens Institute of Technology). Princeton and Harvard ranked #1 and #2.
Some of these rankings have rather suspect institutions
ranked higher than the University of Minnesota, almost no matter the ranking
criteria used. Really, Minnesota is tied
with the Stevens Institute of Technology?
(Its Institutional Vision: "Stevens
will become a premier student-centric technological research university,
focusing on six “foundational pillars,” areas of true societal need where
Stevens possesses significant depth and expertise: artificial intelligence,
machine learning, and cybersecurity; data science and information systems;
complex systems and networks; financial systems and technologies; biomedical
engineering, healthcare, and life sciences; and resilience and sustainability.
As our education and research capacity grows in these areas, so will our
influence.") I'm sure it's a fine
educational institution, but it has a far narrower mission and set of
responsibilities than the University of Minnesota.
* * *
A faculty friend wrote back to me about the economist
William Bowen.
I'd only add
that whereas government (state, mostly) taketh away, government (national,
mostly) also giveth back in the form of student loans and grants, which both
cushion and enable the tuition hikes.
More importantly, though, they giveth research money, which is now the
primary target, other than tuition, for universities who are trying to maximize
income. This in turn distorts -- in this
liberal arts person's view -- the intellectual priorities of the
institution. The current craze for more
and more STEM can be seen in part as a rationale for whoring after big bucks
from government and industry. Not that
science is bad or training isn't important.
But it's gone overboard, feeding as well into the idea that college is
just job training. We will need to
rebalance things at some point.
The only
modest correction I'd offer is that for the public colleges and universities,
the state both giveth (when the state economy is doing well) and taketh away
(when the state economy is in the tank).
A friend of mine at the University, who knows its budget upside down and
backwards, pointed out that "various measures can be applied here: (1) funding in inflation-adjusted dollars,
(2) funding as a percentage of the overall state budget. Some other priorities, especially social welfare
(especially healthcare) and K-12 education have absorbed a good part of the
budget." The University of
Minnesota took one of the biggest proportional reductions in state funding of
any public university in the country during the last recession. It has never returned to the same level of
state funding (in inflation-adjusted dollars).
One
additional observation: those on the
fiscally conservative end of national politics echo my friend's point about
"student loans and grants, which both cushion and enable the tuition
hikes." They see that as bad public
policy. The cure, in their eyes, is to
reduce the amount of student loans and grants provided by the federal
government, on the theory that institutions won't be so quick to raise tuition.
If that happens, either students are
screwed (because the institutions will continue to raise tuition in order to
protect quality) or the institutions will have to cut back both on breadth and
quality.
I agree
with the assertion that more and more STEM education does further encourage the
idea of college as job training. Nobody
wants to discourage STEM field training, but if that's all a student gets out
of going to college, we don't need colleges and universities to provide
it. And, of course, there are places
where someone can obtain the technical training without any of the other benefits
of a college education (that is, exposure to the "liberal arts" and
all that it entails beyond job training; see, e.g., Stevens Institute of
Technology).
* * *
A friend
and I were having lunch and at one point the conversation wound around to the
point where I related a story from a graduate seminar I took decades ago. The seminar was on organizational
leadership. The faculty member leading
it observed that people in all kinds of organizations--the Church, the
military, corporations, higher education--want to have a great leader to take
the organization in new/better directions.
People need to be careful what they wish for; she said that people tend
to forget that the German word for leader is führer.
That anecdote led us to wonder if the proper noun, Der Führer,
has seen diminished use in Germany of the noun führer because it's so tainted
with Hitler. I wrote to a friend who
visits Germany frequently. He told me
that "it seems to me that the simple word "Fueher" (leader) is
little used today; it has been replaced by "Leiter." But the word is still used frequently in
compound nouns, e.g., Zugfuehrer (railroad engineer, train driver), etc." It later occurred to me that I could Google
the question; multiple websites gave me the same answer. "Führer" is never used in German
politics but it is still part of compound words.
(I learned that the term Lebensraum—"living
space"—has taken a different course.
Lebensraum was coined in the late 19th century to describe
the space/territory Germany needed to survive and grow. Hitler used it specifically to describe
Germany's need to conquer land in the east—e.g., Poland—to provide room for
German people and food production. The
term is now used exclusively in the field of animal science and husbandry: the space that animals need to survive or
breed. It is never used for people. Another word has largely disappeared from
German: Endlösung. Ordinarily, Endlösung simply means an end
solution, but after the Nazis used the term to describe their program to eradicate
the Jews and others they thought undesirable, "the Final Solution,"
there's distaste for the word. It is
politically incorrect to use Endlösung in modern Germany, so I read.)
If you're in the upper Midwest, stay warm!
Gary
No comments:
Post a Comment