Good afternoon, on a bright Sunday (even though we remain
with snow cover).
I wonder
why it is that I was in California the two times in my recent memory when I
felt compelled to buy a sweatshirt—because I was so cold and needed another
layer.
I'm way
behind on commenting on bits and pieces of research that I liked, so I'm going
to race through some of it. More in the
future.
* * *
Responsibility
Exchange Theory will help you understand why you say "thank you" and "I'm
sorry"—and why there can be bad feelings when you decline to do so when
circumstances suggest you should. Or so report researchers Shereen Chaudhry and
George Loewenstein at Carnegie Mellon University. The theory also speaks to blaming and
bragging.
All four of
those kinds of statements "are tools used to transfer responsibility from
one person to another. . . . They relay
information about credit or blame, and they involve image-based trade-offs
between appearing competent and appearing warm." One of the co-authors maintains that "these
communications—and their absence—can make or break relationships and affect
material outcomes ranging from restaurant tips to medical malpractice
settlements."
A central
idea is that these messages either project confidence or they project
warmth. "Thanking and apologizing
make the speaker appear caring or generous, but usually at the cost of seeming
incompetent or weak. The opposite is
true of bragging and blaming, which can bolster the speaker's perceived
competence and status, but at the cost of seeming selfish or inconsiderate." I suppose we all have idiosyncratic feelings
about these claims, but (at least when I was old enough to think I'm wise about
it) I don't feel weak or incompetent if I express appreciation; it's probably true,
however, that most of us feel at least a little sheepish if we have to
apologize (or much more than sheepish if, for instance, we caused an accident
that wounded or killed someone). Perhaps
like many people of a certain age, I don't feel much urge to bolster my perceived
competence or my status; at this stage I can do what I can do and I am what I
am. So bragging is unnecessary and
blaming is usually counterproductive.
(Except, of course, when you declare that "I'm not saying it's your
fault, I said I'm blaming you" 😊)
"The
recipient of the communication experiences a different impact on their image:
Thanking and apologizing elevate both perceived competence and warmth for the
recipient, while bragging and blaming decrease both." And why would we not do the former for most people
with whom we interact? I can think of a
few for whom I have no interest in elevating perceived confidence and warmth—perhaps
even the contrary—but not many.
Chaudhry
and Loewenstein went on to hypothesize (in advance of experimental work) that it
doesn't matter if the person is the one doing the favor/causing the harm or the
one receiving the favor/the victim; in either case the preference will be for
thanking or apologizing rather than bragging or blaming—so the latter two
should be infrequent. At first blush
that seems like a no-brainer prediction; of course people would prefer thanks
or apology rather than braggadocio or blame.
Another hypothesis was that people would, in conversation, steer away
from blame and bragging and toward apology and thanks.
Interesting,
the experimental work confirmed the prediction about the frequency of
thanks/apologies—however, "thankers will do so more begrudgingly in some
cases compared to others. . . . When they were in an environment in which it
was more important to appear competent . . . participants preferred neither
person say anything about the credit or blame at stake." (They also found that people did indeed steer
conversations toward apologies/thanks.)
Not
surprisingly, they observed that "some communications that might appear to
be apologies, but which don't accept responsibility – 'I'm sorry you feel that
I hurt you' -- are not accepted as authentic. In addition, it [Responsibility
Exchange Theory] explains why thanking and apologizing are much less likely to
occur after the other side has bragged or blamed."
And guess
what? Women feel more obligation to
thank or apologize than men do because of the social expectation of more warmth
from women. I wonder if that has changed
over time.
One can
always wonder if this is a WEIRD study:
conducted in a country that is "Western, educated, industrialized,
rich and democratic." Would the
same results obtain in rural China or India, or a Japanese corporate boardroom,
or an African village?
* * *
This can't happen too quickly as
far as I'm concerned. "Research
shows most United States presidents fade quickly from the nation's collective
memory—a fate that could even befall Trump." Henry Roediger, psychology and brain science
professor at Washington University, says it's simply the passage of time. "By the year 2060, Americans will
probably remember as much about the 39th and 40th presidents, Jimmy Carter and
Ronald Reagan, as they now remember about our 13th president, Millard Fillmore." Roediger has been testing undergraduates (and
other people) on their ability to recall the names of presidents for over 40
years.
If a president is remembered (that
is, stands out in history), Roediger says it's because of events during the
person's term of office, not necessarily the president himself. "If Trump is remembered in 75 years, it
will probably be because of what happens during his term of office, not his
personality. Presidents during cataclysmic events such as wars are likely to be
better remembered than those who governed in times of peace." His research suggests that most presidents "will
be largely forgotten within 50-to-100 years of their serving as president."
Roediger has some amusing findings based
on multiple tests given over many years.
Among the half-dozen presidents who
served around the time the test first was given in 1973, Harry S. Truman,
Lyndon B. Johnson, and Gerald R. Ford faded fast from historical memory,
whereas John F. Kennedy has been better retained. That study estimated Truman
will be forgotten by three-fourths of college students by 2040, bringing him
down to the level of presidents such as Zachary Taylor and William McKinley.
It may be that assassination is the reason Kennedy is better
remembered—but James Garfield and William McKinley aren't remembered for that
reason.
Those
taking the test in 1974 clearly remembered Johnson; by 1991, "the numbers
who recalled him dropped to 53 percent. By 2009, it plummeted to 20 percent. .
. . Their research shows that less than
20 percent of the participants are able to recall more than the past eight or
nine presidents in a row. As of today, that means somewhere around Ford and
Nixon are the end of the memorable line—and Johnson is history." Disheartening, of course, for those of us who
lived through the Vietnam War. I suppose
Korean War vets feel the same way about faded Mr. Truman.
What comes
as no surprise is that few people can name all the presidents in order. Many can get the first four or five, and Lincoln
and Johnson and Grant, but "Number 8 Martin Van Buren to number 30 Calvin
Coolidge—you can forget them. And you have, according to their previous
research. There has been a level of poor recall for this presidential period
spanning the four decades of Roediger's research."
I suppose
it's because part of my avocational reading all my life has been American
history, but I have little trouble naming all the presidents in order and their
terms of office. The way I can do it is
to associate names and dates—and, as Roediger claimed, because I can connect
events and dates in an approximate enough way that I can get the presidents and
their terms of office in order. When
teased into reciting them, the taunt that usually follows is either "who
were their vice presidents?" or "bet you can't name their wives." Which, of course, I cannot. Ah well, the price of boasting, even when
provoked.
* * *
Here is
another field that I had no idea existed:
neuroeconomics. Christian Ruff
and colleagues at the (University of) Zurich Center for Neuroeconomics did
brain research to look at conflicts between moral values and money (that is, "our
actions are guided by moral values [but] monetary incentives can get in the way
of our good intentions") and where the conflicts are resolved.
As a
general proposition, I'm not sure this kind of research, using electronic stimulation
of parts of the brain, is going to prove to be all that fruitful. From what little I know of brain processes,
there is such decentralization and dispersal of cognitive functions across the
brain that trying to pin down certain behaviors or thoughts to one part seems
unproductive. But I am ready to stand
corrected by any of my biologist friends who read this. Having said that, I'll relay what they found,
which at least has some entertainment value.
When donating money to a charity or
doing volunteer work, we put someone else's needs before our own and forgo our
own material interests in favor of moral values. Studies have described this
behavior as reflecting either a personal predisposition for altruism, an
instrument for personal reputation management, or a mental trade-off of the
pros and cons associated with different actions.
Seems to me that the second one isn't particularly
altruistic. Giving in order to be seen
to be giving.
Anyway, they
start with the assumption that "people have a moral preference for
supporting good causes and not wanting to support harmful or bad causes." Without knowing anything about brain
stimulation, I could have guessed that "depending on the strength of the
monetary incentive, people will at one point switch to selfish behavior." They found that when a particular part of the
brain of interest was stimulated, which made people more deliberative, the
research subjects became more focused on the money and less on the morals. "If we don't let the brain deliberate on
conflicting moral and monetary values, people are more likely to stick to their
moral convictions and aren't swayed, even by high financial incentives." It is of interest, Ruff says, because one
could suppose that people are driven by money and are only altruistic when they
think about it. But it seems that the
reverse is the case.
So we're
moral as long as we don't think about it too much. I have to wonder about that.
* * *
A huge
study out of Tufts University health sciences pretty firmly puts paid the
illusion that taking vitamin and mineral supplements does anything for your
health. In short, "adequate intake
of certain nutrients from foods -- but not supplements -- is linked to a
reduction in all-cause mortality. There was no association between dietary
supplement use and a lower risk of death."
In addition, taking more than 1000 mg/day of calcium increased the risk
of cancers of some kinds.
Here's the
summary:
Adequate intakes of vitamin K and
magnesium were associated with a lower risk of death;
Adequate intakes of vitamin A,
vitamin K, and zinc were associated with a lower risk of death from CVD [cardiovascular
disease]; and
Excess intake of calcium was associated
with higher risk of death from cancer.
When sources of nutrient intake
(food vs. supplement) were evaluated, the researchers found:
The lower risk of death associated
with adequate nutrient intakes of vitamin K and magnesium was limited to
nutrients from foods, not from supplements;
The lower risk of death from CVD
associated with adequate intakes of vitamin A, vitamin K, and zinc was limited
to nutrients from foods, not from supplements; and
Calcium intake from supplement
totals of at least 1,000 mg/day was associated with increased risk of death
from cancer but there was no association for calcium intake from foods.
A few years
ago I spoke with my physician about supplements. Several decades ago I had a girlfriend who
said she took a regular multi-vitamin just as insurance. I thought that sounded right, so I began
doing so as well. When research began
appearing quite a few years ago questioning the value of supplements, I mostly
stopped (except for I think vitamin C and D).
When I talked to my doctor, his view was that I was welcome to continue
taking supplements if I wanted to have expensive urine. He said that moderate doses of minerals and
vitamins probably don't cause any harm—but there's no evidence they do much
good, either. The Tufts study confirms
his advice.
* * *
In the "this
is sort of scary" category, researchers at universities in the Netherlands
and Germany examined the extent to which humans are concerned about robots and apply
moral principles in dealing with them.
(Given that robots are on the way to becoming nursing assistants and
provide other help to people.)
The
researchers set out moral dilemmas: when
are people prepared to sacrifice robots to save humans, and when do they think
about saving a robot over humans? They
posed robots as compassionate, human-like as well as clearly just robots. They found that "the more the robot was
humanized, the less likely participants were to sacrifice it. . . . [when] the robot was depicted as a
compassionate being or as a creature with its own perceptions, experiences and
thoughts, [that was] more likely to deter the study participants from
sacrificing it in the interests of anonymous humans." If the robot had emotions, "many of the
experimental subjects expressed a readiness to sacrifice the injured humans to
spare the robot from harm."
The
researchers conclude, wisely in my opinion, that "attempts to humanize
robots should not go too far. Such efforts could come into conflict with their
intended function -- to be of help to us."
* * *
Matt
Browning, professor of recreation, sport and tourism at the University of
Illinois, co-authored a study with one of his graduate students demonstrating
that Medicare costs are lower in counties with more greenery (i.e., forests, trees,
shrubs, etc.). My first-blush reaction
was "well of course, those are typically wealthier areas with bigger lots
and more trees." But Professor
Browning is a good researcher and knew to control for "age, sex, race,
median household income, health care access and health behaviors," so it
wasn't just more affluent areas. They obtained
data for 3,086 of 3,103 counties in the continental U.S., so quite a data pool.
Urban Forestry and Urban Greening is the
journal with the study, and the authors also recognize that the data are
correlational. It's not hard to draw the
conclusion, however, that there might be an indirect relationship at least,
given the volumes of other research being published demonstrating the value of
green space to health. "For
example, studies have shown that people in intensive care units recover more
quickly and have fewer complications after surgery if their hospital rooms look
out over trees rather than parking lots. Other studies have found that forest
walks can influence potentially health-promoting hormone levels or anti-cancer
immune cells in the blood."
They found
that "each 1 percent of a county's land that was covered in forest was
associated with an average Medicare expenditure savings of $4.32 per person per
year," which "amounts to about $6 billion in reduced Medicare spending
every year nationally." If you
include shrublands, the number goes up to $9 billion.
So we all
need to live near a forest. Or in the
forest. Quite a number of my friends do
live adjacent to wooded areas. I wonder
if those data would hold up if urban forests are taken into account (that is,
living in a city, but one with lots with trees and bushes around the houses). One can imagine there would be a middle
ground in effect: less benefit than
living in the forest but more than living in a concrete jungle like parts of Manhattan,
downtown Chicago, and similar central metro areas.
I, of
course, also try to follow the research by having many green plants inside the
house. To Kathy's occasional
dismay. Some rooms do look like little
forests, which is what we're supposed to be living in. That's my story and I'm sticking to it.
* * *
On the
topic of what you have in your house, besides plants, Swedish researchers have
provided additional evidence that exposing kids to dirt and crud (and thus
microbes and bugs) helps ward off allergies and "other autoimmune conditions." Among the way to expose children to germs of
various sorts is to have pets. The
Swedes found "that this effect is dose-dependent—that is, the more pets in
a baby's house, the lower the risk that the child will go on to develop
allergies years later."
What they
found was that "the risk of allergies among the children decreased
steadily with the number of pets they'd lived with as infants. Those with four
pets had half the risk of the children who'd had none."
So when the
grandchildren come to visit, make sure you have a menagerie in your home.
That's
enough for one day.
Enjoy the
remainder of the weekend.
--Gary