Good afternoon!
Elliott
and I had a brief conversation last September while I was driving him to a
cat-sitting job, and then I picked up the same topic with Kathy when I returned
home. It started because I happened to
have back pain at the time, something that recurs every 4-5 years and makes it
painful to bend over and to stand up from a sitting position. I get over it; it's just a bother for a few
days and then it goes away. No big
deal. It did, however, elicit from me
the observation that one of the only good things about getting older is that
you're not dead.
Elliott related that he'd heard a
podcast with some neuroscientist expert who proclaimed that in the next century
we'll either discover how to keep people alive forever—or we'll learn that it's
impossible to do so. Elliott recalled a
comment by Neil DeGrasse Tyson, who'd said that if people lived forever,
nothing would ever get done, because you could always postpone things until
tomorrow or next year. (I find that true
of retirement!) Elliott also pointed out
that there wouldn't be any novelty to travel, for example, because you'd have
time to visit every place on the planet.
I ventured the suggestion that
people would be bored after 10,000 years.
Kathy wondered, however, if boredom might not drive more creativity;
boredom is aversive and people might start doing innovative things to keep it
at bay. I can imagine getting bored, but
the Protestant work ethic many Minnesotans (and others) inherited from our
families and culture would drive me to do something (at least for a couple of
thousand years). If one assumed the
markets work more or less as they do now, everyone's 401(k) could be in good
shape after working for 50-60-70 years, so theoretically you could retire
forever around age 100. (Kathy also
joked, in light of us replacing our roof at the time this chat took place, that
there'd be a lot of roof replacements in one's lifetime.)
In one of those coincidences that
occur frequently as I compose, the daily "Now I Know" message, a day
after our conversation, focused on boredom and creativity. In the Creativity
Research Journal in 2014, two researchers at Central Lancashire University
in Britain reported on experimental work they done to test whether boredom
leads to creativity ("Does Boredom Make Us More Creative?").
First, Sandi Mann and Rebekah Cadman
described boredom (they explained that there's no commonly-accepted definition
of the word, for research purposes).
Contrary
to popular wisdom, boredom is not the result of having nothing to do. It is very hard to come up with a situation where
a person's options are so limited that he or she literally can do nothing. Rather, boredom stems from a situation where
none of the possible things that a person can realistically do appeal to the
person in question. This renders the
person inactive, and generally unhappy.
Thus, boredom is the result of having nothing to do that one likes,
rather than nothing to do per se. For
most people, boredom is a negative experience.
What is agreed about
boredom is that it is a “complex phenomenon” . . . a “state in which the level
of stimulation is perceived as unsatisfactorily low.” Lack of external
stimulation leads to increased neural arousal in search of variety—failure to
satisfy this leads to the experience of boredom (citations omitted).
People in general use different
words but largely agree on what it is: “feeling stressed, agitated, yet at the
same time lethargic.” One might chuckle
at studying boredom, but there are troublesome outcomes when people are bored;
it affects work performance and academic performance in a multitude of ways,
none of which are positive.
The two researchers, in two
different studies, gave participants one of two tasks, one boring and one not, and
then charged them with tasks that required creative approaches. In both studies, the participants who'd
engaged in the boring tasks were markedly more creative in the later
activity. The authors are careful to
note the considerable limits one can draw from their experiments, but the results
are at least suggestive. So Kathy was
likely correct!
Obviously there would need to be
population controls if everyone lived 10,000 years or more—unless we go
off-planet and start settling elsewhere in the galaxy. Otherwise, presumably the only childbirth
allowed would be replacements for people who were killed in accidents, floods,
etc. It takes little imagination to
picture the responses to such sharp restrictions on having children.
(There was a research article
published in October, 2016, arguing that 135 years was the maximum human
lifespan. Several papers followed during
2017 rebutting the claim. So it remains
an open question whether it will be possible to extend our average length of
life.)
* * *
The Pew Research Center tells me
that Kathy and I were part of a small minority.
According to their data, only 1% of Americans used a dating website or
app in 2005 but that 15% did so in 2015.
If the percentage increased linearly, presumably about 5-6% did so when
Kathy and I did in 2009. But we were
part of an identifiable minority, it seems.
The other Pew findings were more
intriguing, I thought. Not surprising,
the young—more than older people—use the internet to find love. More surprising, the "liberal elite"
is also much more likely to use it than are lower-income households: "Americans from households with an
annual income of more than $75,000 are nearly twice as likely to know an online
suitor as those earning less than $30,000.
A similar disparity exists between those who identify as 'very liberal'
and 'very conservative.' The ratio
between college graduates and high-school dropouts is almost four to
one." Our household income is above
$75,000, we probably fall in the
"very liberal" category, and we both have college degrees; Q.E.D., I
suppose it's less of a surprise that we used match (dot) com.
What
I also learned is that there are some snooty websites for pairing up. One matches up people based on income,
education, and profession (and forget it if you don't have a college degree);
another "will only 'draft' you once it has scanned your Facebook and
LinkedIn profiles and deemed you suitably high-achieving: 100,000 people are on the waiting
list." Yet another requires
demonstration that you earn $200,000 per year or more.
The
growth of these specialized high-end sites makes me wonder if Kathy and I would
have met on match (dot) com had we both been looking 10 years later. Would either or both of us gone to one of the
more selective sites? I suppose if we
both had, we still could have met. I
don't know what I would have done.
Probably used more than one.
* * *
(I wrote the following paragraphs
last summer. I modified them slightly
after Krystin's death. Since it hadn't
happened, that event wasn't on my mind when I first began exploring the concept
of empathy.)
Most of the people I know were
likely raised to be (or try to be) empathetic with people who are in troubled
circumstances (implicitly, I think, in most cases, troubles that were not of
their own making; few would be empathetic with a felonious criminal being put
in jail). There has recently been some
fascinating research on empathy (the piece I saw came out of work at the State
University of New York at Buffalo by two psychologists, Michael Poulin and Anneke
Buffone) suggesting that "the idiom that suggests 'walking a mile in their
shoes' turns out to be problematic advice."
The gist of the findings is that
there are two different ways to be empathetic, one of which they designate as
"imagine-other
perspective-taking (IOPT)" and
the other as "imagine-self
perspective-taking (ISPT)" (my
emphasis, just to keep track of which is which). The first "observes and infers how
someone feels. . . . The other way to
empathize is for helpers to put themselves into someone else's situation, the
imagined 'walking a mile' scenario."
In other words, either taking on the person's situation yourself or
imagining how someone else in that situation might feel. It turns out that it's more upsetting and
distressing for people who practice the "imagine self" approach than
for those who rely on the "imagine other" perspective. Putting yourself in someone else's shoes can
be bad for your own health.
Before I could understand this,
however, I had to try to figure out the difference between the
"other" approach and the concept of "sympathy": how are they different? Professor Poulin agreed that there are
several closely-related concepts, including empathy, sympathy, and compassion,
but said they decided to set aside definitional questions "in favor of
simply manipulating perspective taking, which can be more precisely
defined" (and which was the approach used in previous research).
So what do you do with a definition
question? You go to the dictionary, of
course. I lean heavily on dictionary
(dot) com, which provided me the following definition and synonyms (excerpts)
from the Random House 2017 dictionary:
3. the fact or power of sharing the feelings of
another, especially in sorrow or trouble; fellow feeling, compassion, or
commiseration.
7. Psychology. a relationship between persons in
which the condition of
one
induces a parallel or reciprocal condition in another.
Sympathy,
compassion, pity, empathy all denote the tendency, practice, or capacity to
share in the feelings of others, especially their distress, sorrow, or
unfulfilled desires. Sympathy is the
broadest of these terms, signifying a general kinship with another's feelings,
no matter of what kind: in sympathy with
her yearning for peace and freedom; to extend sympathy to the bereaved. Compassion implies a deep sympathy for
the sorrows or troubles of another coupled to a powerful urge to alleviate the
pain or distress or to remove its source: to show compassion for homeless
refugees. . . . Empathy most often refers to a vicarious participation in the
emotions, ideas, or opinions of others, the ability to imagine oneself in the
condition or predicament of another:
empathy with those striving to improve their lives; to feel empathy with
Hamlet as one watches the play.
Professors Poulin and Buffone relied
on physiological measures (cardiovascular measures of anxiety/stress) rather
than self reports to gauge the difference between ISPT and IOPT. Poulin commented that
you
can think about another person's feelings without taking those feelings upon
yourself (IOPT). . . . But I begin to
feel sad once I go down the mental pathway of putting myself into the place of
someone who is feeling sad (ISPT). I
think sometimes we all avoid engaging in empathy for others who are suffering
partially because taking on someone else's burdens (ISPT) could be unpleasant.
On the other hand, it seems a much better way to proceed is if it's possible to
show empathy simply by acknowledging another person's feelings without it being
aversive (IOPT).
And so? It takes little thought to realize how
important the distinction is for caregivers:
think, just for a start, doctors and nurses; you can add in social
workers, teachers, ministers/priests/rabbis, and probably a raft of other professions. There are significant implications for
curricula in several fields of study and even, the article suggested,
child-rearing. I imagine that many in
helping professions just figure out on their own how to be empathetic without
taking on the burden of those being helped, but it surely would be an advance
to provide them guidance in doing so before they encounter the challenge. There's also the possibility that being
unable to do so contributes to burnout in a number of fields.
Parents
might even consider the study's finding when thinking about how they speaking
to their children in certain circumstances. "Rather than saying to a
child, 'How would you feel if that were done to you?' maybe we should be
saying, 'Think about how that person is feeling.'"
I told Professor Poulin that it
struck me that the difficulties with ISPT arise--at least in his research--when
someone is actively trying to help another with whom they are being
empathetic. What about the case when
someone is empathetic but not doing anything about it? I might be empathetic with someone but have
no ability or inclination to DO anything about my empathy. I wonder if that would reduce the negative
effects of ISPT, simply because one realizes they won't be involved. He wrote back to me that his suspicion
"(which is supported by other research) is that when we engage in ISPT but
are not involved in helping, there is no incentive to keep engaging in ISPT, so
we simply shift our mental focus to something else. That is, once it feels aversive to consider
our fate in someone else's shoes, we stop doing so if we can." That makes perfect sense. I bet it's easier to disengage when the
empathy is directed to a casual acquaintance, or someone one doesn't even know,
than it is when the person is a child, parent, sibling, or very close friend.
There are limitations to their
study, ones the authors themselves point out.
They only used one situation of need; ISPT/IOPT may have different
effects depending on the situation (e.g., a close relative versus a stranger);
people may be more inclined to help if they perceive they have the emotional
resources to do so (or more than the person who needs help); theirs was a
short-term task and there may be differences with long-term care, for
example. There may also be differential
effects on the person receiving the help; either IOPT or ISPT may be
better. Another question is whether
there are long-term effects from taking different perspectives (e.g., with care
givers). It may also be that ISPT is an
immature form of empathy, a perspective a child might take, while IOPT is the
more mature form.
One big question is whether anyone
can decide which approach to take. Are
some people wired to imagine themselves in the position while others can take
the more detached approach? If you can't
change your perspective, the research has uncovered phenomena that are
interesting but, like hydrogen bonding to oxygen to create water, not a lot one
can do about it.
Earlier research has shown that
while individuals may be motivated to help others who are troubled in some way,
they can also avoid the victim or get away from the setting. Or they may decide the person doesn't need
help in order to stay uninvolved. It's
also been suggested that "threat negatively impacts performance, . . .
suggesting that ISPT may predict lower help quality, in addition to greater
likelihood to disengage from a helping task."
Anyway, I thought this interesting
stuff. On a personal level, the
physician who spent time talking and emailing with me after Krystin's death was
explicit in saying he could not empathize with me—because he could not put
himself in my shoes. (He related that he
has a 27-year-old daughter and couldn't imagine his reaction if she were to
die.) He extended profound sympathy but
said it was impossible for him to empathize; he (without knowing of this
research, I'm sure) rejected even the proposition that he could offer the
"imagine-other
perspective-taking" approach to
empathy. I actually thought about this
research when I was talking to him; I understood his position.
* * *
A friend and I were emailing about
our children's lack of interest in our possessions. As I have written before, Elliott looks at a
lot of my stuff and says "eBay."
Some of the items I have, however, have no significant resale or market
value, they are simply things handed down from my parents or grandparents or
great-grandparents, things that only have sentimental value. (I one time investigated what my King James
Version of the Bible is, dated 1878 and with my great-grandmother's
hand-written notes of birth, marriage, and death dates. There appear to be plenty of those Bibles
floating around; it was worth almost nothing.
I wasn't going to sell it under any circumstances, but I wondered if it
had any value. It's true that sterling
silver flatware has cash value, depending on the price of silver, and there's a
brass and glass miniature grandfather clock that's probably worth a couple
thousand dollars, but other than that, there isn't much. I doubt my great-grandmother's wicker sewing
basket is going to fetch a high big on eBay.)
My friend, who has two adult
children, observed that one of them is beginning—barely—to take an interest in
the stuff his parents have. That kid
also now has a child. We both wonder if
having a child/children changes your perspective on keeping items that are part
of family history. We'll have to see how
it goes with Elliott.
* * *
Everyone who reads my missives knows
about the Dunning-Kruger Effect. Perhaps
not by that name, but you all know it from life experience; (per
Wikipedia):
a
cognitive bias wherein people of low ability suffer from illusory superiority,
mistakenly assessing their cognitive ability as greater than it is. The cognitive bias of illusory superiority
derives from the metacognitive inability of low-ability persons to recognize
their own ineptitude; without the self-awareness of metacognition, low-ability
people cannot objectively evaluate their actual competence or incompetence."
("Metacognition"
is "cognition about cognition, or more informally, thinking about
thinking. . . . Metacognition also
involves thinking about one's own thinking process such as study skills, memory
capabilities, and the ability to monitor learning.")
The title of the journal article
that Professors Dunning and Kruger published in 1999 told the story: "Unskilled and unaware of it: How difficulties in recognizing one's own
incompetence lead to inflated self-assessments."
People
who performed in the lowest at certain tasks, such as judging humor, grammar,
and logic, significantly overestimated how good they were at these tasks. . .
. The study also found that people who
performed slightly above average at identifying how funny a given joke was
tended to be the most accurate at assessing how good they were at the assigned
tasks, and that those who performed the best tended to think they performed
only slightly above average.
The point is a simple one: if you haven't the ability or training to
think about your own thinking, you have a high risk of over-estimating your own
abilities. It is always rewarding for
social scientists to have their hypotheses vindicated. I am sure that Professors Dunning and Kruger
(both are still on the faculty of their respective schools, Michigan and NYU)
see proof of the Dunning-Kruger Effect:
the current occupant of the White House demonstrates its validity almost
daily.
I was amused to see that after I
drafted these paragraphs, the Atlantic had
an article saying the exact same thing I have about the Dunning-Kruger
Effect.
* * *
"Science Friday" had a
piece on a topic that I've thought about as well. Titled "Scientists warn we may be
creating a 'digital dark age,'" it addressed concerns that all our digital
information won't survive. The material
posted on Facebook or blogs, your emails, etc., won't necessarily be there in
10 or 20 years. "Unlike in previous
decades, no physical record exists these days for much of the digital material
we own. Your old CDs, for example, will not last more than a couple of decades.
. . . 'We may [one day] know less about
the early 21st century than we do about the early 20th century,' says Rick
West, who manages data at Google."
Material from a century ago is in paper or film; a large part of what is
being created in the 21st century is digital to begin with—with no
corresponding "hard copy." And
thus eventually perhaps unrecoverable.
The experts in the field apparently
call this potential era of lost information the "digital dark
ages." Some organizations are
trying to prevent the losses.
Surprisingly,
many of the world’s largest companies and data-based enterprises still rely on
an old storage medium: magnetic tape. . . .
The medium has come a long way, says Lauren Young, Science Friday's web
producer. . . . A single cartridge of
today’s magnetic tape can hold hundreds of terabytes of data, the equivalent to
hundreds of millions of books, Young says. "This past summer, IBM
increased the amount a cartridge can hold to 330 terabytes, which is 330,000
gigabytes per cartridge. Big companies like
Google and particle physics labs like Fermilab all have massive libraries of
tape with thousands and thousands of cartridges."
In other words,
"in many cases, magnetic tape is the backup to the backup."
None of the current solutions are
invulnerable, however. There is
something called "bit rot," which is the decay of information stored
electronically (no matter the format).
There is also "software rot," which occurs when "old
files, games and other data becomes [sic] unusable because no format exists to
read and reproduce the information."
It appears that magnetic means of storing information have a life of
10-14 years—and things like DVDs and CD-ROMs last even a shorter time.
There is a new technology on the
horizon: storing data on (synthetic)
DNA, a "technology" that has lasted for millions of years. I don't pretend to understand how synthetic
DNA stores information, so I'll take their word for it. But its storage capacity is huge: it "is measured in petabytes; that is,
millions of gigabytes. Science Magazine writes: 'A single gram of DNA could, in principle,
store every bit of datum ever recorded by humans in a container about the size
and weight of a couple of pickup trucks.'"
I
am a pessimist about this to the extent that:
(1) I have made paper copies of all of these missives and my earlier
annual letters; (2) I have saved to MS Word documents all of my materials
related to Krystin and will print them out in the near future; (3) of the
photos I am sure I want to be saved (for Elliott, and in Kathy's case, for
Spence), I make print copies; and (4) generally print out and file any
important emails or other documents I have stored on my hard drive or in the
cloud. I have no confidence that the
electronic versions will survive like paper.
(Of course, I'm not printing them on acid-free paper, so even my copies
will deteriorate over time—but these are not papers that are significant to the
history of humanity, just to me and, perhaps, Elliott and any kids he might
have.) No, I do not print out all the
emails that I exchange with all of you!
With
warm greetings—
Gary
No comments:
Post a Comment