Good morning from continuingly cold Minneapolis.
Why
are some people "liberal" and some "conservative"? That's a question to which there has been
much research devoted and there are a multitude of studies that have answers of
varying degrees of believability. One
hypothesis is that liberals are more open to ambiguity; another is that family
background has a significant effect:
liberals have liberal kids, conservatives have conservative kids.
I am taken with research out of the
University of Illinois and New York University by two psychologists, Larisa
Hussak and Andrei Cimpian. Tom Jacobs
reported on their work in Pacific
Standard. The gist of their findings
is that if you think external factors determine or strongly affect the
person/country/situation—luck, social forces, birth SES— you're more likely to
be liberal. If you think it is internal
characteristics—hard work, ability, and so on—then you're more likely to be
conservative. Think about the poor: "Are they unworthy, or just
unlucky?" When I read about their
findings, my gut response was "that seems right."
Cimpian and Hussak hypothesize that
there's a bias in humans about explaining things. Social phenomena are extremely complicated,
so people use explanatory shortcuts, and there's a bias toward attributing status/circumstances to inherent characteristics rather than
social forces. That bias, in turn,
"leads to a tendency to (1) view socioeconomic stratification as
acceptable and (2) prefer current societal arrangements to alternative ones,
two hallmarks of conservative ideology."
They found these attitudes in children and that they affected later
political views.
Jacobs, in Pacific Standard, observes that previous research suggests that
inherent
explanations come to our minds more easily than extrinsic ones. Considering the
many external factors that play a role in an individual's success or failure
requires considerable cognitive effort.
In contrast, "those people are simply like that" is a simple
idea to process—a way to make a reasonable-seeming snap judgment and move
on. If your tendency is to simply go
with that initial explanation, you will find yourself in sync with conservative
values, including the idea that society is basically fair, and people get what
they deserve.
On
the other hand, if you believe people's circumstances and actions are shaped by
society, family, and luck—external factors—you're less likely to condone
economic inequality and will be more likely to support social programs that aid
the poor, for example.
I
doubt the researchers would maintain that this attribution of causes is the
sole explanation for political differences.
It makes sense to me, however, as one significant factor. Anyone who takes coursework in psychology,
sociology, and related fields—and pays attention—will have a hard time relying
heavily on internal factors as the primary explanation for where people are in
life. Perhaps that is one reason why the
less-well-educated tend to be more conservative and less willing to support
social programs (even when those programs support them). Of course, attitudes beget attitudes in
families, so attributing poverty to laziness passes from one generation to the
next.
As
Jacobs observes, the research suggests an approach "for parents who wish
their children to grow up with open minds. . . . It's easy and natural to blame or celebrate
people for their status in life. The
fact that larger, society-level forces play a major role has to be taught." Yeah, including by going to college, but K-12
education should at the very least make those forces known to students.
I've
tried to remember what kinds of explanations my parents proffered. I think they used both. Whatever they said, they didn't freeze my
beliefs (and likely didn't intend to), because the more courses in the social
and behavioral sciences that I took, the less likely I became to attribute much
of an individual's status and circumstances to his or her own effort and
worth. My guess is that parents who
embrace inherent characteristics as the cause of status and situation and who
are raising children with the same view are not the people who are reading
journal research ("Investigating the origins of political views: Biases in explanation predict conservative
attitudes in children and adults." Developmental
Science. 2017;e12567. https://doi.org/10.1111/desc.12567).
* * *
I've had a
few amusing responses from friends that made me chuckle (or laugh out loud).
-- "I’m glad to
finally have ‘permission’ to quit what needs quitting. I feel better now. And looking back, I feel even better! A good thing."
-- "I totally
concur with the Corinthians verse. Having been raised Catholic, I heard that damn
thing all the time and for sure in every Church wedding I ever attended. It is ridiculous and makes me think: this is the best they can come up with and
that is why it is the only one ever read?"
-- "As a child
during and shortly after World War II, spam was often the only meat available,
so we ate it fairly often. It was in
fact my favorite meat when I was 5 or 6 years old, and I chose it for a few
years as my special birthday dinner. My
mother fried it mostly, but also baked it topped with canned pineapple. Those were the days! :)"
* * *
Shifting
from solid to liquid intake, research demonstrating that "beer can lift
your spirits." Scientists at a
German university looked at 13,000 (!) elements of food to identify those that
stimulate the appropriate part of the brain to "make people feel good." They could do 13,000 because they used
computer simulations, not laboratory tests.
Some foods make us happy. Well,
maybe not happy but they make us feel good. That is why we cannot stop eating
when we have had enough. Scientists call this hedonic hunger -- the drive to
eat for pleasure rather than to satisfy an actual biological need.
The reason you feel good from these
foods is dopamine, which can be stimulated by certain foods. One of the substances that stimulates
dopamine production—if I understand this process correctly—is something called
hordenine. Beer and malted barley
contain hordenine. So science proves
beer makes you happy. Obviously those
who primarily drink wine should consider changing their consumption habits if
they want to have a rosier outlook on life.
(https://www.sciencedaily.com/releases/2017/09/170927152838.htm)
* * *
Seeing the movie "Dunkirk"
last year, and then "Darkest Hour" this year, and Kathy watching
"The Crown," reminded me of an article I read about the actual
authority of the British monarch. I was
startled at how much residual power the Queen has, even though she almost never
exercises it. She doesn't need a
passport or a driver's license, but that's not exactly "power." She also pays taxes voluntarily, although she
doesn't have to.
One of the reasons the British
monarchy survived after WWI, according to one historian, is because the
monarchs kept their mouths shut. As
we've seen, when they do say things political, it's can be controversial and
reduce popular support for the monarchy.
Something I didn't know, although it doesn't surprise me.
For
almost two decades now the monarchy has regularly had polls run and focus
groups put together to keep track of how the general public feels about them
and their various actions. They also
have on payroll individuals whose job it is to ensure the Queen stays in the
public eye and in a way that is most likely to endear her to her subjects—as
with politicians who rely on the voting public, with each public change she
presents, right down to carrying a cell phone or not, carefully calculated in
terms of the impact it might have.
The powers the Queen retains
include:
-- declaring war without the approval of
Parliament
-- immune from prosecution for any crime
-- combined
with diplomatic immunity, she could commit a crime anywhere on Earth and not be
prosecuted
-- exempt
from requests for information
-- could
have anyone arrested and take their property (all are "subjects of the
monarch," not citizens of the UK)
-- "owns
all of the sea beds around the UK and can commandeer any ship found in British
waters 'for service to the realm'"
-- could
"administer any manner of punishment to an individual who offended or
otherwise displeased her as the crown has 'prerogative power to keep the peace
within the realm'"
-- could
dissolve Parliament and appoint a prime minister of her choosing no matter the
outcome of elections—and keep on calling elections until she gets a Parliament
she likes
-- is
commander in chief of all British armed forces; every member swears allegiance
to the crown
-- must
approve laws passed by Parliament before they can go into effect
-- must
approve discussion of any bill that "affects the interest of the
monarchy" (a power she has used)
As
anyone who follows European and British politics knows, the monarch does not
rule with the dictatorial authority she theoretically possesses. I would guess that if she ever did try to
flex her muscles in some significant way, that would be the end of the
monarchy. There is a potential benefit
to the country for leaving the residual power alone, however.
These
powers still exist for a variety of reasons including potentially being needed
in a time of extreme crisis where an individual ruling unilaterally for the
good of her people can potentially be of benefit—one of the few scenarios her
subjects might not mind her flexing her political muscles a bit without
necessarily consulting parliament, depending on the circumstances.
The Queen did exercise her authority
to dismiss parliament—the Australian parliament.
Some of us, at this point, would
welcome the intervention of a crown.
* * *
This must rank as one of the oldest—if not *the*
oldest—public health studies in the western world. In 1842, a guy named Edwin Chadwick did a
study of life expectancy, controlling for geographic area and occupational type;
the study covered five urban and rural areas of England. Three researchers (from the Universities of
Liverpool, Oxford, and Glasgow) replicated the study 175 years later. Here is the table of data from the 1842 Chadwick
study:
Table
1.
Average
age of death for occupation group by location (after Chadwick, 1842).
Location
|
Professional Trades
|
Tradesmen
|
Labourers
|
Rutland
|
52
|
41
|
38
|
Leeds
|
44
|
27
|
19
|
Liverpool
|
35
|
22
|
15
|
Manchester
|
38
|
20
|
17
|
Bolton
|
34
|
23
|
18
|
Rutland was a rural area; the others were urban. My goodness, people did not live long in that
era in England; the working conditions for laborers in all the urban areas must
have been tough. That may be an instance
where life as hunter-gatherers, before industrial development, would have been
longer and more pleasant. The data are
also provocative: those in the
"lowest" occupations in rural Rutland were living longer than those
in the "highest" occupations in urban areas (except for Leeds). However, as one would expect, those in the
lower occupational ranks still had higher premature mortality rates than those
in higher-status occupations irrespective of geographic area. What Chadwick demonstrated was the importance
of geography and the associated conditions on public health.
During
Chadwick's era, geographic context mattered partly because the cities he
studied were, at the extreme, ‘cesspits’ rife with outbreaks of infectious
diseases due to insanitary conditions (e.g. Cholera flourished due to a lack of
clean water sources or the safe disposal of human waste killing thousands at a time),
high level of pollutants due to unregulated industry, and overcrowded slum
housing facilitating the spread of diseases.
In contrast, Rutland was an ‘idyllic’ rural settlement set aside from
the problems and squalor of Victorian cities.
Much of the field of Health Geography today owes its direction of
development and initial impetus to this single piece of evidence.
The modern researchers found that the relationship
between geography and health remained.
While the disparities had shrunk, because public health generally has
improved dramatically since 1842, there remained differences between the urban
and rural groups and between the level of occupation.
Although
there is no longer consistent evidence on individuals in the lowest
occupational group having lower mortality rates than those in the highest
group, there were clear social gradients in mortality within each area and the
extent of these inequalities varied between areas.
As one of the study authors put
it, "it is remarkable that after 175 years, mortality rates in Liverpool
are still higher than in Rutland within each occupational group. What this demonstrates is that living in
certain locations offers very different life chances and health outcomes for
people within the same occupational groups."
To exaggerate, it seems that geography remains
destiny. I'm not sure what to make of
the data, but in the U.S. the situation appears to be reversed. Here's a small excerpt from a news release
from the Centers for Disease Control in January of 2017:
Some
46 million Americans — 15 percent of the U.S. population — currently live in
rural areas. Several demographic,
environmental, economic, and social factors might put rural residents at higher
risk of death from these public health conditions. Residents of rural areas in the United States
tend to be older and sicker than their urban counterparts. They have higher rates of cigarette smoking,
high blood pressure, and obesity. Rural
residents report less leisure-time physical activity and lower seatbelt use
than their urban counterparts. They also
have higher rates of poverty, less access to healthcare, and are less likely to
have health insurance.
From Medical Daily:
2005
to 2009, people who resided in large metropolitan areas had a life expectancy
of 79.1 years in comparison to [small towns at] 76.9 and [rural areas at] 76.7
years . . . during the same period. Back
in 1969 to 1971, city life expectancy was less than half a year longer than
nonmetropolitan areas. This gap
increased to two years between 2005 and 2009.
The idyllic life in the countryside, at least in the
U.S., is also a shorter one. I wonder to
what extent that would change if we had a national health insurance program so
all those people in outlying areas were covered. There would remain the problem of getting to
a provider—the simple fact is that there are much longer distances between
residents and health care providers in rural areas.
* * *
You
may not be aware that the term that appears to be becoming accepted for the
words is semordnilap. A palindrome is a
word or phrase that is the same backward or forward; perhaps the most famous is
"a man, a plan, a canal, Panama."
The simplest ones are mom, noon, etc.
But what of words, backwards, that are a different word? One example is evil, another is dog.
The
little article that's the source of information about semordnilaps tells me
that "one of the earliest direct references to the concept of semordnilaps
(though not the name) in English can be found in Lewis Carroll’s 1893 novel
Sylvie and Bruno Concluded," where Sylvie puts letters out spelling evil
and asks Bruno what they spell; he thinks for a bit and says "live,
backwards."
As
for the term itself, it probably originated in 1961 in an annotation by Martin
Gardner to the book Oddities and
Curiosities of Words and Literature.
"The word is self-referencing in that it demonstrates the concept
for which it describes—semordnilap is palindromes spelled backwards."
There
are well-known modern examples of semordnilaps.
The
mirror Harry stumbles upon in Harry Potter and the Sorcerer’s Stone, which
shows a person’s innermost desire is called the Mirror of Erised. Erised is
desire spelled backward. In addition, the engraving on the mirror’s frame reads
"Erised stra ehru oyt ube cafru oyt on wohsi." If you read this
sentence backward and shift the spaces, it reads "I show not your face but
your heart’s desire."
Yensid
is the name of the sorcerer in Fantasia, which is "Disney" spelled
backward.
Harpo
is Oprah spelled backward, and is the name of her production company.
There's your factoid for the day.
* * *
A friend of mine disputed the sentiment in the quote I
had from one of P. D. James's novels about grieving. ("The tragedy of loss is not that we
grieve, but that we cease to grieve, and then perhaps the dead are dead at
last.") She introduced me to a poem
by Rabbis Sylvan Kamens and Jack Riemer, We
Remember Them, written in 1970 (I think), to be used at shiva or other
times of mourning the dead. It has been
adopted by the Unitarians and others for use in services and memorials.
I have concluded I agree with my friend: because the grieving stops (or diminishes)
does not mean the remembering stops. The
poem is one way to capture the remembrances.
This much better reflects my sentiments.
At the rising of the sun and at
its going down
We remember them.
At the blowing of the wind and
in the chill of winter
We remember them.
At the opening of the buds and
in the rebirth of spring
We remember them.
At the blueness of the skies
and in the warmth of summer
We remember them.
At the rustling of the leaves
and in the beauty of autumn
We remember them.
At the beginning of the year
and when it ends
We remember them.
As long as we live, they too
will live;
for they are now a part of us
as we remember them.
When we are weary and in need
of strength
We remember them.
When we are lost and sick at
heart
We remember them.
When we have joy we crave to
share
We remember them.
When we have decisions that are
difficult to make
We remember them.
When we have achievements that
are based on theirs
We remember them.
As long as we live, they too
will live;
for they are now a part of us
as we remember them.
-- Gary
No comments:
Post a Comment