Monthly Archives: May 2013

Perception of Risk

As discussed a few days ago, Robert is enjoying Daniel Kahneman’s book, Thinking, Fast and Slow.  In particular, it includes a good discussion of the differing views of risk perception and risk management held by Paul Slovic and Cass Sunstein.

Everyone agrees that people, and, perhaps, especially Americans, overestimate risks from certain categories of causes (e.g., chemical technologies, nuclear technologies, food preservatives) and tolerate high risk from other categories (e.g., skiing, bicycling, swimming). There are lots of folks who study the reasons for this.  The role of the media is a common focus. The question is what to do about it, and that is where there are differing schools of thought.

Anyhow, here’s a link to an old paper by Paul Slovic.  Here’s a link to an important paper by Sunstein and Kuran.

Sunstein proposes an “availability cascade” that causes the havoc. According to Sunstein:

“An availability cascade is a self-reinforcing process of collective belief
formation by which an expressed perception triggers a chain reaction that gives
the perception increasing plausibility through its rising availability in public
discourse. The driving mechanism involves a combination of informational and
reputational motives: Individuals endorse the perception partly by learning
from the apparent beliefs of others and partly by distorting their public responses
in the interest of maintaining social acceptance. Availability entrepreneurs–
activists who manipulate the content of public discourse-strive to trigger
availability cascades likely to advance their agendas. Their availability
campaigns may yield social benefits, but sometimes they bring harm, which
suggests a need for safeguards.”

Also, as a sort of illustration, here’s an excerpt from the wikipedia entry on the Fukushima nuclear power plant, err, disaster.

Casualties

There were no casualties caused by radiation exposure, approximately 25,000 died due to the earthquake and tsunami. Predicted future cancer deaths due to accumulated radiation exposures in the population living near Fukushima are predicted to be extremely low to none.[31]

In 2013, two years after the incident, the World Health Organization indicated that the residents of the area who were evacuated were exposed to so little radiation that radiation induced health impacts are likely to be below detectable levels.[114] The health risks in the WHO assessment attributable to the Fukushima radiation release were calculated by largely applying the conservative Linear no-threshold model of radiation exposure, a model that assumes even the smallest amount of radiation exposure will cause a negative health effect.[115]

The WHO calculations using this model determined that the most at risk group, infants, who were in the most affected area, would experience an absolute increase in the risk of cancer(of all types) during their lifetime, of approximately 1% due to the accident. With the lifetime risk increase for thyroid cancer, due to the accident, for a female infant, in the most affected radiation location, being estimated to be one half of one percent[0.5%].[116][117] Cancer risks for the unborn child are considered to be similar to those in 1 year old infants.[118]

The estimated risk of cancer to people who were children and adults during the Fukushima accident, in the most affected area, was determined to be lower again when compared to the most at risk group – infants.[119] A thyroid ultrasound screening programme is currently[2013] ongoing in the entire Fukushima prefecture, this screening programme is, due to the screening effect, likely to lead to an increase in the incidence of thyroid disease due to early detection of non-symptomatic disease cases.[120] About one third of people[~30%] in industrialized nations are presently diagnosed with cancer during their lifetimes, radiation exposure can increase ones cancer risk, with the cancers that arise being indistinguishable from cancers resulting from other causes.[121]

No increase is expected in the incidence of congenital or developmental abnormalities, including cognitive impairment attributable to within the womb radiation exposure.[122] As no radiation induced inherited effects/heritable effects, nor teratogenic effects, have ever been definitely demonstrated in humans, with studies on the health of children conceived by cancer survivors who received radiotherapy, and the children of the Hibakusha, not finding a definitive increase in inherited disease or congenital abnormalities.[123] No increase in these effects are therefore expected in or around the Fukushima power plants.

Thinking, Fast and Slow

Robert is reading Daniel Kahneman’s book, Thinking, Fast and Slow, which was published in 2011. Mr. Kahneman is a fancy pants professor of psychology at Princeton, oh, and he also won the Nobel Prize in Economics.  He studies decision making and stuff like that.

One of the central current notions of judgment and decision-making is the existence of so-called System 1 and System 2 thinking.  System 1 thinking being automatic and intuitive “thinking” that helps us make decisions.  System 2 thinking being the more intellectual workhorse type of thinking that kicks in when the brain cannot jump to a conclusion through use of System 1. For example, consider the following multiplication problems:

2 x 4

17 x 24

In answering the first problem, everyone uses System 1 thinking to snap to an answer automatically. But in order to answer the second problem, you must engage System 2 and calculate an answer.

Another central notion is that System 2 is very lazy.  We all resist its use, and it will only kick in when System 1 cannot consider (and often reconstruct wholesale) the question in a way that allows it to snap to an answer (often with shockingly little information).  To demonstrate the laziness of System 2, consider the following problem.

A bat and ball cost $1.10.
The bat costs one dollar more than the ball.
How much does the ball cost?

If you answered 10 cents, then you are like many people.  You are also wrong.  The answer is 5 cents. The striking thing is that it is very easy to check one’s answer when considering this question and confirm that it is correct or incorrect (e.g., obviously, if the ball cost 10 cents, then the bat would not cost one dollar more than the ball).  But when asked what is known as the bat-and-ball problem, most people will avoid even a small investment in making the check, which would entail System 2 thinking.  Over 50% of undergraduate students at Harvard, MIT and Princeton will not make it. People who answer 10 cents appear to be ardent followers of the law of least effort.  People who avoid that answer appear to have more active minds. But the thought experiment is not really about determining intelligence.  It is about demonstrating how lazy a certain part of all our minds is.

The bat-and-ball problem also describes another theory written about in Kahneman’s book (and which Robert finds pretty fascinating).  Spinoza wrote about the theory of believing and unbelieving. According to Kahneman, modern psychologists have proposed that understanding a statement must begin with an attempt to believe it: you must first know what the idea would mean if it were true. Only then can you decide whether or not to unbelieve it. The initial attempt to believe is an automatic operation of System 1, which involves the construction of the best possible interpretation of the situation.  Paraphrasing Kahneman, even a nonsensical statement will evoke initial belief. Try the following: “Whitefish eat candy.” Your System 1 engages in an automatic process of associative memory to find links between the two ideas (i.e., fish and candy) that would make sense of the nonsense.  During this process, it can be said that you believe that whitefish eat candy until your System 1 exhausts itself in trying to automatically confirm the statement. Once your System 1 is exhausted, your System 2 kicks in and is responsible for the unbelieving process which leads to a conclusion that the statement is false.

Very cool book full of stuff like this.  The problem is that after about 100 pages, the reader has very little confidence left in the reasoning abilities of mankind.  The price of a good book, Robert supposes.