Convert numbers and rates into equivalent traits or dispositions: Convert “85% of the taxis in the city are green” to “85% of previous accidents involved drivers of green cabs”. (Recent Kahneman interview)
Requisition social thinking: Convert “85%” to “85 out of 100″, or “Which cards must you turn over” to “which people must you check further” (Wason test).
how to debias framing effects
Have people been trained in automatically thinking of “mortality rates” as “survival rates” and such? A good dojo game to play would be practicing thinking in terms of an opposite framing as quickly as possible, until it became pre-conscious, and one consciously became aware of what one heard and its opposite at the same time.
An enduring concern about democracies is that citizens conform too readily to the policy views of
elites in their own parties, even to the point of ignoring other information about the policies in
question. This article presents two experiments that undermine this concern, at least under one
important condition. People rarely possess even a modicum of information about policies; but when they
do, their attitudes seem to be affected at least as much by that information as by cues from party elites.
The experiments also measure the extent to which people think about policy. Contrary to many accounts,
they suggest that party cues do not inhibit such thinking. This is not cause for unbridled optimism about
citizens’ ability to make good decisions, but it is reason to be more sanguine about their ability to use
information about policy when they have it.
(Emphasis mine.)
If one knew the extent one was biased by cues, and one knew one’s opinion based on cues and facts, it would be possible to calculate what one’s views would be without cues.
Now have a look at a very small variation that changes everything. There are two companies in the city; they’re equally large. Eighty-five percent of cab accidents involve blue cabs. Now this is not ignored. Not at all ignored. It’s combined almost accurately with a base rate. You have the witness who says the opposite. What’s the difference between those two cases? The difference is that when you read this one, you immediately reach the conclusion that the drivers of the blue cabs are insane, they’re reckless drivers. That is true for every driver. It’s a stereotype that you have formed instantly, but it’s a stereotype about individuals, it is no longer a statement about the ensemble. It is a statement about individual blue drivers. We operate on that completely differently from the way that we operate on merely statistical information that that cab is drawn from that ensemble.
...
A health survey was conducted in a sample of adult males in British Columbia of all ages and occupations. “Please give your best estimate of the following values: What percentage of the men surveyed have had one or more heart attacks? The average is 18 percent. What percentage of men surveyed both are over 55 years old, and have had one or more heart attacks? And the average is 30 percent.” A large majority says that the second is more probable than the first.
Here is an alternative version of that which we proposed, a health survey, same story. It was conducted in a sample of 100 adult males, so you have a number. “How many of the 100 participants have had one or more heart attacks, and how many of the 100 participants both are over 55 years old and have had one or more heart attacks?” This is radically easier. From a large majority of people making mistakes, you get to a minority of people making mistakes. Percentages are terrible; the number of people out of 100 is easy.
Regarding framing effects, one could write a computer program into which one could plug in numbers and have a decision converted into an Allais paradox.
Convert numbers and rates into equivalent traits or dispositions: Convert “85% of the taxis in the city are green” to “85% of previous accidents involved drivers of green cabs”. (Recent Kahneman interview)
Requisition social thinking: Convert “85%” to “85 out of 100″, or “Which cards must you turn over” to “which people must you check further” (Wason test).
Have people been trained in automatically thinking of “mortality rates” as “survival rates” and such? A good dojo game to play would be practicing thinking in terms of an opposite framing as quickly as possible, until it became pre-conscious, and one consciously became aware of what one heard and its opposite at the same time.
Fresh off the presses at Yale’s American Political Science Review from August: http://bullock.research.yale.edu/papers/elite/elite.pdf
(Emphasis mine.)
If one knew the extent one was biased by cues, and one knew one’s opinion based on cues and facts, it would be possible to calculate what one’s views would be without cues.
Thanks! I knew some of that stuff, but not all. But for the table of thinking errors and debiasing techniques I need the references, too.
http://edge.org/conversation/the-marvels-and-flaws-of-intuitive-thinking
Regarding framing effects, one could write a computer program into which one could plug in numbers and have a decision converted into an Allais paradox.
One could commit to donating an amount of money to charity any time a free thing is acquired. (Arieli Lindt/Hershey’s experiment)