Historical examples of flinching away
I’m looking for historical examples of “flinching away,” so I can illustrate the concept to others and talk about motivated cognition and leaving a line of retreat and so on.
The ideal example would be one of motivated skepticism with grave consequences. Like, a military commander who shied away from believing certain reports because they implied something huge and scary was about to happen, and then the huge and scary thing happened and caused great damage. Something like that.
What examples can you think of?
World War II seems to have some possible examples. Many Jews who didn’t think about escaping until it was too late, because it was too horrible to believe. Neville Chamberlain thinking he could appease the war away. Possibly Stalin’s reaction to Hitler’s invasion:
Hitler’s delusions of a German victory toward the end of the war. The Japanese holdouts.
These are the first things I thought of, anyway, although it now occurs to me that comparing your audience to Hitler would have some rhetorical drawbacks.
Upvoted for comedy.
I remember someone claimed that many Jews put off making plans to leave Germany because they had pianos, and pianos are hard to move. Basically the piano created an Ugh Field around the idea of moving.
I wonder if piano prices and costs of owning vs renting are therefore related...
“someone claimed Jews X” is pretty much the standard of evidence the Germans used (I know you don’t mean it this way of course)
And if Hitler did it it must be bad! (On the subject of ‘standard of evidence’...)
Well aware of the Hitler fallacy, it’s quite common among (us) Jews. “Someone said Y X” is still a shitty standard of evidence to begin with. Considering that people are not emotionally neutral to Jews in general hearsay is even worse. In this case the undercurrent of meaning is quite possibly “greedy Jews woud rather die than part with their pianos”. I suspect that Daniel is too refined of a person to catch that; it’s still not epistemically hygienic.
Most people just say “Y X”. Explicitly saying “Someone said Y X” is relatively good epistemic hygiene, because it communicates something about the evidence for the claim, not just the claim itself.
Agreed, it’s better than “Y X”, but still relatively worthless, especially when someone is looking for quotable examples for an article. Repeating stories like this is sum negative since it adds social proof to things of very low probability. This is what I meant by hygiene.
I think we have to be careful to avoid hindsight bias when thinking of examples of this. For example, it is quite possible that Jews who chose not to leave Nazi Germany before Kristalnacht were in fact acting perfectly rationally. Certainly Weinberg makes a reasonable case that they were acting rationally in this essay (and yes, I realise that he has plenty of reasons to try to justify himself, but that doesn’t change the fact that it’s possible he made a sensible decision).
I don’t disagree at all. Also, the millions of Jews in areas subject to Nazi control had an enormous variety of differing constraints and circumstances.
No matter what, in order to find an unambiguous example of “motivated skepticism with grave consequences” from history (rather than in the context of an academic experiment), Luke is going to have to do his homework. First, the rational course of action has to be demonstrably certain. As you correctly point out, hindsight bias is a real problem. Second, Luke has to show that the actor not only behaved irrationally, but specifically suffered from “motivated skepticism,” rather than some other form of irrationality.
You would need a control group of other people at the time who made the ‘correct’ decision due to lack of a certain bias to indicate the bias was present.
I seem to recall that Japanese soldiers were especially trained to fight to the bitter end because failure against the Americans was the worst thing imaginable, etc. Can anyone point me to decent historical documentation of this, if this is indeed what happened?
Many of them committed suicide rather than be taken prisoner or surrender there are many articles and such on the war in the pacific theatre and almost all of them mention that. Although the ones with veterans interviews would be most helpful.
I’ve read—and this may be pure anecdote—that false claims about how the American forces were treating prisoners of war may have contributed to that.
I guess the value system can be more important here. They had an order and they did what they were ordered.
Look at http://history1900s.about.com/od/worldwarii/a/soldiersurr_2.htm—the soldier surrendered immediately after receiving the order to do so read by his cmmander, but ignored all evidence of Japanese defeat before that.
The US bombing escalation in Vietnam.
Prior to the escalation in bombing in the Vietnam War, the Americans wargamed potential North Vietnamese responses in the Sigma I and II wargames. Regional experts were able to almost exactly predict the Vietnamese response, and working-level officers from the State and Defense departments, and the CIA, predicted the actual outcome. William Bundy, the guy running the games, thought the conclusion was “too harsh,” and the wargames never influenced actual policymakers. (see H R McMaster’s Dereliction of Duty)
(On a related note, something similar occurred with the Millennium Challenge 2002- the Red team used unexpected tactics to pull off unexpected early victories against the simulated US forces, so the general running the war game ‘refloated’ the sunk ships, then forced both sides to use prescripted plans of action, ignoring the unexpected initial events.)
The State Department’s Policy Planning Council published a separate study in 1964 which essentially also concluded that bombing wouldn’t work. Walt Rostow, its chairman, disagreed with its conclusions, so he work to suppress its conclusions; it did eventually influence policymakers, but only after the war had escalated, and even then its conclusions had to be bootlegged out of the council. (see David Halberstam’s The Best and the Brightest)
I think I’m detecting a trend.
It’s a little more complicated than that, or so I read. After the unexpected happened, the various generals—including the one in charge of the Red team—decided that the rest of the exercise would lose a lot of its value if they continued from the point they had arrived at (with much of the “U.S.” forces unable to participate in the simulated landing) and collectively decided to hit the reset button to see what would happen in that part of the exercise.
Well, yeah, that was General Pace’s justification. But the Red team was then forced to use a pre-set strategy for the rest of the exercise, which was restrictive enough that its commander, General Paul Van Riper, outright resigned midway through. He later said that “We were directed… to move air defences so that the army and marine units could successfully land. We were simply directed to turn [air defence systems] off or move them… So it was scripted to be whatever the control group wanted it to be.” He also later explicitly compared General Pace’s thinking to that of the Defense Department under McNamara, which is why I brought it up.
How the hell did Bundy get away with it?
I think I’m detecting a trend.
The fall of Dan Rather.
His rise to fame was in the days when a few broadcast TV channels and the local newspaper was The News to almost everyone. By 2004, he was an established and powerful elder statesman-type at CBS. To make a long story short, his people were given a set of documents that seemed to substantiate what would have been a scandal regarding George W. Bush’s time in the Air National Guard. Rather went with the story. Almost immediately, a lot of people—especially bloggers—noticed that the documents not only looked like fakes, but really amateurish, silly fakes.
At this point, Rather could have cut his losses. He could have gone on the air and said, “Folks, I’m really sorry. We were lied to, and we retract the whole story. My staff and I would never deliberately try to deceive you, our viewers, but we just didn’t check this story out properly ahead of time. Ultimately, I’m the boss, so it’s my responsibility. I deeply regret the error, and I personally promise that this will never happen again.” This would have been extremely embarrassing, but Rather had a lot of stature, and if he handled it well, he might even have gained some respect.
But no, he decided to double down.
Would you consider the actions of Stanislav Petrov an example of flinching away? It seems like there might be historical examples of where flinching has overwhelmingly beneficial consequences, if I understand flinching away right.
No, because he was updating on the evidence. If the satellites had detected a full launch of thousands of US missiles and Petrov had delayed launching, that would have been flinching away.
Wikipedia says:
Apparently there was a bit of reasoning that went into ignoring what seemed to be a strike, so it wasn’t flinching away dramatically. I suppose the point is that while flinching away is bad, you also can be overly impulsive, and history gives us examples of both. This, of course, does not say we should not continue to advocate against flinching away.
Also note that if US strike were a real five-missile strike, the retaliation would still be possible. Even half of the USSR stockpile was enough to level all the major military bases and industrial cities. If USSR “retaliated” to something that was not a real first strike… Let’s just say that Petrov chose the cheaper risk.
There’s a natural cluster of “flinching away from the evidence because it makes you uncomfortable”, and that cluster does not include what Petrov did.
The subprime mortgage crisis of 2007-2008 seems like an obvious example, as is every other bubble. Warnings get ignored because “this time it’s different”.
To be fair, there often is always someone warning of a bubble. As the famous quote goes, economists have predicted 9 out of the last 5 recessions. The problem is picking out who to listen to. (On the other hand, I don’t have detailed knowledge about whether there are more warnings from more reputable sources than usual before actual bubbles).
This post by Noah Millman is a good discussion of the role “flinching away” played in causing the Eurozone crisis, the Iraq War blunder, and the housing bubble. The examples might be too politically controversial for pedagogical use, but it is a worth reading, thought-provoking article. Excerpt:
“If a superintelligent AI is going to take over the world eventually, then what on Earth are we going to do about it?”
I recently read a book on Richard Sorge, a German Communist who pretended to be a Nazi and spied on the Japanese for Moscow. He had acquired a large amount of information indicating that Germany was preparing for an invasion of the Soviet Union, and was attempting to secure Japanese support for this invasion. His reports were viewed by Stalin and his cabinet, but Stalin refused to believe that his good buddy Hitler would betray him (I believe he referred to Sorge as “a little shit”). This was a bad decision.
It’s even worse than that- Barton Whaley’s definitive history of pre-war Soviet intelligence, Codeword “Barbarossa”, identifies no less than 84 separate warnings Stalin had.
Beware the hindsight bias: how many similar warnings did he have of things that didn’t happen?
A few months ago when EU leaders insisted that it was unthinkable that Greece would ever leave the euro.
Yeah, but such insistences serve a useful purpose.
It could, but I bet for all historically significant flinching away examples you could find someone who thought they could benefit from the flinching.
Also the purpose is basically “if we pretend it can’t happen it’s less likely to happen.” But surely this is the motivation behind most flinching examples.
I disagree. The unusual nature of banking and finance is explicitly recognized and dealt with: you can’t throw a rock in the fractional-reserve banking literature without hitting someone talking about self-fulfilling prophecies and the usefulness of central banks having printing presses enabling them to make credible commitments and so on and so forth. This is not the case in most fields and so definitely not the motivation behind most flinching examples.
(eg. Stalin ignoring Hitler’s build-up is not an example of optimism being a self-fulfilling prophecy but possibly entirely the opposite, a self-defeating prophecy—the lack of Soviet reaction encouraging the German plans.)
The sentence you quote and the sentence after “I disagree” support rather than contradict each other. (Not after the edit.)
Stalin could have felt that planning for a Nazi attack (which included talking about it because of the possibility of Nazi spies) would increase the odds of a Nazi attack.
Plus, I’ve read that Stalin had received lots of reports of invasions that proved to be false when the Nazi’s didn’t invade when the reports claimed they would and Stalin did have good reason to think that the U.K. was trying to plant false evidence of a Nazi invasion. Furthermore, Stalin might have reasonably concluded that it would be strongly against the Nazi’s self-interest to invade Russia. Finally, I find it hard to believe that the extremely paranoid Stalin really didn’t consider the possibility of a Nazi invasion. Might Stalin’s critics be suffering from hindsight bias?
Also, it’s not like Stalin trying to do something would help the matter. Marshal Timoshenko was struggling to increase Red Army’s readiness regardless of Hitler’s plans; every month spent in delusion meant a month well-spent by Timoshenko… Forcing the events could easily make the disaster even worse.
I’ve edited to be clearer.
Yes. But until we have closely looked at it, I am content to take the subject-area experts at their word when they say it was a blunder by Stalin, much like I accept their word about other mistakes by Hitler and Churchill and in general.
Check out the comments on Ugh Fields, a lot of LWers shared their own experiences with ‘flinching away.’
--STALINGRAD, Antony Beevor
General Custer attacked an Indian village on the Little Bighorn River despite reports from his scouts that the village was larger than any of which they had heard.
Nitpick: Custer was a Lt. Colonel at Little Bighorn.
Possibly the Type 93:
There were many opportunities to figure out that the Japanese torpedoes had much, much range of Allied ones, but the inference that torpedo hits were all from hidden submarines rather than superior technology may have been reasonable. Apparently, it took until after a torpedo was captured to figure it out.
--STALINGRAD, Antony Beevor
Irrationality by Sutherland had some good military examples, particularly about Pearl Harbor and some wwII European operations, IIRC the culprit there was Gen. Montgomery.
I recommend looking for examples to the “nodding donkey”/”lackey” Wilhelm Keitel and the “lapdog” Ernst Bush—both thought of rather poorly by their colleagues in the Wehrmacht, as evidenced by their nicknames. One good quote I can’t find was a series of remarks by Busch while he was presiding over the destruction of Army Group Centre, the most severe German defeat of WWII (“destruction” is the word ubiquitously associated with the debacle, there are over 300,000 hits for “destruction of army group centre/center” combined—which is still far fewer than the number of soldiers the Germans lost in the series of battles).
In many cases where Hitler gave orders and those orders led to military disaster, there were debates among officers between those arguing that following the orders will/would lead to success and those arguing it will/would lead to failure, and these are cited in German memoirs and other sources. Towards the end of the war, there was ever less plausible uncertainty through which one could believe the orders sound, even in the inherently uncertain environment of war. Busch was mostly trying to rationalize what he was told aloud, according to the recollection of the witnessing officer...I don’t remember where I read this. But anyone who has read about Operation Bagration might want to jog their memory.