A Black Swan is better formulated as: - Extreme Tail Event : Probabilities cannot compute in current paradigm. Its weight is p<Epsilon. - Extreme Impact if it happens : Paradigm Revolution. - Can be rationalised in hindsight, because there were hints. “Most” did not spot the pattern. Some may have.
The Argument: ”Math + Evidence + Rationality + Limits makes it Rational to drop Long Tail for Decision Making” is a prime example of an heuristic which fails into what Taleb calls “Blind Faith in Degenerate MetaProbabilities”.
It is likely based on an instance of {Absence of Evidence is Evidence of Absence : Ad Ignorantiam : Logical Fallacy}
The central argument of Anti-Fragility is that Heuristics allocating some resources to Black Swans / Dragon Kings studies & contingency plans are infinitely more rational than “drop the long tail” heuristics.
A Black Swan is better formulated as:
- Extreme Tail Event : Probabilities cannot compute in current paradigm. Its weight is p<Epsilon.
- Extreme Impact if it happens : Paradigm Revolution.
- Can be rationalised in hindsight, because there were hints. “Most” did not spot the pattern. Some may have.
If spotted a priori, one could call it a Dragon King: https://en.wikipedia.org/wiki/Dragon_king_theory
The Argument:
”Math + Evidence + Rationality + Limits makes it Rational to drop Long Tail for Decision Making”
is a prime example of an heuristic which fails into what Taleb calls “Blind Faith in Degenerate MetaProbabilities”.
It is likely based on an instance of {Absence of Evidence is Evidence of Absence : Ad Ignorantiam : Logical Fallacy}
The central argument of Anti-Fragility is that Heuristics allocating some resources to Black Swans / Dragon Kings studies & contingency plans are infinitely more rational than “drop the long tail” heuristics.