This is only true if the tails are long to the right and not to the left, which seems true to Ben. Most projects tend to end up not pulling any useful levers whatever and just do nothing, but a few pull crucial levers and solve open problems or increase capacity for coordination.
For what it’s worth, I disagree with this; I think we have lots of examples of small groups of concerned passionate people changing the world for the worse (generally through unforeseen consequences, but sometimes through consequences that were foreseeable at the time) and lots of sleeping dragons that should not be awoken.
[Existential risk is sort of an example of a ‘heavy tail to the left,’ but this requires a bit of abuse of notation.]
From Richard:
Then, of course, there are the massive declines in poverty across Asia in particular. It’s difficult to assign credit for this, since it’s so tied up with globalisation, but to the extent that any small group was responsible, it was Asian governments and the policies of Deng Xiaoping, Lee Kuan Yew, Rajiv Gandhi, etc.
Tying in with the last point, I don’t think it’s the case that those specific people made good policy so much as unmade bad policy, and communism seems to me like an example of a left tail policy.
I generally agree with the idea of there being long tails to the left. Revolutions are a classic example—and, more generally, any small group of ideologically polarised people taking extreme actions. Environmentalists groups blocking genetically engineered crops might be one example; global warming skepticism another; perhaps also OpenAI.
I’m not sure about the “sleeping dragons”, though, since I can’t think of many cases where small groups created technologies that counterfactually wouldn’t have happened (or even would have happened in safer ways).
I’m not sure about the “sleeping dragons”, though, since I can’t think of many cases where small groups created technologies that counterfactually wouldn’t have happened (or even would have happened in safer ways).
For technology this is possible; here we get into arguments about replacability and inventions that are “after their time” (that is, could feasibly have been built much earlier, but no one thought of them). Most such examples that I’m aware of involve particular disasters, where no one had really cared to solve problem X until problem X manifested in a way that hurt some inventor.
For policy / direct action, I think this is clearer; plausibly WWI wouldn’t have happened (or would have taken a different form) if the Black Hand hadn’t existed. There must have been many declarations of adversarial intent that turned out quite poorly for the speaker, since it put them on some enemy’s radar before they were ready.
From Ben:
For what it’s worth, I disagree with this; I think we have lots of examples of small groups of concerned passionate people changing the world for the worse (generally through unforeseen consequences, but sometimes through consequences that were foreseeable at the time) and lots of sleeping dragons that should not be awoken.
[Existential risk is sort of an example of a ‘heavy tail to the left,’ but this requires a bit of abuse of notation.]
From Richard:
Tying in with the last point, I don’t think it’s the case that those specific people made good policy so much as unmade bad policy, and communism seems to me like an example of a left tail policy.
I generally agree with the idea of there being long tails to the left. Revolutions are a classic example—and, more generally, any small group of ideologically polarised people taking extreme actions. Environmentalists groups blocking genetically engineered crops might be one example; global warming skepticism another; perhaps also OpenAI.
I’m not sure about the “sleeping dragons”, though, since I can’t think of many cases where small groups created technologies that counterfactually wouldn’t have happened (or even would have happened in safer ways).
For technology this is possible; here we get into arguments about replacability and inventions that are “after their time” (that is, could feasibly have been built much earlier, but no one thought of them). Most such examples that I’m aware of involve particular disasters, where no one had really cared to solve problem X until problem X manifested in a way that hurt some inventor.
For policy / direct action, I think this is clearer; plausibly WWI wouldn’t have happened (or would have taken a different form) if the Black Hand hadn’t existed. There must have been many declarations of adversarial intent that turned out quite poorly for the speaker, since it put them on some enemy’s radar before they were ready.