Arguing about the mostly likely outcome is missing the point: when the stakes are as high as survival of the human race, even a 1% probability of an adverse outcome is very worrisome. So my question to Robin Hanson is this: are you 99% certain that the FOOM scenario is wrong?
I disagree with this line of argument. It’s true that moderately lower-probability scenarios deserve extra attention if they’re higher-stakes, but if the full range of scenarios in which early AGI systems realize large capability gains totaled to only 1% probability, then they would deserve little attention. Practically speaking, in real life, there are just too many other urgent medium- and high-probability scenarios to worry about for us to have the resources to focus on all the 1%-probability futurist scenarios.
If there are more than a few independent short-term extinction scenarios (from any cause) with a probability higher than 1%, then we are in trouble—their combined probability would add up to a significant probability of doom.
As far as resources go, even if we threw 100 times the current budget of MIRI at the problem, that would be $175 million, which is
Arguing about the mostly likely outcome is missing the point: when the stakes are as high as survival of the human race, even a 1% probability of an adverse outcome is very worrisome. So my question to Robin Hanson is this: are you 99% certain that the FOOM scenario is wrong?
I disagree with this line of argument. It’s true that moderately lower-probability scenarios deserve extra attention if they’re higher-stakes, but if the full range of scenarios in which early AGI systems realize large capability gains totaled to only 1% probability, then they would deserve little attention. Practically speaking, in real life, there are just too many other urgent medium- and high-probability scenarios to worry about for us to have the resources to focus on all the 1%-probability futurist scenarios.
If there are more than a few independent short-term extinction scenarios (from any cause) with a probability higher than 1%, then we are in trouble—their combined probability would add up to a significant probability of doom.
As far as resources go, even if we threw 100 times the current budget of MIRI at the problem, that would be $175 million, which is
- 0.005% of the U.S. federal budget,
- 54 cents per person living in the U.S., or
- 2 cents per human being.