Without even going into different specific risks, you should beware the conjunction fallacy (or, more accurately, its flip side) when assigning such a high probability. A lack of details tends to depress estimates of an event that could occur as a result of many different causes, since if you aren’t visualizing a full scenario it’s tempting to say there’s no way for it to occur.
You’re effectively asserting that not only are all of the proposed risks to humanity’s survival this minuscule in aggregate, but that you’re also better than 99.9% confident that there won’t be invented or discovered anything else that presents a plausible existential threat. How do you arrive at such confidence of that?
Without even going into different specific risks, you should beware the conjunction fallacy (or, more accurately, its flip side) when assigning such a high probability. A lack of details tends to depress estimates of an event that could occur as a result of many different causes, since if you aren’t visualizing a full scenario it’s tempting to say there’s no way for it to occur.
You’re effectively asserting that not only are all of the proposed risks to humanity’s survival this minuscule in aggregate, but that you’re also better than 99.9% confident that there won’t be invented or discovered anything else that presents a plausible existential threat. How do you arrive at such confidence of that?