Events that kill 90% of the human population can easily be extinction level events and in 2014 more LessWrongers believed that pandemics do that then AI.
Killing 90% of the human population would not be enough to cause extinction. That would put us at a population of 800 million, higher than the population in 1700.
Shimux claims that Eliezer’s emphasis was always about X-risk and not global catastrophic risks. If that’s true why was the LW survey tracking global catastrophic risks and not X-risk?
I actually agree with you there, there was always discussion of GCR along with extinction risks(though I think Eliezer in particular was more focused on extinction risks). However, they’re still distinct categories: even the deadliest of pandemics is unlikely to cause extinction.
Modern civilisation depends a lot on collaboration. I think it’s plausible that downstream of the destabilization of a deadly pandemic extinction happens, especially as the tech level grows.
That doesn’t ring true to me. I’m curious why you think that, even though I’m irrationally short-termist: “100% is actually much worse than 90%” says my brain dryly, but I feel like a 90% deadly event is totally worth worrying about a lot!
Events that kill 90% of the human population can easily be extinction level events and in 2014 more LessWrongers believed that pandemics do that then AI.
I don’t disagree that it was discussed on LW… I’m just pointing out that there was little interest from the founder himself.
Killing 90% of the human population would not be enough to cause extinction. That would put us at a population of 800 million, higher than the population in 1700.
Shimux claims that Eliezer’s emphasis was always about X-risk and not global catastrophic risks. If that’s true why was the LW survey tracking global catastrophic risks and not X-risk?
I actually agree with you there, there was always discussion of GCR along with extinction risks(though I think Eliezer in particular was more focused on extinction risks). However, they’re still distinct categories: even the deadliest of pandemics is unlikely to cause extinction.
Modern civilisation depends a lot on collaboration. I think it’s plausible that downstream of the destabilization of a deadly pandemic extinction happens, especially as the tech level grows.
That doesn’t ring true to me. I’m curious why you think that, even though I’m irrationally short-termist: “100% is actually much worse than 90%” says my brain dryly, but I feel like a 90% deadly event is totally worth worrying about a lot!