Shimux claims that Eliezer’s emphasis was always about X-risk and not global catastrophic risks. If that’s true why was the LW survey tracking global catastrophic risks and not X-risk?
I actually agree with you there, there was always discussion of GCR along with extinction risks(though I think Eliezer in particular was more focused on extinction risks). However, they’re still distinct categories: even the deadliest of pandemics is unlikely to cause extinction.
Modern civilisation depends a lot on collaboration. I think it’s plausible that downstream of the destabilization of a deadly pandemic extinction happens, especially as the tech level grows.
That doesn’t ring true to me. I’m curious why you think that, even though I’m irrationally short-termist: “100% is actually much worse than 90%” says my brain dryly, but I feel like a 90% deadly event is totally worth worrying about a lot!
Shimux claims that Eliezer’s emphasis was always about X-risk and not global catastrophic risks. If that’s true why was the LW survey tracking global catastrophic risks and not X-risk?
I actually agree with you there, there was always discussion of GCR along with extinction risks(though I think Eliezer in particular was more focused on extinction risks). However, they’re still distinct categories: even the deadliest of pandemics is unlikely to cause extinction.
Modern civilisation depends a lot on collaboration. I think it’s plausible that downstream of the destabilization of a deadly pandemic extinction happens, especially as the tech level grows.
That doesn’t ring true to me. I’m curious why you think that, even though I’m irrationally short-termist: “100% is actually much worse than 90%” says my brain dryly, but I feel like a 90% deadly event is totally worth worrying about a lot!