I suspect it’s related to the fact that we’ve gotten ourselves off-distribution from the emergencies that used to be common, and thus AI and the Singularity are interpreted as immediate emergencies when they aren’t.
I’ll also make a remark that LW focuses on the tails, so things tend to be more extreme than usual.
I suspect it’s related to the fact that we’ve gotten ourselves off-distribution from the emergencies that used to be common, and thus AI and the Singularity are interpreted as immediate emergencies when they aren’t.
I’ll also make a remark that LW focuses on the tails, so things tend to be more extreme than usual.