Is this an official position in the first place? It seems to me that they want to give the impression that—without their efforts—the END IS NIGH—without committing to any particular probability estimate—which would then become the target of critics.
Halloween update: It’s been a while now, and I think the response has been poor. I think this means there is no such document (which explains Ben’s attempted reconstruction). It isn’t clear to me that producing such a document is a “high-priority task”—since it isn’t clear that the thesis is actually correct—or that the SIAI folks actually believe it.
Most of the participants here seem to be falling back on: even if it is unlikely, it could happen, and it would be devastating, so therefore we should care a lot—which seems to be a less unreasonable and more defensible position.
It isn’t clear to me that producing such a document is a “high-priority task”—since it isn’t clear that the thesis is actually correct—or that the SIAI folks actually believe it.
Most of the participants here seem to be falling back on: even if it is unlikely, it could happen, and it would be devastating, so therefore we should care a lot—which seems to be a less unreasonable and more defensible position.
You lost me at that sharp swerve in the middle. With probabilities attached to the scary idea, it is an absolutely meaningless concept. What if its probability were 1 / 3^^^3, should we still care then? I could think of a trillion scary things that could happen. But without realistic estimates of how likely it is to happen, what does it matter?
Is this an official position in the first place? It seems to me that they want to give the impression that—without their efforts—the END IS NIGH—without committing to any particular probability estimate—which would then become the target of critics.
Halloween update: It’s been a while now, and I think the response has been poor. I think this means there is no such document (which explains Ben’s attempted reconstruction). It isn’t clear to me that producing such a document is a “high-priority task”—since it isn’t clear that the thesis is actually correct—or that the SIAI folks actually believe it.
Most of the participants here seem to be falling back on: even if it is unlikely, it could happen, and it would be devastating, so therefore we should care a lot—which seems to be a less unreasonable and more defensible position.
You lost me at that sharp swerve in the middle. With probabilities attached to the scary idea, it is an absolutely meaningless concept. What if its probability were 1 / 3^^^3, should we still care then? I could think of a trillion scary things that could happen. But without realistic estimates of how likely it is to happen, what does it matter?