I think that the more we explore this analogy & take it seriously as a way to predict AGI, the more confident we’ll get that the classic misalignment risk story is basically correct.
The analogy doesn’t seem relevant to AGI risk so I don’t update much on it. Even if doom happens in this story, it seems like it’s for pretty different reasons than in the classic misalignment risk story.
I don’t agree with this:
The analogy doesn’t seem relevant to AGI risk so I don’t update much on it. Even if doom happens in this story, it seems like it’s for pretty different reasons than in the classic misalignment risk story.
Right, so you don’t take the analogy seriously—but the quoted claim was meant to say basically “IF you took the analogy seriously...”
Feel free not to respond, I feel like the thread of conversation has been lost somehow.