My take is that Eliezer is saying that we should be aware of the significant probability that AGI takes us unaware, and also that people don’t tend to think enough about their claims. He’s not saying “be certain that it will be soon,” but rather “any claim that it will almost certainly take centuries is suspect if it cannot be backed up with specific, lower-level difficulty claims expressed through estimated times for certain goals to be reached.” I’m not sure if this goes against your reading of the post, though.
My take is that Eliezer is saying that we should be aware of the significant probability that AGI takes us unaware, and also that people don’t tend to think enough about their claims. He’s not saying “be certain that it will be soon,” but rather “any claim that it will almost certainly take centuries is suspect if it cannot be backed up with specific, lower-level difficulty claims expressed through estimated times for certain goals to be reached.” I’m not sure if this goes against your reading of the post, though.