One reason might be that AGIs are really not that concerning and the EA,rationality community has developed a mistaken model of the world that assigns a much higher probability to doom by AGI than it should, and those smart people outside the group do not hold the same beliefs.
Generally speaking, they haven’t really thought about these risks in detail, so the fact that they don’t hold “the MIRI position” is not really as much evidence as you’d think.
One reason might be that AGIs are really not that concerning and the EA,rationality community has developed a mistaken model of the world that assigns a much higher probability to doom by AGI than it should, and those smart people outside the group do not hold the same beliefs.
Generally speaking, they haven’t really thought about these risks in detail, so the fact that they don’t hold “the MIRI position” is not really as much evidence as you’d think.