I think the survey results probably look a lot like this almost regardless of which world we are in?
Connor is at something like 90% doom, iirc, and explicitly founded Conjecture to do alignment work in a world with very short time lines. If we grant that organizations (probably) attract people with similar doom-levels and timelines to the leader of the organizations, maybe with some regression to the mean, then this is kinda what we expect, regardless of what the world is like. I’d advise against updating on it on the general grounds that updating on filtered evidence is generally a bad idea.
(On the other hand, if someone showed a survey from like, Facebook AI employees, and it had something like these numbers, that seems like much much stronger evidence.)
I think the survey results probably look a lot like this almost regardless of which world we are in?
Connor is at something like 90% doom, iirc, and explicitly founded Conjecture to do alignment work in a world with very short time lines. If we grant that organizations (probably) attract people with similar doom-levels and timelines to the leader of the organizations, maybe with some regression to the mean, then this is kinda what we expect, regardless of what the world is like. I’d advise against updating on it on the general grounds that updating on filtered evidence is generally a bad idea.
(On the other hand, if someone showed a survey from like, Facebook AI employees, and it had something like these numbers, that seems like much much stronger evidence.)