Yeah, ‘AGI takes control of virtually all resources but leaves many humans alive for years’ seems like it clearly violates one or more parts of the EY-model (and the Rob-model, which looks a lot like my model of the EY-model).
An edge case that I wouldn’t assume violates the EY-model is ‘AGI kills all humans but then runs lots of human-ish simulations in order to test some hypotheses, e.g., about how hypothetical aliens it runs into might behave’. I’m not particularly expecting this because it strikes me as conjunctive and unnecessary, but it doesn’t fly in the face of anything I believe.
Wrong. He is being quite clear about what he means
Yeah, ‘AGI takes control of virtually all resources but leaves many humans alive for years’ seems like it clearly violates one or more parts of the EY-model (and the Rob-model, which looks a lot like my model of the EY-model).
An edge case that I wouldn’t assume violates the EY-model is ‘AGI kills all humans but then runs lots of human-ish simulations in order to test some hypotheses, e.g., about how hypothetical aliens it runs into might behave’. I’m not particularly expecting this because it strikes me as conjunctive and unnecessary, but it doesn’t fly in the face of anything I believe.