Yeah, I think that’s another example of a combination of going partway into “why would it do the scary thing?” (3) and “wouldn’t it be good anyway?” (5). (A lot of people wouldn’t consider “AI takes over but keeps humans alive for its own (perhaps scary) reasons” to be a “non-doom” outcome.) Missing positions like this one is a consequence of trying to categorize into disjoint groups, unfortunately.
Yeah, I think that’s another example of a combination of going partway into “why would it do the scary thing?” (3) and “wouldn’t it be good anyway?” (5). (A lot of people wouldn’t consider “AI takes over but keeps humans alive for its own (perhaps scary) reasons” to be a “non-doom” outcome.) Missing positions like this one is a consequence of trying to categorize into disjoint groups, unfortunately.