I don’t think this is the case, but I’m mentioning this possibility because I’m surprised I’ve never seen someone suggest it before:
Maybe the reason Sam Altman is taking decisions that increase p(doom) is because he’s a pure negative utilitarian (and he doesn’t know-about/believe-in acausal trade).
(I’m gonna interpret these disagree-votes as “I also don’t think this is the case” rather than “I disagree with you tamsin, I think this is the case”.)
I don’t think this is the case, but I’m mentioning this possibility because I’m surprised I’ve never seen someone suggest it before:
Maybe the reason Sam Altman is taking decisions that increase p(doom) is because he’s a pure negative utilitarian (and he doesn’t know-about/believe-in acausal trade).
(I’m gonna interpret these disagree-votes as “I also don’t think this is the case” rather than “I disagree with you tamsin, I think this is the case”.)