My probabilities are very rough, but I’m feeling more like 1⁄3 ish today after thinking about it a bit more. Shrug.
As far as reasons for it being this high:
Conflict seems plausible to get to this level of lethality (see edit, I think I was a bit unclear or incorrect)
AIs might not care about acausal trade considerations before too late (seems unclear)
Future humans/AIs/aliens might decide it isn’t morally important to particularly privilege currently alive humans
Generally, I’m happy to argue for ‘we should be pretty confused and there are a decent number of good reasons why AIs might keep humans alive’. I’m not confident in survival overall though...
My probabilities are very rough, but I’m feeling more like 1⁄3 ish today after thinking about it a bit more. Shrug.
As far as reasons for it being this high:
Conflict seems plausible to get to this level of lethality (see edit, I think I was a bit unclear or incorrect)
AIs might not care about acausal trade considerations before too late (seems unclear)
Future humans/AIs/aliens might decide it isn’t morally important to particularly privilege currently alive humans
Generally, I’m happy to argue for ‘we should be pretty confused and there are a decent number of good reasons why AIs might keep humans alive’. I’m not confident in survival overall though...