I agree that it is kind of insane for an AGI which cares about scope sensitive resources to treat sims in this way and thus we should expect a more sensible decision theory.
Introducing the option of creating lots of simulations of your adversary in the future where you win doesn’t seem like it’d change the result that Bob’s share has size O(p). So if O(p) is only enough to preserve humanity for a year instead of a billion years[1], then that’s all we get.
This seems right to me, I agree you shouldn’t be able to mug AIs better than other people are able to scam AIs. (AIs that care about sims in this way might get mugged for all their stuff.)
However, I think O(p) only needs to be quite small for the deal to go through. 1⁄100 million of expected resources seems like it should suffice to keep humans from being killed I think?
I’d naively guess that a lot of resources get controlled by evolved life (50%), that evolved life cares a lot about not getting exterminated, and that evolved life is also often willing to pay a moderate amount to either bail out other aliens or to save themselves in a UDT sense.
Even if you think the fraction controlled by evolved life is much smaller (e.g. 1/1000), I’d guess that it’s pretty cheap to avoid maximal slaughter from the AI?
(I’m not claiming that we should be willing to pay this money, just that people in practice are likely to, including aliens.)
I agree that it is kind of insane for an AGI which cares about scope sensitive resources to treat sims in this way and thus we should expect a more sensible decision theory.
This seems right to me, I agree you shouldn’t be able to mug AIs better than other people are able to scam AIs. (AIs that care about sims in this way might get mugged for all their stuff.)
However, I think O(p) only needs to be quite small for the deal to go through. 1⁄100 million of expected resources seems like it should suffice to keep humans from being killed I think?
I’d naively guess that a lot of resources get controlled by evolved life (50%), that evolved life cares a lot about not getting exterminated, and that evolved life is also often willing to pay a moderate amount to either bail out other aliens or to save themselves in a UDT sense.
Even if you think the fraction controlled by evolved life is much smaller (e.g. 1/1000), I’d guess that it’s pretty cheap to avoid maximal slaughter from the AI?
(I’m not claiming that we should be willing to pay this money, just that people in practice are likely to, including aliens.)