I don’t think you should commit to doing this scheme; I think you should just commit to thinking carefully about this argument post-singularity and doing the scheme if you think it still seems good. Acausal trade is potentially really scary and I don’t think you want to make unnecessarily strong commitments.
The reason I wanted to commit is something like this: currently, I’m afraid of the AI killing everyone I know and love, so it seems like an obviously good deal to trade away a small fraction of the Universe to prevent that. However, if we successfully get through the Singularity, I will no longer feel this strongly, after all, me and my friends all survived, a million years passed, and now I would need to spend 10 juicy planets to do this weird simulation trade that is obviously not worth it from our enlightened total utilitarian perspective. So the commitment I want to make is just my current self yelling at my future self, that “no, you should still bail us out even if ‘you’ don’t have a skin in the game anymore”. I expect myself to keep my word that I would probably honor a commitment like that, even if trading away 10 planets for 1 no longer seems like that good of an idea.
However, I agree that acausal trade can be scary if we can’t figure out how to handle blackmail well, so I shouldn’t make a blanket commitment. However, I also don’t want to just say that “I commit to think carefully about this in the future”, because I worry that when my future self “thinks carefully” without having a skin in the game, he will decide that he is a total utilitarian after all.
Do you think it’s reasonable for me to make a commitment that “I will go through with this scheme in the Future if it looks like there are no serious additional downsides to doing it, and the costs and benefits are approximately what they seemed to be in 2024”?
I don’t think you should commit to doing this scheme; I think you should just commit to thinking carefully about this argument post-singularity and doing the scheme if you think it still seems good. Acausal trade is potentially really scary and I don’t think you want to make unnecessarily strong commitments.
I also don’t think making any commitment is actually needed or important except under relatively narrow assumptions.
The reason I wanted to commit is something like this: currently, I’m afraid of the AI killing everyone I know and love, so it seems like an obviously good deal to trade away a small fraction of the Universe to prevent that. However, if we successfully get through the Singularity, I will no longer feel this strongly, after all, me and my friends all survived, a million years passed, and now I would need to spend 10 juicy planets to do this weird simulation trade that is obviously not worth it from our enlightened total utilitarian perspective. So the commitment I want to make is just my current self yelling at my future self, that “no, you should still bail us out even if ‘you’ don’t have a skin in the game anymore”. I expect myself to keep my word that I would probably honor a commitment like that, even if trading away 10 planets for 1 no longer seems like that good of an idea.
However, I agree that acausal trade can be scary if we can’t figure out how to handle blackmail well, so I shouldn’t make a blanket commitment. However, I also don’t want to just say that “I commit to think carefully about this in the future”, because I worry that when my future self “thinks carefully” without having a skin in the game, he will decide that he is a total utilitarian after all.
Do you think it’s reasonable for me to make a commitment that “I will go through with this scheme in the Future if it looks like there are no serious additional downsides to doing it, and the costs and benefits are approximately what they seemed to be in 2024”?