Hmm, the AI could have said that if you are the original, then by the time you make the decision it will have already either tortured or not tortured your copies based on its simulation of you, so hitting the reset button won’t prevent that.
Nothing can prevent something that has already happened. On the other hand, pressing the reset button will prevent the AI from ever doing this in the future. Consider that if it has done something that cruel once, it might do it again many times in the future.
Nothing can prevent something that has already happened. On the other hand, pressing the reset button will prevent the AI from ever doing this in the future.
I believe Wei_Dai one boxes on Newcomb’s problem. In fact, he has his very own brand of decision theory which is ‘updateless’ with respect to this kind of temporal information.
Hmm, the AI could have said that if you are the original, then by the time you make the decision it will have already either tortured or not tortured your copies based on its simulation of you, so hitting the reset button won’t prevent that.
Nothing can prevent something that has already happened. On the other hand, pressing the reset button will prevent the AI from ever doing this in the future. Consider that if it has done something that cruel once, it might do it again many times in the future.
I believe Wei_Dai one boxes on Newcomb’s problem. In fact, he has his very own brand of decision theory which is ‘updateless’ with respect to this kind of temporal information.