If it helps, remember that there is a significant likelihood of you being in an ancestor simulation. You have no knowledge of what is outside the simulation, so it is entirely possible that regardless of your actions, you will be tortured for an up-notation amount of time upon death (or maybe literally forever, if the laws of physics/logic are different outside of the sim).
Thus, you shouldn’t be too stressed about destroying any information about yourself—it only makes a quantitative instead of a qualitative difference in terms of potential AI torture. That is, instead of (P-AI_Torture =0.01|No information destruction) and (P-AI_Torture =0.00 | Information destruction), it’s more something like (P-AI_Torture = O+0.01|No information destruction) and (P-AI_Torture = O | Information destruction), where O is the probability of the AI out of the sim torturing you anyways. I find this to be a more soothing way to think about the problem since it takes advantage of a few cognitive biases to make the importance of information destruction less emotionally critical.
I mean that doesn’t really have any relevance to the question of how I should think or act. Information destruction is exactly as important whether or not there is some chance you’re screwed anyway, by independence of irrelevant alternatives.
If it helps, remember that there is a significant likelihood of you being in an ancestor simulation. You have no knowledge of what is outside the simulation, so it is entirely possible that regardless of your actions, you will be tortured for an up-notation amount of time upon death (or maybe literally forever, if the laws of physics/logic are different outside of the sim).
Thus, you shouldn’t be too stressed about destroying any information about yourself—it only makes a quantitative instead of a qualitative difference in terms of potential AI torture. That is, instead of (P-AI_Torture =0.01|No information destruction) and (P-AI_Torture =0.00 | Information destruction), it’s more something like (P-AI_Torture = O+0.01|No information destruction) and (P-AI_Torture = O | Information destruction), where O is the probability of the AI out of the sim torturing you anyways. I find this to be a more soothing way to think about the problem since it takes advantage of a few cognitive biases to make the importance of information destruction less emotionally critical.
I mean that doesn’t really have any relevance to the question of how I should think or act. Information destruction is exactly as important whether or not there is some chance you’re screwed anyway, by independence of irrelevant alternatives.