As long as the probability of it saying the truth is positive, it could up the number of copies of you it tortues/claims to torture (and torture them all in subtly different ways)...
Anyway, if you are sure you are going to hit the reset button every time, then there’s no reason to worry, since the torture will end as soon as the real copy of you hits reset. If you don’t, then the whole world is absolutely screwed (including you), so you’re a stupid bastard anyway.
I don’t use a single probability to decide whether it was telling me the truth.
Whether it was telling me the truth would depend upon the statement being made as well. This tends to happen in every day life as well.
So the higher number of people it claims it is torturing the less I would believe it. Considering your prior in this case as well. You can’t assign an equal probability to the maximum number of copies of you it can simulate. This is because there are potentially infinite numbers of different maxes, you’d need a function that summed to 1 in the limit (as you do in solomonoff induction).
But good reason to expect it not to torture people at greater than the maximum rate its hardware was capable of, so if you can bound that there exist some positive values of belief that cannot be inflated into something meaningful by upping copies.
As long as the probability of it saying the truth is positive, it could up the number of copies of you it tortues/claims to torture (and torture them all in subtly different ways)...
Pascal’s mugging...
Anyway, if you are sure you are going to hit the reset button every time, then there’s no reason to worry, since the torture will end as soon as the real copy of you hits reset. If you don’t, then the whole world is absolutely screwed (including you), so you’re a stupid bastard anyway.
Yes, the copies are depending upon you to hit reset, and so is the world.
That would only be correct if hitting the reset button somehow kills or stops the AI.
If you don’t have the power to kill/stop it, then the problem is somewhat more interesting.
I don’t use a single probability to decide whether it was telling me the truth.
Whether it was telling me the truth would depend upon the statement being made as well. This tends to happen in every day life as well.
So the higher number of people it claims it is torturing the less I would believe it. Considering your prior in this case as well. You can’t assign an equal probability to the maximum number of copies of you it can simulate. This is because there are potentially infinite numbers of different maxes, you’d need a function that summed to 1 in the limit (as you do in solomonoff induction).
There’d be no reason to expect it to torture people at less than the maximum rate its hardware was capable of.
But good reason to expect it not to torture people at greater than the maximum rate its hardware was capable of, so if you can bound that there exist some positive values of belief that cannot be inflated into something meaningful by upping copies.