Are you going with Torture v Dust Specks here? Or do you just reject Many Worlds?
Neither is relevant in this case. My claim is that it’s not worth spending even a second of time, even a teensy bit of thought, on changing which kind of randomization you use.
Why? Exponential functions drop off really, really quickly. Really quickly. The proportion of of random bit strings that, when booted up, are minds in horrible agony drops roughly as the exponential of the complexity of the idea “minds in horrible agony.” It would look approximately like 2^-(complexity).
To turn this exponentially small chance into something I’d care about, we’d need the consequence to be of exponential magnitude. But it’s not. It’s just a regular number like 1 billion dollars or so. That’s 2^30. It’s nothing. You aren’t going to write a computer program that detects minds in horrible agony using 30 bits. You aren’t going to write one with 500 bits, either (concentration of one part in 10^-151). It’s simply not worth worrying about things that are worth less than 10^-140 cents.
I’m saying I don’t understand what you’re measuring. Does a world with a suffering simulation exist, given the OP’s scenario, or not?
If it does, then the proliferation of other worlds doesn’t matter unless they contain something that might offset the pain. If they’re morally neutral they can number Aleph-1 and it won’t make any difference.
Decision-making in many-worlds is exactly identical to ordinary decision-making. You weight the utility of possible outcomes by their measure, and add them up into an expected utility. The bad stuff in one of those outcomes only feels more important when you phrase it in terms of many-worlds, because a certainty of small bad stuff often feels worse than a chance of big bad stuff, even when the expected utility is the same.
Neither is relevant in this case. My claim is that it’s not worth spending even a second of time, even a teensy bit of thought, on changing which kind of randomization you use.
Why? Exponential functions drop off really, really quickly. Really quickly. The proportion of of random bit strings that, when booted up, are minds in horrible agony drops roughly as the exponential of the complexity of the idea “minds in horrible agony.” It would look approximately like 2^-(complexity).
To turn this exponentially small chance into something I’d care about, we’d need the consequence to be of exponential magnitude. But it’s not. It’s just a regular number like 1 billion dollars or so. That’s 2^30. It’s nothing. You aren’t going to write a computer program that detects minds in horrible agony using 30 bits. You aren’t going to write one with 500 bits, either (concentration of one part in 10^-151). It’s simply not worth worrying about things that are worth less than 10^-140 cents.
I’m saying I don’t understand what you’re measuring. Does a world with a suffering simulation exist, given the OP’s scenario, or not?
If it does, then the proliferation of other worlds doesn’t matter unless they contain something that might offset the pain. If they’re morally neutral they can number Aleph-1 and it won’t make any difference.
Decision-making in many-worlds is exactly identical to ordinary decision-making. You weight the utility of possible outcomes by their measure, and add them up into an expected utility. The bad stuff in one of those outcomes only feels more important when you phrase it in terms of many-worlds, because a certainty of small bad stuff often feels worse than a chance of big bad stuff, even when the expected utility is the same.