I think there’s a framework in which it makes sense to reject Pascal’s Mugging. According to SSA (self-sampling assumption) the probability that the universe contains 3^^^^3 people and you happen to be at a privileged position relative to them is extremely low, and as the number gets bigger the probability gets lower (probability is proportional 1/n if there are n people). SSA has its own problems, but a refinement I came up with (scale the probability of a universe by its efficiency at converting computation time to observer time) seems to be more intuitive. See the discussion here. The question you ask is not “how many people do my actions affect?” but instead “what percentage of simulated observer-time, assuming all universes are being simulated in parallel and given computation time proportional to the probabilities of their laws of physics, do my actions affect?”. So I don’t think you need to use ad-hoc heuristics to prevent Pascal’s Mugging.
Isn’t that easily circumvented by changing the wording of Pascal’s mugging? I think the typical formulation (or at least Eliezer’s) was “create and kill 3^^^^3 people. And this formulation was “minus 3^^^^3 utilions”.
“Minus 3^^^^3 utilons”, by definition, is so bad that you’d be indifferent between −1 utilon and a 1/3^^^^3 chance of losing 3^^^^3 utilons, so in that case you should accept Pascal’s Mugging. But I don’t see why you would even define the utility function such that anything is that bad. My comment applies to utilitarian-ish utility functions (such as hedonism) that scale with the number of people, since it’s hard to see why 2 people being tortured isn’t twice as bad as one person being tortured. Other utility functions should really not be that extreme, and if they are then accepting Pascal’s Mugging is the right thing to do.
Torture one person twice as bad. Maybe you can’t, but maybe you can. How unlikely is it really that you can torture one person by −3^^^^3 utilons in one year? Is it really 1/3^^^^3?
Instead of torturing them for longer, torture them more intensely. It’s likely that there’s an upper bound on how intensely you can torture someone, but how sure can you be?
I think there’s a framework in which it makes sense to reject Pascal’s Mugging. According to SSA (self-sampling assumption) the probability that the universe contains 3^^^^3 people and you happen to be at a privileged position relative to them is extremely low, and as the number gets bigger the probability gets lower (probability is proportional 1/n if there are n people). SSA has its own problems, but a refinement I came up with (scale the probability of a universe by its efficiency at converting computation time to observer time) seems to be more intuitive. See the discussion here. The question you ask is not “how many people do my actions affect?” but instead “what percentage of simulated observer-time, assuming all universes are being simulated in parallel and given computation time proportional to the probabilities of their laws of physics, do my actions affect?”. So I don’t think you need to use ad-hoc heuristics to prevent Pascal’s Mugging.
Isn’t that easily circumvented by changing the wording of Pascal’s mugging? I think the typical formulation (or at least Eliezer’s) was “create and kill 3^^^^3 people. And this formulation was “minus 3^^^^3 utilions”.
“Minus 3^^^^3 utilons”, by definition, is so bad that you’d be indifferent between −1 utilon and a 1/3^^^^3 chance of losing 3^^^^3 utilons, so in that case you should accept Pascal’s Mugging. But I don’t see why you would even define the utility function such that anything is that bad. My comment applies to utilitarian-ish utility functions (such as hedonism) that scale with the number of people, since it’s hard to see why 2 people being tortured isn’t twice as bad as one person being tortured. Other utility functions should really not be that extreme, and if they are then accepting Pascal’s Mugging is the right thing to do.
Torture one person twice as bad. Maybe you can’t, but maybe you can. How unlikely is it really that you can torture one person by −3^^^^3 utilons in one year? Is it really 1/3^^^^3?
I can’t parse your meaning from this comment.
Instead of torturing them for longer, torture them more intensely. It’s likely that there’s an upper bound on how intensely you can torture someone, but how sure can you be?