“Minus 3^^^^3 utilons”, by definition, is so bad that you’d be indifferent between −1 utilon and a 1/3^^^^3 chance of losing 3^^^^3 utilons, so in that case you should accept Pascal’s Mugging. But I don’t see why you would even define the utility function such that anything is that bad. My comment applies to utilitarian-ish utility functions (such as hedonism) that scale with the number of people, since it’s hard to see why 2 people being tortured isn’t twice as bad as one person being tortured. Other utility functions should really not be that extreme, and if they are then accepting Pascal’s Mugging is the right thing to do.
Torture one person twice as bad. Maybe you can’t, but maybe you can. How unlikely is it really that you can torture one person by −3^^^^3 utilons in one year? Is it really 1/3^^^^3?
Instead of torturing them for longer, torture them more intensely. It’s likely that there’s an upper bound on how intensely you can torture someone, but how sure can you be?
“Minus 3^^^^3 utilons”, by definition, is so bad that you’d be indifferent between −1 utilon and a 1/3^^^^3 chance of losing 3^^^^3 utilons, so in that case you should accept Pascal’s Mugging. But I don’t see why you would even define the utility function such that anything is that bad. My comment applies to utilitarian-ish utility functions (such as hedonism) that scale with the number of people, since it’s hard to see why 2 people being tortured isn’t twice as bad as one person being tortured. Other utility functions should really not be that extreme, and if they are then accepting Pascal’s Mugging is the right thing to do.
Torture one person twice as bad. Maybe you can’t, but maybe you can. How unlikely is it really that you can torture one person by −3^^^^3 utilons in one year? Is it really 1/3^^^^3?
I can’t parse your meaning from this comment.
Instead of torturing them for longer, torture them more intensely. It’s likely that there’s an upper bound on how intensely you can torture someone, but how sure can you be?