I ask because I hypothesize that a rational theist/religious person almost definitely has to be vulnerable to Pascal’s Mugging.
I don’t see why they’d be any more vulnerable then a rationalist atheist.
Keep in mind we don’t even know how to describe a rational agent that’s not vulnerable to Pascal’s mugging.
The way we currently get around this problem is by having a rule that temporarily suspends our decision theory when we pattern match the situation to resemble Pascal’s mugging.
I ask because I hypothesize that a rational theist/religious person almost definitely has to be vulnerable to Pascal’s Mugging.
A weird conclusion. I’d think that most theists would be likely to believe that such a huge disutility couldn’t be allowed (by God) to exist; atleast not on the basis of some superdimensional prankster asking you for 5 dollars.
What’s the rational reason not to be vulnerable to Pascal’s Mugging?
Roughly the same reason to one box on Newcomb’s Problem—rationalists win.
I thought the whole problem with Pascal’s Mugging is that being mugged has a higher expected value—and so those who get mugged “win” more. Obviously we’re not precise enough to be vulnerable to it, but the hypothetical super-AI could be.
The reason Pascal’s Mugging is a challenge is that expected utility calculations say to get mugged, but really strong intuitions say not to.
Roughly the same reason to one box on Newcomb’s Problem—rationalists win.
I ask because I hypothesize that a rational theist/religious person almost definitely has to be vulnerable to Pascal’s Mugging.
I don’t see why they’d be any more vulnerable then a rationalist atheist.
Keep in mind we don’t even know how to describe a rational agent that’s not vulnerable to Pascal’s mugging.
The way we currently get around this problem is by having a rule that temporarily suspends our decision theory when we pattern match the situation to resemble Pascal’s mugging.
A weird conclusion. I’d think that most theists would be likely to believe that such a huge disutility couldn’t be allowed (by God) to exist; atleast not on the basis of some superdimensional prankster asking you for 5 dollars.
I thought the whole problem with Pascal’s Mugging is that being mugged has a higher expected value—and so those who get mugged “win” more. Obviously we’re not precise enough to be vulnerable to it, but the hypothetical super-AI could be.
The reason Pascal’s Mugging is a challenge is that expected utility calculations say to get mugged, but really strong intuitions say not to.