The probability of you getting struck by lightning and dying while making your decision is >10−40.
The probability of you dying by a meteor strike, by an earthquake, by …. is >10−40.
The probability that you don’t get to complete your decision for one reason or the other is >10−40.
It doesn’t make sense then to entertain probabilities vastly lower than 10−40 , but not entertain probabilities much higher than 10−40.
What happens is that our utility is bounded at or below 10−40.
This is because we ignore probabilities at that order of magnitude via revealed preference theory.
If you value X at K times more utility than Y, then you are indifferent between exchanging Y for a 1K times chance of getting X.
I’m not indifferent between exchanging $1000 for a 10−40 chance of anything, so my utility is bounded at 1040 . It is possible that others deny this is true for them, but they are being inconsistent. They ignore other events with higher probability which may prevent them from deciding, but consider events they assign vastly lower probability.
Realising that your utility function is bounded is sufficient to reject Pascal’s mugging.
As I said in the other thread:
Realising that your utility function is bounded is sufficient to reject Pascal’s mugging.
That said, I agree with you.