You become less skeptical, but that doesn’t affect the issue presented, which concerns only the evidential force of the claim itself.
If someone tears the sky asunder, you will be more inclined to believe the threat. But after a point of increasing threat, increasing it further should decrease your expectation.
But after a point of increasing threat, increasing it further should decrease your expectation.
OK, so after a certain point, the mugger increasing his threat will cause you to decrease your belief faster. After a certain point, the mugger increasing his threat will cause (threat badness * probability) to go decrease.
That implies that if he threatens you with a super-exponentially bad outcome, you will assign a super-exponentially small probability to his threat.
But super-exponentially small probabilities are a tricky thing. Once you’ve assigned a super-exponentially small probability to an event, no amount of evidence in the visible universe can make you change your mind. It doesn’t matter if the mugger grows wings of fire or turns passers by into goats; no amount of evidence your eyes and ears are capable of receiving can sway a super-exponentially small probability. If the city around you melts into lava, should you believe the mugger then? How do you quantify whether you should or should not?
I should define super-exponentially large, since I’m using it as a hand-wavy term. Let’s call a number super-exponentially large if it’s bigger than anything you could write down using scientific notation using the matter available on earth. Numbers like a googol (=10^100) or 1,000,000,000^1,000,000,000 (=10^000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000) are not super-exponentially large, since I can write them as a power of ten in a reasonable amount of space. A googolplex (10^10^100, or 1 followed by a googol zeros) is super-exponentially large because there is not enough matter on earth to write a googol zeros.
A super-exponentially small probability is a probability less than 1 over a super-exponentially large number.
Speaking loosely, if you express a probability in scientific notation, a given amount of evidence lets you add or remove a constant number of digits from the exponent. For most things, that’s pretty powerful. But if you’re dealing with super-exponentially small probabilities, to hold real sway you would need more evidence than you could write down with the matter available on earth.
Once you’ve assigned a super-exponentially small probability to an event, no amount of evidence in the visible universe can make you change your mind.
I don’t see why this is necessarily a problem.
The claim that the mugger will torture 3^^^3 people, unless you give them $100, is so implausible that there should be no possible evidence that will convince you of it.
Any possible evidence is more plausibly explained by possibilities such as you being in a computer game, and the mugger being a player who’s just being a dick because they find it funny.
This article would benefit from working through a concrete example.
If you become super-exponentially more skeptical as the mugger invokes super-exponentially higher utilities, how do you react if the mugger tears the sky asunder?
You become less skeptical, but that doesn’t affect the issue presented, which concerns only the evidential force of the claim itself.
If someone tears the sky asunder, you will be more inclined to believe the threat. But after a point of increasing threat, increasing it further should decrease your expectation.
OK, so after a certain point, the mugger increasing his threat will cause you to decrease your belief faster. After a certain point, the mugger increasing his threat will cause (threat badness * probability) to go decrease.
That implies that if he threatens you with a super-exponentially bad outcome, you will assign a super-exponentially small probability to his threat.
But super-exponentially small probabilities are a tricky thing. Once you’ve assigned a super-exponentially small probability to an event, no amount of evidence in the visible universe can make you change your mind. It doesn’t matter if the mugger grows wings of fire or turns passers by into goats; no amount of evidence your eyes and ears are capable of receiving can sway a super-exponentially small probability. If the city around you melts into lava, should you believe the mugger then? How do you quantify whether you should or should not?
I should define super-exponentially large, since I’m using it as a hand-wavy term. Let’s call a number super-exponentially large if it’s bigger than anything you could write down using scientific notation using the matter available on earth. Numbers like a googol (=10^100) or 1,000,000,000^1,000,000,000 (=10^000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000) are not super-exponentially large, since I can write them as a power of ten in a reasonable amount of space. A googolplex (10^10^100, or 1 followed by a googol zeros) is super-exponentially large because there is not enough matter on earth to write a googol zeros.
A super-exponentially small probability is a probability less than 1 over a super-exponentially large number.
Speaking loosely, if you express a probability in scientific notation, a given amount of evidence lets you add or remove a constant number of digits from the exponent. For most things, that’s pretty powerful. But if you’re dealing with super-exponentially small probabilities, to hold real sway you would need more evidence than you could write down with the matter available on earth.
I don’t see why this is necessarily a problem.
The claim that the mugger will torture 3^^^3 people, unless you give them $100, is so implausible that there should be no possible evidence that will convince you of it.
Any possible evidence is more plausibly explained by possibilities such as you being in a computer game, and the mugger being a player who’s just being a dick because they find it funny.
That is one resolution.