The problem, as stated, seems to me like it can be solved by precommitting not to negotiate with terrorists—this seems like a textbook case.
So switch it to Pascal’s Philanthropist, who says “I offer you a choice: either you may take this $5 bill in my hand, or I will use my magic powers outside the universe to grant you 3^^^^3 units of utility.”
But I’m actually not intuitively bothered by the thought of refusing the $5 in that case. It’s an eccentric thing to do, but it may be rational. Can anybody give me a formulation of the problem where taking the magic powers claim seriously is obviously crazy?
The two situations are not necessarily equivalent.
See my most recent response in the Pascal’s Mugging thread—taking into account the Mugger’s intentions & motives is relevant to the probability calculation.
Having said that, probably the two situations ARE equivalent—in both cases an increasingly high number indicates a higher probability that you are being manipulated.
The problem, as stated, seems to me like it can be solved by precommitting not to negotiate with terrorists—this seems like a textbook case.
That can work when the mugger is a terrorist. Unfortunately most muggers aren’t. They’re businessmen. Since the ‘threat’ issue isn’t intended to be the salient feature of the question we can perhaps specify that the mugger would be paid $3 to run the simulation and is just talking to you in a hope of getting a better offer. You do negotiate under those circumstances.
For my part I don’t like the specification of the problem as found on the wiki at all:
Now suppose someone comes to me and says, “Give me five dollars, or I’ll use my magic powers from outside the Matrix to run a Turing machine that simulates and kills 3^^^^3 people.”
Quite aside from the ‘threat’ issue I just don’t care what some schmuck simulates on a Turing machine outside the matrix. That is a distraction.
I wasn’t the downvoter (nor the upvoter), and wouldn’t have downvoted; but I would suggest considering the abstract version of the problem:
Given that, in general, a Turing machine can increase in utility vastly faster than it increases in complexity, how should an Occam-abiding mind avoid being dominated by tiny probabilities of vast utilities?
The problem, as stated, seems to me like it can be solved by precommitting not to negotiate with terrorists—this seems like a textbook case.
So switch it to Pascal’s Philanthropist, who says “I offer you a choice: either you may take this $5 bill in my hand, or I will use my magic powers outside the universe to grant you 3^^^^3 units of utility.”
But I’m actually not intuitively bothered by the thought of refusing the $5 in that case. It’s an eccentric thing to do, but it may be rational. Can anybody give me a formulation of the problem where taking the magic powers claim seriously is obviously crazy?
The two situations are not necessarily equivalent.
See my most recent response in the Pascal’s Mugging thread—taking into account the Mugger’s intentions & motives is relevant to the probability calculation.
Having said that, probably the two situations ARE equivalent—in both cases an increasingly high number indicates a higher probability that you are being manipulated.
That can work when the mugger is a terrorist. Unfortunately most muggers aren’t. They’re businessmen. Since the ‘threat’ issue isn’t intended to be the salient feature of the question we can perhaps specify that the mugger would be paid $3 to run the simulation and is just talking to you in a hope of getting a better offer. You do negotiate under those circumstances.
For my part I don’t like the specification of the problem as found on the wiki at all:
Quite aside from the ‘threat’ issue I just don’t care what some schmuck simulates on a Turing machine outside the matrix. That is a distraction.
No responses and a downvote. Clearly I’m missing something obvious.
I wasn’t the downvoter (nor the upvoter), and wouldn’t have downvoted; but I would suggest considering the abstract version of the problem: