“For those who would pick SPECKS, would you pay a single penny to avoid the dust specks?”
Yes. Note that, for the obvious next question, I cannot think of an amount of money large enough such that I would rather keep it than use it to save a person from torture. Assuming that this is post-Singularity money which I cannot spend on other life-saving or torture-stopping efforts.
“You probably wouldn’t blind everyone on earth to save that one person from being tortured, and yet, there are (3^^^3)/(10^17) >> 7*10^9 people being blinded for each person you have saved from torture.”
This is cheating, to put it bluntly- my utility function does not assign the same value to blinding someone and putting six billion dust specks in everyone’s eye, even though six billion specks are enough to blind people if you force them into their eyes all at once.
“I’d still take the former. (10(10100))/(3^^^3) is still so close to zero that there’s no way I can tell the difference without getting a larger universe for storing my memory first.”
The probability is effectively much greater than that, because of complexity compression. If you have 3^^^^3 people with dust specks, almost all of them will be identical copies of each other, greatly reducing abs(U(specks)). abs(U(torture)) would also get reduced, but by a much smaller factor, because the number is much smaller to begin with.
“For those who would pick SPECKS, would you pay a single penny to avoid the dust specks?”
Yes. Note that, for the obvious next question, I cannot think of an amount of money large enough such that I would rather keep it than use it to save a person from torture. Assuming that this is post-Singularity money which I cannot spend on other life-saving or torture-stopping efforts.
“You probably wouldn’t blind everyone on earth to save that one person from being tortured, and yet, there are (3^^^3)/(10^17) >> 7*10^9 people being blinded for each person you have saved from torture.”
This is cheating, to put it bluntly- my utility function does not assign the same value to blinding someone and putting six billion dust specks in everyone’s eye, even though six billion specks are enough to blind people if you force them into their eyes all at once.
“I’d still take the former. (10(10100))/(3^^^3) is still so close to zero that there’s no way I can tell the difference without getting a larger universe for storing my memory first.”
The probability is effectively much greater than that, because of complexity compression. If you have 3^^^^3 people with dust specks, almost all of them will be identical copies of each other, greatly reducing abs(U(specks)). abs(U(torture)) would also get reduced, but by a much smaller factor, because the number is much smaller to begin with.
People are being tortured, and it wouldn’t take too much money to prevent some of it. Obviously, there is already a price on torture.