Is it? 3^^^3 isn’t all that much of a ridiculous number. Larger than the number of atoms in the universe, certainly, but not so much so that certain people’s methods of non-linear valuations of disutility per speck couldn’t make that kind of difference matter. (I tend to prefer at least 3^^^^3 for my stupid-large-numbers.)
Larger than the number of atoms in the universe, certainly,
That’s quite a bit of an understatement. 3^^4 (~10^3638334640025) is already vastly larger than the number of atoms in the universe (~10^80), 3^^5 in turn is incomprehensibly larger again, and 3^^^3 = 3^^7625597484987.
80 orders of magnitude is an extremely narrow band for a balance point to fall into when one of the numbers involved is greater by so many orders of magnitude they can’t even reasonably be expressed without a paradigm for writing big numbers about as strong as up arrow notation. Hitting the space between one human and 10^80 humans would take truly extraordinary precision. (Or making yourself money-pumpable by choosing different sides of the same deal when split in 10^80 sub-deals)
80 orders of magnitude is an extremely narrow band for a balance point to fall into when one of the numbers involved is greater by so many orders of magnitude they can’t even reasonably be expressed without a paradigm for writing big numbers about as strong as up arrow notation. Hitting the space between one human and 10^80 humans would take truly extraordinary precision.
Not necessarily. For a lot of people the limit of disutility of the scenario as number of dustspecks approaches infinity is non even infinite. In such cases it is perfectly plausible—and even likely—that it is considered worse that torturing one person but not as bad as torturing 10^80 people. (In which case the extra Knuth arrow obviously doesn’t help either.)
See the comment in the parentheses. Choosing torture over 3^^^^3 dust specks, but not 3^^^3*10^-80 dust specks takes extraordinary precision. Choosing one torture over 3^^^3*10^-80 dust specks but not 10^80 tortures over 3^^^3 dust specks implies inconsistent preferences.
Your comment in the parenthesis (if you were referring to the one when you were saying requires you to be money pumpable) was false but I was letting it pass. If you are telling me to see my own comment in parentheses that says about same thing as your second sentence then, well, yes we are mostly in agreement about that part, albeit not quite to the same degree.
Choosing one torture over 3^^^3*10^-80 dust specks but not 10^80 tortures over 3^^^3 dust specks implies inconsistent preferences.
Just not true. It implies preferences in which 10^80 tortures is not 10^80 times worse than 1 torture. There isn’t anything inconsistent about not valuing increased instances of the same thing to be worth a different amount than the previous instance. In fact it is the usual case. It is also not exploitable—anything you can make an agent with those preferences do based on its own preferences will be something it agrees in hindsight is a good thing to do.
Can you explain how such a preference can be consistent? The total incidence of both torture and dust specks is unknown in either case. On what basis would an agent that trades one torture for avoiding 3^^^3*10^-80 dust specks refuse the same deal a second time? Or the 10^80th time? Given that 3^^^3*10^-80 people are involved it seems astronomically unlikely that the rate of torture changed noticeably even only assuming knowledge available to the agent. In any case 10^80 separate instances of the agent with no knowledge of each other would make the same deal 10^80 times, and can’t complain about being deceived since no information about the incidence of torture was assumed. Even assuming the agent only makes the deal only a single time consistency would then require that the agent prefer trading 3^^^3 dust specks for avoiding 10^80 instances of torture over trading 3^^^3*(1+10^-80) dust specks for 10^80 +1 instances of torture, which seems implausible.
The total incidence of both torture and dust specks is unknown in either case.
Where was this declared? (Not that it matters for the purpose of this point.) The agent has prior probabilities distributed over the number of possible incidence of torture and dustspecks. It is impossible not to. And after taking one such deal those priors will be different. Sure, restricting the access to information about the current tortured population will make it harder for an agent to implement preferences that are not linear with respect to additional units but it doesn’t make those preferences inconsistent and it doesn’t stop the agent doing its best to maximise utility despite the difficulty.
There is no information on the total incidence of either included in the problem statement (other than the numbers used), and I have seen no one answer conditionally based on the incidence of either.
The agent has prior probabilities distributed over the number of possible incidence of torture and dustspecks.
Yes, of course, I thought my previous comment clearly implied that?
And after taking one such deal those priors will be different.
Infinitesimally. I thought I addressed that? The problem implies the existence of an enormous number of people. Conditional on there actually being that many people the expected number of people tortured shifts by the tiniest fraction of the total. If the agent is sensitive to such a tiny shift we are back to requiring extraordinary precision.
Is it? 3^^^3 isn’t all that much of a ridiculous number. Larger than the number of atoms in the universe, certainly, but not so much so that certain people’s methods of non-linear valuations of disutility per speck couldn’t make that kind of difference matter. (I tend to prefer at least 3^^^^3 for my stupid-large-numbers.)
That’s quite a bit of an understatement. 3^^4 (~10^3638334640025) is already vastly larger than the number of atoms in the universe (~10^80), 3^^5 in turn is incomprehensibly larger again, and 3^^^3 = 3^^7625597484987.
80 orders of magnitude is an extremely narrow band for a balance point to fall into when one of the numbers involved is greater by so many orders of magnitude they can’t even reasonably be expressed without a paradigm for writing big numbers about as strong as up arrow notation. Hitting the space between one human and 10^80 humans would take truly extraordinary precision. (Or making yourself money-pumpable by choosing different sides of the same deal when split in 10^80 sub-deals)
Not necessarily. For a lot of people the limit of disutility of the scenario as number of dustspecks approaches infinity is non even infinite. In such cases it is perfectly plausible—and even likely—that it is considered worse that torturing one person but not as bad as torturing 10^80 people. (In which case the extra Knuth arrow obviously doesn’t help either.)
See the comment in the parentheses. Choosing torture over 3^^^^3 dust specks, but not 3^^^3*10^-80 dust specks takes extraordinary precision. Choosing one torture over 3^^^3*10^-80 dust specks but not 10^80 tortures over 3^^^3 dust specks implies inconsistent preferences.
Your comment in the parenthesis (if you were referring to the one when you were saying requires you to be money pumpable) was false but I was letting it pass. If you are telling me to see my own comment in parentheses that says about same thing as your second sentence then, well, yes we are mostly in agreement about that part, albeit not quite to the same degree.
Just not true. It implies preferences in which 10^80 tortures is not 10^80 times worse than 1 torture. There isn’t anything inconsistent about not valuing increased instances of the same thing to be worth a different amount than the previous instance. In fact it is the usual case. It is also not exploitable—anything you can make an agent with those preferences do based on its own preferences will be something it agrees in hindsight is a good thing to do.
Can you explain how such a preference can be consistent? The total incidence of both torture and dust specks is unknown in either case. On what basis would an agent that trades one torture for avoiding 3^^^3*10^-80 dust specks refuse the same deal a second time? Or the 10^80th time? Given that 3^^^3*10^-80 people are involved it seems astronomically unlikely that the rate of torture changed noticeably even only assuming knowledge available to the agent. In any case 10^80 separate instances of the agent with no knowledge of each other would make the same deal 10^80 times, and can’t complain about being deceived since no information about the incidence of torture was assumed. Even assuming the agent only makes the deal only a single time consistency would then require that the agent prefer trading 3^^^3 dust specks for avoiding 10^80 instances of torture over trading 3^^^3*(1+10^-80) dust specks for 10^80 +1 instances of torture, which seems implausible.
Where was this declared? (Not that it matters for the purpose of this point.) The agent has prior probabilities distributed over the number of possible incidence of torture and dustspecks. It is impossible not to. And after taking one such deal those priors will be different. Sure, restricting the access to information about the current tortured population will make it harder for an agent to implement preferences that are not linear with respect to additional units but it doesn’t make those preferences inconsistent and it doesn’t stop the agent doing its best to maximise utility despite the difficulty.
There is no information on the total incidence of either included in the problem statement (other than the numbers used), and I have seen no one answer conditionally based on the incidence of either.
Yes, of course, I thought my previous comment clearly implied that?
Infinitesimally. I thought I addressed that? The problem implies the existence of an enormous number of people. Conditional on there actually being that many people the expected number of people tortured shifts by the tiniest fraction of the total. If the agent is sensitive to such a tiny shift we are back to requiring extraordinary precision.