I think the point may be: LW orthodoxy, in so far as there is such a thing, says to choose SPECKS over TORTURE [EDITED to add:] … no, wait, I mean the exact opposite, TORTURE over SPECKS … and ONE BOX over TWO BOXES, and that combining these in ike’s rather odd scenario leads to the conclusion that we should prefer “torture everyone in the universe” over “dust-speck everyone in the universe” in that scenario, which might be a big enough bullet to bite to make some readers reconsider their adherence to LW orthodoxy.
My own view on this, for what it’s worth, is that all my ethical intuitions—including the one that says “torture is too awful to be outweighed by any number of dust specks” and the one that says “each of these vastly-many transitions from which we get from DUST SPECKS to TORTURE is a strict improvement”—have been formed on the basis of experiences (my own, my ancestors’, earlier people in the civilization I’m a part of) that come nowhere near to this sort of scenario, and I don’t trust myself to extrapolate. If some incredibly weird sequence of events actually requires me to make such a choice for real then of course I’ll have to make it (for what it’s worth, I think I would choose TORTURE and ONE BOX in the separate problems and DUST SPECKS in this one, the apparent inconsistency notwithstanding, not least because I don’t think I could ever actually have enough evidence to know something was a truly perfect truthful predictor) but I think its ability to tell me anything insightful about my values, or about the objective moral structure of the universe if it has one, is very very very doubtful.
I think your explanation may be correct, but I don’t understand why torture would be the intuitive answer even so. First, if I select torture, everyone in the universe gets tortured, which means I get tortured. If instead I select dust speck, I get a dust speck, which is vastly preferable. Second, I would prefer a universe with a bunch of me to one with just me, because I’m pretty awesome so more me is pretty much just better. Basically I just fail to see a downside to the dust speck scenario.
The downside to the dust speck scenario is that lots and lots and lots of you get dust-specked. But yes, I think the thought experiment is seriously impaired by the fact that the existence of more copies of you is likely a bigger deal than whether they get dust-specked.
Perhaps we can fix it, as follows: Omega has actually set up two toy universes, one with 3^^^^3 of you who may or may not get dust-specked, one with just one of you who may or may not get tortured. Now Omega tells you the same as in ike’s original scenario, except that it’s “everyone sharing your toy universe” who will be either tortured or dust-specked.
But yes, I think the thought experiment is seriously impaired by the fact that the existence of more copies of you is likely a bigger deal than whether they get dust-specked.
The idea was that your choice doesn’t change the number of people, so this shouldn’t affect the answer.
That seems, if you don’t mind my saying so, an odd thing to say when discussing a version of Newcomb’s problem. (“Your choice doesn’t change what’s in the boxes, so …”)
In the first version, there’s no causal relation between your choice and the number of people in the world. In the third, there is, and in the middle one, anthropics must also be considered.
I gave multiple scenarios to make this point.
If the predictor in newcomb doesn’t touch the boxes but just tells you that they predict your choice is the same as what’s in the box, it turns into the smoking lesions scenario.
This seems like a weird mishmash of other hypotheticals on the site, I’m not really seeing the point of parts of your scenario.
I think the point may be: LW orthodoxy, in so far as there is such a thing, says to choose SPECKS over TORTURE [EDITED to add:] … no, wait, I mean the exact opposite, TORTURE over SPECKS … and ONE BOX over TWO BOXES, and that combining these in ike’s rather odd scenario leads to the conclusion that we should prefer “torture everyone in the universe” over “dust-speck everyone in the universe” in that scenario, which might be a big enough bullet to bite to make some readers reconsider their adherence to LW orthodoxy.
My own view on this, for what it’s worth, is that all my ethical intuitions—including the one that says “torture is too awful to be outweighed by any number of dust specks” and the one that says “each of these vastly-many transitions from which we get from DUST SPECKS to TORTURE is a strict improvement”—have been formed on the basis of experiences (my own, my ancestors’, earlier people in the civilization I’m a part of) that come nowhere near to this sort of scenario, and I don’t trust myself to extrapolate. If some incredibly weird sequence of events actually requires me to make such a choice for real then of course I’ll have to make it (for what it’s worth, I think I would choose TORTURE and ONE BOX in the separate problems and DUST SPECKS in this one, the apparent inconsistency notwithstanding, not least because I don’t think I could ever actually have enough evidence to know something was a truly perfect truthful predictor) but I think its ability to tell me anything insightful about my values, or about the objective moral structure of the universe if it has one, is very very very doubtful.
No, Eliezer and Hanson are anti-specks.
Wow, did I really write that? It’s the exact opposite of what I meant. Will fix.
I think your explanation may be correct, but I don’t understand why torture would be the intuitive answer even so. First, if I select torture, everyone in the universe gets tortured, which means I get tortured. If instead I select dust speck, I get a dust speck, which is vastly preferable. Second, I would prefer a universe with a bunch of me to one with just me, because I’m pretty awesome so more me is pretty much just better. Basically I just fail to see a downside to the dust speck scenario.
The downside to the dust speck scenario is that lots and lots and lots of you get dust-specked. But yes, I think the thought experiment is seriously impaired by the fact that the existence of more copies of you is likely a bigger deal than whether they get dust-specked.
Perhaps we can fix it, as follows: Omega has actually set up two toy universes, one with 3^^^^3 of you who may or may not get dust-specked, one with just one of you who may or may not get tortured. Now Omega tells you the same as in ike’s original scenario, except that it’s “everyone sharing your toy universe” who will be either tortured or dust-specked.
The idea was that your choice doesn’t change the number of people, so this shouldn’t affect the answer.
That seems, if you don’t mind my saying so, an odd thing to say when discussing a version of Newcomb’s problem. (“Your choice doesn’t change what’s in the boxes, so …”)
In the first version, there’s no causal relation between your choice and the number of people in the world. In the third, there is, and in the middle one, anthropics must also be considered.
I gave multiple scenarios to make this point.
If the predictor in newcomb doesn’t touch the boxes but just tells you that they predict your choice is the same as what’s in the box, it turns into the smoking lesions scenario.
Specks is supposed to be the intuitive answer.
That’s why I gave scenarios where your choice doesn’t cause the number of people, which is where Newcomblike scenarios come in.