The downside to the dust speck scenario is that lots and lots and lots of you get dust-specked. But yes, I think the thought experiment is seriously impaired by the fact that the existence of more copies of you is likely a bigger deal than whether they get dust-specked.
Perhaps we can fix it, as follows: Omega has actually set up two toy universes, one with 3^^^^3 of you who may or may not get dust-specked, one with just one of you who may or may not get tortured. Now Omega tells you the same as in ike’s original scenario, except that it’s “everyone sharing your toy universe” who will be either tortured or dust-specked.
But yes, I think the thought experiment is seriously impaired by the fact that the existence of more copies of you is likely a bigger deal than whether they get dust-specked.
The idea was that your choice doesn’t change the number of people, so this shouldn’t affect the answer.
That seems, if you don’t mind my saying so, an odd thing to say when discussing a version of Newcomb’s problem. (“Your choice doesn’t change what’s in the boxes, so …”)
In the first version, there’s no causal relation between your choice and the number of people in the world. In the third, there is, and in the middle one, anthropics must also be considered.
I gave multiple scenarios to make this point.
If the predictor in newcomb doesn’t touch the boxes but just tells you that they predict your choice is the same as what’s in the box, it turns into the smoking lesions scenario.
The downside to the dust speck scenario is that lots and lots and lots of you get dust-specked. But yes, I think the thought experiment is seriously impaired by the fact that the existence of more copies of you is likely a bigger deal than whether they get dust-specked.
Perhaps we can fix it, as follows: Omega has actually set up two toy universes, one with 3^^^^3 of you who may or may not get dust-specked, one with just one of you who may or may not get tortured. Now Omega tells you the same as in ike’s original scenario, except that it’s “everyone sharing your toy universe” who will be either tortured or dust-specked.
The idea was that your choice doesn’t change the number of people, so this shouldn’t affect the answer.
That seems, if you don’t mind my saying so, an odd thing to say when discussing a version of Newcomb’s problem. (“Your choice doesn’t change what’s in the boxes, so …”)
In the first version, there’s no causal relation between your choice and the number of people in the world. In the third, there is, and in the middle one, anthropics must also be considered.
I gave multiple scenarios to make this point.
If the predictor in newcomb doesn’t touch the boxes but just tells you that they predict your choice is the same as what’s in the box, it turns into the smoking lesions scenario.