Written before reading comments; The answer was decided within or close to the 2 minute window.
I take both boxes. I am uncertain of three things in this scenario: 1)whether the number is prime; 2) whether Omega predicted I would take one box or two; and 3) whether I am the type of agent that will take one box or two. If I take one box, it is highly likely that Omega predicted this correctly, and it is also highly likely that the number is prime. If I take two boxes, it is highly likely that Omega predicted this correctly and that the number is composite. I prefer the number to be composite, therefor I take both boxes on the anticipation that when I do so I will (correctly) be able to update to 99.9% probability that the number is composite.
Thinking this through actually led me to a bit of insight on the original newcomb’s problem, namely that last part about updating my beliefs based on which action I choose to take, even when that action has no causal effects on the subject of my beliefs. Taking an action allows you to strongly update on your belief about which action you would take in that situation; in cases where that fact is causally connected to others (in this case Omega’s prediction), you can then update through those connections.
Written before reading comments; The answer was decided within or close to the 2 minute window.
I take both boxes. I am uncertain of three things in this scenario: 1)whether the number is prime; 2) whether Omega predicted I would take one box or two; and 3) whether I am the type of agent that will take one box or two. If I take one box, it is highly likely that Omega predicted this correctly, and it is also highly likely that the number is prime. If I take two boxes, it is highly likely that Omega predicted this correctly and that the number is composite. I prefer the number to be composite, therefor I take both boxes on the anticipation that when I do so I will (correctly) be able to update to 99.9% probability that the number is composite.
Thinking this through actually led me to a bit of insight on the original newcomb’s problem, namely that last part about updating my beliefs based on which action I choose to take, even when that action has no causal effects on the subject of my beliefs. Taking an action allows you to strongly update on your belief about which action you would take in that situation; in cases where that fact is causally connected to others (in this case Omega’s prediction), you can then update through those connections.