If Omega has already left, I open box B first, take whatever is in it, and then open box A.
I guess my cognition just breaks down over the idea of Omega. To me, Newcomb’s problem seems akin to a theological argument. Either we are talking about a purely theoretical idea that is meant to illustrate abstract decision theory, in which case I don’t care how many boxes I take, because it has no bearing on anything tied to reality, or we are actually talking about the real universe, in which case I take both boxes because I don’t believe in alien superintelligences capable of foreseeing my choices any more than I believe in an anthropomorphic deity.
If Omega has already left, I open box B first, take whatever is in it, and then open box A.
Labeling “I decide to lose” as a snark just seems odd.
I guess my cognition just breaks down over the idea of Omega. To me, Newcomb’s problem seems akin to a theological argument. Either we are talking about a purely theoretical idea that is meant to illustrate abstract decision theory, in which case I don’t care how many boxes I take, because it has no bearing on anything tied to reality, or we are actually talking about the real universe, in which case I take both boxes because I don’t believe in alien superintelligences capable of foreseeing my choices any more than I believe in an anthropomorphic deity.
You are confused. Using Omega is merely a simplification of real possible situations. That is, any situation in which you and the other player have some degree of mutual knowledge. Since those situations are complicated they will sometimes call for cooperation (one boxing, here) but often other considerations or insufficient mutual knowledge will override and call for defection (two boxing).
If you wish to consider the effect of just, say, the mass of a cow then assuming a spherical cow in a vacuum is useful. If the conclusion you reach about the mass of said cow doesn’t suit you and you say “but there are no spherical cows in vacuums!” then you are using an excuse to avoid biting the bullet, not showing your superior awareness of reality.
Yeah, that’s generally what “I guess my cognition breaks down” means.
If you wish to consider the effect of just, say, the mass of a cow then assuming a spherical cow in a vacuum is useful. If the conclusion you reach about the mass of said cow doesn’t suit you and you say “but there are no spherical cows in vacuums!”
I think you can reasonably expect people to behave in real life as if they expect the laws of physics to approximate reasonably closely what newtonian mechanics predicts about spherical point masses. What I was saying, however, is that you would be wrong to predict that I defect in prisoners’ dilemmas based on my 2-boxing, because for me Newcomb’s problem isn’t connected to those problems for reasons already stated. I hypothesize that I am not alone in that.
What I was saying, however, is that you would be wrong to predict that I defect in prisoners’ dilemmas based on my 2-boxing, because for me Newcomb’s problem isn’t connected to those problems for reasons already stated. I hypothesize that I am not alone in that.
And I said you are confused regarding this belief and the stated reasons. I don’t doubt that others are confused as well—it’s a rather common response.
I guess my cognition just breaks down over the idea of Omega. To me, Newcomb’s problem seems akin to a theological argument. Either we are talking about a purely theoretical idea that is meant to illustrate abstract decision theory, in which case I don’t care how many boxes I take, because it has no bearing on anything tied to reality, or we are actually talking about the real universe, in which case I take both boxes because I don’t believe in alien superintelligences capable of foreseeing my choices any more than I believe in an anthropomorphic deity.
Labeling “I decide to lose” as a snark just seems odd.
You are confused. Using Omega is merely a simplification of real possible situations. That is, any situation in which you and the other player have some degree of mutual knowledge. Since those situations are complicated they will sometimes call for cooperation (one boxing, here) but often other considerations or insufficient mutual knowledge will override and call for defection (two boxing).
If you wish to consider the effect of just, say, the mass of a cow then assuming a spherical cow in a vacuum is useful. If the conclusion you reach about the mass of said cow doesn’t suit you and you say “but there are no spherical cows in vacuums!” then you are using an excuse to avoid biting the bullet, not showing your superior awareness of reality.
Yeah, that’s generally what “I guess my cognition breaks down” means.
I think you can reasonably expect people to behave in real life as if they expect the laws of physics to approximate reasonably closely what newtonian mechanics predicts about spherical point masses. What I was saying, however, is that you would be wrong to predict that I defect in prisoners’ dilemmas based on my 2-boxing, because for me Newcomb’s problem isn’t connected to those problems for reasons already stated. I hypothesize that I am not alone in that.
And I said you are confused regarding this belief and the stated reasons. I don’t doubt that others are confused as well—it’s a rather common response.