Two boxes, sitting there on the ground, unguarded, no traps, nobody else has a legal claim to the contents? Seriously? You can have the empty one if you’d like, I’ll take the one with the money. If you ask nicely I might even give you half.
I don’t understand what you’re gaining from this “rationality” that won’t let you accept a free lunch when an insane godlike being drops it in your lap.
I don’t understand what you’re gaining from this “rationality” that won’t let you accept a free lunch when an insane godlike being drops it in your lap.
No, you’re not. You’re getting an empty box, and hoping that by doing so you’ll convince Omega to put a million dollars in the next box, or in a box presented to you in some alternate universe.
And by this exact reasoning, which Omega has successfully predicted, you will one-box, and thus Omega has successfully predicted that you will one-box and made the correct decision to leave the box empty.
Remember to trace your causal arrows both ways if you want a winning CDT.
Remember also Omega is a superintelligence. The recursive prediction is exactly why it’s rational to “irrationally” one-box.
And by this exact reasoning, which Omega has successfully predicted, you will one-box, and thus Omega has successfully predicted that you will one-box and made the correct decision to leave the box empty.
Yes, that’s why I took the one box with more money in it.
Strictly speaking the scenario being discussed is one in which Omega left a transparent box of money and another transparent box which was empty in front of Wedrifid, then I came by, confirmed Wedrifid’s disinterest in the money, and left the scene marginally richer. I personally have never been offered money by Omega, don’t expect to be any time soon, and am comfortable with the possibility of not being able to outwit something that’s defined as being vastly smarter than me.
Remember also Omega is an insane superintelligence, with unlimited resources but no clear agenda beyond boredom. If appeasing such an entity was my best prospect for survival, I would develop whatever specialized cognitive structures were necessary; it’s not, so I don’t, and consider myself lucky.
Ah, then in that case, you win. With that scenario there’s really nothing you could do better than what you propose. I was under the impression you were discussing a standard transparent Newcomb.
Two boxes, sitting there on the ground, unguarded, no traps, nobody else has a legal claim to the contents? Seriously? You can have the empty one if you’d like, I’ll take the one with the money. If you ask nicely I might even give you half.
I don’t understand what you’re gaining from this “rationality” that won’t let you accept a free lunch when an insane godlike being drops it in your lap.
A million dollars.
No, you’re not. You’re getting an empty box, and hoping that by doing so you’ll convince Omega to put a million dollars in the next box, or in a box presented to you in some alternate universe.
And by this exact reasoning, which Omega has successfully predicted, you will one-box, and thus Omega has successfully predicted that you will one-box and made the correct decision to leave the box empty.
Remember to trace your causal arrows both ways if you want a winning CDT.
Remember also Omega is a superintelligence. The recursive prediction is exactly why it’s rational to “irrationally” one-box.
Yes, that’s why I took the one box with more money in it.
Strictly speaking the scenario being discussed is one in which Omega left a transparent box of money and another transparent box which was empty in front of Wedrifid, then I came by, confirmed Wedrifid’s disinterest in the money, and left the scene marginally richer. I personally have never been offered money by Omega, don’t expect to be any time soon, and am comfortable with the possibility of not being able to outwit something that’s defined as being vastly smarter than me.
Remember also Omega is an insane superintelligence, with unlimited resources but no clear agenda beyond boredom. If appeasing such an entity was my best prospect for survival, I would develop whatever specialized cognitive structures were necessary; it’s not, so I don’t, and consider myself lucky.
Ah, then in that case, you win. With that scenario there’s really nothing you could do better than what you propose. I was under the impression you were discussing a standard transparent Newcomb.