“Rationalists should win,” mathematical tautology. Perfectly rational bayesian expected utility maximizers do just that.
Just clarify for me, the way you use the phrase would you say a perfectly rational Bayesian expected utility maximizer takes one box or two in Newcomb’s problem? Plenty of people would claim that that particular combination of terms refers to a particular kind of agent (and meaning of ‘rational’) which two boxes. The phrase “Rationalists should win” comes built in with the unambiguous “one box” prescription. Those people would therefore either say that the phrase “rationalists should win” is tautologically false or perhaps insist on different language.
The BayRatUtilMax agent I am talking about is of course running the One True Decision Theory which one boxes, is immune to acausal blackmail, and all sorts of other nice features.
Just clarify for me, the way you use the phrase would you say a perfectly rational Bayesian expected utility maximizer takes one box or two in Newcomb’s problem? Plenty of people would claim that that particular combination of terms refers to a particular kind of agent (and meaning of ‘rational’) which two boxes. The phrase “Rationalists should win” comes built in with the unambiguous “one box” prescription. Those people would therefore either say that the phrase “rationalists should win” is tautologically false or perhaps insist on different language.
The BayRatUtilMax agent I am talking about is of course running the One True Decision Theory which one boxes, is immune to acausal blackmail, and all sorts of other nice features.