I never liked the comparison of the Prisoner’s Dilemma with Newcomb, and the Ultimatum Game seems even less like Newcomb.
If you’re up against an agent from a species that you know has evolved traits like fairness and spite, then the rational course of action is certainly not to offer a penny. That should be true on any sane theory of rational action.
(For the record, I one-box on Newcomb, defect on the true Prisoner’s Dilemma (unless the opponent is somehow using the very same decision-making process as me), and offer a fair deal against a human in the Ultimatum Game.)
If you’re up against an agent from a species that you know has evolved traits like fairness and spite, then the rational course of action is certainly not to offer a penny. That should be true on any sane theory of rational action.
But is the rational course of action to accept a penny?
I never liked the comparison of the Prisoner’s Dilemma with Newcomb, and the Ultimatum Game seems even less like Newcomb.
If you’re up against an agent from a species that you know has evolved traits like fairness and spite, then the rational course of action is certainly not to offer a penny. That should be true on any sane theory of rational action.
(For the record, I one-box on Newcomb, defect on the true Prisoner’s Dilemma (unless the opponent is somehow using the very same decision-making process as me), and offer a fair deal against a human in the Ultimatum Game.)
But is the rational course of action to accept a penny?