Thinking about Parfit’s Hitchhiker, an alternative example occurred to me: You’re lost in the desert and this time Aul Peckman drives up and tells you “I will give you a ride back to town iff you would have stiffed my nemesis Paul Eckman.” After reading Parfit’s Hitchhiker, you had pre-committed to pay Paul Eckman if this happened to you, or chosen a decision theory that would cause you to do that, so you try telling Aul Peckman that you would stiff his nemesis in this situation, but he knows you’re lying and drives off. If only you weren’t so timelessly rational!
Obviously, one can argue that you’re more likely to encounter agents who will want to get paid than who will want you to not pay someone, and so if you’re in a world where that is true, you still have positive EV from running TDT/UDT, but is this an example of regretting TDT rationality?
Anti-Parfit’s Hitchhiker
Thinking about Parfit’s Hitchhiker, an alternative example occurred to me:
You’re lost in the desert and this time Aul Peckman drives up and tells you “I will give you a ride back to town iff you would have stiffed my nemesis Paul Eckman.” After reading Parfit’s Hitchhiker, you had pre-committed to pay Paul Eckman if this happened to you, or chosen a decision theory that would cause you to do that, so you try telling Aul Peckman that you would stiff his nemesis in this situation, but he knows you’re lying and drives off. If only you weren’t so timelessly rational!
Obviously, one can argue that you’re more likely to encounter agents who will want to get paid than who will want you to not pay someone, and so if you’re in a world where that is true, you still have positive EV from running TDT/UDT, but is this an example of regretting TDT rationality?