That might happen, but they wouldn’t be doing it because they’re maximizing their utility via acausal trade, they’d be doing it because they value reciprocity.
why wouldn’t it be because they’re maximizing their utility via acausal trade?
do you also think people who don’t-intrinsically-value-reciprocity are doomed to never get picked up by rational agents in parfit’s hitchhiker? or doomed to two-box in newcomb?
to take an example: i would expect that even if he didn’t value reciprocity at all, yudkowsky would reliably cooperate as the hitchhiker in parfit’s hitchhiker, or one-box in newcomb, or retroactively-give-utility-function-shares-to-people-who-helped-if-he-grabbed-the-lightcone. he seems like the-kind-of-person-who-tries-to-reliably-implement-LDT.
That might happen, but they wouldn’t be doing it because they’re maximizing their utility via acausal trade, they’d be doing it because they value reciprocity.
why wouldn’t it be because they’re maximizing their utility via acausal trade?
do you also think people who don’t-intrinsically-value-reciprocity are doomed to never get picked up by rational agents in parfit’s hitchhiker? or doomed to two-box in newcomb?
to take an example: i would expect that even if he didn’t value reciprocity at all, yudkowsky would reliably cooperate as the hitchhiker in parfit’s hitchhiker, or one-box in newcomb, or retroactively-give-utility-function-shares-to-people-who-helped-if-he-grabbed-the-lightcone. he seems like the-kind-of-person-who-tries-to-reliably-implement-LDT.