If you want a reason to prefer dust specks for others over torture for yourself, consistently egocentric values can do it. That will also lead you to prefer torture for others over torture for yourself. What about preferring torture for others over dust speck for yourself? It’s psychologically possible, but the true threshold (beyond which one would choose torture for others) seems to lie somewhere between inconvenience for oneself and torture for oneself.
It seems that LW has never had a serious discussion about the likely fact that the true human value system is basically egocentric, with altruism being sharply bounded by the personal costs experienced; nor has there been a discussion about the implications of this for CEV and FAI.
ETA: OK, I see I didn’t say how a person would choose between dust specks for 3^^^3 others versus torture for one other. Will recently mentioned that you should take the preferences of the 3^^^3 into account: would they want someone to be tortured for fifty years, so that none of them got a dust speck in the eye? “Renormalizing” in this way is probably the best way to get a sensible and consistent decision procedure here, if one employs the model of humans as “basically egocentric but with a personal threshold of cost below which altruism is allowed”.
If you want a reason to prefer dust specks for others over torture for yourself, consistently egocentric values can do it. That will also lead you to prefer torture for others over torture for yourself. What about preferring torture for others over dust speck for yourself? It’s psychologically possible, but the true threshold (beyond which one would choose torture for others) seems to lie somewhere between inconvenience for oneself and torture for oneself.
It seems that LW has never had a serious discussion about the likely fact that the true human value system is basically egocentric, with altruism being sharply bounded by the personal costs experienced; nor has there been a discussion about the implications of this for CEV and FAI.
ETA: OK, I see I didn’t say how a person would choose between dust specks for 3^^^3 others versus torture for one other. Will recently mentioned that you should take the preferences of the 3^^^3 into account: would they want someone to be tortured for fifty years, so that none of them got a dust speck in the eye? “Renormalizing” in this way is probably the best way to get a sensible and consistent decision procedure here, if one employs the model of humans as “basically egocentric but with a personal threshold of cost below which altruism is allowed”.