Looking forward to your next post, but in the meantime:
AI—Seems like it would be easier to build an AI that helps me get what I want, if “what I want” had various nice properties and I wasn’t in “crossing that bridge when we come to it” mode all the time.
meta-ethical uncertainty—I can’t be sure there is no territory.
ethics/philosophy as a status game—I can’t get status from this game if I opt out of it.
morality as coordination—I’m motivated to make my morality have various nice properties because it helps other people coordinate with me (by letting them better predict what I would do in various situations/counterfactuals).
Looking forward to your next post, but in the meantime:
AI—Seems like it would be easier to build an AI that helps me get what I want, if “what I want” had various nice properties and I wasn’t in “crossing that bridge when we come to it” mode all the time.
meta-ethical uncertainty—I can’t be sure there is no territory.
ethics/philosophy as a status game—I can’t get status from this game if I opt out of it.
morality as coordination—I’m motivated to make my morality have various nice properties because it helps other people coordinate with me (by letting them better predict what I would do in various situations/counterfactuals).