Wait.. if you base morality off of what other agents judge to be moral, and some of those agents are likewise judging their morality off of what other agents judge to be moral..… aren’t you kind of SOL? Seems a little akin to Eliezer’s calculator that calculates what it calculates.
Wait.. if you base morality off of what other agents judge to be moral, and some of those agents are likewise judging their morality off of what other agents judge to be moral..… aren’t you kind of SOL? Seems a little akin to Eliezer’s calculator that calculates what it calculates.