My problem with CEV is the arbitrariness of what it means to “know more”. My brain cannot hold all the knowledge about the universe, so the AI has to somehow choose what information to impart and in what order, and this would significantly influence the outcome. E.g. maybe hearing 100 heartwarming stories would make me care more about others, while hearing 100 stories about people being bastards to each other would make me care less, hearing all evidence supporting some political theory would sway me towards it, et cetera.
My problem with CEV is the arbitrariness of what it means to “know more”. My brain cannot hold all the knowledge about the universe, so the AI has to somehow choose what information to impart and in what order, and this would significantly influence the outcome. E.g. maybe hearing 100 heartwarming stories would make me care more about others, while hearing 100 stories about people being bastards to each other would make me care less, hearing all evidence supporting some political theory would sway me towards it, et cetera.