I agree with what you’re saying, but just to complicate things a bit: what if humans have two terminal values that directly conflict? Would it be justifiable to modify one to satisfy the other, or would we just have to learn to live with the contradiction? (I honestly don’t know what I think.)
I agree with what you’re saying, but just to complicate things a bit: what if humans have two terminal values that directly conflict? Would it be justifiable to modify one to satisfy the other, or would we just have to learn to live with the contradiction? (I honestly don’t know what I think.)
Ah… If you or I knew what to think, we’d be working on CEV right now, and we’d all be much less fucked than we currently are.