No. I think that would cause value drift, and I’d rather my values not drift in that fashion, because it would cause me to be less likely to steer reality towards world-states which maximize my current values.
Would values that aren’t stable without constant emotional feedback values worth preserving? You might evolve to better ones E.g psychopaths are shown to make more utilitarian judgements and not be swayed by emotive descriptions.
No. I think that would cause value drift, and I’d rather my values not drift in that fashion, because it would cause me to be less likely to steer reality towards world-states which maximize my current values.
Would values that aren’t stable without constant emotional feedback values worth preserving? You might evolve to better ones E.g psychopaths are shown to make more utilitarian judgements and not be swayed by emotive descriptions.
I’m interested in this, Do you have a link or cite?