Our values do not change as a result of reflection
Values, like biology and culture, evolve. That doesn’t mean getting “better” over time. It means becoming more adaptive.
Take any moral advance you like. Study its history, and you’ll find people adopted it when it became economically advantageous to those in power do so.
The arguments here seem completely orthogonal to the heading they are supposed to support. In fact that seems to be representative of the whole post.
Yes, values are unstable over time. Go far enough back and there will not even be creatures with something that can be described as ‘values’. The same could be expected if humanity somehow managed to evolve towards the Malthusian equilibrium that current evolutionary payoffs would reward. Our current values are completely unstable.
CEV isn’t anything to do with predicting what humans would value in the future. It is about capturing the values we have right now and adjusting them only according to how we would want them to be adjusted if we had the time, power and resources to think it through. That doesn’t mean simulating the future it means taking a closer look at inconsistencies and competing desires and resolving them in whichever fashion seems best. It means ironing out problems with respect to want/approve/like/would want. It means extrapolating current preferences to a best effort evaluation of how to value scenarios which are too complex or unfamiliar for us to give a useful evaluation of if faced with it now.
The arguments here seem completely orthogonal to the heading they are supposed to support. In fact that seems to be representative of the whole post.
Yes, values are unstable over time. Go far enough back and there will not even be creatures with something that can be described as ‘values’. The same could be expected if humanity somehow managed to evolve towards the Malthusian equilibrium that current evolutionary payoffs would reward. Our current values are completely unstable.
CEV isn’t anything to do with predicting what humans would value in the future. It is about capturing the values we have right now and adjusting them only according to how we would want them to be adjusted if we had the time, power and resources to think it through. That doesn’t mean simulating the future it means taking a closer look at inconsistencies and competing desires and resolving them in whichever fashion seems best. It means ironing out problems with respect to want/approve/like/would want. It means extrapolating current preferences to a best effort evaluation of how to value scenarios which are too complex or unfamiliar for us to give a useful evaluation of if faced with it now.