What if what they really want, deep down, is a sense of importance or social interaction or whatnot?
This sounds a bit like religious people saying “But what if it turns out that there is no morality? That would be bad!”. What part of you thinks that this is bad? Because, that is what CEV is extrapolating. CEV is taking the deepest and most important values we have, and figuring out what to do next. You in principle couldn’t care about anything else.
If human values wanted to self-modify, then CEV would recognise this. CEV wants to do what we want most, and this we call ‘right’.
The only non-arbitrary “we” is the community of all minds/consciousnesses.
This is what you value, what you chose. Don’t lose sight of invisible frameworks. If we’re including all decision procedures, then why not computers too? This is part of the human intuition of ‘fairness’ and ‘equality’ too. Not the hamster’s one.
This sounds a bit like religious people saying “But what if it turns out that there is no morality? That would be bad!”. What part of you thinks that this is bad? Because, that is what CEV is extrapolating. CEV is taking the deepest and most important values we have, and figuring out what to do next. You in principle couldn’t care about anything else.
If human values wanted to self-modify, then CEV would recognise this. CEV wants to do what we want most, and this we call ‘right’.
This is what you value, what you chose. Don’t lose sight of invisible frameworks. If we’re including all decision procedures, then why not computers too? This is part of the human intuition of ‘fairness’ and ‘equality’ too. Not the hamster’s one.
Yes. We want utilitarianism. You want CEV. It’s not clear where to go from there.
FWIW, hamsters probably exhibit fairness sensibility too. At least rats do.