Basically, yes. If values are different enough between two species/minds/groups/whatever, then both see the other as resources that could be reorganized into more valuable structures.
To borrow an UFAI example: An upload might not hate you, but your atoms could be reorganized into computronium running thousands of upload copies/children.
“Friendly” values simply means our values (or very close to them—closer than the value spread among us). Preservation of preference means that the agency of far future will prefer (and do) the kinds of things that we would currently prefer to be done in the far future (on reflection, if we knew more, given the specific situation in the future, etc.). In other words, value drift is absence of reflective consistency, and Friendliness is reflective consistency in following our preference. Value drift results in the far future agency having preference very different from ours, and so not doing the things we’d prefer to be done. This turns the far future into the moral wasteland, from the point of view of our preference, little different from what would remain after unleashing a paperclip maximizer or exterminating all life and mind.
(Standard disclaimer: values/preference have little to do with apparent wants or likes.)
Why must value drift eventually make unfriendly values? Do you just define “friendly” values as close values?
Basically, yes. If values are different enough between two species/minds/groups/whatever, then both see the other as resources that could be reorganized into more valuable structures.
To borrow an UFAI example: An upload might not hate you, but your atoms could be reorganized into computronium running thousands of upload copies/children.
“Friendly” values simply means our values (or very close to them—closer than the value spread among us). Preservation of preference means that the agency of far future will prefer (and do) the kinds of things that we would currently prefer to be done in the far future (on reflection, if we knew more, given the specific situation in the future, etc.). In other words, value drift is absence of reflective consistency, and Friendliness is reflective consistency in following our preference. Value drift results in the far future agency having preference very different from ours, and so not doing the things we’d prefer to be done. This turns the far future into the moral wasteland, from the point of view of our preference, little different from what would remain after unleashing a paperclip maximizer or exterminating all life and mind.
(Standard disclaimer: values/preference have little to do with apparent wants or likes.)