Firstly, I apologize if this has already been addressed, but I didn’t put in the time to read all the comments.
I still feel like Eliezer is passing the buck here. The computation to produce rightness is given by:
“Did everyone survive? How many people are happy? Are people in control of their own lives? …”
Ignore for the moment the issue of coherence. Is this supposed to be a list of all of my terminal values? Does that mean that since I follow my own planning algorithm, the morally correct action is any given situation will always be exactly what I would do given infinite time to deliberate? This doesn’t seem to add back up to normality to me. I feel like my actual planning algorithm assigns significantly more weight to things that affect me personally than my algorithm for finding the moral course of action does. Do you expect this to go away after sufficient deliberation?
If the above list of moral values is not supposed to a complete list of my terminal values, can you describe for me exactly which values are supposed to be on this list? I understand that godshatter may well make writing down a complete list impractical, but can you at least distinguish between the values on this list and the ones not on this list?
Firstly, I apologize if this has already been addressed, but I didn’t put in the time to read all the comments.
I still feel like Eliezer is passing the buck here. The computation to produce rightness is given by: “Did everyone survive? How many people are happy? Are people in control of their own lives? …” Ignore for the moment the issue of coherence. Is this supposed to be a list of all of my terminal values? Does that mean that since I follow my own planning algorithm, the morally correct action is any given situation will always be exactly what I would do given infinite time to deliberate? This doesn’t seem to add back up to normality to me. I feel like my actual planning algorithm assigns significantly more weight to things that affect me personally than my algorithm for finding the moral course of action does. Do you expect this to go away after sufficient deliberation?
If the above list of moral values is not supposed to a complete list of my terminal values, can you describe for me exactly which values are supposed to be on this list? I understand that godshatter may well make writing down a complete list impractical, but can you at least distinguish between the values on this list and the ones not on this list?