The error is this: the reasoning assumes that humans desires are designed in a way that makes sense with respect to the way reality is. In other words, that we’re not inherently deluded or mislead by our basic nature in some (subjectively) unacceptable way.
Interestingly, this is the exact opposite of Zen, in which it’s considered a premise that we are inherently deluded and misled by our basic nature… and in large part due to our need to label things. As in How An Algorithm Feels From Inside, Zen attempts to point out that our basic nature is delusion: we feel as though questions like “Does the tree make a sound?” and “What is the nature of objective morality?” actually have some sort of sensible meaning.
(Of course, I have to say that Eliezer’s writing on the subject did a lot more for allowing me to really grasp that idea than my Zen studies ever did. OTOH, Zen provides more opportunities to feel as though the world is an undifferentiated whole, its own self with no labels needed.)
Interestingly, this is the exact opposite of Zen, in which it’s considered a premise that we are inherently deluded and misled by our basic nature… and in large part due to our need to label things. As in How An Algorithm Feels From Inside, Zen attempts to point out that our basic nature is delusion: we feel as though questions like “Does the tree make a sound?” and “What is the nature of objective morality?” actually have some sort of sensible meaning.
(Of course, I have to say that Eliezer’s writing on the subject did a lot more for allowing me to really grasp that idea than my Zen studies ever did. OTOH, Zen provides more opportunities to feel as though the world is an undifferentiated whole, its own self with no labels needed.)