“Efficiency” at achieving something other than what you should work towards is harmful. … Otherwise, you let that Blind Idiot Azathoth pick your purposes for you, trusting it more than you trust yourself.
The purpose of solving friendly AI is to protect the purposes picked for us by the blind idiot god.
Our psychological adaptations are not our purposes, we don’t want to protect them, even though they contribute to determining what it is we want to protect. See Evolutionary Psychology.
The purpose of solving friendly AI is to protect the purposes picked for us by the blind idiot god.
Our psychological adaptations are not our purposes, we don’t want to protect them, even though they contribute to determining what it is we want to protect. See Evolutionary Psychology.