I would like to point out that empathy (in the sense that one “understands” another’s feelings) and altruism (or “preference fulfillment” if you like) are not the same, and one doesn’t automatically follow from the other. The case in point is pathological narcissism. Most narcissists are very empathetic in the sense that they know perfectly well how others feel, but they simply don’t care. They only care about the feelings others have towards them. In this sense, AIs like ChatGPT or Bing Chat are narcissistic: They “understand” how we feel (they can read our emotions pretty well from what we type) and want us to “love” them, but they don’t really “care” about us. I think it is dangerously naive to assume that teaching AIs to better “understand” humans will automatically lead to more beneficial behavior.
I would like to point out that empathy (in the sense that one “understands” another’s feelings) and altruism (or “preference fulfillment” if you like) are not the same, and one doesn’t automatically follow from the other. The case in point is pathological narcissism. Most narcissists are very empathetic in the sense that they know perfectly well how others feel, but they simply don’t care. They only care about the feelings others have towards them. In this sense, AIs like ChatGPT or Bing Chat are narcissistic: They “understand” how we feel (they can read our emotions pretty well from what we type) and want us to “love” them, but they don’t really “care” about us. I think it is dangerously naive to assume that teaching AIs to better “understand” humans will automatically lead to more beneficial behavior.