Given two people A and B, and given a choice between A breaking ten bones and B breaking nine bones, you prefer A breaking ten bones if you’re A (because you have empathy). Yes?
Similarly, presumably if you’re B, you prefer that B break nine bones, again because you have empathy. Yes?
And in both of those cases you consider that the morally correct choice… an AI that arrives at a different answer is “unfriendly”/immoral. Yes?
So, OK. Some additional questions.
If you are neither A nor B and are given that choice, which do you prefer? Which do you consider the morally correct choice?
If you are A, and you have a choice between A breaking ten bones, B breaking nine bones, and letting B make the choice instead, which do you prefer? Which do you consider the morally correct choice? Does it affect your choice if you know that B shares your preferences and has empathy, and will therefore predictably choose ten broken bones for themselves?
OK.
Given two people A and B, and given a choice between A breaking ten bones and B breaking nine bones, you prefer A breaking ten bones if you’re A (because you have empathy).
Yes?
Similarly, presumably if you’re B, you prefer that B break nine bones, again because you have empathy.
Yes?
And in both of those cases you consider that the morally correct choice… an AI that arrives at a different answer is “unfriendly”/immoral.
Yes?
So, OK.
Some additional questions.
If you are neither A nor B and are given that choice, which do you prefer? Which do you consider the morally correct choice?
If you are A, and you have a choice between A breaking ten bones, B breaking nine bones, and letting B make the choice instead, which do you prefer? Which do you consider the morally correct choice? Does it affect your choice if you know that B shares your preferences and has empathy, and will therefore predictably choose ten broken bones for themselves?