The Ameglian Major Cow — which not only wants to be eaten, but is capable of saying so, clearly and distinctly — seems to be in the same family of ethically problematic artificial intelligences as the house-elf — which wants to serve you, suffers if it cannot do so, and has no defenses against mistreatment.
In both cases, if the creature already exists, you may as well exploit it, since doing so fulfills the creature’s own intentions. But it seems to have been created for just the purpose of turning a vice into a virtue: of creating an artificial setup in which doing something that would normally be wrong (killing and eating other sapient beings; keeping slaves and benefiting from their labor) is rendered not-wrong by exceptional circumstances.
And this, in turn, seems calculated to degrade our moral intuitions. I suspect I would not want to meet a person who had grown up around house-elves and Ameglian Major Cows, and therefore expected that all intelligences were similarly eager for exploitation.
The Ameglian Major Cow — which not only wants to be eaten, but is capable of saying so, clearly and distinctly — seems to be in the same family of ethically problematic artificial intelligences as the house-elf — which wants to serve you, suffers if it cannot do so, and has no defenses against mistreatment.
In both cases, if the creature already exists, you may as well exploit it, since doing so fulfills the creature’s own intentions. But it seems to have been created for just the purpose of turning a vice into a virtue: of creating an artificial setup in which doing something that would normally be wrong (killing and eating other sapient beings; keeping slaves and benefiting from their labor) is rendered not-wrong by exceptional circumstances.
And this, in turn, seems calculated to degrade our moral intuitions. I suspect I would not want to meet a person who had grown up around house-elves and Ameglian Major Cows, and therefore expected that all intelligences were similarly eager for exploitation.