The capacity to abide by morality carves out the right cluster in thingspace for me, though I’d hesitate to call it the determining factor. If a thing has this capacity, we care about its preferences proportionately.
People, save probably infants, are fully capable, in theory, of understanding and abiding by morality. Most animals are not. Those more capable of doing so, domestic pets, beasts of burden for example, receive some protection. Those who do not have this capacity are generally less protected and that which is done to them less morally relevant.
I don’t fully endorse this view, but it feels like it explians a lot.
This is a logical vicious circle. Morality itself is the handmaiden of humans (and similar creatures in fantasy and SF). Morality has value only insofar as we find it important to care about human and quasi-human interests. This does not answer the question “Why do we care about human and quasi-human interests?”
One could try to find an answer in the prisoner’s dilemma. In the logic of Kant’s categorical imperative. Cooperation of rational agents and the like. Then I should sympathize with any system that cares about my interests, even if that system is otherwise like the Paperclipmaker and completely devoid of “unproductive” self-reflection. Great. There is some cynical common sense in this, but I feel a little disappointed.
Which cluster is that: Agents currently acknowledging a morality similar to yours (with “capacity” referring to their choice on whether or not to act according to those nominal beliefs at any give time)? Agents who would be moved by your moral arguments (even if those arguments haven’t yet been presented to them)? Anything Turing-complete (even if not currently running an algorithm that has anything to do with morality)?
“Agents capable of being moral” corresponds very closely with my intuitive set “agents whose desires we should have some degree of respect for.” Thus, it captures my personal sense of what morality is quite well, though it doesn’t really capture why that’s my sense of it.
The capacity to abide by morality carves out the right cluster in thingspace for me, though I’d hesitate to call it the determining factor. If a thing has this capacity, we care about its preferences proportionately.
People, save probably infants, are fully capable, in theory, of understanding and abiding by morality. Most animals are not. Those more capable of doing so, domestic pets, beasts of burden for example, receive some protection. Those who do not have this capacity are generally less protected and that which is done to them less morally relevant.
I don’t fully endorse this view, but it feels like it explians a lot.
This is a logical vicious circle. Morality itself is the handmaiden of humans (and similar creatures in fantasy and SF). Morality has value only insofar as we find it important to care about human and quasi-human interests. This does not answer the question “Why do we care about human and quasi-human interests?”
One could try to find an answer in the prisoner’s dilemma. In the logic of Kant’s categorical imperative. Cooperation of rational agents and the like. Then I should sympathize with any system that cares about my interests, even if that system is otherwise like the Paperclipmaker and completely devoid of “unproductive” self-reflection. Great. There is some cynical common sense in this, but I feel a little disappointed.
Which cluster is that: Agents currently acknowledging a morality similar to yours (with “capacity” referring to their choice on whether or not to act according to those nominal beliefs at any give time)? Agents who would be moved by your moral arguments (even if those arguments haven’t yet been presented to them)? Anything Turing-complete (even if not currently running an algorithm that has anything to do with morality)?
“Agents capable of being moral” corresponds very closely with my intuitive set “agents whose desires we should have some degree of respect for.” Thus, it captures my personal sense of what morality is quite well, though it doesn’t really capture why that’s my sense of it.