Unconvinced. Bottom line seems to be an equation of Personal Care with Moral Worth.
But I don’t see how the text really supports that: Just because we feel more attached to entities we interact with, it doesn’t inherently elevate their sentience i.e. their objective moral worth.
Example: Our lesser emotional attachment or physical distance to chickens in factory farms does not diminish their sentience or moral worth, I’d think. Same for (future) AIs too.
At best I could see this equation to +- work out in a perfectly illusionist reality, where there is no objective moral relevance. But then I’d rather not invoke the concept of moral relevance at all—instead we’d have to remain with mere subjective care as the only thing there might be.
Unconvinced. Bottom line seems to be an equation of Personal Care with Moral Worth.
But I don’t see how the text really supports that: Just because we feel more attached to entities we interact with, it doesn’t inherently elevate their sentience i.e. their objective moral worth.
Example: Our lesser emotional attachment or physical distance to chickens in factory farms does not diminish their sentience or moral worth, I’d think. Same for (future) AIs too.
At best I could see this equation to +- work out in a perfectly illusionist reality, where there is no objective moral relevance. But then I’d rather not invoke the concept of moral relevance at all—instead we’d have to remain with mere subjective care as the only thing there might be.