My own thoughts and reactions are somewhat illegible to me, so I’m not certain this is my true objection. But I think our disagreement is what I mentioned above: Utility functions and cost-benefit calculations are tools for decisions and predictions, where “altruism” and moral judgements are orthogonal and not really measurable using the same tools.
I do consider myself somewhat altruistic, in that I’ll sacrifice a bit of my own comfort to (I hope and imagine) help near and distant strangers. And I want to encourage others to be that way as well. I don’t think framing it as “because my utility function includes terms for strangers” is more helpful nor more true than “because virtuous people help strangers”. And in the back of my mind I suspect there’s a fair bit of self-deception in that I mostly prefer it because that belief-agreement (or at least apparent agreement) makes my life easier and maintains my status in my main communities.
I do agree with your (and Tim Urban’s) observation that “emotional distance” is a thing, and it varies in import among people. I’ve often modeled it (for myself) as an inverse-square relationship about how much emotional investment I have based on informational distance (how often I interact with them), but that’s not quite right. I don’t agree with using this observation to measure altruism or moral judgement.
My own thoughts and reactions are somewhat illegible to me, so I’m not certain this is my true objection.
That makes sense. I feel like that happens to me sometimes as well.
But I think our disagreement is what I mentioned above: Utility functions and cost-benefit calculations are tools for decisions and predictions, where “altruism” and moral judgements are orthogonal and not really measurable using the same tools.
I see. That sounds correct. (And also probably isn’t worth diving into here.)
I do agree with your (and Tim Urban’s) observation that “emotional distance” is a thing, and it varies in import among people. I’ve often modeled it (for myself) as an inverse-square relationship about how much emotional investment I have based on informational distance (how often I interact with them), but that’s not quite right. I don’t agree with using this observation to measure altruism or moral judgement.
Gotcha. After posting and discussing in the comments a bit, this is something that I wish I had hit on in the post. That even if “altruism” isn’t quite the right concept, there’s probably some related concept (like “emotional distance”) that maps to what I discussed in the post.
Gotcha. If so, I’m not seeing it. Do you have any thoughts on where specifically we disagree?
My own thoughts and reactions are somewhat illegible to me, so I’m not certain this is my true objection. But I think our disagreement is what I mentioned above: Utility functions and cost-benefit calculations are tools for decisions and predictions, where “altruism” and moral judgements are orthogonal and not really measurable using the same tools.
I do consider myself somewhat altruistic, in that I’ll sacrifice a bit of my own comfort to (I hope and imagine) help near and distant strangers. And I want to encourage others to be that way as well. I don’t think framing it as “because my utility function includes terms for strangers” is more helpful nor more true than “because virtuous people help strangers”. And in the back of my mind I suspect there’s a fair bit of self-deception in that I mostly prefer it because that belief-agreement (or at least apparent agreement) makes my life easier and maintains my status in my main communities.
I do agree with your (and Tim Urban’s) observation that “emotional distance” is a thing, and it varies in import among people. I’ve often modeled it (for myself) as an inverse-square relationship about how much emotional investment I have based on informational distance (how often I interact with them), but that’s not quite right. I don’t agree with using this observation to measure altruism or moral judgement.
That makes sense. I feel like that happens to me sometimes as well.
I see. That sounds correct. (And also probably isn’t worth diving into here.)
Gotcha. After posting and discussing in the comments a bit, this is something that I wish I had hit on in the post. That even if “altruism” isn’t quite the right concept, there’s probably some related concept (like “emotional distance”) that maps to what I discussed in the post.