Why would you say that when you have no idea what his H or his h were in the first place?
Well, I don’t have “no idea”—I have a probability distribution informed by experience.
Having too much concern for an individual is theoretically possible I suppose, but it’s not a problem anyone is terribly likely to suffer from. The reason most people don’t care about most other people is not the fact that the human population is large; it’s the fact that most of that large population isn’t psychologically close enough for them to care.
It’s possible that utilitarian calculations could argue for downgrading one’s level of concern for e.g. Amanda Knox—but I’m far more inclined to suspect rationalization of pre-existing natural indifference on the part of someone who makes a claim like that.
Actually, h has increased on average; it’s just that h has decreased for the immediately available examples. i.e. I care much less about Amanda Fox or a single salient example, but more about general, systematic effects that might cause great harm to people that I don’t hear about.
Also, do you really care less about (i.e. assign less utility to the welfare of) someone like Amanda than previously, or is it just that you try to avoid strong emotional reactions to such individual cases?
Let’s look at it this way: if I had cash to hand, and was given the option: pay X to solve this particular salient injustice, then I’d be less inclined to do it than before.
On the other hand, if I was given the option: pay X to solve this particular class of injustices, then I’d be more inclined to do it than before.
This still bothers me; I feel like you should have just increased H without decreasing h.
Why would you say that when you have no idea what his H or his h were in the first place?
It’s intuitively difficult for us to accept, or at least to say, that having too much concern for a person is as possible as having too little.
Well, I don’t have “no idea”—I have a probability distribution informed by experience.
Having too much concern for an individual is theoretically possible I suppose, but it’s not a problem anyone is terribly likely to suffer from. The reason most people don’t care about most other people is not the fact that the human population is large; it’s the fact that most of that large population isn’t psychologically close enough for them to care.
It’s possible that utilitarian calculations could argue for downgrading one’s level of concern for e.g. Amanda Knox—but I’m far more inclined to suspect rationalization of pre-existing natural indifference on the part of someone who makes a claim like that.
Actually, h has increased on average; it’s just that h has decreased for the immediately available examples. i.e. I care much less about Amanda Fox or a single salient example, but more about general, systematic effects that might cause great harm to people that I don’t hear about.
I assume you mean Amanda Knox.
Also, do you really care less about (i.e. assign less utility to the welfare of) someone like Amanda than previously, or is it just that you try to avoid strong emotional reactions to such individual cases?
Let’s look at it this way: if I had cash to hand, and was given the option: pay X to solve this particular salient injustice, then I’d be less inclined to do it than before.
On the other hand, if I was given the option: pay X to solve this particular class of injustices, then I’d be more inclined to do it than before.
Emotional involvement follows a similar trend.