If you don’t feel like you care about billions of people, and you recognize that the part of your brain that cares about small numbers of people has scope sensitivity, what observation causes you to believe that you do care about everyone equally?
I can think of two categories of responses.
One is something like “I care by induction”. Over the course of your life, you have ostensibly had multiple experiences of meeting new people, and ending up caring about them. You can reasonably predict that, if you meet more people, you will end up caring about them too. From there, it’s not much of a leap to “I should just start caring about people before I meet them”. After all, rational agents should not be able to predict changes in their own beliefs; you might as well update now.
The other is something like “The caring is much better calibrated than the not-caring”. Let me use an analogy to physics. My everyday intuition says that clocks tick at the same rate for everybody, no matter how fast they move; my knowledge of relativity says clocks slow down significantly near c. The problem is that my intuition on the matter is baseless; I’ve never traveled at relativistic speeds. When my baseless intuition collides with rigorously-verified physics, I have to throw out my intuition.
I’ve also never had direct interaction with or made meaningful decisions about billions of people at a time, but I have lots of experience with individual people. “I don’t care much about billions of people” is an almost totally unfounded wild guess, but “I care lots about individual people” has lots of solid evidence, so when they collide, the latter wins.
(Neither of these are ironclad, at least not as I’ve presented them, but hopefully I’ve managed to gesture in a useful direction.)
Your second category of response seems to say “my intuitions about considering a group of people, taken billions at a time, aren’t reliable, but my intuitions about considering the same group of people, one at a time, are”. You then conclude that you care because taking the billions of people one at a time implies that you care about them.
But it seems that I could apply the same argument a little differently—instead of applying it to how many people you consider at a time, apply it to the total size of the group. “my intuitions about how much I care about a group of billions are bad, even though my intuitions about how much I care about a small group are good.” The second argument would, then, imply that it is wrong to use your intuitions about small groups to generalize to large groups—that is, the second argument refutes the first. Going from “I care about the people in my life” to “I would care about everyone if I met them” is as inappropriate as going from “I know what happens to clocks at slow speeds” to “I know what happens to clocks at near-light speeds”.
The next time you are in a queue with strangers, imagine the two people behind you (that you haven’t met before and don’t expect to meet again and didn’t really interact with much at all, but they are /concrete/). Put them on one track in the trolley problem, and one of the people that you know and care about on the other track.
If you prefer to save two strangers to one tribesman, you are different enough from me that we will have trouble talking about the subject, and you will probably find me to be a morally horrible person in hypothetical situations.
To address your first category: When I meet new people and interact with them, I do more than gain information- I perform transitive actions that move them out of the group “people I’ve never met” that I don’t care about, and into the group of people that I do care about.
Addressing your second: I found that a very effective way to estimate my intuition would be to imagine a group of X people that I have never met (or specific strangers) on one minecart track, and a specific person that I know on the other. I care so little about small groups of strangers, compared to people that I know, that I find my intuition about billions is roughly proportional; the dominant factor in my caring about strangers is that some number of people who are strangers to me are important to people who are important to me, and therefore indirectly important to me.
I can think of two categories of responses.
One is something like “I care by induction”. Over the course of your life, you have ostensibly had multiple experiences of meeting new people, and ending up caring about them. You can reasonably predict that, if you meet more people, you will end up caring about them too. From there, it’s not much of a leap to “I should just start caring about people before I meet them”. After all, rational agents should not be able to predict changes in their own beliefs; you might as well update now.
The other is something like “The caring is much better calibrated than the not-caring”. Let me use an analogy to physics. My everyday intuition says that clocks tick at the same rate for everybody, no matter how fast they move; my knowledge of relativity says clocks slow down significantly near c. The problem is that my intuition on the matter is baseless; I’ve never traveled at relativistic speeds. When my baseless intuition collides with rigorously-verified physics, I have to throw out my intuition.
I’ve also never had direct interaction with or made meaningful decisions about billions of people at a time, but I have lots of experience with individual people. “I don’t care much about billions of people” is an almost totally unfounded wild guess, but “I care lots about individual people” has lots of solid evidence, so when they collide, the latter wins.
(Neither of these are ironclad, at least not as I’ve presented them, but hopefully I’ve managed to gesture in a useful direction.)
Your second category of response seems to say “my intuitions about considering a group of people, taken billions at a time, aren’t reliable, but my intuitions about considering the same group of people, one at a time, are”. You then conclude that you care because taking the billions of people one at a time implies that you care about them.
But it seems that I could apply the same argument a little differently—instead of applying it to how many people you consider at a time, apply it to the total size of the group. “my intuitions about how much I care about a group of billions are bad, even though my intuitions about how much I care about a small group are good.” The second argument would, then, imply that it is wrong to use your intuitions about small groups to generalize to large groups—that is, the second argument refutes the first. Going from “I care about the people in my life” to “I would care about everyone if I met them” is as inappropriate as going from “I know what happens to clocks at slow speeds” to “I know what happens to clocks at near-light speeds”.
I’ll go a more direct route:
The next time you are in a queue with strangers, imagine the two people behind you (that you haven’t met before and don’t expect to meet again and didn’t really interact with much at all, but they are /concrete/). Put them on one track in the trolley problem, and one of the people that you know and care about on the other track.
If you prefer to save two strangers to one tribesman, you are different enough from me that we will have trouble talking about the subject, and you will probably find me to be a morally horrible person in hypothetical situations.
To address your first category: When I meet new people and interact with them, I do more than gain information- I perform transitive actions that move them out of the group “people I’ve never met” that I don’t care about, and into the group of people that I do care about.
Addressing your second: I found that a very effective way to estimate my intuition would be to imagine a group of X people that I have never met (or specific strangers) on one minecart track, and a specific person that I know on the other. I care so little about small groups of strangers, compared to people that I know, that I find my intuition about billions is roughly proportional; the dominant factor in my caring about strangers is that some number of people who are strangers to me are important to people who are important to me, and therefore indirectly important to me.