Excessive selfishness, sure. Some degree of selfishness is required as self-defense, currently, otherwise all your own needs are subsumed by supplying others’ wants.. Even in a completely symmetric society with everybody acting more for others’ good than their own is worse than one where everybody takes care of their own needs first—because each individual generally knows their own needs and wants better than anyone else does.
I don’t know the needs and wants of the future. I can’t know them particularly well. I have worse and worse uncertainty the farther away in time that is. Unless we’re talking about species-extinction level of events, I damn well should punt to those better informed, those closer to the problems.
It also is intuitive that we would like to care more about future people.
Not to me. Heck. I’m not entirely sure what it means to care about a person who doesn’t exist yet, and where my choices will influence which of many possible versions will exist.
each individual generally knows their own needs and wants better than anyone else does.
I don’t know the needs and wants of the future.
Expected-utility calculation already takes that into effect. Uncertainty about whether an action will be beneficial translates into a lower expected utility. Discounting, on top of that, is double counting.
Knowledge is a fact about probabilities, not utilities.
Not to me.
Let’s hope our different intuitions are resolvable.
I’m not entirely sure what it means to care about a person who doesn’t exist yet, and where my choices will influence which of many possible versions will exist.
Surely it’s not much more difficult than caring about a person who your choices will dramatically change?
Excessive selfishness, sure. Some degree of selfishness is required as self-defense, currently, otherwise all your own needs are subsumed by supplying others’ wants.. Even in a completely symmetric society with everybody acting more for others’ good than their own is worse than one where everybody takes care of their own needs first—because each individual generally knows their own needs and wants better than anyone else does.
I don’t know the needs and wants of the future. I can’t know them particularly well. I have worse and worse uncertainty the farther away in time that is. Unless we’re talking about species-extinction level of events, I damn well should punt to those better informed, those closer to the problems.
Not to me. Heck. I’m not entirely sure what it means to care about a person who doesn’t exist yet, and where my choices will influence which of many possible versions will exist.
Expected-utility calculation already takes that into effect. Uncertainty about whether an action will be beneficial translates into a lower expected utility. Discounting, on top of that, is double counting.
Knowledge is a fact about probabilities, not utilities.
Let’s hope our different intuitions are resolvable.
Surely it’s not much more difficult than caring about a person who your choices will dramatically change?