Suppose that someone asks you to help them move in to their new apartment. Should you help them?
It depends, right?
Depends on what?
My answer: moral weights.
Suppose it’s your brother who is the one who asked you. And suppose you really love your brother. You guys are really close. You care about him just as much as you care about yourself. You assign a “moral weight” of 1-to-1. Suppose that helping him move will bring him 10 utilons. If so, you should help him move as long it costs you less than 10 utilons.[1]
Now let’s suppose that it’s your cousin instead of your brother. You like your cousin, but not as much as your brother. You care about yourself four times as much as you care about your cousin. Let’s call this a “moral weight” of 4-to-1, or just “4″. In this situation, you should help him move as long as it costs you less than 2.5 utilons.
Suppose it’s a medium-good friend of yours. You care about yourself 10x as much as you care about this friend. Here, the breakeven point would be when it costs you 1 utilon to help your friend move.[2]
What about if it’s a coworker who is cool but also isn’t your favorite person in the world. You assign a moral weight to them of 100. The breakeven point is 0.1 utilons.
In that blog post, Tim uses the diagrams to represent how close you are to various people. I think it can also be used to represent how much moral weight you assign to various people.[3] For example, maybe Tier 1 includes people for whom you’d assign moral weights up to 5, Tier 2 up to 30, and Tier 3 up to 1,000.
Different people have different “moral mountains”. People like Near-and-Dear Nick care a lot about the people who are close to them, but below that things get a little sparse.
For others like Individualistic Ian, there isn’t anyone who they care about nearly as much as they care about themself, but there are still a decent amount of people in Tier 2 and Tier 3.
Then you’ve got your Selfish Shelby’s who pretty much just think about themself.
If I were able to draw like Tim Urban I would include more examples, but unfortunately I don’t have that skill. But for fun, let’s just briefly think about some other examples, and ask some other questions.
What would Peter Singer’s mountain look like? Is everyone in Tier 1?
What about vegans? Where do they place animals on their mountains?
Longtermists? Where do they place that guy Tron Landale who is potentially living in the year 31,402?
What about nationalism in politics? Consider a nationalistic American. Would they bump someone down a tier for moving Vermont to Montreal? Two tiers? And what if they moved to China? Or what if they were born in China?
I suppose it’s not only people sentient beings that can be placed on the mountain. Some people value abstract things like art and knowledge. So then, how high up on the mountain do those things get placed?
I’ll end with this thought: I think you can probably use these ideas of moral weights and moral mountains to quantify how altruistic someone is. Maybe just add up all of the moral weights you assign? I dunno.
I prefer to think about it visually. Someone like Selfish Shelby isn’t very altruistic because her mountain is pretty empty. On the other hand, someone like Near-and-Dear Nick looks like his mountain is reasonably crowded.
It gets interesting though when you compare people with similarly crowded, but differently shaped mountains. For example, how does Near-and-Dear Nick’s mountain compare to Longtermist Lauren’s mountain? Nick’s mountain looks pretty cozy with all of those people joining him at the summit.
But he’s also got a ton of people on the white parts of land who he basically couldn’t give a shit about. On the other hand, Longtermist Lauren’s mountain looks a little sparse on first glance. There’s definitely no parties going on up at the summit.
But upon closer inspection, that red-tinted fourth tier spans out seemingly forever. And the white section labeled “Strangers” is miles and miles away.
This is an overly simplified toy example. Let’s assume for the sake of this discussion that there are no other consequences at play here. It’s a clean, one-time, simple trade-off.
In practice, helping your friend move would often be a win-win situation. Instead of costing you utilons, it’s likely that you generate utilons for yourself. Ie. it’s likely that it’s a pleasant experience. Warm fuzzies and whatnot.
Moral Mountains
Suppose that someone asks you to help them move in to their new apartment. Should you help them?
It depends, right?
Depends on what?
My answer: moral weights.
Suppose it’s your brother who is the one who asked you. And suppose you really love your brother. You guys are really close. You care about him just as much as you care about yourself. You assign a “moral weight” of 1-to-1. Suppose that helping him move will bring him 10 utilons. If so, you should help him move as long it costs you less than 10 utilons.[1]
Now let’s suppose that it’s your cousin instead of your brother. You like your cousin, but not as much as your brother. You care about yourself four times as much as you care about your cousin. Let’s call this a “moral weight” of 4-to-1, or just “4″. In this situation, you should help him move as long as it costs you less than 2.5 utilons.
Suppose it’s a medium-good friend of yours. You care about yourself 10x as much as you care about this friend. Here, the breakeven point would be when it costs you 1 utilon to help your friend move.[2]
What about if it’s a coworker who is cool but also isn’t your favorite person in the world. You assign a moral weight to them of 100. The breakeven point is 0.1 utilons.
Framing things this way makes me think back to the following diagram from 10 Types of Odd Friendships You’re Probably Part Of:
In that blog post, Tim uses the diagrams to represent how close you are to various people. I think it can also be used to represent how much moral weight you assign to various people.[3] For example, maybe Tier 1 includes people for whom you’d assign moral weights up to 5, Tier 2 up to 30, and Tier 3 up to 1,000.
Different people have different “moral mountains”. People like Near-and-Dear Nick care a lot about the people who are close to them, but below that things get a little sparse.
For others like Individualistic Ian, there isn’t anyone who they care about nearly as much as they care about themself, but there are still a decent amount of people in Tier 2 and Tier 3.
Then you’ve got your Selfish Shelby’s who pretty much just think about themself.
If I were able to draw like Tim Urban I would include more examples, but unfortunately I don’t have that skill. But for fun, let’s just briefly think about some other examples, and ask some other questions.
What would Peter Singer’s mountain look like? Is everyone in Tier 1?
What about vegans? Where do they place animals on their mountains?
Longtermists? Where do they place that guy Tron Landale who is potentially living in the year 31,402?
What about nationalism in politics? Consider a nationalistic American. Would they bump someone down a tier for moving Vermont to Montreal? Two tiers? And what if they moved to China? Or what if they were born in China?
I suppose it’s not only
peoplesentient beings that can be placed on the mountain. Some people value abstract things like art and knowledge. So then, how high up on the mountain do those things get placed?I’ll end with this thought: I think you can probably use these ideas of moral weights and moral mountains to quantify how altruistic someone is. Maybe just add up all of the moral weights you assign? I dunno.
I prefer to think about it visually. Someone like Selfish Shelby isn’t very altruistic because her mountain is pretty empty. On the other hand, someone like Near-and-Dear Nick looks like his mountain is reasonably crowded.
It gets interesting though when you compare people with similarly crowded, but differently shaped mountains. For example, how does Near-and-Dear Nick’s mountain compare to Longtermist Lauren’s mountain? Nick’s mountain looks pretty cozy with all of those people joining him at the summit.
But he’s also got a ton of people on the white parts of land who he basically couldn’t give a shit about. On the other hand, Longtermist Lauren’s mountain looks a little sparse on first glance. There’s definitely no parties going on up at the summit.
But upon closer inspection, that red-tinted fourth tier spans out seemingly forever. And the white section labeled “Strangers” is miles and miles away.
This is an overly simplified toy example. Let’s assume for the sake of this discussion that there are no other consequences at play here. It’s a clean, one-time, simple trade-off.
In practice, helping your friend move would often be a win-win situation. Instead of costing you utilons, it’s likely that you generate utilons for yourself. Ie. it’s likely that it’s a pleasant experience. Warm fuzzies and whatnot.
“Closeness” and “moral weight” are probably pretty related. But still, they are also two distinct and different concepts.