Valid point, thanks. Although I’m not very fond of this kind of calculations of utility, your point is well made.
In my case, I probably wouldn’t give my life for less than lives of a billion strangers, so that ratio would have to be extremely high, to the point where it’s probably incalculable.
I mean, to be clear, making this call doesn’t require you to be incredibly altruistic here, it just requires you to care at all about trading with people around you, and acting at all with something like the principle of generalizability in mind (or TDT, or UDT, or whatever other flavor of game-theory that helps you describe principles that enable positive-sum trades and avoid negative sum equilibria).
In my case, I probably wouldn’t give my life for less than lives of a billion strangers, so that ratio would have to be extremely high, to the point where it’s probably incalculable.
Valid point, thanks. Although I’m not very fond of this kind of calculations of utility, your point is well made.
In my case, I probably wouldn’t give my life for less than lives of a billion strangers, so that ratio would have to be extremely high, to the point where it’s probably incalculable.
I mean, to be clear, making this call doesn’t require you to be incredibly altruistic here, it just requires you to care at all about trading with people around you, and acting at all with something like the principle of generalizability in mind (or TDT, or UDT, or whatever other flavor of game-theory that helps you describe principles that enable positive-sum trades and avoid negative sum equilibria).
Okay, in that case, your position is actually consistent and your question valid. I’m pretty sure that’s a minority position on LW, though.
Why?