I’m reminded of the post a while back on whether an Atheist/Rationalist society would be effective in war.
I have trouble understanding why they wouldn’t be (which seems to be the opinion of most of the others here). In an objective moral sense, if Truth doesn’t matter more than Winning, then what does? Implicitly most here behave in accordance to that statement—I’d suggest that the amount of time devoted to this site exceeds the amount required for merely winning in contemporary society—but most seem to balk at the concept that Truth might require the sacrifice of life.
Maybe it’s scope insensitivity. Risking 1 utilon for 10 utilons (at fifty/fifty odds) is a gamble everyone here would take—but when the risk is 1000 utilons for 10 000 utilons, even though it’s the same gamble, it’s harder to see it as such (this being the major pause which Yudkowki’s dust-mote vs torture analogy brought out).
If we are, in fact, advocating Truth over mere Winning, there are going to be casualties along the way; in concrete terms, if my goal is an equal and just society, then I will be called upon to intervene in any gay-bashings I witness, at the risk of my own life.
So yes, the Atheist/Rationalist society—assuming they have that meta level of moral awareness—will go to war and be more viciously stalwart than any religious group could possibly hope to be. And if Wednesday must choose between Truth and Winning—as long as she isn’t a lecherous societal leech, concerned only with besting her opponents, rules be damned—she’ll choose the former, regardless of the expense to herself.
Risking 1 utilon for 10 utilons (at fifty/fifty odds) is a gamble everyone here would take—but when the risk is 1000 utilons for 10 000 utilons, even though it’s the same gamble, it’s harder to see it as such
I think the standard reply here is that utilons (or utils, or whatever your favored terminology for this) is a standardized measure of whatever-it-is-you-care-about. You might not want to risk 1000 (say) dollars for even odds of 10 000 dollars—that all depends on your personal marginal utility of money. But if you don’t think you’d want to risk 1000 utilons for 10 000 utilons at even odds, that just means you’re defining utilons incorrectly. By definition, if I understand.
IAWYC, but I don’t think Aurini was necessarily making that mistake.
I read their comment as stating that, even when their “shut up and multiply” answer would or should be the same, people are wired to behave differently towards gambles when the stakes are higher. Not that they should, but that they do.
For example, my conscious dollars-to-utility function is nearly linear in small increments from my present position; if I had a 1-in-5 chance of turning $10 into $100, I’d go for it. However, my conscious (lives saved)-to-utility function is practically linear in small populations; but if I had a chance to gamble 10 lives against 100 at 1-in-5 odds, it would be psychologically more difficult to make the clearly correct choice. Or any choice at all; decisive paralysis is a probable actual outcome.
There are sensible evolutionary reasons for this to be the case, but it raises the question of what to do about it for people in positions of power.
On a deeper level, I’m suggesting that we over-estimate the utilon-level of own lives. Personally, I think your average North American thinks their own life far more valuable than it actually is.
Honestly, I really can’t pint to factual evidence when it comes to ‘the value of human life.’ - but back in University, I honestly thought that Latin was a more accurate representation of human-life value than Christian English was—and at the present day, knowledge of transhumanism seems to justify it.
I honestly thought that Latin was a more accurate representation of human-life value than Christian English was—and at the present day, knowledge of transhumanism seems to justify it.
I can’t parse this. What does Latin or English have to do with the value of life? The ways the concept is expressed in the two languages? What does transhumanism have to say about Latin?
I would argue that people actually take the larger gamble when they enter romantic relationships, certainly when they get married, and probably with some other decisions like that.
I’m reminded of the post a while back on whether an Atheist/Rationalist society would be effective in war.
I have trouble understanding why they wouldn’t be (which seems to be the opinion of most of the others here). In an objective moral sense, if Truth doesn’t matter more than Winning, then what does? Implicitly most here behave in accordance to that statement—I’d suggest that the amount of time devoted to this site exceeds the amount required for merely winning in contemporary society—but most seem to balk at the concept that Truth might require the sacrifice of life.
Maybe it’s scope insensitivity. Risking 1 utilon for 10 utilons (at fifty/fifty odds) is a gamble everyone here would take—but when the risk is 1000 utilons for 10 000 utilons, even though it’s the same gamble, it’s harder to see it as such (this being the major pause which Yudkowki’s dust-mote vs torture analogy brought out).
If we are, in fact, advocating Truth over mere Winning, there are going to be casualties along the way; in concrete terms, if my goal is an equal and just society, then I will be called upon to intervene in any gay-bashings I witness, at the risk of my own life.
So yes, the Atheist/Rationalist society—assuming they have that meta level of moral awareness—will go to war and be more viciously stalwart than any religious group could possibly hope to be. And if Wednesday must choose between Truth and Winning—as long as she isn’t a lecherous societal leech, concerned only with besting her opponents, rules be damned—she’ll choose the former, regardless of the expense to herself.
I think the standard reply here is that utilons (or utils, or whatever your favored terminology for this) is a standardized measure of whatever-it-is-you-care-about. You might not want to risk 1000 (say) dollars for even odds of 10 000 dollars—that all depends on your personal marginal utility of money. But if you don’t think you’d want to risk 1000 utilons for 10 000 utilons at even odds, that just means you’re defining utilons incorrectly. By definition, if I understand.
IAWYC, but I don’t think Aurini was necessarily making that mistake.
I read their comment as stating that, even when their “shut up and multiply” answer would or should be the same, people are wired to behave differently towards gambles when the stakes are higher. Not that they should, but that they do.
For example, my conscious dollars-to-utility function is nearly linear in small increments from my present position; if I had a 1-in-5 chance of turning $10 into $100, I’d go for it. However, my conscious (lives saved)-to-utility function is practically linear in small populations; but if I had a chance to gamble 10 lives against 100 at 1-in-5 odds, it would be psychologically more difficult to make the clearly correct choice. Or any choice at all; decisive paralysis is a probable actual outcome.
There are sensible evolutionary reasons for this to be the case, but it raises the question of what to do about it for people in positions of power.
On a deeper level, I’m suggesting that we over-estimate the utilon-level of own lives. Personally, I think your average North American thinks their own life far more valuable than it actually is.
Honestly, I really can’t pint to factual evidence when it comes to ‘the value of human life.’ - but back in University, I honestly thought that Latin was a more accurate representation of human-life value than Christian English was—and at the present day, knowledge of transhumanism seems to justify it.
We are expendable: truth and justice matter.
I can’t parse this. What does Latin or English have to do with the value of life? The ways the concept is expressed in the two languages? What does transhumanism have to say about Latin?
I would argue that people actually take the larger gamble when they enter romantic relationships, certainly when they get married, and probably with some other decisions like that.