I suggest evaluating a point of view by its best proponents, not its worst or even its average proponents.
Right. That’s an obvious failure mode that occurred here. Unfortunately, it isn’t always clear which proponents of a view are actually the best. Moreover, sometimes the best proponents get lost in the noise of the less intelligent/rational/informed proponents.
This makes me worry how often this occurs. To use a really extreme example: maybe the Young Earth Creationists have some really slamdunk argument but I’m not noticing it because it is so rarely used? The failure that occurred in this context doesn’t seem to be that large a scale of getting reality just wrong but it does create those sorts of worries.
The scale of the post 9/11 failure, not just by me, but my lots of people, some quite smart is frightening. I can look back and see specific things that went wrong but how much of even that is hindsight bias? How many big decision are we making even now that I support that in a decade will seem incredibly wrong and stupid?
How many big decision are we making even now that I support
What does this mean, exactly?
Hopefully the number of big decisions you support where you estimate the probability that things will be better than the counterfactual without the big decision is one is zero.
There’s more to it than the probability things will be better—the worst thing that can happen is a lot worse than the best thing that can happen is good.
Those last sentences are both atrocious. If I think of a better way to say that I will edit it.
You have more to lose than you stand to gain, maybe?
Edit: Well yes, I meant ‘you’ in the generic sense. “There is always more to lose than stands to be gained,” perhaps. (That’s a horribly depressing worldview, incidentally. Which isn’t to say it’s wrong, just that… it’s not the kind of thought you could use to cast the True Patronus Charm, if you know what I mean.)
That’s an interesting article and I thank you for linking it, but the issue was never the truth value of any particular factual claim. It’s just, when I offer as a synthesis of your point “You have more to lose than you stand to gain” and your response amounts to “Too specific”, I have to think you’re actually saying something along the lines of “All altruism is counterproductive” which is horribly depressing whether or not it’s true.
Again, I’m just remarking on how something appears to me. And maybe implicitly asking you to refute the point or explain how you deal with it.
I have to think you’re actually saying something along the lines of “All altruism is counterproductive”
It’s not all counterproductive, I’m not saying that for two reasons.
First, I was only speaking about decision making and considering the odds of various future outcomes. Obviously, no mater what one’s intentions are or how poorly decisions are made, things may work out very well.
Second, I am claiming that it is usually the case that there is more to lose than to gain, that building things takes more work than destroying things. It can still be best to be altruistic.
Consider two six sided dice, one with sides numbered 6-5-4-2-1-1 and one with sides numbered 3-2-2-2-2-2. If I offered you dollars equal to the result of the roll of a die, and the opportunity to roll either, you would probably choose to roll the first, even though its worst case scenario is worse.
That’s too personal, I’m trying to say something that applies at every scale and every level of selflessness.
Under Saddam, hundreds of Iraqis annually were tortured, raped, and/or murdered for intimidation, crimes of their relatives, fun, punishment for losing international sports games, etc. Made into amputees, put into sausage machines alive, and on and on.
But the population was over 30,000,000, and the best plausible government wouldn’t have the best justice system either. So war risks all those millions’ lives. Sadaam wasn’t killing millions annually—even without the low intensity war of the no-fly zones he probably wouldn’t have killed more than hundreds of thousands, as he had done in the past.
If he had had many chemical weapons, it could have been really, really bad.
Right. That’s an obvious failure mode that occurred here. Unfortunately, it isn’t always clear which proponents of a view are actually the best. Moreover, sometimes the best proponents get lost in the noise of the less intelligent/rational/informed proponents.
This makes me worry how often this occurs. To use a really extreme example: maybe the Young Earth Creationists have some really slamdunk argument but I’m not noticing it because it is so rarely used? The failure that occurred in this context doesn’t seem to be that large a scale of getting reality just wrong but it does create those sorts of worries.
The scale of the post 9/11 failure, not just by me, but my lots of people, some quite smart is frightening. I can look back and see specific things that went wrong but how much of even that is hindsight bias? How many big decision are we making even now that I support that in a decade will seem incredibly wrong and stupid?
I feel much the same way about the financial crisis.
What does this mean, exactly?
Hopefully the number of big decisions you support where you estimate the probability that things will be better than the counterfactual without the big decision is one is zero.
There’s more to it than the probability things will be better—the worst thing that can happen is a lot worse than the best thing that can happen is good.
Those last sentences are both atrocious. If I think of a better way to say that I will edit it.
You have more to lose than you stand to gain, maybe?
Edit: Well yes, I meant ‘you’ in the generic sense. “There is always more to lose than stands to be gained,” perhaps. (That’s a horribly depressing worldview, incidentally. Which isn’t to say it’s wrong, just that… it’s not the kind of thought you could use to cast the True Patronus Charm, if you know what I mean.)
A good recent article.
That’s an interesting article and I thank you for linking it, but the issue was never the truth value of any particular factual claim. It’s just, when I offer as a synthesis of your point “You have more to lose than you stand to gain” and your response amounts to “Too specific”, I have to think you’re actually saying something along the lines of “All altruism is counterproductive” which is horribly depressing whether or not it’s true.
Again, I’m just remarking on how something appears to me. And maybe implicitly asking you to refute the point or explain how you deal with it.
It’s not all counterproductive, I’m not saying that for two reasons.
First, I was only speaking about decision making and considering the odds of various future outcomes. Obviously, no mater what one’s intentions are or how poorly decisions are made, things may work out very well.
Second, I am claiming that it is usually the case that there is more to lose than to gain, that building things takes more work than destroying things. It can still be best to be altruistic.
Consider two six sided dice, one with sides numbered 6-5-4-2-1-1 and one with sides numbered 3-2-2-2-2-2. If I offered you dollars equal to the result of the roll of a die, and the opportunity to roll either, you would probably choose to roll the first, even though its worst case scenario is worse.
That’s too personal, I’m trying to say something that applies at every scale and every level of selflessness.
Under Saddam, hundreds of Iraqis annually were tortured, raped, and/or murdered for intimidation, crimes of their relatives, fun, punishment for losing international sports games, etc. Made into amputees, put into sausage machines alive, and on and on.
But the population was over 30,000,000, and the best plausible government wouldn’t have the best justice system either. So war risks all those millions’ lives. Sadaam wasn’t killing millions annually—even without the low intensity war of the no-fly zones he probably wouldn’t have killed more than hundreds of thousands, as he had done in the past.
If he had had many chemical weapons, it could have been really, really bad.