Ah. I just picked up that technique from MinibearRex up there. I see you said it first, so kudos to you, then. It’s a useful trick. I’ll remember it.
...incidentally, if it’s too much work to click the link, copy-paste the text and click the button, then you might save yourself even more time and effort by just scrolling on without bothering to click the thumbs-down button either. There are friendlier ways to express disapproval, too. But thanks for the advice, I’ll try to be less of a bother next time.
This is kind of funny. I learned this trick from Grognor’s comment when I saw it in the recent comments section. And then I decided to try it out when I noticed the misspelling, not realizing it was on the same post.
First, it is an appeal to consequences against honor. Worse, it is an appeal to fictional consequences.
Second, honor is not the opposite of rationality. Just making an argument against honor would not automatically be a rationality quote even if it was a good argument.
Third it was encrypted which made me waste more than three times the amount of time reading it that I would have if it was in plain text. When it turned out to be bad this made the disappointment much worse.
The point isn’t that honour is bad, the point is (much more generally) that rational agents shouldn’t follow the Rules and lose anyway, they should WIN. Whether the Rules are the rules of honour, of mainstream science or of traditional rationalism, or whatever, if they don’t get you to win, find a way that does. And it’s futile to complain about unfairness after you lost, or the guy you were rooting for did.
The only part that appeals to fictional consequences is the additional implication that oftentimes, an ounce of down-to-earth pragmatism beats any amount of lofty ideals if you need to actually achieve concrete goals.
I thought adding that “rational agents should win” reference would make the intended idea clear enough. But I’ll take my own advice and just make a mental note to be clearer next time.
I dunno, I think all of that is overstated. I mean, sure, perfectly rational agents will always win, where “win” is defined as “achieving the best possible outcome under the circumstances.”
But aspiring rationalists will sometimes lose, and therefore be forced to choose the lesser of two evils, and, in making that choice, may very rationally decide that the pain of not achieving your (stated, proactive) goal is easier to bear than the pain of transgressing your (implicit, background) code of morality.
And if by “win” you mean not “achieve the best possible outcome under the circumstances,” but “achieve your stated, proactive goal,” then no, rationalists won’t and shouldn’t always win. Sometimes rationalists will correctly note that the best possible outcome under the circumstances is to suffer a negative consequence in order to uphold an ideal. Sometimes your competitors are significantly more talented and better-equipped than you, and only a little less rational than you, such that you can’t outwit your way to an honorable upset victory. If you value winning more than honor, fine, and if you value honor more than winning, fine, but don’t prod yourself to cheat simply because you have some misguided sense that rationalists never lose.
P.S.: Regarding your third point, is there a less bothersome way to handle spoilers? I’ve only seen rot13 being used for that purpose here. I’d gladly make it less cumbersome to read if I could do so without risking diminishing the fun of other people who watch or intend to watch this series.
(Or maybe the annoyance caused by the encryption is worse than the risk of spoiling just one scene in case there’s anyone reading this who watches the series and is a season and a half behind… I dunno. Neither course of action should be a big deal.)
If those four people who downvoted this would enlighten me as to why this is a bad quote, that would be much appreciated.
I have a general policy of downvoting anything in rot13. No, I’m not going to work to read your comment!
Instead, put your spoiler text in the hover text of a fake url, like this
Syntax:
Ah. I just picked up that technique from MinibearRex up there. I see you said it first, so kudos to you, then. It’s a useful trick. I’ll remember it.
...incidentally, if it’s too much work to click the link, copy-paste the text and click the button, then you might save yourself even more time and effort by just scrolling on without bothering to click the thumbs-down button either. There are friendlier ways to express disapproval, too. But thanks for the advice, I’ll try to be less of a bother next time.
This is kind of funny. I learned this trick from Grognor’s comment when I saw it in the recent comments section. And then I decided to try it out when I noticed the misspelling, not realizing it was on the same post.
First, it is an appeal to consequences against honor. Worse, it is an appeal to fictional consequences.
Second, honor is not the opposite of rationality. Just making an argument against honor would not automatically be a rationality quote even if it was a good argument.
Third it was encrypted which made me waste more than three times the amount of time reading it that I would have if it was in plain text. When it turned out to be bad this made the disappointment much worse.
Jeez, you guys. You miss the point.
-
-
-- Eliezer Yudkowsky
The point isn’t that honour is bad, the point is (much more generally) that rational agents shouldn’t follow the Rules and lose anyway, they should WIN. Whether the Rules are the rules of honour, of mainstream science or of traditional rationalism, or whatever, if they don’t get you to win, find a way that does. And it’s futile to complain about unfairness after you lost, or the guy you were rooting for did.
The only part that appeals to fictional consequences is the additional implication that oftentimes, an ounce of down-to-earth pragmatism beats any amount of lofty ideals if you need to actually achieve concrete goals.
I thought adding that “rational agents should win” reference would make the intended idea clear enough. But I’ll take my own advice and just make a mental note to be clearer next time.
I dunno, I think all of that is overstated. I mean, sure, perfectly rational agents will always win, where “win” is defined as “achieving the best possible outcome under the circumstances.”
But aspiring rationalists will sometimes lose, and therefore be forced to choose the lesser of two evils, and, in making that choice, may very rationally decide that the pain of not achieving your (stated, proactive) goal is easier to bear than the pain of transgressing your (implicit, background) code of morality.
And if by “win” you mean not “achieve the best possible outcome under the circumstances,” but “achieve your stated, proactive goal,” then no, rationalists won’t and shouldn’t always win. Sometimes rationalists will correctly note that the best possible outcome under the circumstances is to suffer a negative consequence in order to uphold an ideal. Sometimes your competitors are significantly more talented and better-equipped than you, and only a little less rational than you, such that you can’t outwit your way to an honorable upset victory. If you value winning more than honor, fine, and if you value honor more than winning, fine, but don’t prod yourself to cheat simply because you have some misguided sense that rationalists never lose.
EDIT: Anyone care to comment on the downvotes?
P.S.: Regarding your third point, is there a less bothersome way to handle spoilers? I’ve only seen rot13 being used for that purpose here. I’d gladly make it less cumbersome to read if I could do so without risking diminishing the fun of other people who watch or intend to watch this series.
(Or maybe the annoyance caused by the encryption is worse than the risk of spoiling just one scene in case there’s anyone reading this who watches the series and is a season and a half behind… I dunno. Neither course of action should be a big deal.)
I have a general policy of downvoting anything in rot13. No, I’m not going to work to read your comment!
Do this instead, put your spoiler text in the hover text of a fake url, like this
Syntax:
It could be more than four. Someone might have upvoted you.
To the extent honor encodes valid ethical injunctions, ignoring it will cause you to loose in the long run.
Exactly—compare Protected from Myself to “rationalists should win!”.
Would your opinion of the quote change if “fighting dishonorably” were replaced by “violating the Geneva convention”?
Perhaps. I’d say that should depend on the price for failure and how that compares to the violation. But point taken.