This is an excerpt from a comment I wrote on the EA forum, extracted and crossposted here by request:
There’s a phenomenon where a gambler places their money on 32, and then the roulette wheel comes up 23, and they say “I’m such a fool; I should have bet 23”.
More useful would be to say “I’m such a fool; I should have noticed that the EV of this gamble is negative.” Now at least you aren’t asking for magic lottery powers.
Even more useful would be to say “I’m such a fool; I had three chances to notice that this bet was bad: when my partner was trying to explain EV to me; when I snuck out of the house and ignored a sense of guilt; and when I suppressed a qualm right before placing the bet. I should have paid attention in at least one of those cases and internalized the arguments about negative EV, before gambling my money.” Now at least you aren’t asking for magic cognitive powers.
My impression is that various EAs respond to crises in a manner that kinda rhymes with saying “I wish I had bet 23”, or at best “I wish I had noticed this bet was negative EV”, and in particular does not rhyme with saying “my second-to-last chance to do better (as far as I currently recall) was the moment that I suppressed the guilt from sneaking out of the house”.
(I think this is also true of the general population, to be clear. Perhaps even moreso.)
I have a vague impression that various EAs perform self-flagellation, while making no visible attempt to trace down where, in their own mind, they made a misstep. (Not where they made a good step that turned out in this instance to have a bitter consequence, but where they made a wrong step of the general variety that they could realistically avoid in the future.)
(Though I haven’t gone digging up examples, and in lieu of examples, for all I know this impression is twisted by influence from the zeitgeist.)
When I see or hear a piece of advice, I check to see what happens if the advice were the reverse. Often it’s also good advice, which means all we can do is take the advice into account as we try to live a balanced life. For example, if the advice is “be brave!” the reverse is “be more careful”. Which is good advice, too.
This advice is unusual in that it is non-reversible.
I’ve referred to “I should have bet on 23-type errors” several times over the past year. Having this shorthand and an explanation I can link to has sped up those conversations.
I’m not EA (though I do agree with most of the motte—I care about other humans, and I try to be effective), and not part of the rationalist “community”, so take this as an outside view.
There’s a ton of “standard human social drama” in EA and in rationalist communities, and really anywhere where “work” and “regular life” overlap significantly. Some of this takes the form of noticing flaws in other people’s rationality (or, just as often, flaws in kindness/empathy being justified by rationality).
Especially when one doesn’t want to identify and address specific examples, I think there’s a very high risk of misidentifying the cause of a disagreement or disrespect-of-behavior. In this case, I don’t notice much of the flagellation or wishing—either I don’t hang out in the right places, or I bounce off those posts and don’t pay much mind. But things that might fit that pattern strike me as a failure of personal responsibility, not a failure of modeling wishes. Your term self-flagellation is interesting from that standpoint—the historic practice was for penance of generalized sin, and to share suffering, not as a direct correction for anything. It’s clearly social, not rational.
IMO, rationalism must first and foremost be individual. I am trying to be less wrong in my private beliefs and in my goal-directed behaviors. Group rationality is a category error—I don’t have access to group beliefs (if there is such a thing). I do have some influence over group behaviors and shared statements, but I recognize that they are ALWAYS a compromise and negotiated results of individual beliefs and behaviors, and don’t necessarily match any individual in the group.
I’m surprised every time I see a rationalist assuming otherwise, and being disappointed that other members of the group doesn’t share all their beliefs and motivations.
From my perspective, “group rationality” means how much the group provides an environment conductive to becoming more rational (for those who have the propensity to do so).
It is probably easier to describe the opposite—a group where religion and woo are high status, skepticism is considered a form of stupidity, members are encouraged to think in slogans rather than inspect the details of their actual experience, etc.
A rational group would then be one where as an individual rationalist you can do the right thing without getting socially punished in turn, and are gently called out when you do something stupid.
And it is a “more or less” thing, rather than “yes or no”. (However, the same is true about individual rationality.) I would not expect any group to share all my beliefs and motivation. But the difference between sharing 20% or 80% of the beliefs and motivations means a lot to me.
This is an excerpt from a comment I wrote on the EA forum, extracted and crossposted here by request:
There’s a phenomenon where a gambler places their money on 32, and then the roulette wheel comes up 23, and they say “I’m such a fool; I should have bet 23”.
More useful would be to say “I’m such a fool; I should have noticed that the EV of this gamble is negative.” Now at least you aren’t asking for magic lottery powers.
Even more useful would be to say “I’m such a fool; I had three chances to notice that this bet was bad: when my partner was trying to explain EV to me; when I snuck out of the house and ignored a sense of guilt; and when I suppressed a qualm right before placing the bet. I should have paid attention in at least one of those cases and internalized the arguments about negative EV, before gambling my money.” Now at least you aren’t asking for magic cognitive powers.
My impression is that various EAs respond to crises in a manner that kinda rhymes with saying “I wish I had bet 23”, or at best “I wish I had noticed this bet was negative EV”, and in particular does not rhyme with saying “my second-to-last chance to do better (as far as I currently recall) was the moment that I suppressed the guilt from sneaking out of the house”.
(I think this is also true of the general population, to be clear. Perhaps even moreso.)
I have a vague impression that various EAs perform self-flagellation, while making no visible attempt to trace down where, in their own mind, they made a misstep. (Not where they made a good step that turned out in this instance to have a bitter consequence, but where they made a wrong step of the general variety that they could realistically avoid in the future.)
(Though I haven’t gone digging up examples, and in lieu of examples, for all I know this impression is twisted by influence from the zeitgeist.)
When I see or hear a piece of advice, I check to see what happens if the advice were the reverse. Often it’s also good advice, which means all we can do is take the advice into account as we try to live a balanced life. For example, if the advice is “be brave!” the reverse is “be more careful”. Which is good advice, too.
This advice is unusual in that it is non-reversible.
I’ve referred to “I should have bet on 23-type errors” several times over the past year. Having this shorthand and an explanation I can link to has sped up those conversations.
I’m not EA (though I do agree with most of the motte—I care about other humans, and I try to be effective), and not part of the rationalist “community”, so take this as an outside view.
There’s a ton of “standard human social drama” in EA and in rationalist communities, and really anywhere where “work” and “regular life” overlap significantly. Some of this takes the form of noticing flaws in other people’s rationality (or, just as often, flaws in kindness/empathy being justified by rationality).
Especially when one doesn’t want to identify and address specific examples, I think there’s a very high risk of misidentifying the cause of a disagreement or disrespect-of-behavior. In this case, I don’t notice much of the flagellation or wishing—either I don’t hang out in the right places, or I bounce off those posts and don’t pay much mind. But things that might fit that pattern strike me as a failure of personal responsibility, not a failure of modeling wishes. Your term self-flagellation is interesting from that standpoint—the historic practice was for penance of generalized sin, and to share suffering, not as a direct correction for anything. It’s clearly social, not rational.
IMO, rationalism must first and foremost be individual. I am trying to be less wrong in my private beliefs and in my goal-directed behaviors. Group rationality is a category error—I don’t have access to group beliefs (if there is such a thing). I do have some influence over group behaviors and shared statements, but I recognize that they are ALWAYS a compromise and negotiated results of individual beliefs and behaviors, and don’t necessarily match any individual in the group.
I’m surprised every time I see a rationalist assuming otherwise, and being disappointed that other members of the group doesn’t share all their beliefs and motivations.
From my perspective, “group rationality” means how much the group provides an environment conductive to becoming more rational (for those who have the propensity to do so).
It is probably easier to describe the opposite—a group where religion and woo are high status, skepticism is considered a form of stupidity, members are encouraged to think in slogans rather than inspect the details of their actual experience, etc.
A rational group would then be one where as an individual rationalist you can do the right thing without getting socially punished in turn, and are gently called out when you do something stupid.
And it is a “more or less” thing, rather than “yes or no”. (However, the same is true about individual rationality.) I would not expect any group to share all my beliefs and motivation. But the difference between sharing 20% or 80% of the beliefs and motivations means a lot to me.