But part of my point is that LW isn’t “focusing on rationality”, or rather, it is focusing on fun theoretical discussions of rationality rather than practical exercises that are hard to work implement but actually make you more rational. The self-help / life hacking / personal development community is actually better (in my opinion) at helping people become more rational than this site ostensibly devoted to rationality.
The self-help / life hacking / personal development community is actually better (in my opinion) at helping people become more rational than this site ostensibly devoted to rationality.
Hmm. The self-help / life hacking / personal development community may well be better than LW at focussing on practice, on concrete life-improvements, and on eliciting deep-seated motivation. But AFAICT these communities are not aiming at epistemic rationality in our sense, and are consequently not hitting it even as well as we are. LW, for all its faults, has had fair success at teaching folks how to thinking usefully about abstract, tricky subjects on which human discussions often tend rapidly toward nonsense (e.g. existential risk, optimal philanthropy, or ethics). It has done so by teaching such subskills as:
Never attempting to prove empirical facts from definitions;
Never saying or implying “but decent people shouldn’t believe X, so X is false”;
Being curious; participating in conversations with intent to update opinions, rather than merely to defend one’s prior beliefs;
Asking what potential evidence would move you, or would move the other person;
Not expecting all sides of a policy discussion to line up;
Aspiring to have true beliefs, rather than to make up rationalizations that back the group’s notions of virtue.
By all means, let’s copy the more effective, doing-oriented aspects of life hacking communities. But let’s do so while continuing to distinguish epistemic rationality as one of our key goals, since, as Steven notes, this goal seems almost unique to LW, is achieved here more than elsewhere, and is necessary for tackling e.g. existential risk reduction.
The self-help / life hacking / personal development community is actually better (in my opinion) at helping people become more rational than this site ostensibly devoted to rationality.
Could you elaborate on what you mean by that claim, or why you believe it?
I love most of your recent comments, but on this point my impression differs. Yes, folks often learn more from practice, exercises, and deep-seated motivation than from having fun discussions. Yes, some self-help communities are better than LW at focussing on practice and life-improvement. But, AFAICT: no, that doesn’t mean these communities do more to boost their participants’ epistemic rationality. LW tries to teach folks skills for thinking usefully about abstract, tricky subjects on which human discussions often tend rapidly toward nonsense (e.g. existential risk, optimal philanthropy, or ethics). And LW, for all its flaws, seems to have had a fair amount of success in teaching its longer-term members (judging from my discussions with many such, in person and online) such skills as:
Never attempting to prove empirical facts from definitions;
Never saying or implying “but decent people shouldn’t believe X, so X is false”;
Being curious; participating in conversations with intent to update opinions, rather than merely to defend one’s prior beliefs;
Asking what potential evidence would move you, or would move the other person;
Not expecting all sides of a policy discussion to line up;
Aspiring to have true beliefs, rather than to make up rationalizations that back the group’s notions of virtue.
Do you mean: (1) self-help sites are more successful than LW at teaching the above, and similar, subskills; (2) the above subskills do not in fact boost folks’ ability to think non-nonsensically about abstract and tricky issues; or (3) LW may better boost folks’ ability to think through abstract issues, but that ability should not be called “rationality”?
It’s because they take less continued attention/effort and provide more immediate/satisfying results. LW is almost purely theoretical and isn’t designed to be efficient. It’s an attempt to logically override bias rather than implement the quirks of human neurochemistry to automate the process.
Computer scientists are notorious for this. They know how brains make thoughts happen, but they don’t have a clue how people think, so ego drives them to rationalize a framework to perceive the flaws of others as uncuriousness and lack of dedication. This happens because they’re just as human as the rest of us, made of the same biological approximation of inherited “good-enoughness.” And the smarter they are, the more complex and well-reasoned that rationalization will be.
We all seek to affirm our current beliefs and blame others for discrepancies. It’s instinct, physics, chemistry. No amount of logic and reason can override the instinct to defend one’s perception of reality. Or other instincts either. Examples are everywhere. Every fat person in the world has been thoroughly educated on which lifestyle changes will cause them to lose weight, yet the obesity epidemic still grows.
Therefore, we study “rationality” to see ourselves as the good-guy protagonists who strive to be “less wrong,” have “accurate beliefs,” and “be effective at achieving our goals.”
It’s important work… for computers. For humanity, you’re better off consulting a monk.
But part of my point is that LW isn’t “focusing on rationality”, or rather, it is focusing on fun theoretical discussions of rationality rather than practical exercises that are hard to work implement but actually make you more rational. The self-help / life hacking / personal development community is actually better (in my opinion) at helping people become more rational than this site ostensibly devoted to rationality.
Hmm. The self-help / life hacking / personal development community may well be better than LW at focussing on practice, on concrete life-improvements, and on eliciting deep-seated motivation. But AFAICT these communities are not aiming at epistemic rationality in our sense, and are consequently not hitting it even as well as we are. LW, for all its faults, has had fair success at teaching folks how to thinking usefully about abstract, tricky subjects on which human discussions often tend rapidly toward nonsense (e.g. existential risk, optimal philanthropy, or ethics). It has done so by teaching such subskills as:
Never attempting to prove empirical facts from definitions;
Never saying or implying “but decent people shouldn’t believe X, so X is false”;
Being curious; participating in conversations with intent to update opinions, rather than merely to defend one’s prior beliefs;
Asking what potential evidence would move you, or would move the other person;
Not expecting all sides of a policy discussion to line up;
Aspiring to have true beliefs, rather than to make up rationalizations that back the group’s notions of virtue.
By all means, let’s copy the more effective, doing-oriented aspects of life hacking communities. But let’s do so while continuing to distinguish epistemic rationality as one of our key goals, since, as Steven notes, this goal seems almost unique to LW, is achieved here more than elsewhere, and is necessary for tackling e.g. existential risk reduction.
Could you elaborate on what you mean by that claim, or why you believe it?
I love most of your recent comments, but on this point my impression differs. Yes, folks often learn more from practice, exercises, and deep-seated motivation than from having fun discussions. Yes, some self-help communities are better than LW at focussing on practice and life-improvement. But, AFAICT: no, that doesn’t mean these communities do more to boost their participants’ epistemic rationality. LW tries to teach folks skills for thinking usefully about abstract, tricky subjects on which human discussions often tend rapidly toward nonsense (e.g. existential risk, optimal philanthropy, or ethics). And LW, for all its flaws, seems to have had a fair amount of success in teaching its longer-term members (judging from my discussions with many such, in person and online) such skills as:
Never attempting to prove empirical facts from definitions;
Never saying or implying “but decent people shouldn’t believe X, so X is false”;
Being curious; participating in conversations with intent to update opinions, rather than merely to defend one’s prior beliefs;
Asking what potential evidence would move you, or would move the other person;
Not expecting all sides of a policy discussion to line up;
Aspiring to have true beliefs, rather than to make up rationalizations that back the group’s notions of virtue.
Do you mean: (1) self-help sites are more successful than LW at teaching the above, and similar, subskills; (2) the above subskills do not in fact boost folks’ ability to think non-nonsensically about abstract and tricky issues; or (3) LW may better boost folks’ ability to think through abstract issues, but that ability should not be called “rationality”?
It’s because they take less continued attention/effort and provide more immediate/satisfying results. LW is almost purely theoretical and isn’t designed to be efficient. It’s an attempt to logically override bias rather than implement the quirks of human neurochemistry to automate the process.
Computer scientists are notorious for this. They know how brains make thoughts happen, but they don’t have a clue how people think, so ego drives them to rationalize a framework to perceive the flaws of others as uncuriousness and lack of dedication. This happens because they’re just as human as the rest of us, made of the same biological approximation of inherited “good-enoughness.” And the smarter they are, the more complex and well-reasoned that rationalization will be.
We all seek to affirm our current beliefs and blame others for discrepancies. It’s instinct, physics, chemistry. No amount of logic and reason can override the instinct to defend one’s perception of reality. Or other instincts either. Examples are everywhere. Every fat person in the world has been thoroughly educated on which lifestyle changes will cause them to lose weight, yet the obesity epidemic still grows.
Therefore, we study “rationality” to see ourselves as the good-guy protagonists who strive to be “less wrong,” have “accurate beliefs,” and “be effective at achieving our goals.”
It’s important work… for computers. For humanity, you’re better off consulting a monk.