Does commentary and opinion that LW “sucks” mean that it can’t proselytize? Everybody from terrorists to politicians to businesses proselytize, and what most of them are selling looks to me to be a whole lot less useful than what LW is selling.
Yet there is, I think, more absent than present in this “art of rationality”—defeating akrasia and coordinating groups are two of the deficits I feel most keenly. I’ve concentrated more heavily on epistemic rationality than instrumental rationality, in general. And then there’s training, teaching, verification, and becoming a proper experimental science based on that. And if you generalize a bit further, then building the Art could also be taken to include issues like developing better introductory literature, developing better slogans for public relations, establishing common cause with other Enlightenment subtasks, analyzing and addressing the gender imbalance problem...
There were a few articles about akrasia; and CFAR is working on a curriculum for teaching rationality.
I have an idea about a sequence I would love to see, but only if it is written well (because it would be very easy and tempting!!! to make it wrong in various ways): Starting with scientifically describing human emotions, social behavior, and sexual behavior. Progressing to social skills. And culminating with community building.
I believe that these issues are so interconnected that it is almost meaningless trying to discuss them separately. They also happen to be places where the popular stereotypes say the nerds have blind spots. I suspect these may be areas where being “half correct” may harm you a lot. Which poses a problem because non-nerds usually don’t care much about their maps matching the territory, and nerds talking about these topics will probably suffer from the curse of Dunning and Kruger; so we need to be extra careful here. However, without mastering this part of Art we will never achieve larger-scale rationalist communities.
(My beliefs here are a result of many pieces of evidence coming from different sources, so I am not able to disentangle them all within a comment. But generally, there are two levels here: The more superficial is that people are naturally drawn towards attractive people, and repelled from “creepy” people. This is a strong force to be ignored only at one’s own peril, and it’s even more true on the community level; even the wannabe rationalists are no exception to this. Some degree of social savviness is necessary for merely not having a group fall apart; even higher level is required to attract new members. These skills are learnable, but the fresh students tend to be overconfident and creepy. We also need communication skills and norms that are conductive to solving problems, because sooner or later the problems will appear. On a deeper level, many uniquely human skills are probably a result of sexual selection, even if they are not consciouslly connected. Understanding this may help to better understand “prestige”, which is a required component for social success. Yeah, am I aware that this explanation is probably not very helpful.)
Another thing, we need some good introduction for newbies. At this moment, giving them Rationality: from AI to Zombies is my favorite choice, but something shorter and easier to read would be even better. I imagine 50-100 pages of text written for a above-average-IQ high-school audience. (At the end, the text should point them towards further literature, including the Rationality, but also many other books. Simply imagine that you are looking at a smart high-school bookworm who is going to read 20 or 50 books in the following years anyway; if your only influence on their life would be replacing their reading list, which specific books would you recommend?) This shorter text could then be printed and distributed among the smart high-school or college students.
My notion for outreach is to start with the planning fallacy because it’s straightforward and something a lot of people have experience with, and see how much can be built from there.
straightforward and something a lot of people have experience with
I like this. Then I would also suggest selection bias. (Specifically to students I’d say that for every school dropout who became famous, there are thousand others who didn’t.)
Well, the non-tautological aspect is that this goes far beyond the explicitly sexual contexts. Even if you are e.g. having a lecture on Bayes Theorem, your audience keeps evaluating your attractivity, and it reflects in how much attention they give you, etc.
(This is a specific application of the “halo effect”. The attractivity is perceived automatically all the time, so it strongly contributes to the “halo”.)
Something like that, yeah, but instead of reading one full book, I would prefer reading a compilation of the important ideas from multiple books of multiple authors.
Similarly to how e.g. one LW post tells you about the concept of “belief in belief” instead of giving you a link to a several hundred pages long book from Dennett. Most of the Sequences could be replaced by a list of 20 or 50 books to read, and they would certainly provide much more information, but reading all those books would take me a few years, while the whole Sequences can be read in a month or two. And if I want to communicate the whole thing to someone else, giving them one e-book is better than telling them to read 20 or 50 books.
Just like that, I would like to see 10 or 20 books on human social behavior compiled into a single book (of course, with links to the original sources for those who want to read more). To have the option of giving someone one book and saying: “This is the kind of community I want to create; would you like to join?”
It is also important (I haven’t read the specific book you linked, so I am not sure if this applies) to have some ideas actionable.
To explain what I mean, think about books like “Predictably Irrational” which contain the information that human behavior is often wrong, but all you get from them is the idea “humans are stupid”. Possibly true, but it doesn’t help you become stronger. On the other hand, in the Sequences there is often a hint towards a strategy that could reduce a specific bias. Because the spirit of LW is not pointing fingers towards others and laughing, but looking at the mirror and thinking strategically.
Then of course there is the whole PUA industry that tries to be actionable, and that specific aspect I admire, but they are horrible from the epistemic point of view: “schools proliferating without evidence”, overgeneralizations, confirmation bias, etc. What I’d like to see is someone who gives a correct explanation, supported by science, and then a few actionable ideas for people who want to improve themselves. Focusing on general human interaction (not just seeking one-night stands in loud pubs), considering also long-term consequences. But then I’d like to see people taking those ideas seriously.
Another important part would be debunking the existing myths; e.g. showing that human morality is more then mere cooperation in iterated prisoners’ dilemmas (why do tourists tip in foreign restaurants even if they are unlikely to ever return there again?).
A few books that come to my mind are “The Theory of Moral Sentiments”, “The Mating Mind”, “Nonviolent Communication”, “Games Prople Play”, “Creating a Life Together”, maybe even “Don’t Shoot the Dog”. I probably forgot something important.
I have an idea about a sequence I would love to see, but only if it is written well (because it would be very easy and tempting!!! to make it wrong in various ways): Starting with scientifically describing human emotions, social behavior, and sexual behavior.
This seems way too broad. If it was done right, I don’t think it would end up being a single sequence.
How much have you looked into this already and do have any more concrete ideas on where you would want the sequence to go?
Progressing to social skills.
I think there are two main problems with trying to do a sequence on this:
I think that the problems that people have with social skills are more individualistic than the heuristics and biases that are normally discussed on this site. They are largely due to ingrained perspectival and strategical problems, which are in constrast, to say, the representativeness heuristic which we can all relate to. A consequence of this is that most people are going to find a lot of the content to not be applicable for them. I would guess that this problem could be overcome by creating more of a debug guide like if you have problems expressing yourself go to post 3 or you could also go meta and let people realize their own problems.
As you mention, it is important: “to have some ideas actionable”, but I think it is also especially important for these type of skills that they be easily able to be put into action and that there are opportunities available to easily put them into action. In this area, I would think that most of the ideas that would make an impact in people lives are not going to seem particularly insightful or revolutionary, in fact, they may at first appear to be incredibly obvious. It is only when they are put into practice that people come to recognize their importance. I would think that simulation of these skills is not as useful or as interesting as it is for the other content on this site.
My wish is to create rationalist communities which are emotionally healthy, capable of action, and successful in life. So the content of the book would be “all knowledge that is important for wannabe rationalists to become this kind of community”. But I suspect that it would be a lot of material; not just to give people the necessary skills, but also to combat existing myths.
By emotionally healthy I mean that the group wouldn’t fall apart immediately because of some inconsequential squabble, but also wouldn’t become some kind of a cult. That the people inside the group would be happy and would achieve their personal dreams, and the people outside of the group would be mostly positively impressed. That the group as a whole and its individuals would be rational, but not suffering from akrasia.
When you look at the existing Sequences, you see that Eliezer not only spends a lot of time arguing against supernatural stuff, but he also needs to turn around and argue against the overenthusiastic “fans of evolution”, and explain that actually evolution is stupid, it can lead to extinction, and the group selectionism mostly doesn’t work. I expect that the similar kind of hedging both ways would also be necessary for this topic. That people often try to avoid an extreme by running into the opposite extreme—“if I agree with everything, it means I am a sheep; therefore I will disagree with everything” or “acting without thinking is stupid; therefore I will always think and never act” or “people often disregard their own reason because they want to fit in the group; therefore I need to be abrasive and cynical about everything”—and it will be necessary not only to navigate them properly, but also make them notice when someone else promotes an extreme form of behavior.
I agree that the palette of the most frequent social/emotional mistakes is probably wider that the palette of the most frequent cognitive errors. But people are less unique than they imagine. For example, read Games People Play—this book describes about a dozen patterns of dysfunctional human interaction, and most people are shocked to find their own story described there. A therapist or a religious confessor probably mostly hears the same few stories over and over again. Also, Games People Play after each pattern contains an advice “what to do if you find youself stuck in this pattern”.
Something like this?
In that direction, yeah. The guide contains a lot of dense “do this, do that, don’t do this, don’t do that”, which I get is compressed to make a long text short, but I would like to read the longer version with explanations and stories that would make it easier to memorize. (Reading all the recommended literature on the last page would probably do it.) Also, the guide is about making a successful meetup, which is a great thing, but “meetup” still feels to me like “inside the lab”, and I would like something that goes beyond the time and space limitations of the meetup. Similarly how the Sequences are not about “how to think rationally during a meetup”, but how to think rationally in general.
Well, there are still some things I am not sure about, so… that’s one of the reasons why I am not trying to write that text myself right now. I don’t see exactly how the things are connected. I never was a community leader, so this would be outside my experience. There are still skills that I don’t have, and problems that I can’t solve even theoretically. (For example one of my big concerns is that I think that successful groups sooner or later attract psychopaths who will try to get into positions of power and exploit the whole thing for themselves. No idea how to prevent this, though. I think I have a heuristic for detecting that it already happened: it’s when people are afraid to speak openly with each other; but that is too late, if your goal is to prevent that thing from happening in the first place. I suspect that some of the seemingly irrational traditional rituals, such as “getting drunk together” could actually have been designed for this purpose; but maybe I’m completely wrong.)
My wish is to create rationalist communities which are emotionally healthy, capable of action, and successful in life. […] Similarly how the Sequences are not about “how to think rationally during a meetup”, but how to think rationally in general.
Is your wish actually to create rationalist communities which are emotionally healthy, capable of action, and successful in life so that you can become these things?
That people often try to avoid an extreme by running into the opposite extreme […] and it will be necessary not only to navigate them properly, but also make them notice when someone else promotes an extreme form of behavior.
I think the problem here is that the underlying problem that caused these people to take on an extreme view in the first place is still there and so when they do change their view they just tend to adopt another similarly extreme view. For example you mentioned PUA before; I have noticed that there are some men who have a strong level of neediness and so become ‘nice guys’ in the worst sense. They then read some of the PUA stuff and instead of doing good things like: developing genuine confidence, self-respect, a healthy sense of boundaries etc. they take a shortcut. They just change their perspective on women. The neediness is still there, however, and so they adopt a view where they objectify women. They essentially become ‘assholes’. In both the ‘asshole’ and the ‘nice guy’ cases there is a sense in which these people are giving up or altering part of who they are. This is what I think these people would actually need to solve if they were going to make an improvement in their situation.
But people are less unique than they imagine. For example, read Games People Play—this book describes about a dozen patterns of dysfunctional human interaction, and most people are shocked to find their own story described there.
That’s probably true, but I was trying to say that unlike with cognitive errors, which most people can relate to, most people would only be able to relate to a few of the games and would find the others to be largely irrelevant for them.
one of my big concerns is that I think that successful groups sooner or later attract psychopaths who will try to get into positions of power and exploit the whole thing for themselves. No idea how to prevent this
I would expect that any group like the one you propose would have a very flat hierarchy of power. In fact, if people become dependent on the group or find themselves seeming to need the group to improve then the group probably isn’t working too well. You reduce dependency by maximizing the free exchange of information and the ability for people to improve outside of the group.
Is your wish actually to create rationalist communities which are emotionally healthy, capable of action, and successful in life so that you can become these things?
Good insight. Yeah, it’s so that I can become these things permanently.
People usually do what people around them do. Trying to do something that no one around you does is possible but exhausting, like swimming against the current. On the other hand, doing what people around you do is easy and literally instinctive.
Many people report that their choices of environment are either (a) rational people suffering from akrasia and mildly neurotic, or (b) emotionally healthy and highly active people who often believe obviously stupid ideas, but because of lucky compartmentalization it doesn’t ruin their lives. Sometimes they perceive this as a false dilemma: should I try to become more rational, but akratic and neurotic; or should I throw rationality away and become a happy and healthy human?
To me it seems obvious that a third way is possible, but it would be much easier when surrounded by a group of humans who do the same. I mean, to me this is obvious: after a LW meetup, I become much more reasonable and active, but only for a few days, then it wears off. Maybe I am more sensitive to my surroundings than an average person (I do have some evidence for this), but I believe that this effect is universal or close to universal; it’s just a question of degree. Humans are a social species, peer pressure and cultural learning exist.
In both the ‘asshole’ and the ‘nice guy’ cases there is a sense in which these people are giving up or altering part of who they are.
I interpret this as “the beginners are doing it wrong”.
most people would only be able to relate to a few of the games and would find the others to be largely irrelevant for them.
Yeah, I get it now. We all suffer from all (major) cognitive biases, only to a different degree, but each of us only has a subset of the emotional problems. Yeah, it seems so.
Well, the benefits of promoting rationality have been widely recognized ever since EY published his post on raising the sanity waterline.
But let’s be clear—rationality outreach is not LW outreach. The goal is to spread good ideas from rationality and raise the sanity waterline, not get people to engage with LW necessarily. I myself would like to wait until more of LW 2.0 comes into being, including the newbie section, before inviting newcomers to engage with LW actively.
EY was attempting to spread his ideas since his first post on overcomingbias. This pattern was followed through entire Sequences. Do you regard this as different from then?
Means and ends. LW was the means of “spreading his ideas” as you put it. Whereas Gleb is promoting the idea that we should do outreach for LW. LW as the end.
Ok, this is something I have been thinking every time I see an Outreach Thread, and now I can’t resist asking it:
When did LW become a proselytizing community?
And are we sure that it is a good idea to do a lot of outreach when the majority of discussion on the site is about why LW sucks?
Does commentary and opinion that LW “sucks” mean that it can’t proselytize? Everybody from terrorists to politicians to businesses proselytize, and what most of them are selling looks to me to be a whole lot less useful than what LW is selling.
Am I missing something?
Yeah, at this moment I would rather tell people to download and read Rationality: from AI to Zombies.
We’re still up against the challenge of finding interesting things to write about.
Quoting Eliezer2009:
There were a few articles about akrasia; and CFAR is working on a curriculum for teaching rationality.
I have an idea about a sequence I would love to see, but only if it is written well (because it would be very easy and tempting!!! to make it wrong in various ways): Starting with scientifically describing human emotions, social behavior, and sexual behavior. Progressing to social skills. And culminating with community building.
I believe that these issues are so interconnected that it is almost meaningless trying to discuss them separately. They also happen to be places where the popular stereotypes say the nerds have blind spots. I suspect these may be areas where being “half correct” may harm you a lot. Which poses a problem because non-nerds usually don’t care much about their maps matching the territory, and nerds talking about these topics will probably suffer from the curse of Dunning and Kruger; so we need to be extra careful here. However, without mastering this part of Art we will never achieve larger-scale rationalist communities.
(My beliefs here are a result of many pieces of evidence coming from different sources, so I am not able to disentangle them all within a comment. But generally, there are two levels here: The more superficial is that people are naturally drawn towards attractive people, and repelled from “creepy” people. This is a strong force to be ignored only at one’s own peril, and it’s even more true on the community level; even the wannabe rationalists are no exception to this. Some degree of social savviness is necessary for merely not having a group fall apart; even higher level is required to attract new members. These skills are learnable, but the fresh students tend to be overconfident and creepy. We also need communication skills and norms that are conductive to solving problems, because sooner or later the problems will appear. On a deeper level, many uniquely human skills are probably a result of sexual selection, even if they are not consciouslly connected. Understanding this may help to better understand “prestige”, which is a required component for social success. Yeah, am I aware that this explanation is probably not very helpful.)
Another thing, we need some good introduction for newbies. At this moment, giving them Rationality: from AI to Zombies is my favorite choice, but something shorter and easier to read would be even better. I imagine 50-100 pages of text written for a above-average-IQ high-school audience. (At the end, the text should point them towards further literature, including the Rationality, but also many other books. Simply imagine that you are looking at a smart high-school bookworm who is going to read 20 or 50 books in the following years anyway; if your only influence on their life would be replacing their reading list, which specific books would you recommend?) This shorter text could then be printed and distributed among the smart high-school or college students.
My notion for outreach is to start with the planning fallacy because it’s straightforward and something a lot of people have experience with, and see how much can be built from there.
I like this. Then I would also suggest selection bias. (Specifically to students I’d say that for every school dropout who became famous, there are thousand others who didn’t.)
That’s a tautology
Well, the non-tautological aspect is that this goes far beyond the explicitly sexual contexts. Even if you are e.g. having a lecture on Bayes Theorem, your audience keeps evaluating your attractivity, and it reflects in how much attention they give you, etc.
(This is a specific application of the “halo effect”. The attractivity is perceived automatically all the time, so it strongly contributes to the “halo”.)
Like this?
Something like that, yeah, but instead of reading one full book, I would prefer reading a compilation of the important ideas from multiple books of multiple authors.
Similarly to how e.g. one LW post tells you about the concept of “belief in belief” instead of giving you a link to a several hundred pages long book from Dennett. Most of the Sequences could be replaced by a list of 20 or 50 books to read, and they would certainly provide much more information, but reading all those books would take me a few years, while the whole Sequences can be read in a month or two. And if I want to communicate the whole thing to someone else, giving them one e-book is better than telling them to read 20 or 50 books.
Just like that, I would like to see 10 or 20 books on human social behavior compiled into a single book (of course, with links to the original sources for those who want to read more). To have the option of giving someone one book and saying: “This is the kind of community I want to create; would you like to join?”
It is also important (I haven’t read the specific book you linked, so I am not sure if this applies) to have some ideas actionable.
To explain what I mean, think about books like “Predictably Irrational” which contain the information that human behavior is often wrong, but all you get from them is the idea “humans are stupid”. Possibly true, but it doesn’t help you become stronger. On the other hand, in the Sequences there is often a hint towards a strategy that could reduce a specific bias. Because the spirit of LW is not pointing fingers towards others and laughing, but looking at the mirror and thinking strategically.
Then of course there is the whole PUA industry that tries to be actionable, and that specific aspect I admire, but they are horrible from the epistemic point of view: “schools proliferating without evidence”, overgeneralizations, confirmation bias, etc. What I’d like to see is someone who gives a correct explanation, supported by science, and then a few actionable ideas for people who want to improve themselves. Focusing on general human interaction (not just seeking one-night stands in loud pubs), considering also long-term consequences. But then I’d like to see people taking those ideas seriously.
Another important part would be debunking the existing myths; e.g. showing that human morality is more then mere cooperation in iterated prisoners’ dilemmas (why do tourists tip in foreign restaurants even if they are unlikely to ever return there again?).
A few books that come to my mind are “The Theory of Moral Sentiments”, “The Mating Mind”, “Nonviolent Communication”, “Games Prople Play”, “Creating a Life Together”, maybe even “Don’t Shoot the Dog”. I probably forgot something important.
This seems way too broad. If it was done right, I don’t think it would end up being a single sequence.
How much have you looked into this already and do have any more concrete ideas on where you would want the sequence to go?
I think there are two main problems with trying to do a sequence on this:
I think that the problems that people have with social skills are more individualistic than the heuristics and biases that are normally discussed on this site. They are largely due to ingrained perspectival and strategical problems, which are in constrast, to say, the representativeness heuristic which we can all relate to. A consequence of this is that most people are going to find a lot of the content to not be applicable for them. I would guess that this problem could be overcome by creating more of a debug guide like if you have problems expressing yourself go to post 3 or you could also go meta and let people realize their own problems.
As you mention, it is important: “to have some ideas actionable”, but I think it is also especially important for these type of skills that they be easily able to be put into action and that there are opportunities available to easily put them into action. In this area, I would think that most of the ideas that would make an impact in people lives are not going to seem particularly insightful or revolutionary, in fact, they may at first appear to be incredibly obvious. It is only when they are put into practice that people come to recognize their importance. I would think that simulation of these skills is not as useful or as interesting as it is for the other content on this site.
Something like this?
Yeah, properly done, this would be a lot of text.
My wish is to create rationalist communities which are emotionally healthy, capable of action, and successful in life. So the content of the book would be “all knowledge that is important for wannabe rationalists to become this kind of community”. But I suspect that it would be a lot of material; not just to give people the necessary skills, but also to combat existing myths.
By emotionally healthy I mean that the group wouldn’t fall apart immediately because of some inconsequential squabble, but also wouldn’t become some kind of a cult. That the people inside the group would be happy and would achieve their personal dreams, and the people outside of the group would be mostly positively impressed. That the group as a whole and its individuals would be rational, but not suffering from akrasia.
When you look at the existing Sequences, you see that Eliezer not only spends a lot of time arguing against supernatural stuff, but he also needs to turn around and argue against the overenthusiastic “fans of evolution”, and explain that actually evolution is stupid, it can lead to extinction, and the group selectionism mostly doesn’t work. I expect that the similar kind of hedging both ways would also be necessary for this topic. That people often try to avoid an extreme by running into the opposite extreme—“if I agree with everything, it means I am a sheep; therefore I will disagree with everything” or “acting without thinking is stupid; therefore I will always think and never act” or “people often disregard their own reason because they want to fit in the group; therefore I need to be abrasive and cynical about everything”—and it will be necessary not only to navigate them properly, but also make them notice when someone else promotes an extreme form of behavior.
I agree that the palette of the most frequent social/emotional mistakes is probably wider that the palette of the most frequent cognitive errors. But people are less unique than they imagine. For example, read Games People Play—this book describes about a dozen patterns of dysfunctional human interaction, and most people are shocked to find their own story described there. A therapist or a religious confessor probably mostly hears the same few stories over and over again. Also, Games People Play after each pattern contains an advice “what to do if you find youself stuck in this pattern”.
In that direction, yeah. The guide contains a lot of dense “do this, do that, don’t do this, don’t do that”, which I get is compressed to make a long text short, but I would like to read the longer version with explanations and stories that would make it easier to memorize. (Reading all the recommended literature on the last page would probably do it.) Also, the guide is about making a successful meetup, which is a great thing, but “meetup” still feels to me like “inside the lab”, and I would like something that goes beyond the time and space limitations of the meetup. Similarly how the Sequences are not about “how to think rationally during a meetup”, but how to think rationally in general.
Well, there are still some things I am not sure about, so… that’s one of the reasons why I am not trying to write that text myself right now. I don’t see exactly how the things are connected. I never was a community leader, so this would be outside my experience. There are still skills that I don’t have, and problems that I can’t solve even theoretically. (For example one of my big concerns is that I think that successful groups sooner or later attract psychopaths who will try to get into positions of power and exploit the whole thing for themselves. No idea how to prevent this, though. I think I have a heuristic for detecting that it already happened: it’s when people are afraid to speak openly with each other; but that is too late, if your goal is to prevent that thing from happening in the first place. I suspect that some of the seemingly irrational traditional rituals, such as “getting drunk together” could actually have been designed for this purpose; but maybe I’m completely wrong.)
Is your wish actually to create rationalist communities which are emotionally healthy, capable of action, and successful in life so that you can become these things?
I think the problem here is that the underlying problem that caused these people to take on an extreme view in the first place is still there and so when they do change their view they just tend to adopt another similarly extreme view. For example you mentioned PUA before; I have noticed that there are some men who have a strong level of neediness and so become ‘nice guys’ in the worst sense. They then read some of the PUA stuff and instead of doing good things like: developing genuine confidence, self-respect, a healthy sense of boundaries etc. they take a shortcut. They just change their perspective on women. The neediness is still there, however, and so they adopt a view where they objectify women. They essentially become ‘assholes’. In both the ‘asshole’ and the ‘nice guy’ cases there is a sense in which these people are giving up or altering part of who they are. This is what I think these people would actually need to solve if they were going to make an improvement in their situation.
That’s probably true, but I was trying to say that unlike with cognitive errors, which most people can relate to, most people would only be able to relate to a few of the games and would find the others to be largely irrelevant for them.
I would expect that any group like the one you propose would have a very flat hierarchy of power. In fact, if people become dependent on the group or find themselves seeming to need the group to improve then the group probably isn’t working too well. You reduce dependency by maximizing the free exchange of information and the ability for people to improve outside of the group.
Good insight. Yeah, it’s so that I can become these things permanently.
People usually do what people around them do. Trying to do something that no one around you does is possible but exhausting, like swimming against the current. On the other hand, doing what people around you do is easy and literally instinctive.
Many people report that their choices of environment are either (a) rational people suffering from akrasia and mildly neurotic, or (b) emotionally healthy and highly active people who often believe obviously stupid ideas, but because of lucky compartmentalization it doesn’t ruin their lives. Sometimes they perceive this as a false dilemma: should I try to become more rational, but akratic and neurotic; or should I throw rationality away and become a happy and healthy human?
To me it seems obvious that a third way is possible, but it would be much easier when surrounded by a group of humans who do the same. I mean, to me this is obvious: after a LW meetup, I become much more reasonable and active, but only for a few days, then it wears off. Maybe I am more sensitive to my surroundings than an average person (I do have some evidence for this), but I believe that this effect is universal or close to universal; it’s just a question of degree. Humans are a social species, peer pressure and cultural learning exist.
I interpret this as “the beginners are doing it wrong”.
Yeah, I get it now. We all suffer from all (major) cognitive biases, only to a different degree, but each of us only has a subset of the emotional problems. Yeah, it seems so.
Well, the benefits of promoting rationality have been widely recognized ever since EY published his post on raising the sanity waterline.
But let’s be clear—rationality outreach is not LW outreach. The goal is to spread good ideas from rationality and raise the sanity waterline, not get people to engage with LW necessarily. I myself would like to wait until more of LW 2.0 comes into being, including the newbie section, before inviting newcomers to engage with LW actively.
EY was attempting to spread his ideas since his first post on overcomingbias. This pattern was followed through entire Sequences. Do you regard this as different from then?
Means and ends. LW was the means of “spreading his ideas” as you put it. Whereas Gleb is promoting the idea that we should do outreach for LW. LW as the end.
Responded to your earlier point above—as you see, the point is to raise the sanity waterline, not do LW outreach.
Huh, there have been several? I’m glad I missed the other ones. ;-)