LessWrong’s focus on the bay-area/software-programmer/secular/transhumanist crowd seems to me unnecessary.
I don’t understand what this means. LW is composed mostly from people of these backgrounds. Are you saying that this a problem?
But when people here tie rationality to being part of that subset, or to high-IQ in general, it seems a bit silly.
If by rationality you mean systematic winning (where winning can be either truth seeking (epistemic rationality) or goal achieving (instrumental rationality)) then no one is claiming that we have a monopoly on it. But if by rationality, you are referring to the group of people who have decided to study it and form a community around it, then yes most of us are high IQ and in technical fields. And if you think this is a problem, I’d be interested in why.
I also find the near-obsession with IQ a bit unsettling
In other words, my opponent believes something which is kind like being obsessed with it, and obsession is bad. If you have a beef with a particular view or argument then say so.
Some of the material is essentially millenia old, self-knowledge and self-awareness and introspection aren’t new inventions
Eliezer has responded to this (very common) criticism here
I don’t know why you want LW to be packaged to a wide audience. I suspect this would do more harm than good to us, and to the wider audience. It would harm the wider audience because of the sophistication bias, which would cause them to mostly look for errors in thinking in others and not their own thinking. It takes a certain amount of introspectiveness (which LW seems to self-select for) not to become half-a-rationalist.
I don’t understand what this means. LW is composed mostly from people of these backgrounds. Are you saying that this a problem?
If it creates an exclusionary atmosphere, or prevents people outside that group from reading and absorbing the ideas, or closes this community to outside ideas, then yes. But mostly I think that focusing on presenting these ideas only to that group is unnecessary.
If by rationality you mean systematic winning (where winning can be either truth seeking (epistemic rationality) or goal achieving (instrumental rationality)) then no one is claiming that we have a monopoly on it. But if by rationality, you are referring to the group of people who have decided to study it and form a community around it, then yes most of us are high IQ and in technical fields. And if you think this is a problem, I’d be interested in why.
I am really thinking of posts like this where many commenters agonize over how hard it would be to bring rationality to the masses.
In other words, my opponent believes something which is kind like being obsessed with it, and obsession is bad. If you have a beef with a particular view or argument then say so.
I did say what I have a beef with. The attitude that deliberate application of rationality is only for high-iq people, or that only the high-iq people are likely to make real contributions.
Eliezer has responded to this (very common) criticism here
It’s not a criticism—it’s an explanation for why I don’t believe it would be that difficult to package the content of the sequences for a general audience. None of it needs to be packaged as revelatory. Instead of calling rationality systematic winning, just call it a laundry list of methods for being clear-eyed and avoiding self-deception.
I don’t know why you want LW to be packaged to a wide audience. I suspect this would do more harm than good to us, and to the wider audience. It would harm the wider audience because of the sophistication bias, which would cause them to mostly look for errors in thinking in others and not their own thinking. It takes a certain amount of introspectiveness (which LW seems to self-select for) not to become half-a-rationalist.
Several responses bring up the “half-a-rationalist” criticism, but I think that’s something that can be avoided by presentation. Instead of “here’s a bunch of tools to be cleverer than other people”, present it as “here’s a bunch of tools to occasionally catch yourself before you make a dumb mistake”. It’s certainly no excuse not to try to think of how a more broadly-targeted presentation of the sequences could be put together.
And really, what’s the worst case-scenario? That articles here sometimes get cited vacuously kind of like those fallacy lists? Not that bad.
Inclusiveness is not a terminal value for me. Certain types of people are attracted to a community such as LW, as with every other type of community. I do not see this as a problem.
Which of the following statements would you endorse if any?
1) LW should change in such a way as to be more inclusive to a wider variety of people.
1 a) LW members should change how they comment (perhaps avoiding jargon?) so as to be more inclusive.
1 b) LW members should talk change the topics that they discuss in order to be more inclusive.
2) LW should compose a rewritten set of sequences to replace the current sequence as a way of making the community more inclusive.
3) LW should compose a rewritten set of sequences and publish it somewhere (perhaps a book or a different website)) to spread the tools of rationality.
4) LW should try to actively recruit different types of people than the ones that are naturally inclined to read it already.
I don’t think LW needs to change dramatically (though more activity would be nice), I just think it should be acknowledged that the demographic focus is narrow; a wider focus could mean a new community or a growth of LW, or something else.
Mainly #3 and to an extent #4.
I’d modify and combine #4 and #1a/1b into:
5) We should have inclusionary, non-jargony explanations and examples at the ready to express almost any idea on rationality that we understand within LW’s context. Especially ideas that have mainstream analogues, which is most of them. This has many potential uses including #1 and #4.
I don’t understand what this means. LW is composed mostly from people of these backgrounds. Are you saying that this a problem?
If by rationality you mean systematic winning (where winning can be either truth seeking (epistemic rationality) or goal achieving (instrumental rationality)) then no one is claiming that we have a monopoly on it. But if by rationality, you are referring to the group of people who have decided to study it and form a community around it, then yes most of us are high IQ and in technical fields. And if you think this is a problem, I’d be interested in why.
In other words, my opponent believes something which is kind like being obsessed with it, and obsession is bad. If you have a beef with a particular view or argument then say so.
Eliezer has responded to this (very common) criticism here
I don’t know why you want LW to be packaged to a wide audience. I suspect this would do more harm than good to us, and to the wider audience. It would harm the wider audience because of the sophistication bias, which would cause them to mostly look for errors in thinking in others and not their own thinking. It takes a certain amount of introspectiveness (which LW seems to self-select for) not to become half-a-rationalist.
If it creates an exclusionary atmosphere, or prevents people outside that group from reading and absorbing the ideas, or closes this community to outside ideas, then yes. But mostly I think that focusing on presenting these ideas only to that group is unnecessary.
I am really thinking of posts like this where many commenters agonize over how hard it would be to bring rationality to the masses.
I did say what I have a beef with. The attitude that deliberate application of rationality is only for high-iq people, or that only the high-iq people are likely to make real contributions.
It’s not a criticism—it’s an explanation for why I don’t believe it would be that difficult to package the content of the sequences for a general audience. None of it needs to be packaged as revelatory. Instead of calling rationality systematic winning, just call it a laundry list of methods for being clear-eyed and avoiding self-deception.
Several responses bring up the “half-a-rationalist” criticism, but I think that’s something that can be avoided by presentation. Instead of “here’s a bunch of tools to be cleverer than other people”, present it as “here’s a bunch of tools to occasionally catch yourself before you make a dumb mistake”. It’s certainly no excuse not to try to think of how a more broadly-targeted presentation of the sequences could be put together.
And really, what’s the worst case-scenario? That articles here sometimes get cited vacuously kind of like those fallacy lists? Not that bad.
Inclusiveness is not a terminal value for me. Certain types of people are attracted to a community such as LW, as with every other type of community. I do not see this as a problem.
Which of the following statements would you endorse if any?
1) LW should change in such a way as to be more inclusive to a wider variety of people.
1 a) LW members should change how they comment (perhaps avoiding jargon?) so as to be more inclusive.
1 b) LW members should talk change the topics that they discuss in order to be more inclusive.
2) LW should compose a rewritten set of sequences to replace the current sequence as a way of making the community more inclusive.
3) LW should compose a rewritten set of sequences and publish it somewhere (perhaps a book or a different website)) to spread the tools of rationality.
4) LW should try to actively recruit different types of people than the ones that are naturally inclined to read it already.
I don’t think LW needs to change dramatically (though more activity would be nice), I just think it should be acknowledged that the demographic focus is narrow; a wider focus could mean a new community or a growth of LW, or something else.
Mainly #3 and to an extent #4.
I’d modify and combine #4 and #1a/1b into:
5) We should have inclusionary, non-jargony explanations and examples at the ready to express almost any idea on rationality that we understand within LW’s context. Especially ideas that have mainstream analogues, which is most of them. This has many potential uses including #1 and #4.