I’m at a loss regarding what you must consider ‘horrible’. About the worst example I can think of is the JoshElders saga of pedophilia posts, and it only took two days to downvote everything he posted into oblivion and get it removed from the lists—and even that contained a lot of good discussion in the comments.
If you truly see that much horrible stuff here, perhaps your bar is too low, or perhaps mine is too high. Can you provide examples that haven’t been downvoted, that are actually considered mainstream opinion here?
Most of these are not dominant on LW, but come up often enough to make me twitchy. I am not interested in debating or discussing the merits of these points here because that’s a one-way track to a flamewar this thread doesn’t need.
The stronger forms of evolutionary psychology and human-diversity stuff. High confidence that most/all demographic disparities are down to genes. The belief that LessWrong being dominated by white male technophiles is more indicative of the superior rationality of white male technophiles than any shortcomings of the LW community or society-at-large.
Any and all neoreactionary stuff.
High-confidence predictions about the medium-to-far-future (especially ones that suggest sending money)
Throwing the term “eugenics” around cavalierly and assuming that everyone knows you’re talking about benevolent genetic engineering and not forcibly-sterilizing-people-who-don’t-look-like-me.
There should be a place to discuss these things, but it probably shouldn’t be on a message board dedicated to spreading and refining the art of human rationality. LessWrong could easily be three communities:
a rationality forum (based on the sequences and similar, focused on technique and practice rather than applying to particular issues)
a transhumanist forum (for existential risk, cryonics, FAI and similar)
an object-level discussion/debate forum (for specific topics like feminism, genetic engineering, neoreactionism, etc).
High confidence that most/all demographic disparities are down to genes. The belief that LessWrong being dominated by white male technophiles is more indicative of the superior rationality of white male technophiles than any shortcomings of the LW community or society-at-large.
I am not sure how much these opinions are that extreme, and how much it’s just a reflection of how political debates push people into “all or nothing” positions. Like, if you admit that genes have any influence on population, you are automatically misinterpreted to believe that every aspect of a population is caused by genes. Because, you know, there are just two camps, “genes, boo” camp and “genes, yay” camp, and you have already proved you don’t belong into the former camp, therefore...
At least this is how I often feel in similar debates. Like there is no “genes affect 50% of something” position. There is a “genes don’t influence anything significant, ever” camp where all the good guys are; and there is the “other” camp, with everyone else, including me and Hitler. If we divide a continuous scale into “zero” and “nonzero” subsets, then of course 0.1 and 0.5 and 1 and infinity all get into the same subset. But that’s looking through the mindkilling glasses. I could start explaining how believing that genes can have some influence on thinking and behavior is not the same as attributing everything to the genes, and is completely nothing like advocating a genocide… but I already see all the good guys looking at me and thinking: “Nice try, but you are not going to fool us. We know what you really believe.”—Well, the idea is that I actually don’t.
I even don’t think that having a white male majority at this moment is some failure of a LW community. I mean—just try imagine a parallel universe where someone else started LW. How likely it is that in the parallel universe it is perfectly balanced by ethnicity and gender? What exactly does your model of reality make you predict?
Imagine that you are a visitor from an alien species are you are told the following facts: 1) Most humans are irrational, and rationality is associated with various negative things, like Straw Vulcans. Saying good things about rationality will get you laughed at. But paradoxically, telling others that they are not very rational, is offensive. So it’s best to avoid this topic, which most people do. 2) Asch’s conformity test suggests that women are a bit more likely than men to conform. 3) Asians have a culture that discourages standing out of the crowd. 4) Blacks usually live in the most poor countries, and those living in the developed countries were historically oppressed. -- Now that you know these facts, you are told that there is a new group of people who tries to promote rationality and science and technology. As the alien visitor, based on the given data, please tell me, which gender and which race would you bet would be most represented in this group?
If the LW remains forever a group of mostly white males, then yes, that would mean that we have failed. Specifically that we have failed to spread rationality, to increase the sanity waterline. But the fact that LW started with such demographics is completely unsurprising to me. So, is the proportion of other groups increasing on LW? Looking at the surveys for two years, it seems to me that yes. Then the only question is whether it is increasing fast enough? Well, fast enough compared with what? Sure, we could do more about it. Surely, we are not automatically strategic, we have missed some opportunities. Let’s try harder. But there is no point in obsessing over the fact that LW started as a predominantly white male group, or that we didn’t fix the disparities in the society within a few years.
I even don’t think that having a white male majority at this moment is some failure of a LW community
There are other options. I think there exist possible worlds where LW is less-offputting to people outside of the uppermiddleclasstechnophilewhitemaleosphere with demographics that are closer to, but probably not identical to, the broader population. Like you said, there’s no reason for us to split the world into all-or-nothing sides: It’s entirely possible (and I think likely) that statistical differences do exist between demographics and that we have a suboptimal community/broader-culture which skews those differences more than would otherwise be the case.
Edit: I had only skimmed your comment when writing this reply; On a reread, I think we mostly agree.
I’ve definitely experienced strong adverse reactions to discussing eugenics ‘cavalierly’ if you don’t spend at least ten to fifteen minutes covering the inferential steps and sanitising the perceived later uses of the concept.
Good point about the possible three communities. I haven’t posted here much, as I found myself standing too far outside the concepts whilst I worked my way through the sequences. Regardless of that, the more I read the more I feel I have to learn, especially about patterned thinking and reframes. To a certain extent I see this community as a more scientifically minded Maybe Logic group, when thinking about priors and updating information.
A lot of the transhumanist material have garnered very strong responses from friends though, but I’ve stocked up on Istvan paperbacks to hopefully disseminate soon.
My vague recollections of LW-past disagreements, but I don’t have any readily available examples. It’s possible my model is drawing too much on the-rest-of-the-Internet experiences and I should upgrade my assessment of LW accordingly.
The stronger forms of evolutionary psychology and human-diversity stuff. High confidence that most/all demographic disparities are down to genes. The belief that LessWrong being dominated by white male technophiles is more indicative of the superior rationality of white male technophiles than any shortcomings of the LW community or society-at-large.
Any and all neoreactionary stuff.
High-confidence predictions about the medium-to-far-future (especially ones that suggest sending money)
Throwing the term “eugenics” around cavalierly and assuming that everyone knows you’re talking about benevolent genetic engineering and not forcibly-sterilizing-people-who-don’t-look-like-me.
I don’t mind #3, in fact the discussions of futurism are a big draw of LessWrong for me (though I suppose there are general reasons for being cautious about your confidence about the future). But I would be very happy to see #1, #2, and #4 go away.
I find stuff like “if you don’t sign up your kids for cryonics then you are a lousy parent” more problematic than a sizeable fraction of what reactionaries say.
What if you qualified it, “If you believe the claims of cryonicists, are signed up for cryonics yourself, but don’t sign your kids up, then you are a lousy parent”?
In discussing vaccinations, how many people choose to say something as conditional as “if you believe the claims of doctors, have had your own vaccinations, but don’t let your kids be vaccinated, then you are a lousy parent”?
No, the argument is that you should believe the value of vaccinations, and that disbelieving the value of vaccinations itself makes your parenting lousy.
Well, I think Eliezer feels the same about cryonics as pretty much all the rest of us feel about vaccines—they help protect your kids from several possible causes of death.
No, the argument is that you should believe the value of vaccinations, and that disbelieving the value of vaccinations itself makes your parenting lousy.
Which is pretty much the same argument as saying that you should baptize your children and that disbelieving the value of baptism itself makes your parenting lousy.
If the belief-set you’re subtly implying is involved were accurate, then it would be.
However, I think we have a “sound” vs “sound” tree-falling-in-the-woods issue here. Is “lousy parenting” a virtue-ethics style moral judgement, or a judgement of your effectiveness as a parent?
Taboo “lousy”, people. We’re supposed to be rationalists.
Which is pretty much the same argument as saying that you should baptize your children and that disbelieving the value of baptism itself makes your parenting lousy.
Exactly, it all depends on the actual value of the thing in question. I believe baptism has zero value, I believe vaccines have lots of value, I’m highly uncertain about the value of cryonics (compared to other things the money could be going to).
A person is expected to say such about X if they believe X has lots of value. So why is it so very problematic for Eliezer to say it about cryonics when he believes cryonics have lots of value?
It’s impolitic and I don’t know how effective it is in changing minds. But then again it’s the same thing we say about vaccinations, so who knows: perhaps shaming parents does work in convincing them. I’d like to see research about that.
perhaps shaming parents does work in convincing them
My prior is that the results will be bi-modal: some parents can be shamed into adjusting their ways, while for others it will only force them into the bunker mindset and make them more resistant to change.
a rationality forum (based on the sequences and similar, focused on technique and practice rather than applying to particular issues)
a transhumanist forum (for existential risk, cryonics, FAI and similar)
an object-level discussion/debate forum (for specific topics like feminism, genetic engineering, neoreactionism, etc).
I’m not sure that would work. After all, Bayes’s rule has fairly obvious unPC consequences when applied to race or gender, and thinking seriously about transhumanism will require dealing with eugenics-like issues.
Think of it as the no-politics rule turned up to 11.The point is not that these things can’t be reasoned about, but that the strong (negative/positve) affect attached to certain things makes them ill-suited to rationalist pedagogy.
Lowering the barrier to entry doesn’t mean you can’t have other things further up the incline, though.
Datapoint: I find that I spend more time reading the politically-charged threads and subthreads than other content, but get much less out of them. They’re like junk food; interesting but not useful. On the other hand, just about anywhere other than LW, they’re not even interesting.
(on running a memory-check, I find that observation applies mostly to comment threads. There’s been a couple of top-level political articles that I genuinely learned something from)
I’m at a loss regarding what you must consider ‘horrible’. About the worst example I can think of is the JoshElders saga of pedophilia posts, and it only took two days to downvote everything he posted into oblivion and get it removed from the lists—and even that contained a lot of good discussion in the comments.
If you truly see that much horrible stuff here, perhaps your bar is too low, or perhaps mine is too high. Can you provide examples that haven’t been downvoted, that are actually considered mainstream opinion here?
Most of these are not dominant on LW, but come up often enough to make me twitchy. I am not interested in debating or discussing the merits of these points here because that’s a one-way track to a flamewar this thread doesn’t need.
The stronger forms of evolutionary psychology and human-diversity stuff. High confidence that most/all demographic disparities are down to genes. The belief that LessWrong being dominated by white male technophiles is more indicative of the superior rationality of white male technophiles than any shortcomings of the LW community or society-at-large.
Any and all neoreactionary stuff.
High-confidence predictions about the medium-to-far-future (especially ones that suggest sending money)
Throwing the term “eugenics” around cavalierly and assuming that everyone knows you’re talking about benevolent genetic engineering and not forcibly-sterilizing-people-who-don’t-look-like-me.
There should be a place to discuss these things, but it probably shouldn’t be on a message board dedicated to spreading and refining the art of human rationality. LessWrong could easily be three communities:
a rationality forum (based on the sequences and similar, focused on technique and practice rather than applying to particular issues)
a transhumanist forum (for existential risk, cryonics, FAI and similar)
an object-level discussion/debate forum (for specific topics like feminism, genetic engineering, neoreactionism, etc).
I am not sure how much these opinions are that extreme, and how much it’s just a reflection of how political debates push people into “all or nothing” positions. Like, if you admit that genes have any influence on population, you are automatically misinterpreted to believe that every aspect of a population is caused by genes. Because, you know, there are just two camps, “genes, boo” camp and “genes, yay” camp, and you have already proved you don’t belong into the former camp, therefore...
At least this is how I often feel in similar debates. Like there is no “genes affect 50% of something” position. There is a “genes don’t influence anything significant, ever” camp where all the good guys are; and there is the “other” camp, with everyone else, including me and Hitler. If we divide a continuous scale into “zero” and “nonzero” subsets, then of course 0.1 and 0.5 and 1 and infinity all get into the same subset. But that’s looking through the mindkilling glasses. I could start explaining how believing that genes can have some influence on thinking and behavior is not the same as attributing everything to the genes, and is completely nothing like advocating a genocide… but I already see all the good guys looking at me and thinking: “Nice try, but you are not going to fool us. We know what you really believe.”—Well, the idea is that I actually don’t.
I even don’t think that having a white male majority at this moment is some failure of a LW community. I mean—just try imagine a parallel universe where someone else started LW. How likely it is that in the parallel universe it is perfectly balanced by ethnicity and gender? What exactly does your model of reality make you predict?
Imagine that you are a visitor from an alien species are you are told the following facts: 1) Most humans are irrational, and rationality is associated with various negative things, like Straw Vulcans. Saying good things about rationality will get you laughed at. But paradoxically, telling others that they are not very rational, is offensive. So it’s best to avoid this topic, which most people do. 2) Asch’s conformity test suggests that women are a bit more likely than men to conform. 3) Asians have a culture that discourages standing out of the crowd. 4) Blacks usually live in the most poor countries, and those living in the developed countries were historically oppressed. -- Now that you know these facts, you are told that there is a new group of people who tries to promote rationality and science and technology. As the alien visitor, based on the given data, please tell me, which gender and which race would you bet would be most represented in this group?
If the LW remains forever a group of mostly white males, then yes, that would mean that we have failed. Specifically that we have failed to spread rationality, to increase the sanity waterline. But the fact that LW started with such demographics is completely unsurprising to me. So, is the proportion of other groups increasing on LW? Looking at the surveys for two years, it seems to me that yes. Then the only question is whether it is increasing fast enough? Well, fast enough compared with what? Sure, we could do more about it. Surely, we are not automatically strategic, we have missed some opportunities. Let’s try harder. But there is no point in obsessing over the fact that LW started as a predominantly white male group, or that we didn’t fix the disparities in the society within a few years.
There are other options. I think there exist possible worlds where LW is less-offputting to people outside of the uppermiddleclasstechnophilewhitemaleosphere with demographics that are closer to, but probably not identical to, the broader population. Like you said, there’s no reason for us to split the world into all-or-nothing sides: It’s entirely possible (and I think likely) that statistical differences do exist between demographics and that we have a suboptimal community/broader-culture which skews those differences more than would otherwise be the case.
Edit: I had only skimmed your comment when writing this reply; On a reread, I think we mostly agree.
I’ve definitely experienced strong adverse reactions to discussing eugenics ‘cavalierly’ if you don’t spend at least ten to fifteen minutes covering the inferential steps and sanitising the perceived later uses of the concept.
Good point about the possible three communities. I haven’t posted here much, as I found myself standing too far outside the concepts whilst I worked my way through the sequences. Regardless of that, the more I read the more I feel I have to learn, especially about patterned thinking and reframes. To a certain extent I see this community as a more scientifically minded Maybe Logic group, when thinking about priors and updating information.
A lot of the transhumanist material have garnered very strong responses from friends though, but I’ve stocked up on Istvan paperbacks to hopefully disseminate soon.
I can’t see this as part of the problem. You don’t have to discuss it, but I’m bewildered that it’s on the list.
I should probably have generalized this to “community-accepted norms that trigger absurdity heuristic alarms in the general population”.
Again, there should be a place to discuss that, but it shouldn’t be the same place that’s trying to raise the sanity waterline.
I don’t think this hypothesis is supported by the evidence, specifically past LW discussions.
My vague recollections of LW-past disagreements, but I don’t have any readily available examples. It’s possible my model is drawing too much on the-rest-of-the-Internet experiences and I should upgrade my assessment of LW accordingly.
Yes, I am specifically talking about LW. With respect to the usual ’net forums I agree with you.
I don’t mind #3, in fact the discussions of futurism are a big draw of LessWrong for me (though I suppose there are general reasons for being cautious about your confidence about the future). But I would be very happy to see #1, #2, and #4 go away.
I find stuff like “if you don’t sign up your kids for cryonics then you are a lousy parent” more problematic than a sizeable fraction of what reactionaries say.
What if you qualified it, “If you believe the claims of cryonicists, are signed up for cryonics yourself, but don’t sign your kids up, then you are a lousy parent”?
I would agree with it, but that’s a horse of a different colour.
In discussing vaccinations, how many people choose to say something as conditional as “if you believe the claims of doctors, have had your own vaccinations, but don’t let your kids be vaccinated, then you are a lousy parent”?
No, the argument is that you should believe the value of vaccinations, and that disbelieving the value of vaccinations itself makes your parenting lousy.
Well, I think Eliezer feels the same about cryonics as pretty much all the rest of us feel about vaccines—they help protect your kids from several possible causes of death.
Which is pretty much the same argument as saying that you should baptize your children and that disbelieving the value of baptism itself makes your parenting lousy.
If the belief-set you’re subtly implying is involved were accurate, then it would be.
However, I think we have a “sound” vs “sound” tree-falling-in-the-woods issue here. Is “lousy parenting” a virtue-ethics style moral judgement, or a judgement of your effectiveness as a parent?
Taboo “lousy”, people. We’re supposed to be rationalists.
Exactly, it all depends on the actual value of the thing in question. I believe baptism has zero value, I believe vaccines have lots of value, I’m highly uncertain about the value of cryonics (compared to other things the money could be going to).
A person is expected to say such about X if they believe X has lots of value. So why is it so very problematic for Eliezer to say it about cryonics when he believes cryonics have lots of value?
It’s impolitic and I don’t know how effective it is in changing minds. But then again it’s the same thing we say about vaccinations, so who knows: perhaps shaming parents does work in convincing them. I’d like to see research about that.
My prior is that the results will be bi-modal: some parents can be shamed into adjusting their ways, while for others it will only force them into the bunker mindset and make them more resistant to change.
I’m not sure that would work. After all, Bayes’s rule has fairly obvious unPC consequences when applied to race or gender, and thinking seriously about transhumanism will require dealing with eugenics-like issues.
“rather than applying to particular issues”
That would simply result in people treating Bayesianism as if it’s a separate magisterium from everyday life.
Think of it as the no-politics rule turned up to 11.The point is not that these things can’t be reasoned about, but that the strong (negative/positve) affect attached to certain things makes them ill-suited to rationalist pedagogy.
Lowering the barrier to entry doesn’t mean you can’t have other things further up the incline, though.
Datapoint: I find that I spend more time reading the politically-charged threads and subthreads than other content, but get much less out of them. They’re like junk food; interesting but not useful. On the other hand, just about anywhere other than LW, they’re not even interesting.
(on running a memory-check, I find that observation applies mostly to comment threads. There’s been a couple of top-level political articles that I genuinely learned something from)