Having certain topics discussed too openly on Lesswrong could result in several unfortunate things happening.
It could make certain potential rationalists be deterred from participating in the community.
It could attract the attention of certain contrarians who are less-than-rational and, for various reasons, should not necessarily be considered potential rationalists.
Most importantly, from the standpoint of the Singularity Institute (or, at least, what I think is its standpoint), it could increase the probability of human extinction by harming the SI’s reputation.
I think as far as 1 goes, it seems to me like that’s already happening—I know a few people not on this site (who discovered it independently of myself, none of whom know each other), and many more I’ve encountered about online, who explicitly view LW as essentially compromised by 2, hence they have no interest in being here. YMMV how much those people are reachable or desireable, of course, but it’s difficult for me to disagree with their basic perception that this place is already full of contrarian-cluster types who’re intellectuals but still quite biased.
I also wonder about signalling now, re: “less-than-rational”—given what I understand of rationality as it’s described and the reasons humans don’t tend to display that trait most of the time, it seems like it’s only asymptotically-reachable—you can reduce the frequency of incorrect decisions and amend certain biases in short or long-term ways, but you probably can’t get rid of it altogether. Who here is truly “rational?” Even Eliezer Yudkowsky still has his own biases—the most you can hope for is, well, “less wrong”, and that is work to achieve.
So assuming (big assumption here!) that I understand the about rationality and how LW views it, and why it’s desireable and how much realistically a human being can self-optimize for that trait, it seems like “less than rational” should probably be avoided. Aren’t we all? Aren’t we all going to be until such time as we come up with some kind of game-breaking thing that allows a person to really just run rationality full-time if that’s what they want?
You seem to be suggesting that, since the community already falls short of its stated goal, there’s no particular reason to avoid a practice that makes that goal less likely.
First, that “we might draw people less-than-rational, and that’s undesireable” seems to suggest, in a Sapir-Whorf kinda way, that the utterers consider themselves to be rational, rather than rationality being a thing which is valuable to increase in oneself, and that this suggests to me a degree of reflective incoherence on the part of those whose mental model can be described that way, which is at conflict with the goal of being less wrong.
Second, that members of this community should probably not give themselves too much credit for rationality or presume that any given proficiency in the methods of rationality has adequately compensated for their biases—at any given point it is still overwhelmingly likely that their cognition is affected by some unnoticed, unaddressed and significant bias, which may not be obvious to other members of this largely-homogenous community. This also amounts to reflective incoherence.
Corollary: That this state of affairs is obvious to an unknown but possibly significant number of people who might be supportive of the community’s aggregate goals and methods, but who are put off by the perception of such missed blind spots; that is, not everyone who looks at LW and rejects it is rejecting rationality, or unsuited for it, or just incapable of learning it—and nobody here, even the seasoned and highly-upvoted contributors, is without bias.
“Less than rational” isn’t the phrase I’d use; as you say, rationality really shouldn’t be understood as a discrete state but as an asymptotic goal, and even then it’s probably preferable to speak in terms of individual biases or cognitive skills as appropriate. But J_Taylor’s second point doesn’t lose much of its force if you cast it in terms of individuals seeking company in their specific contrarian beliefs, for whom this whole “rationality” business might be little more than a group-identifying label or a justifying habit of thought. Granted, it might eventually be possible to bring such a demographic around to actual truth-seeking, but it’ll take more work than debiasing someone who’s already posting in good faith—and this site isn’t so large or so stable that it can afford to spend a lot of time dragging people out of self-constructed ideological labyrinths in which they’re quite comfortable.
It’s a particularly nasty problem, though: ideology looks like common sense from the inside, and so it’s hard to tell to what extent the site culture’s already corrupted by arational ideas that’ve just happened to achieve local hegemony. I’d like to say that a careful and fearless examination of any beliefs that look like common sense to us should turn up the major problems, but frankly I don’t think we’re there yet—and an outside view, unfortunately, isn’t necessarily going to be helpful. There’s plenty of motivated cognition out there, too.
Nornagest defended the point better than I probably could. Nonetheless, I would like to clarify that “less-than-rational” was myself being slightly too euphemistic. I meant to say that some contrarians are contrarians due to highly problematic reasons. Some of them should not even be considered contrarians, but merely individuals who retain the beliefs of tribes which are not respected within mainstream intellectual culture. These individuals, due to opportunity costs if nothing else, should probably not be considered potential rationalists at this time.
For the record, I agree with your last two paragraphs. I might agree with your first suggestion as well.… I agree that “rational” constantly runs the risk of becoming a mere tribal marker used to enforce in-group/out-group boundaries and thus detached from any actual improvement in decision-making skills, and that different people here succumb to that temptation to different degrees at different times.
I’m less confident about the idea that being concerned about the quality of people attracted to the site, or endorsing decisions on the basis of such concern, is particularly reliable evidence that the speaker is succumbing to that temptation… but I’m no longer confident you’re even suggesting that.
Oh, I was just chiming in about how Vladimir_M claims that his positions are too unacceptable to be voiced publicly (even though, presumably, he believes they are true), when given what details I know or have inferred about him it seems more likely that his estimate of the cost of signalling is overstated (and what censure or punishment apart from reproving comments by people who disagree on the internet he expects to suffer is unclear to me). I was trying to explain a broader social pattern into which I see his behavior falling, to the person who’d expressed skepticism about his concerns.
I was trying to explain a broader social pattern into which I see his [i.e. mine—V.] behavior falling, to the person who’d expressed skepticism about his concerns.
For someone who wields the word “prejudice” as derogatory, you tend to assume an awful lot about people whom you don’t know at all except for a few paragraphs of their writing about impersonal and abstract topics.
I’m not “wielding” the word prejudice; it’s not a weapon. Also, in the above case I’m very specifically referring to prejudice as a phenomenon, and it being something less acceptable to signal—not saying that anything I don’t like qualifies as prejudice. I’m using a specific noun with a pretty basic definition—not suggesting that any particular set of statements is a case example.
Having certain topics discussed too openly on Lesswrong could result in several unfortunate things happening.
It could make certain potential rationalists be deterred from participating in the community.
It could attract the attention of certain contrarians who are less-than-rational and, for various reasons, should not necessarily be considered potential rationalists.
Most importantly, from the standpoint of the Singularity Institute (or, at least, what I think is its standpoint), it could increase the probability of human extinction by harming the SI’s reputation.
Mm, those reasons do make some sense.
I think as far as 1 goes, it seems to me like that’s already happening—I know a few people not on this site (who discovered it independently of myself, none of whom know each other), and many more I’ve encountered about online, who explicitly view LW as essentially compromised by 2, hence they have no interest in being here. YMMV how much those people are reachable or desireable, of course, but it’s difficult for me to disagree with their basic perception that this place is already full of contrarian-cluster types who’re intellectuals but still quite biased.
I also wonder about signalling now, re: “less-than-rational”—given what I understand of rationality as it’s described and the reasons humans don’t tend to display that trait most of the time, it seems like it’s only asymptotically-reachable—you can reduce the frequency of incorrect decisions and amend certain biases in short or long-term ways, but you probably can’t get rid of it altogether. Who here is truly “rational?” Even Eliezer Yudkowsky still has his own biases—the most you can hope for is, well, “less wrong”, and that is work to achieve.
So assuming (big assumption here!) that I understand the about rationality and how LW views it, and why it’s desireable and how much realistically a human being can self-optimize for that trait, it seems like “less than rational” should probably be avoided. Aren’t we all? Aren’t we all going to be until such time as we come up with some kind of game-breaking thing that allows a person to really just run rationality full-time if that’s what they want?
I’m not sure I understood you correctly.
You seem to be suggesting that, since the community already falls short of its stated goal, there’s no particular reason to avoid a practice that makes that goal less likely.
Confirm?
Deny.
I am suggesting two things, somewhat seperate:
First, that “we might draw people less-than-rational, and that’s undesireable” seems to suggest, in a Sapir-Whorf kinda way, that the utterers consider themselves to be rational, rather than rationality being a thing which is valuable to increase in oneself, and that this suggests to me a degree of reflective incoherence on the part of those whose mental model can be described that way, which is at conflict with the goal of being less wrong.
Second, that members of this community should probably not give themselves too much credit for rationality or presume that any given proficiency in the methods of rationality has adequately compensated for their biases—at any given point it is still overwhelmingly likely that their cognition is affected by some unnoticed, unaddressed and significant bias, which may not be obvious to other members of this largely-homogenous community. This also amounts to reflective incoherence.
Corollary: That this state of affairs is obvious to an unknown but possibly significant number of people who might be supportive of the community’s aggregate goals and methods, but who are put off by the perception of such missed blind spots; that is, not everyone who looks at LW and rejects it is rejecting rationality, or unsuited for it, or just incapable of learning it—and nobody here, even the seasoned and highly-upvoted contributors, is without bias.
“Less than rational” isn’t the phrase I’d use; as you say, rationality really shouldn’t be understood as a discrete state but as an asymptotic goal, and even then it’s probably preferable to speak in terms of individual biases or cognitive skills as appropriate. But J_Taylor’s second point doesn’t lose much of its force if you cast it in terms of individuals seeking company in their specific contrarian beliefs, for whom this whole “rationality” business might be little more than a group-identifying label or a justifying habit of thought. Granted, it might eventually be possible to bring such a demographic around to actual truth-seeking, but it’ll take more work than debiasing someone who’s already posting in good faith—and this site isn’t so large or so stable that it can afford to spend a lot of time dragging people out of self-constructed ideological labyrinths in which they’re quite comfortable.
It’s a particularly nasty problem, though: ideology looks like common sense from the inside, and so it’s hard to tell to what extent the site culture’s already corrupted by arational ideas that’ve just happened to achieve local hegemony. I’d like to say that a careful and fearless examination of any beliefs that look like common sense to us should turn up the major problems, but frankly I don’t think we’re there yet—and an outside view, unfortunately, isn’t necessarily going to be helpful. There’s plenty of motivated cognition out there, too.
Nornagest defended the point better than I probably could. Nonetheless, I would like to clarify that “less-than-rational” was myself being slightly too euphemistic. I meant to say that some contrarians are contrarians due to highly problematic reasons. Some of them should not even be considered contrarians, but merely individuals who retain the beliefs of tribes which are not respected within mainstream intellectual culture. These individuals, due to opportunity costs if nothing else, should probably not be considered potential rationalists at this time.
nods My assertion that some nontrivial number of such people are already visible contributors here still remains.
Gotcha—thanks for clarifying.
For the record, I agree with your last two paragraphs. I might agree with your first suggestion as well.… I agree that “rational” constantly runs the risk of becoming a mere tribal marker used to enforce in-group/out-group boundaries and thus detached from any actual improvement in decision-making skills, and that different people here succumb to that temptation to different degrees at different times.
I’m less confident about the idea that being concerned about the quality of people attracted to the site, or endorsing decisions on the basis of such concern, is particularly reliable evidence that the speaker is succumbing to that temptation… but I’m no longer confident you’re even suggesting that.
Oh, I was just chiming in about how Vladimir_M claims that his positions are too unacceptable to be voiced publicly (even though, presumably, he believes they are true), when given what details I know or have inferred about him it seems more likely that his estimate of the cost of signalling is overstated (and what censure or punishment apart from reproving comments by people who disagree on the internet he expects to suffer is unclear to me). I was trying to explain a broader social pattern into which I see his behavior falling, to the person who’d expressed skepticism about his concerns.
For someone who wields the word “prejudice” as derogatory, you tend to assume an awful lot about people whom you don’t know at all except for a few paragraphs of their writing about impersonal and abstract topics.
I’m not “wielding” the word prejudice; it’s not a weapon. Also, in the above case I’m very specifically referring to prejudice as a phenomenon, and it being something less acceptable to signal—not saying that anything I don’t like qualifies as prejudice. I’m using a specific noun with a pretty basic definition—not suggesting that any particular set of statements is a case example.
It is a weapon. It is routinely and regularly used to destroy people’s lives, work, and careers.