Hello.
I’ve been reading Less Wrong from its beginning. I stumbled upon Overcoming Bias just as LW was being launched. I’m a young mathematician (an analyst, to be more specific) currently working towards a PhD and I’m very interested in epistemic rationality and the theory of altruist instrumental rationality. I’ve been very impressed with the general quality of discussion about the theory and general practice of truth-seeking here, even though I can think of places where I disagree with the ideas that I gather are widely accepted here. The most interesting discussions seem to be quite old, though, so reviving those discussions out of the blue hasn’t felt like—for lack of a better word—a proper thing to do.
There are many discussions here of which I don’t care about. A large proportion of people here are programmers or otherwise from a CS background, and that colors the discussions a lot. Or maybe it’s just that the prospect of an AGI in recent future doesn’t seem at all likely to me. Anyway, the AI/singularity stuff, the tangentially related topics that I bunch together with them, and approaching rationality topics from a programmer’s point of view I just don’t care about. Not very much, at least.
The self-help stuff, “winning is everything” and related stuff I’d rather not read. Well, I do my best not to. The apparent lack of concern for altruism in those discussions makes me even wish they wouldn’t take place here in the first place.
And then there are the true failings of this community. I had been thinking of registering and posting in some threads about the more abstract sides of rationality, but I must admit I eventually got around to registering and posting because of the gender threads. But there’s just so much bullshit going on! Evolutionary psychology is grossly misapplied (1). The obvious existence of oppressive cultural constructs (2) is flatly denied. The validity of anecdotes and speculation as evidence is hardly even questioned. The topics that started the flaming have no reason of even being here in the first place. This post pretty well sums up the failures of rationality here at Less Wrong; and that post has been upvoted to 25! Now, the failings and attitudes that surfaced in the gender debate have, of course, been visible for quite some time. But that the failures of thought seem so common has made me wonder if this community as a whole is actually worth wasting my time for.
So, in case you’re still wondering, what has generously been termed “exclusionary speech” really drives people away (3). I’m still hoping that the professed rationality is enough to overcome the failure modes that are currently so common here (4). But unfortunately I think my possible contributions won’t be missed if I rid myself of wishful thinking and see it’s not going to happen.
It’s quite a shame that a community with such good original intentions is failing after a good start. Maybe humans simply won’t overcome their biases (5) yet in this day and age.
So. I’d really like to participate in thoughtful discussions with rationalists I can respect. For quite a long time, Less Wrong seemed like the place, but I just couldn’t find a proper place to start (I dislike introductions). But now as I’m losing my respect for this community and thus the will to participate here, I started posting. I hope I can regain the confidence in a high level of sanity waterline here.
(Now a proper rationalist would, in my position, naturally reconsider his own attitudes and beliefs. It might not be surprising that I didn’t find all too much to correct. So I might just as well assume that I haven’t been mind-killed quite yet, and just make the post I wanted to.)
EDIT: In case you felt I was generalizing with too much confidence—and as I wrote here, I agree I was—see my reply to Vladimir Nesov’s reply.
(1) I think failing to control for cultural influences in evolutionary psychology should be considered at least as much of a fail as postulating group selection. Probably more so.
(2) Somehow I think phrases like “cultural construct”, especially when combined with qualifiers like “oppressive”, trigger immediate bullshit alarms for some. To a certain extent, it’s forgivable, as they certainly have been used in conjunction with some of the most well-known anti-epistemologies of our age. But remember: reversing stupidity doesn’t make you any better off.
(3) This might be a good place to remind the reader that (our kind can’t cooperate)[http://lesswrong.com/lw/3h/why_our_kind_cant_cooperate/]. (This is actually referring to many aspects of the recent debate, not just one.)
(4) Yes, I know, I can’t cooperate either.
(5) Overcoming Bias is quite an ironic name for that blog. EDIT: This refers exclusively to many of Robin Hanson’s posts about gender differences I have read. I think I saw a post linking to some of these recently, but I couldn’t find a link to that just now. Anyway, this footnote probably went a bit too far.
I think I understand your point about overconfidence. I had thought of the post for a day or two but I wrote it in one go, so I probably didn’t end up expressing myself as well as I could have. I had originally intended to include a disclaimer in my post, but for reasons that now seem obscure I left it out. When making as strong, generalizing statements as I did, the ambiguity of statements should be minimized a lot more thoroughly than I did.
So, to explain myself a little bit better: I don’t hold the opinion that what I called “bullshit” is common enough here to make it, in itself, a “failing of this community”. The “bullshit” was, after all, limited only to certain threads and to certain individuals. What I’m lamenting and attributing to the whole community is a failure to react to the “bullshit” properly. Of course, that’s a sweeping generalization in itself—certainly not everyone here failed to react in what I consider a proper way. But the widest consensus in the multitude of opinions seemed to be that the reaction might be hypersensitivity, and that the “bullshit” should be discouraged only because it offends and excludes people (and not because it offends and excludes people for irrational reasons).
And as for overconfidence about my assessment of the “bullshit” itself, I don’t really want to argue about that. Any more than I’d want to argue with people who think atheists should be excluded from public office. (Can you imagine an alternate LW in which the general consensus was that’s a reasonable, though extreme, position to take? That might give an only slightly exaggerated example of how bizarrely out of place I considered the gender debate to be.) If pressed, I will naturally agree to defend my statements. But I wouldn’t really want to have to, and restarting the debate isn’t probably in anyone else’s best interests either. So, I’ll just have to leave the matter as something that, in my perspective, lessens appreciation for the level of discourse here in quite a disturbing way. Still, that doesn’t mean that LW wouldn’t get the best marks from me as far as the rationality of internet communities I know is considered, or that a lowered single value for “the level of discourse” lessened my perception of the value of other contributions here.
Now the latest top-level post about critiquing Bayesianism look quite interesting, I think I’d like to take a closer look at that...