I’m a bit confused by the rationalWiki. Is that maintained by anyone? I saw page for EY, and it seemed either be genuinely harsh/scathing/dismissive, or a poorly executed inside joke.
RationalWiki is older than LW, and their definition of “rationality” is quite different from the one used here.
To put it simply, their “rationality” means science as taught at universities + politically correct ideas; and their “irrationality” means pseudoscience + religion + politically incorrect ideas + everything that feels weird (such as many-world hypothesis).
Also, their idea of rational discussion is that after you have decided that something is irrational, you should describe it in a snarky way, and feel free to exaggerate if it makes the page more funny. So when later anyone points out a factual error in your article, you can defend it by saying “it was obviously a joke, moron”.
In my understanding, this is how they most likely got obsessed with Eliezer and LessWrong:
1) How does a high-school dropout dare to write a series of articles about quantum physics? Only university professors are allowed to have opinions on such topic. Obviously, he must be a crackpot. And he even identifies as a libertarian, which makes him a perfect target for RationalWiki: attack pseudoscience and right-wing politics in the same strike!
2) Oops, a debate at Stack Exchange confirmed that the articles about quantum physics… contain a few minor technical mistakes, but the explanation is correct in general, and about half of people who do quantum physics professionally actually accept some variant of the many-world hypothesis. So… not a crackpot? But we already have a snarky article about him, so there must be something wrong about him… keep digging...
3) Oh, there was once a deleted thread about an idea that sounded genuinely crazy and dangerous. Also, he is polyamorous… and although judging other people’s sexuality is generally considered politically incorrect, it does not necessarily apply to people we already decided are bad. Therefore: he is a leader of a dangerous cult, sexually abusing his followers. There; we knew there was something fishy going on!
4) Then a really long debate followed on RationalWiki’s talk pages, and some statements about Eliezer and LessWrong were rephrased to a milder form, which is what you see there today.
5) Currently, David Gerard from RationalWiki is camping at the Wikipedia article about Less Wrong, and works hard to keep it as bad as possible, mostly by adding citations by “reliable sources” which happen to be journalists using information they found at RationalWiki, and removing anything that would put Less Wrong into more positive light, such as any mention of effective altruism etc.
Still, the “Neoreaction” section is 3x longer than the “Effective Altruism” section. Does anyone other than David Gerard believe this impartially describes Less Wrong? (And where are the sections for the other political alignments mentioned in LW surveys? Oh, we are cherry picking, of course.)
No mention of the Sequences, other than “seed material to create the community blog”. I guess truly no one reads them anymore. :(
ReadTheSequences.com has gotten a steady ~20k–25k monthly page views (edit: excluding bots/crawlers, of course!) for 11 months and counting now, and I am aware of a half-dozen rationality reading groups around the world which are doing Sequence readings (and that’s just those using my site).
(And that doesn’t, of course, count people who are reading the Sequences via LW/GW, or by downloading and reading the e-book.)
Cherry-picking indeed! The NRx section is about 2.5x the length of the EA section (less if you ignore the citations) and about 1⁄4 of it is the statement “Eliezer Yudkowsky has strongly repudiated neoreaction”. Neoreaction is more interesting because in most places there would be (to a good approximation) zero support for it, rather than the rather little found on LW.
I mean, I don’t want to claim that the WP page is good, and I too would shed no tears if the section on neoreaction vanished, but it’s markedly less terrible than suggested in this thread.
If Jehovah Witnesses come to my door, I spend a few minutes talking with them, and then ask them to leave and never return, will I also get a subsection “Jehovah Witnesses” on Wikipedia? I wouldn’t consider that okay even if the subsection contained words “then Viliam told them to go away”. Like, why mention it at all, if that’s not what I am about?
I suppose if there was a longer article about LW, I wouldn’t mind spending a sentence or two on NR. It’s just that in current version, the mention is disproportionately long—and it has its own subsection to make it even more salient. Compare with how much place the Sequences get; actually, not mentioned at all. But there is a whole paragraph about the purpose of Less Wrong. One paragraph about everything LW is about, and one paragraph mentioning that NR was here. Fair and balanced.
What if a bunch of JWs camped out in your garden for a month, and that was one of the places where more JWs congregated than anywhere else nearby? I think then you’d be in danger of being known as “that guy who had the JWs in his garden”, and if you had a Wikipedia page then it might well mention that. It would suck, it would doubtless give a wrong impression of you, but I don’t think you’d have much grounds for complaint about the WP page.
LW had neoreactionaries camped out in its garden for a while. It kinda sucked (though some of them were fairly smart and interesting when they weren’t explaining smartly and interestingly all about how black people are stupid and we ought to bring back slavery; it’s not like there was no reason at all why they weren’t all just downvoted to oblivion and banned from the site) and the perception of LW as a hive of neoreaction is a shame—and yes, there are people maliciously promoting that perception and I wish they wouldn’t—but I’m not convinced that that WP article is either unfair or harmful. It says “neoreactionaries have taken an interest in LW” rather than “LW has taken an interest in neoreaction” and the only specific LW attitude to neoreaction mentioned is that the guy who founded the site thinks NRx is terrible. I don’t think anyone is going to be persuaded by the WP article that LW is full of neoreactionaries, and if someone who has that impression reads the article they might even be persuaded that they’re wrong.
Again, for the avoidance of doubt, I’m not claiming that the WP article is good. But it’s hardly “as bad as possible” either. That’s all.
I don’t think anyone is going to be persuaded by the WP article that LW is full of neoreactionaries, and if someone who has that impression reads the article they might even be persuaded that they’re wrong.
I believe this is not how most people think. The default human mode is thinking in associations. Most people will read the article and remember that LW is associated with something weird right-wing. Especially when “neoreaction” is a section header, which makes it hard to miss. The details about who took interest in whom, if they notice them at all, will be quickly forgotten. (Just like when you publicly debunk some myths, it can actually make people believe them more, because they will later remember they heard it, and forget it was in the context of debunking.)
If the article would instead have a section called “politics on LW” mentioning the ‘politics is the mindkiller’ slogan, and how Eliezer is a libertarian, and then complete results of a political poll (including the NR)… most people would not remember that NR was mentioned there.
Similarly, the length of sections is instinctively perceived as a degree of how much the things are related. Effective altruism is reduced to one very unspecific sentence, while Roko’s basilisk has a relatively longer explanation. Of course (availability bias) this makes an impression that the basilisk is more relevant than effective altruism.
If the article would instead have a longer text on effective altruism (for example, a short paragraph or explaining the outline of the idea, preceded by a link “Main article: Effective altruism”), people would get an impression that LW and EA are significantly connected.
The same truth can be described in ways that leave completely opposite impression. I think David understands quite well how this game works, which is why he keeps certain sections shorter and certain sections longer, etc.
Fair point about association versus actual thinking. (Though at least some versions of the backfire effect are doubtful...)
I don’t think this is all David Gerard’s fault (at least, not the fault of his activities on Wikipedia). Wikipedia is explicitly meant to be a summary of information available in “reliable sources” elsewhere, and unfortunately I think it really is true that most of the stuff about LW in such sources is about things one can point at and laugh or sneer, like Roko’s basilisk and neoreaction. That may be a state of affairs that David Gerard and RationalWiki have deliberately fostered—it certainly doesn’t seem to be one they’ve discouraged! -- but I think the Wikipedia article might well look just the way it does now if there were some entirely impartial but Wikipedia-rules-lawyering third party watching it closely instead of DG. E.g., however informative the LW poll results might be, it’s true that they’re not found in a “reliable source” in the Wikipedia sense. And however marginal Roko’s basilisk might be, it’s true that it’s attracted outside attention and been written about by “reliable sources”.
Wikipedia is explicitly meant to be a summary of information available in “reliable sources” elsewhere
So there seems to be an upstream problem that the line between “reliable sources” and “clickbait” is quite blurred these days.
This is probably not true for things that are typically written about in textbooks; but true for things that are typically written about in mainstream press.
How does a high-school dropout dare to write a series of articles about quantum physics? Only university professors are allowed to have opinions on such topic. Obviously, he must be a crackpot.
Have you noticed that most writings by laypeople on QM actually are crackpottery? RW’s priors are in the right place, at least.
I fully agree (about the priors on QM). The problem is somewhere else. I see two major flaws:
First, the “rationality” of RW lacks self-reflection. They sternly judge others, but consider themselves flawless. To explain what I mean, imagine that I would know nothing about QM other than the fact that 99% of online writings about QM are crackpottery; and then I would find an article about QM that sounds weird. -- Would I trust the article? No. That’s what the priors are for. Would I write my own article denouncing the author of the other article as a crackpot? No. Because I would be aware that I know nothing about QM, and that despite the 99% probability of crackpottery, there is also the 1% probability it is correct; and that my lack of knowledge does not allow me to update after reading the article itself, so I am stuck with my priors. I would try to leave writing the denunciation to someone who actually understands the topic; to someone who can say “X is wrong, because it is actually Y”, instead of merely “X is wrong, because, uhm, my priors” or even “X is wrong, trust me, I am the expert”. But the latest version is most similar to what RW does. They pretend to be experts at science and pseudoscience, but in fact they are not. They merely follow a few simple heuristics which allow them to be correct in most cases, and sometimes they misfire. In this specific case, they followed a (good) heuristic about QM writings, instead of actually understanding QM and judging Eliezer’s articles by their content; but they made it sound like there is a problem specifically with the articles.
Second, it is difficult to update if you burn the bridges after making your first estimate. If I publicly say I disagree with you, I may later change my mind and say that after giving it more thought I actually agree with you; and I will not lose a lot of face, especially among rational people. But if instead I publicly call you a crackpot or worse, and then it turns out that maybe you were right… it will cost me a lot of face to admit it. Being a human, the natural reaction is to double down regardless of evidence, cherry-pick in favor of my original judgment, and try to move the goalpost. And you can hardly avoid burning the bridges if you make everything political (is Eliezer’s libertarianism really relevant for evaluating whether he is wrong about QM? no, but was so tempting to connect libertarianism with supposed crackpottery), and if your culture of communication is “snarky” i.e. when you are an asshole and proud of it.
To make a mistake when the priors point the wrong way is unavoidable. To resist further evidence so strongly that a few years down the line you are spending your nights editing your opponent’s Wikipedia page just to prove that you were right… that’s insane.
I suppose that people who disagree with the snarky way of looking at political opponents will not remain for long time.
There is also a difference between a forum and a wiki. (Medium is the message, kind of.) In a forum, you can write an article expressing your opinions, and then I can write an article about why I disagree with your opinions. In a wiki, I will simply revert your edit. Thus, wikis are more likely to converge to a unified view.
I’m a bit confused by the rationalWiki. Is that maintained by anyone? I saw page for EY, and it seemed either be genuinely harsh/scathing/dismissive, or a poorly executed inside joke.
RationalWiki is maintained by people who really dislike Less Wrong in general and Eliezer personally.
My own view is that RationalWiki is a terrible, terrible source for anything.
Thanks, will take that into account. Did it start as a LW project, and then shifted, or was it that way from the beginning?
RationalWiki is older than LW, and their definition of “rationality” is quite different from the one used here.
To put it simply, their “rationality” means science as taught at universities + politically correct ideas; and their “irrationality” means pseudoscience + religion + politically incorrect ideas + everything that feels weird (such as many-world hypothesis).
Also, their idea of rational discussion is that after you have decided that something is irrational, you should describe it in a snarky way, and feel free to exaggerate if it makes the page more funny. So when later anyone points out a factual error in your article, you can defend it by saying “it was obviously a joke, moron”.
In my understanding, this is how they most likely got obsessed with Eliezer and LessWrong:
1) How does a high-school dropout dare to write a series of articles about quantum physics? Only university professors are allowed to have opinions on such topic. Obviously, he must be a crackpot. And he even identifies as a libertarian, which makes him a perfect target for RationalWiki: attack pseudoscience and right-wing politics in the same strike!
2) Oops, a debate at Stack Exchange confirmed that the articles about quantum physics… contain a few minor technical mistakes, but the explanation is correct in general, and about half of people who do quantum physics professionally actually accept some variant of the many-world hypothesis. So… not a crackpot? But we already have a snarky article about him, so there must be something wrong about him… keep digging...
3) Oh, there was once a deleted thread about an idea that sounded genuinely crazy and dangerous. Also, he is polyamorous… and although judging other people’s sexuality is generally considered politically incorrect, it does not necessarily apply to people we already decided are bad. Therefore: he is a leader of a dangerous cult, sexually abusing his followers. There; we knew there was something fishy going on!
4) Then a really long debate followed on RationalWiki’s talk pages, and some statements about Eliezer and LessWrong were rephrased to a milder form, which is what you see there today.
5) Currently, David Gerard from RationalWiki is camping at the Wikipedia article about Less Wrong, and works hard to keep it as bad as possible, mostly by adding citations by “reliable sources” which happen to be journalists using information they found at RationalWiki, and removing anything that would put Less Wrong into more positive light, such as any mention of effective altruism etc.
¯\_(ツ)_/¯
It’s worth noting that David Gerard did contribute a lot on LessWrong in it’s early days as well, so he’s not really someone who’s simply an outsider.
The Wikipedia page on LW doesn’t seem particularly awful at the moment. (And in particular it does in fact mention effective altruism.)
Slightly better than the last time I saw it.
Still, the “Neoreaction” section is 3x longer than the “Effective Altruism” section. Does anyone other than David Gerard believe this impartially describes Less Wrong? (And where are the sections for the other political alignments mentioned in LW surveys? Oh, we are cherry picking, of course.)
No mention of the Sequences, other than “seed material to create the community blog”. I guess truly no one reads them anymore. :(
Not true!
ReadTheSequences.com has gotten a steady ~20k–25k monthly page views (edit: excluding bots/crawlers, of course!) for 11 months and counting now, and I am aware of a half-dozen rationality reading groups around the world which are doing Sequence readings (and that’s just those using my site).
(And that doesn’t, of course, count people who are reading the Sequences via LW/GW, or by downloading and reading the e-book.)
We are getting about 20k page hits per month on the /rationality page on LessWrong, and something in the 100k range on all sequences posts combined.
Cherry-picking indeed! The NRx section is about 2.5x the length of the EA section (less if you ignore the citations) and about 1⁄4 of it is the statement “Eliezer Yudkowsky has strongly repudiated neoreaction”. Neoreaction is more interesting because in most places there would be (to a good approximation) zero support for it, rather than the rather little found on LW.
I mean, I don’t want to claim that the WP page is good, and I too would shed no tears if the section on neoreaction vanished, but it’s markedly less terrible than suggested in this thread.
If Jehovah Witnesses come to my door, I spend a few minutes talking with them, and then ask them to leave and never return, will I also get a subsection “Jehovah Witnesses” on Wikipedia? I wouldn’t consider that okay even if the subsection contained words “then Viliam told them to go away”. Like, why mention it at all, if that’s not what I am about?
I suppose if there was a longer article about LW, I wouldn’t mind spending a sentence or two on NR. It’s just that in current version, the mention is disproportionately long—and it has its own subsection to make it even more salient. Compare with how much place the Sequences get; actually, not mentioned at all. But there is a whole paragraph about the purpose of Less Wrong. One paragraph about everything LW is about, and one paragraph mentioning that NR was here. Fair and balanced.
What if a bunch of JWs camped out in your garden for a month, and that was one of the places where more JWs congregated than anywhere else nearby? I think then you’d be in danger of being known as “that guy who had the JWs in his garden”, and if you had a Wikipedia page then it might well mention that. It would suck, it would doubtless give a wrong impression of you, but I don’t think you’d have much grounds for complaint about the WP page.
LW had neoreactionaries camped out in its garden for a while. It kinda sucked (though some of them were fairly smart and interesting when they weren’t explaining smartly and interestingly all about how black people are stupid and we ought to bring back slavery; it’s not like there was no reason at all why they weren’t all just downvoted to oblivion and banned from the site) and the perception of LW as a hive of neoreaction is a shame—and yes, there are people maliciously promoting that perception and I wish they wouldn’t—but I’m not convinced that that WP article is either unfair or harmful. It says “neoreactionaries have taken an interest in LW” rather than “LW has taken an interest in neoreaction” and the only specific LW attitude to neoreaction mentioned is that the guy who founded the site thinks NRx is terrible. I don’t think anyone is going to be persuaded by the WP article that LW is full of neoreactionaries, and if someone who has that impression reads the article they might even be persuaded that they’re wrong.
Again, for the avoidance of doubt, I’m not claiming that the WP article is good. But it’s hardly “as bad as possible” either. That’s all.
I mostly agree, except for:
I believe this is not how most people think. The default human mode is thinking in associations. Most people will read the article and remember that LW is associated with something weird right-wing. Especially when “neoreaction” is a section header, which makes it hard to miss. The details about who took interest in whom, if they notice them at all, will be quickly forgotten. (Just like when you publicly debunk some myths, it can actually make people believe them more, because they will later remember they heard it, and forget it was in the context of debunking.)
If the article would instead have a section called “politics on LW” mentioning the ‘politics is the mindkiller’ slogan, and how Eliezer is a libertarian, and then complete results of a political poll (including the NR)… most people would not remember that NR was mentioned there.
Similarly, the length of sections is instinctively perceived as a degree of how much the things are related. Effective altruism is reduced to one very unspecific sentence, while Roko’s basilisk has a relatively longer explanation. Of course (availability bias) this makes an impression that the basilisk is more relevant than effective altruism.
If the article would instead have a longer text on effective altruism (for example, a short paragraph or explaining the outline of the idea, preceded by a link “Main article: Effective altruism”), people would get an impression that LW and EA are significantly connected.
The same truth can be described in ways that leave completely opposite impression. I think David understands quite well how this game works, which is why he keeps certain sections shorter and certain sections longer, etc.
Fair point about association versus actual thinking. (Though at least some versions of the backfire effect are doubtful...)
I don’t think this is all David Gerard’s fault (at least, not the fault of his activities on Wikipedia). Wikipedia is explicitly meant to be a summary of information available in “reliable sources” elsewhere, and unfortunately I think it really is true that most of the stuff about LW in such sources is about things one can point at and laugh or sneer, like Roko’s basilisk and neoreaction. That may be a state of affairs that David Gerard and RationalWiki have deliberately fostered—it certainly doesn’t seem to be one they’ve discouraged! -- but I think the Wikipedia article might well look just the way it does now if there were some entirely impartial but Wikipedia-rules-lawyering third party watching it closely instead of DG. E.g., however informative the LW poll results might be, it’s true that they’re not found in a “reliable source” in the Wikipedia sense. And however marginal Roko’s basilisk might be, it’s true that it’s attracted outside attention and been written about by “reliable sources”.
This is a good point. The Wikipedia pages for other sites, like Reddit, also focus unduly on controversy.
So there seems to be an upstream problem that the line between “reliable sources” and “clickbait” is quite blurred these days.
This is probably not true for things that are typically written about in textbooks; but true for things that are typically written about in mainstream press.
Have you noticed that most writings by laypeople on QM actually are crackpottery? RW’s priors are in the right place, at least.
I fully agree (about the priors on QM). The problem is somewhere else. I see two major flaws:
First, the “rationality” of RW lacks self-reflection. They sternly judge others, but consider themselves flawless. To explain what I mean, imagine that I would know nothing about QM other than the fact that 99% of online writings about QM are crackpottery; and then I would find an article about QM that sounds weird. -- Would I trust the article? No. That’s what the priors are for. Would I write my own article denouncing the author of the other article as a crackpot? No. Because I would be aware that I know nothing about QM, and that despite the 99% probability of crackpottery, there is also the 1% probability it is correct; and that my lack of knowledge does not allow me to update after reading the article itself, so I am stuck with my priors. I would try to leave writing the denunciation to someone who actually understands the topic; to someone who can say “X is wrong, because it is actually Y”, instead of merely “X is wrong, because, uhm, my priors” or even “X is wrong, trust me, I am the expert”. But the latest version is most similar to what RW does. They pretend to be experts at science and pseudoscience, but in fact they are not. They merely follow a few simple heuristics which allow them to be correct in most cases, and sometimes they misfire. In this specific case, they followed a (good) heuristic about QM writings, instead of actually understanding QM and judging Eliezer’s articles by their content; but they made it sound like there is a problem specifically with the articles.
Second, it is difficult to update if you burn the bridges after making your first estimate. If I publicly say I disagree with you, I may later change my mind and say that after giving it more thought I actually agree with you; and I will not lose a lot of face, especially among rational people. But if instead I publicly call you a crackpot or worse, and then it turns out that maybe you were right… it will cost me a lot of face to admit it. Being a human, the natural reaction is to double down regardless of evidence, cherry-pick in favor of my original judgment, and try to move the goalpost. And you can hardly avoid burning the bridges if you make everything political (is Eliezer’s libertarianism really relevant for evaluating whether he is wrong about QM? no, but was so tempting to connect libertarianism with supposed crackpottery), and if your culture of communication is “snarky” i.e. when you are an asshole and proud of it.
To make a mistake when the priors point the wrong way is unavoidable. To resist further evidence so strongly that a few years down the line you are spending your nights editing your opponent’s Wikipedia page just to prove that you were right… that’s insane.
Are you quite sure “they” are a cohesive goup?
Are you quite sure “they” couldn’t possibly include any actual physicists?
So LW never makes sweeping denunciations?
I suppose that people who disagree with the snarky way of looking at political opponents will not remain for long time.
There is also a difference between a forum and a wiki. (Medium is the message, kind of.) In a forum, you can write an article expressing your opinions, and then I can write an article about why I disagree with your opinions. In a wiki, I will simply revert your edit. Thus, wikis are more likely to converge to a unified view.
(deleted)
No, RationalWiki never had anything to do with Less Wrong.
It started as the leftist alternative to Conservapedia.
Really? Do you have any links on that? I wasn’t aware.
The Wikipedia article on it does.
You’re right, literally says it in the second line.