Still, the “Neoreaction” section is 3x longer than the “Effective Altruism” section. Does anyone other than David Gerard believe this impartially describes Less Wrong? (And where are the sections for the other political alignments mentioned in LW surveys? Oh, we are cherry picking, of course.)
No mention of the Sequences, other than “seed material to create the community blog”. I guess truly no one reads them anymore. :(
ReadTheSequences.com has gotten a steady ~20k–25k monthly page views (edit: excluding bots/crawlers, of course!) for 11 months and counting now, and I am aware of a half-dozen rationality reading groups around the world which are doing Sequence readings (and that’s just those using my site).
(And that doesn’t, of course, count people who are reading the Sequences via LW/GW, or by downloading and reading the e-book.)
Cherry-picking indeed! The NRx section is about 2.5x the length of the EA section (less if you ignore the citations) and about 1⁄4 of it is the statement “Eliezer Yudkowsky has strongly repudiated neoreaction”. Neoreaction is more interesting because in most places there would be (to a good approximation) zero support for it, rather than the rather little found on LW.
I mean, I don’t want to claim that the WP page is good, and I too would shed no tears if the section on neoreaction vanished, but it’s markedly less terrible than suggested in this thread.
If Jehovah Witnesses come to my door, I spend a few minutes talking with them, and then ask them to leave and never return, will I also get a subsection “Jehovah Witnesses” on Wikipedia? I wouldn’t consider that okay even if the subsection contained words “then Viliam told them to go away”. Like, why mention it at all, if that’s not what I am about?
I suppose if there was a longer article about LW, I wouldn’t mind spending a sentence or two on NR. It’s just that in current version, the mention is disproportionately long—and it has its own subsection to make it even more salient. Compare with how much place the Sequences get; actually, not mentioned at all. But there is a whole paragraph about the purpose of Less Wrong. One paragraph about everything LW is about, and one paragraph mentioning that NR was here. Fair and balanced.
What if a bunch of JWs camped out in your garden for a month, and that was one of the places where more JWs congregated than anywhere else nearby? I think then you’d be in danger of being known as “that guy who had the JWs in his garden”, and if you had a Wikipedia page then it might well mention that. It would suck, it would doubtless give a wrong impression of you, but I don’t think you’d have much grounds for complaint about the WP page.
LW had neoreactionaries camped out in its garden for a while. It kinda sucked (though some of them were fairly smart and interesting when they weren’t explaining smartly and interestingly all about how black people are stupid and we ought to bring back slavery; it’s not like there was no reason at all why they weren’t all just downvoted to oblivion and banned from the site) and the perception of LW as a hive of neoreaction is a shame—and yes, there are people maliciously promoting that perception and I wish they wouldn’t—but I’m not convinced that that WP article is either unfair or harmful. It says “neoreactionaries have taken an interest in LW” rather than “LW has taken an interest in neoreaction” and the only specific LW attitude to neoreaction mentioned is that the guy who founded the site thinks NRx is terrible. I don’t think anyone is going to be persuaded by the WP article that LW is full of neoreactionaries, and if someone who has that impression reads the article they might even be persuaded that they’re wrong.
Again, for the avoidance of doubt, I’m not claiming that the WP article is good. But it’s hardly “as bad as possible” either. That’s all.
I don’t think anyone is going to be persuaded by the WP article that LW is full of neoreactionaries, and if someone who has that impression reads the article they might even be persuaded that they’re wrong.
I believe this is not how most people think. The default human mode is thinking in associations. Most people will read the article and remember that LW is associated with something weird right-wing. Especially when “neoreaction” is a section header, which makes it hard to miss. The details about who took interest in whom, if they notice them at all, will be quickly forgotten. (Just like when you publicly debunk some myths, it can actually make people believe them more, because they will later remember they heard it, and forget it was in the context of debunking.)
If the article would instead have a section called “politics on LW” mentioning the ‘politics is the mindkiller’ slogan, and how Eliezer is a libertarian, and then complete results of a political poll (including the NR)… most people would not remember that NR was mentioned there.
Similarly, the length of sections is instinctively perceived as a degree of how much the things are related. Effective altruism is reduced to one very unspecific sentence, while Roko’s basilisk has a relatively longer explanation. Of course (availability bias) this makes an impression that the basilisk is more relevant than effective altruism.
If the article would instead have a longer text on effective altruism (for example, a short paragraph or explaining the outline of the idea, preceded by a link “Main article: Effective altruism”), people would get an impression that LW and EA are significantly connected.
The same truth can be described in ways that leave completely opposite impression. I think David understands quite well how this game works, which is why he keeps certain sections shorter and certain sections longer, etc.
Fair point about association versus actual thinking. (Though at least some versions of the backfire effect are doubtful...)
I don’t think this is all David Gerard’s fault (at least, not the fault of his activities on Wikipedia). Wikipedia is explicitly meant to be a summary of information available in “reliable sources” elsewhere, and unfortunately I think it really is true that most of the stuff about LW in such sources is about things one can point at and laugh or sneer, like Roko’s basilisk and neoreaction. That may be a state of affairs that David Gerard and RationalWiki have deliberately fostered—it certainly doesn’t seem to be one they’ve discouraged! -- but I think the Wikipedia article might well look just the way it does now if there were some entirely impartial but Wikipedia-rules-lawyering third party watching it closely instead of DG. E.g., however informative the LW poll results might be, it’s true that they’re not found in a “reliable source” in the Wikipedia sense. And however marginal Roko’s basilisk might be, it’s true that it’s attracted outside attention and been written about by “reliable sources”.
Wikipedia is explicitly meant to be a summary of information available in “reliable sources” elsewhere
So there seems to be an upstream problem that the line between “reliable sources” and “clickbait” is quite blurred these days.
This is probably not true for things that are typically written about in textbooks; but true for things that are typically written about in mainstream press.
Slightly better than the last time I saw it.
Still, the “Neoreaction” section is 3x longer than the “Effective Altruism” section. Does anyone other than David Gerard believe this impartially describes Less Wrong? (And where are the sections for the other political alignments mentioned in LW surveys? Oh, we are cherry picking, of course.)
No mention of the Sequences, other than “seed material to create the community blog”. I guess truly no one reads them anymore. :(
Not true!
ReadTheSequences.com has gotten a steady ~20k–25k monthly page views (edit: excluding bots/crawlers, of course!) for 11 months and counting now, and I am aware of a half-dozen rationality reading groups around the world which are doing Sequence readings (and that’s just those using my site).
(And that doesn’t, of course, count people who are reading the Sequences via LW/GW, or by downloading and reading the e-book.)
We are getting about 20k page hits per month on the /rationality page on LessWrong, and something in the 100k range on all sequences posts combined.
Cherry-picking indeed! The NRx section is about 2.5x the length of the EA section (less if you ignore the citations) and about 1⁄4 of it is the statement “Eliezer Yudkowsky has strongly repudiated neoreaction”. Neoreaction is more interesting because in most places there would be (to a good approximation) zero support for it, rather than the rather little found on LW.
I mean, I don’t want to claim that the WP page is good, and I too would shed no tears if the section on neoreaction vanished, but it’s markedly less terrible than suggested in this thread.
If Jehovah Witnesses come to my door, I spend a few minutes talking with them, and then ask them to leave and never return, will I also get a subsection “Jehovah Witnesses” on Wikipedia? I wouldn’t consider that okay even if the subsection contained words “then Viliam told them to go away”. Like, why mention it at all, if that’s not what I am about?
I suppose if there was a longer article about LW, I wouldn’t mind spending a sentence or two on NR. It’s just that in current version, the mention is disproportionately long—and it has its own subsection to make it even more salient. Compare with how much place the Sequences get; actually, not mentioned at all. But there is a whole paragraph about the purpose of Less Wrong. One paragraph about everything LW is about, and one paragraph mentioning that NR was here. Fair and balanced.
What if a bunch of JWs camped out in your garden for a month, and that was one of the places where more JWs congregated than anywhere else nearby? I think then you’d be in danger of being known as “that guy who had the JWs in his garden”, and if you had a Wikipedia page then it might well mention that. It would suck, it would doubtless give a wrong impression of you, but I don’t think you’d have much grounds for complaint about the WP page.
LW had neoreactionaries camped out in its garden for a while. It kinda sucked (though some of them were fairly smart and interesting when they weren’t explaining smartly and interestingly all about how black people are stupid and we ought to bring back slavery; it’s not like there was no reason at all why they weren’t all just downvoted to oblivion and banned from the site) and the perception of LW as a hive of neoreaction is a shame—and yes, there are people maliciously promoting that perception and I wish they wouldn’t—but I’m not convinced that that WP article is either unfair or harmful. It says “neoreactionaries have taken an interest in LW” rather than “LW has taken an interest in neoreaction” and the only specific LW attitude to neoreaction mentioned is that the guy who founded the site thinks NRx is terrible. I don’t think anyone is going to be persuaded by the WP article that LW is full of neoreactionaries, and if someone who has that impression reads the article they might even be persuaded that they’re wrong.
Again, for the avoidance of doubt, I’m not claiming that the WP article is good. But it’s hardly “as bad as possible” either. That’s all.
I mostly agree, except for:
I believe this is not how most people think. The default human mode is thinking in associations. Most people will read the article and remember that LW is associated with something weird right-wing. Especially when “neoreaction” is a section header, which makes it hard to miss. The details about who took interest in whom, if they notice them at all, will be quickly forgotten. (Just like when you publicly debunk some myths, it can actually make people believe them more, because they will later remember they heard it, and forget it was in the context of debunking.)
If the article would instead have a section called “politics on LW” mentioning the ‘politics is the mindkiller’ slogan, and how Eliezer is a libertarian, and then complete results of a political poll (including the NR)… most people would not remember that NR was mentioned there.
Similarly, the length of sections is instinctively perceived as a degree of how much the things are related. Effective altruism is reduced to one very unspecific sentence, while Roko’s basilisk has a relatively longer explanation. Of course (availability bias) this makes an impression that the basilisk is more relevant than effective altruism.
If the article would instead have a longer text on effective altruism (for example, a short paragraph or explaining the outline of the idea, preceded by a link “Main article: Effective altruism”), people would get an impression that LW and EA are significantly connected.
The same truth can be described in ways that leave completely opposite impression. I think David understands quite well how this game works, which is why he keeps certain sections shorter and certain sections longer, etc.
Fair point about association versus actual thinking. (Though at least some versions of the backfire effect are doubtful...)
I don’t think this is all David Gerard’s fault (at least, not the fault of his activities on Wikipedia). Wikipedia is explicitly meant to be a summary of information available in “reliable sources” elsewhere, and unfortunately I think it really is true that most of the stuff about LW in such sources is about things one can point at and laugh or sneer, like Roko’s basilisk and neoreaction. That may be a state of affairs that David Gerard and RationalWiki have deliberately fostered—it certainly doesn’t seem to be one they’ve discouraged! -- but I think the Wikipedia article might well look just the way it does now if there were some entirely impartial but Wikipedia-rules-lawyering third party watching it closely instead of DG. E.g., however informative the LW poll results might be, it’s true that they’re not found in a “reliable source” in the Wikipedia sense. And however marginal Roko’s basilisk might be, it’s true that it’s attracted outside attention and been written about by “reliable sources”.
This is a good point. The Wikipedia pages for other sites, like Reddit, also focus unduly on controversy.
So there seems to be an upstream problem that the line between “reliable sources” and “clickbait” is quite blurred these days.
This is probably not true for things that are typically written about in textbooks; but true for things that are typically written about in mainstream press.