David Chapman has said himself that when he is referring to rationality, what he is talking about has nothing to do with LessWrong. He is referring to the much older philosophical movement of “Rationalism”. The whole thing with Chapman is literally just an annoying semantic misunderstanding. He also has some specific critiques of things that Eliezer said, but 95% of the time when he critiques rationalism has absolutely nothing to do with what is written on this site.
Also, having 220 karma on the site is really not much evidence you understand what rationality is about. David Gerard has over 1000 karma and very clearly doesn’t understand what the site is about either.
I am pretty sure Chapman has also said he hasn’t read the sequences, though generally I think he understands most content on the site fine. The problem is again not that he doesn’t understand the site, but just that he is using the word rationality to mean something completely different. I like a bunch of his critique, and indeed Eliezer made 90% of the same critiques when he talks about “old Rationality” in the sequences.
Not sure to what extent I’m subtweeted here, but in case clarification is helpful, by “rationalism” I do NOT mean LW. It’s weird that the Berkeley people think “rationalism” is something they invented, when it’s been around for 2600+ years.
Prototypical rationalists are Plato, Kant, Bertrand Russell, not Yudkowsky. If you want a serious living figure, Chomsky or Dennett maybe.
The problem is again not that he doesn’t understand the site, but just that he is using the word rationality to mean something completely different. I
Yes, but he does use the word Bayesanism to talk about the paradigm of the sequences. He has written a substantial criticism of Bayesanism (with is Yudkowsky and not Plato, Kant or Bertrand Russell).
David Gerard has over 1000 karma and very clearly doesn’t understand what the site is about either.
David Gerard not only has 1000 karma but for a long time admin rights at as least our Wiki. I think it’s strawmanning him to say that he just doesn’t understand LessWrong when he spent years in our community and then decided that it’s not the right place for him anymore.
I also think there’s an issue here of saying that people who spent most of their time on LessWrong long before you signed up your account left and had critiques simple don’t understand what LessWrong was about.
I think David has a strong sense that it’s important to put faith in established authorities and he correctly assess that LessWrong is not about following established authority. It’s the same clash that gets him to write against crypto currency.
I looked at how exactly David got the 1000 karma points here, curious whether there were some super popular articles I missed. The answer seems to be that he created lots of Open Threads and similar, each of them getting a few upvotes. The second most frequent things is links to external articles (including two articles about why you shouldn’t buy Bitcoins, written in 2011 and 2014, heh).
Looking for a text he wrote that isn’t a short comment on a linked external page, I found “Attempt to explain Bayes without much maths, please review”, a reasonable short non-technical summary of what Bayesianism means, currently at 17 karma.
Plus lots of comments, most written at 2014 or sooner, and many of those are quite good! (Then he got downvoted in debates about chiropractors and neoreaction, and then he quit. Returned briefly in 2017 when I triggered him by my comment on his edits at Wikipedia page about LessWrong.)
I think David has a strong sense that it’s important to put faith in established authorities and he correctly assess that LessWrong is not about following established authority. It’s the same clash that gets him to write against crypto currency.
To me, this seems like the essence of RationalWiki. Rationality = what the mainstream authorities approve of. Going beyond that, that’s crazy talk, and needs to be called out. For example, the RW page on LW cannot resist pointing out that the Sequences present “many-worlds (a mainstream, but by no means universally accepted interpretation of quantum mechanics) as if it was proven fact”. Like, the authorities don’t even oppose this, they merely don’t universally accept it, and RationalWiki already needs to warn the reader about the heresy.
Dunno, seems to me that David is really passionate about things, and he means well, and he tried to like LessWrong despite all its faults… but he has a very strong need to be right / to be on the right side, and he takes it really hard when something wrong happens, he can’t tolerate disagreement even if it’s in the past and no one other than him cares anymore. Basilisk is a history, it was scary at first and then it was boring and then it got annoying; neoreaction is a history, we flirted with them a bit and then they got boring and annoying and we showed them the door; we don’t think about them anymore. For David, it’s probably still alive, still worth talking about, he needs a closure. It must be documented on Wikipedia that he was right and we were wrong. Even his bullying of Scott, it’s kinda like when religious fanatics are harassing an apostate; from their own perspective they mean well, it is tough love, they are trying to make him repent and save his soul! (Except, oops, now David is officially banned from editing the Wikipedia article on Scott.)
Of course, this is all mind-reading, and I may be completely wrong. But it’s my current best guess. I know David is a good guy in his own story, now I believe I got a glimpse of it, and it just makes me feel sad. I wish him well, and I also wish he could get over LessWrong and the rationalist community already. Though that seems unlikely to happen; internet is a small place, we will keep bumping into each other.
LW cannot resist pointing out that the Sequences present “many-worlds (a mainstream, but by no means universally accepted interpretation of quantum mechanics) as if it was proven fact”. Like, the authorities don’t even oppose this, they merely don’t universally accept it, and RationalWiki already needs to warn the reader about the heresy
You say that like it’s a bad thing.
Knowing how sure science is about it’s claims is an important part of science. A member of the reading public who believes that the interpretation of quantum mechanics is a complex subject that even the experts don’t understand has a better understanding than someone who thinks MWI is 99.999% certain.… even if MWI is correct.
....warn the reader about the heresy
Science says there are a number of possible answers, lesswrong was there is one...who is being more religious?
Suppose you have three hundred scientists, and three competing interpretations. Hundred scientists believe S1, hundred scientists believe S2, and hundred scientists believe S3. LW believes S1.
I guess my point is that I wouldn’t consider it necessary to add a disclaimer to the three hundred scientists. Therefore I don’t consider it necessary to add such disclaimer to LW.
But of course, adding disclaimers to everyone is also a consistent opinion.
To me it seems like a funny misson creep, in context of RationalWiki: start with calling out pseudoscience, end with calling out people who agree with some-but-not-all mainstream scientists.
If you have a hundred scientists scattered through the world who believe S1 , and have nothing else in common, that’s one thing. If they all live together, know each other and go to the same church, then there is reason to believe their acceptance of S1 is groupthink, and not pure scientific objectivity.
No, his critique of bayesianism is also attacking something very different from the sequences, it is again talking about something much narrower. Indeed, substantial fractions of the sequences overlap with his critique of bayesianism (in particular all the stuff about embededness, logical uncertainty, incomputability and TDT-style concerns). I don’t think he agrees with everything in the sequences, but when he writes critiques, I am pretty sure he is responding to something else than the sequences.
David Gerard not only has 1000 karma but for a long time admin rights at as least our Wiki. I think it’s strawmanning him to say that he just doesn’t understand LessWrong when he spent years in our community and then decided that it’s not the right place for him anymore.
No, just because you spend years here does not mean you understand the core ideas.
I think we have plenty of evidence that David Gerard frequently completely makes up random strawmans that have nothing to do with us, and maybe there is a small corner of his mind that does have an accurate model of what we are about, but almost always when he writes something he says random things that have very little to do with what we actually do.
I frequently emphasize that by “rationalism” I am specifically NOT referring to the LW usage, and that is not the target of my critique. I gave up on trying to understand LW rationalism ~5 years ago.
David Chapman has said himself that when he is referring to rationality, what he is talking about has nothing to do with LessWrong. He is referring to the much older philosophical movement of “Rationalism”. The whole thing with Chapman is literally just an annoying semantic misunderstanding. He also has some specific critiques of things that Eliezer said, but 95% of the time when he critiques rationalism has absolutely nothing to do with what is written on this site.
Also, having 220 karma on the site is really not much evidence you understand what rationality is about. David Gerard has over 1000 karma and very clearly doesn’t understand what the site is about either.
I am pretty sure Chapman has also said he hasn’t read the sequences, though generally I think he understands most content on the site fine. The problem is again not that he doesn’t understand the site, but just that he is using the word rationality to mean something completely different. I like a bunch of his critique, and indeed Eliezer made 90% of the same critiques when he talks about “old Rationality” in the sequences.
See this tweet:
https://twitter.com/Meaningness/status/1298019579978014720
Yes, but he does use the word Bayesanism to talk about the paradigm of the sequences. He has written a substantial criticism of Bayesanism (with is Yudkowsky and not Plato, Kant or Bertrand Russell).
David Gerard not only has 1000 karma but for a long time admin rights at as least our Wiki. I think it’s strawmanning him to say that he just doesn’t understand LessWrong when he spent years in our community and then decided that it’s not the right place for him anymore.
I also think there’s an issue here of saying that people who spent most of their time on LessWrong long before you signed up your account left and had critiques simple don’t understand what LessWrong was about.
I think David has a strong sense that it’s important to put faith in established authorities and he correctly assess that LessWrong is not about following established authority. It’s the same clash that gets him to write against crypto currency.
I looked at how exactly David got the 1000 karma points here, curious whether there were some super popular articles I missed. The answer seems to be that he created lots of Open Threads and similar, each of them getting a few upvotes. The second most frequent things is links to external articles (including two articles about why you shouldn’t buy Bitcoins, written in 2011 and 2014, heh).
Looking for a text he wrote that isn’t a short comment on a linked external page, I found “Attempt to explain Bayes without much maths, please review”, a reasonable short non-technical summary of what Bayesianism means, currently at 17 karma.
Plus lots of comments, most written at 2014 or sooner, and many of those are quite good! (Then he got downvoted in debates about chiropractors and neoreaction, and then he quit. Returned briefly in 2017 when I triggered him by my comment on his edits at Wikipedia page about LessWrong.)
To me, this seems like the essence of RationalWiki. Rationality = what the mainstream authorities approve of. Going beyond that, that’s crazy talk, and needs to be called out. For example, the RW page on LW cannot resist pointing out that the Sequences present “many-worlds (a mainstream, but by no means universally accepted interpretation of quantum mechanics) as if it was proven fact”. Like, the authorities don’t even oppose this, they merely don’t universally accept it, and RationalWiki already needs to warn the reader about the heresy.
Dunno, seems to me that David is really passionate about things, and he means well, and he tried to like LessWrong despite all its faults… but he has a very strong need to be right / to be on the right side, and he takes it really hard when something wrong happens, he can’t tolerate disagreement even if it’s in the past and no one other than him cares anymore. Basilisk is a history, it was scary at first and then it was boring and then it got annoying; neoreaction is a history, we flirted with them a bit and then they got boring and annoying and we showed them the door; we don’t think about them anymore. For David, it’s probably still alive, still worth talking about, he needs a closure. It must be documented on Wikipedia that he was right and we were wrong. Even his bullying of Scott, it’s kinda like when religious fanatics are harassing an apostate; from their own perspective they mean well, it is tough love, they are trying to make him repent and save his soul! (Except, oops, now David is officially banned from editing the Wikipedia article on Scott.)
Of course, this is all mind-reading, and I may be completely wrong. But it’s my current best guess. I know David is a good guy in his own story, now I believe I got a glimpse of it, and it just makes me feel sad. I wish him well, and I also wish he could get over LessWrong and the rationalist community already. Though that seems unlikely to happen; internet is a small place, we will keep bumping into each other.
You say that like it’s a bad thing.
Knowing how sure science is about it’s claims is an important part of science. A member of the reading public who believes that the interpretation of quantum mechanics is a complex subject that even the experts don’t understand has a better understanding than someone who thinks MWI is 99.999% certain.… even if MWI is correct.
Science says there are a number of possible answers, lesswrong was there is one...who is being more religious?
Suppose you have three hundred scientists, and three competing interpretations. Hundred scientists believe S1, hundred scientists believe S2, and hundred scientists believe S3. LW believes S1.
I guess my point is that I wouldn’t consider it necessary to add a disclaimer to the three hundred scientists. Therefore I don’t consider it necessary to add such disclaimer to LW.
But of course, adding disclaimers to everyone is also a consistent opinion.
To me it seems like a funny misson creep, in context of RationalWiki: start with calling out pseudoscience, end with calling out people who agree with some-but-not-all mainstream scientists.
If you have a hundred scientists scattered through the world who believe S1 , and have nothing else in common, that’s one thing. If they all live together, know each other and go to the same church, then there is reason to believe their acceptance of S1 is groupthink, and not pure scientific objectivity.
No, his critique of bayesianism is also attacking something very different from the sequences, it is again talking about something much narrower. Indeed, substantial fractions of the sequences overlap with his critique of bayesianism (in particular all the stuff about embededness, logical uncertainty, incomputability and TDT-style concerns). I don’t think he agrees with everything in the sequences, but when he writes critiques, I am pretty sure he is responding to something else than the sequences.
No, just because you spend years here does not mean you understand the core ideas.
I think we have plenty of evidence that David Gerard frequently completely makes up random strawmans that have nothing to do with us, and maybe there is a small corner of his mind that does have an accurate model of what we are about, but almost always when he writes something he says random things that have very little to do with what we actually do.
Chapman has also specifically said that he does not understand LW: