There is another video from the same author explaining his opinions on LW. It takes 2 minutes to just start talking about LW, so here are the important parts: ---
The Sequences are hundreds and hundreds of blog posts, written by one man. They are like catechism, teach strange vocabulary like “winning”, “paying rent”, “mindkilling”, “being Bayesian”.
The claim that Bayes theorem, which is just a footnote in statistic textbook, has the power to reshape your thinking so that you can maximize the outcomes of your life… has no evidence. You can’t simplify the complexity of life into simple probabilities. EY is a high-school dropout and he has no peer-reviewed articles.
People on LW say that criticism of LW is upvoted. Actually, that “criticism” does not disagree with anything—it just asks MIRI to be more specific. Is that the LW’s best defense against accusations of cultishness?
LW community believes in Singularity, which again, has no evidence, and the scientific community does not support it. MIRI asks your money, and does not say how specifically it will be used to save the world.
LW claims that politics is the mindkiller, yet EY admits that he is libertarian. Most of MIRI money comes from Peter Theil—a right-wing libertarian billionaire.
Roko’s basilisk...
...and these guys pretend to be skeptics?
Now let’s look at CFAR. They have EY on their board, and they force you to read the Sequences if you want to join them.
Julia Galef is a rising star in the skeptical movement; she has a podcast “Rationally Speaking”. But she is connected with LW, she believes in Bayes theorem, and she only criticizes the political left. She is obviously used as a face of LW movement because she is pretty! -- This is a sexism on LW’s part, because men at LW agree in comments that Julia is pretty. If they weren’t sexist, they would talk about how smart she is.
People like this are not skeptics and should not be invited to Skepticon!
There’s a user at RationalWiki, one of the dedicated LW critics there, called “Baloney Detection”. I often wondered who it was. The image at 5:45 in this video, and the fact that “Baloney Detection” also edited the “Julia Galef” page at RW to decry her association with LW, tells me this is him…
By the way, the RW article about LW now seems more… rational… than the last time I checked. (Possibly because our hordes of cultists sposored by the right-wing extremist conspiracy fixed it, hoping to receive the promised 3^^^3 robotic virgins in singularitarian paradise as a reward.) You can’t say the same thing about the talk pages, though.
It’s strange. Now I should probably update towards “a criticism of LW found online probably somehow comes from two or three people on RW”. On their talk pages, Aris Katsaris sounds like a lonely sane voice in a desert of… I guess it’s supposed to be a “rationality with a snarky point of view”; which works like this—I can say anything, and if you prove me lying, I say I was exaggerating to make it more funny.
Some interesting bits from the (mostly boring) talk page:
Yudkowsky is an uneducated idiot because there simply can’t be 3^^^3 distinct people
A proper skeptical argument about why “Torture vs Dust Specks” is wrong.
what happened is that they hired Luke Muehlhauser who doesn’t know about anything technical but can adequately/objectively research what a research organization would look like, and then push towards outwards appearance of such
This is why LW people care about Löb’s Theorem, in case you (LW cultists not belonging to the inner circle) didn’t know.
Using Thiel’s money to list yourself as co-author is very weak evidence of competence.
An ad-hoc explanation is being prepared. Criticising Eliezer for being a high school dropout and never publishing in peer-reviewed journal is so much fun… but if he would some day publish in a peer-reviewed journal and get citations or whatever recognition by the scientific establishment, RationalWiki already knows the true explanation—the right-wing conspiracy bribed the scientists. (If the day comes that RW starts criticizing scientists for supporting LW, I will be laughing and munching popcorn.)
Holden Karnofsky’s critique had a significant number of downvotes as well—being high profile, they didn’t want to burn the bridges, so it wasn’t deleted, and a huge number of non-regulars upvoted it.
How do you know what you know? Specifically, where are those data about who upvoted and downvoted Holden coming from? (Or it is an alternative explanation-away? LW does not accept criticism and censors everything, but this one time the power of the popular opinion prevented them from deleting it.)
And finally a good idea:
This talk page is becoming one of the central coordination points for LW/SI’s critic/stalkers. Maybe that should be mentioned on the page too?
I agree, but we are speaking about approximately 13 downvotes from 265 total votes. So we have at least 13 people on LessWrong who oppose a high-quality criticism.
The speculation about regulars downvoting and non-regulars upvoting is without any evidence; could have also been the other way round. We also had a few trolls and crazy people here in the past. And by the way, it’s not like peoplefromRationalWiki couldn’t create throw-away accounts here. So, with the same zero evidence, I could propose an alternative hypothesis that Holden was actually downvoted by people from RW who smartly realized that his “criticism” of LW is actually no criticism. But that would just be silly.
I agree, but we are speaking about approximately 13 downvotes from 265 total votes. So we have at least 13 people on LessWrong who oppose a high-quality criticism.
Or there are approximately 13 people who believe the post is worth a mere 250 votes, not 265 and so used their vote to push it in the desired direction. Votes needn’t be made or considered to be made independently of each other.
Or there are approximately 13 people who believe the post is worth a mere 250 votes, not 265 and so used their vote to push it in the desired direction.
One data point: I used to do that kind of things before the “% positive” thing was implemented, but I no longer do that, at least not deliberately.
I wonder how much “hundreds of blog posts written by one man” is the true rejection. I mean, would the reaction be different if it was a book instead of hundred blog posts? Would it be different if the Sequences were on a website separate from LessWrong? -- The intuition is that a “website by one man” would seem more natural than a “website mostly by one man”. Because people do have their personal blogs, and it’s not controversial. Even if the personal blog gets hundreds of comments, it still feels like a personal blog, not like a movement.
(Note: I am not recommending any change here. Just thinking loudly whether there is something about the format of the website that provokes people, or whether it is mere “I dislike you, therefore I dislike anything you do”.)
Having peer-reviewed articles (not just conference papers) or otherwise being connected with the scientific establishment would obviously be a good argument. I’m not saying it should be high priority for Eliezer, but if there is a PR department in MIRI/CFAR, it should be a priority for them. (Actually, I can imagine some CFAR ideas published in a pedagogical journal—that also counts as official science, and could be easier.)
The cultish stuff is the typical “did you stop beating your wife?” pattern. Anything you respond… is exactly what a cult would do. (Because being cultish is an evidence for being a cult, but not being cultish is also an evidence for being a cult, because cults try to appear not cultish. And by the way, using the word “evidence” is an evidence of being a brainwashed LW follower.)
What is the relation between politics and skepticism? I mean, do all skeptics have to be perfectly politically neutral? Or is left-wing politics compatible with skepticism and only the right-wing politics is incompatible? (I am not sure which of these was the author’s opinion.) How about things like “Atheism Plus”? And here is a horrible thought… if some research would show there is a non-zero corelation between atheism and a position on a political spectrum, would it mean that atheists are also forbidden from skeptical movement?
I appreciate the spin of saying that Julia is just a pretty face, and then suddenly attributing this opinion to LW. I mean, that’s a nice Dark Arts move—say something offensive, and then pretend it was actually your opponent who believes that, not you. (The author is mysteriously silent about his own opinion. Does he believe that Julia is not smart? Or does he believe that she is smart, but that it is completely accidental to the fact that she represents LW on Skepticon? Either choice would be very suspicious, so he just does not specify it. And turns off the comments on youtube, so we cannot ask.)
The only point I feel the need to contest is “EY admits he is libertarian”. What I remember is EY admitting that he was previously libertarian, then stopped.
Well, and “EY is a high school dropout with no peer reviewed articles”, not because it’s untrue, but because neither of those is all that important.
The rest is sound criticism, so far as I can tell.
I started my career as a libertarian, and gradually became less political as I realized that (a) my opinions would end up making no difference to policy and (b) I had other fish to fry. My current concern is simply with the rationality of the disputants, not with their issues—I think I have something new to say about rationality.
It could be interpreted as Eliezer no longer being libertarian, but also as Eliezer remaining libertarian, just moving more meta and focusing on more winnable topics.
“EY is a high school dropout with no peer reviewed articles”, not because it’s untrue, but because neither of those is all that important.
Sure, but why does it feel (I mean, at least to the author) as important? I guess it is heuristics “if you are not a scientist, and you speak a lot about science, you got it wrong”. Which may be generally correct, if people obsessed with science usually become either scientists or pseudoscientists.
The rest is sound criticism, so far as I can tell.
The part about Julia didn’t sound fair to me—but perhaps you should see the original, not my interpretation. It starts at 8:50.
Otherwise, yes, he has some good points, he is just very selective about the evidence he considers. I was most impressed by the part about Holden’s non-criticism. (More meta, I wonder how he would interpret this agreement with his criticism. Possibly as something unimportant, or something that a cult would do to try appear non-cultish.)
Julia Galef is a rising star in the skeptical movement; she has a podcast “Rationally Speaking”. But she is connected with LW, she believes in Bayes theorem, and she only criticizes the political left. She is obviously used as a face of LW movement because she is pretty! -- This is a sexism on LW’s part, because men at LW agree in comments that Julia is pretty. If they weren’t sexist, they would talk about how smart she is
I think what this is really saying is that Galef is socially popular especially among skeptics (she has a popular blog, co-hosts multiple podcasts, and all that), but she’s not necessarily smarter, or even more involved in LW activities (presumably, MIRI/CFAR has a reputation of very smart folks being involved, hence the confusion), compared to many other LW folks, e.g. Eliezer, etc. So, the argument goes, it’s not really clear why she should get to be the public face of LW, but it’s certainly convenient in that, again, LW is made to look less like a cult than it really is.
I hope I am not mistaken about this, but it seems to me that MIRI and CFAR were separated because the former focuses on “Friendly AI” and the latter on “raising the sanity waterline”. It’s not just a difference in topic, but the topic also determines tools and strategy. -- To research Friendly AI, you need to find good mathematicians, develop a mathematical theory, convince AI researchers about its seriousness, publish in peer-reviewed journals, and ultimately develop the machine. To raise the sanity waterline, you need to find good teachers, develop a curriculum, educate people, and measure the impact. -- Obviously, Eliezer cares mostly about the former, and I believe even the author of the video would agree with that.
So, pretty likely, Eliezer is not the most involved person in CFAR. I don’t know about internal stuff of CFAR to say precisely who is that person. Perhaps there are many people contributing significantly in ways that can’t be directly compared; is it more important to research the curriculum, write the textbooks, test the curriculum, connect people, or keep everything running smoothly? Maybe it’s not Julia, but that doesn’t mean it’s Eliezer.
I guess CFAR could also send Anna Salamon, Michael Smith, Andrew Critch, or anyone else from their team to Skepticon. Would that be better? Or unless it is Eliezer personally, will it is always seem like the dark overlord Eliezer is hiding behind someone else’s face? (Actually, I wouldn’t mind if Eliezer goes to Skepticon, if he would think this is the best way to use his time.) How about all of them going to Skepticon together—would that be acceptable? Or is it: anyone but Julia?
By the way, I really liked Julia’s Straw Vulcan lecture, and sent a few people a hyperlink. So she has some interesting things to say, too. And those things are completely relevant to CFAR goals.
Saw this on twitter. Hilarious: “Ballad of Big Yud”
http://www.youtube.com/watch?v=nXARrMadTKk
There is another video from the same author explaining his opinions on LW. It takes 2 minutes to just start talking about LW, so here are the important parts: ---
The Sequences are hundreds and hundreds of blog posts, written by one man. They are like catechism, teach strange vocabulary like “winning”, “paying rent”, “mindkilling”, “being Bayesian”.
The claim that Bayes theorem, which is just a footnote in statistic textbook, has the power to reshape your thinking so that you can maximize the outcomes of your life… has no evidence. You can’t simplify the complexity of life into simple probabilities. EY is a high-school dropout and he has no peer-reviewed articles.
People on LW say that criticism of LW is upvoted. Actually, that “criticism” does not disagree with anything—it just asks MIRI to be more specific. Is that the LW’s best defense against accusations of cultishness?
LW community believes in Singularity, which again, has no evidence, and the scientific community does not support it. MIRI asks your money, and does not say how specifically it will be used to save the world.
LW claims that politics is the mindkiller, yet EY admits that he is libertarian. Most of MIRI money comes from Peter Theil—a right-wing libertarian billionaire.
Roko’s basilisk...
...and these guys pretend to be skeptics?
Now let’s look at CFAR. They have EY on their board, and they force you to read the Sequences if you want to join them.
Julia Galef is a rising star in the skeptical movement; she has a podcast “Rationally Speaking”. But she is connected with LW, she believes in Bayes theorem, and she only criticizes the political left. She is obviously used as a face of LW movement because she is pretty! -- This is a sexism on LW’s part, because men at LW agree in comments that Julia is pretty. If they weren’t sexist, they would talk about how smart she is.
People like this are not skeptics and should not be invited to Skepticon!
There’s a user at RationalWiki, one of the dedicated LW critics there, called “Baloney Detection”. I often wondered who it was. The image at 5:45 in this video, and the fact that “Baloney Detection” also edited the “Julia Galef” page at RW to decry her association with LW, tells me this is him…
By the way, the RW article about LW now seems more… rational… than the last time I checked. (Possibly because our hordes of cultists sposored by the right-wing extremist conspiracy fixed it, hoping to receive the promised 3^^^3 robotic virgins in singularitarian paradise as a reward.) You can’t say the same thing about the talk pages, though.
It’s strange. Now I should probably update towards “a criticism of LW found online probably somehow comes from two or three people on RW”. On their talk pages, Aris Katsaris sounds like a lonely sane voice in a desert of… I guess it’s supposed to be a “rationality with a snarky point of view”; which works like this—I can say anything, and if you prove me lying, I say I was exaggerating to make it more funny.
Some interesting bits from the (mostly boring) talk page:
A proper skeptical argument about why “Torture vs Dust Specks” is wrong.
This is why LW people care about Löb’s Theorem, in case you (LW cultists not belonging to the inner circle) didn’t know.
An ad-hoc explanation is being prepared. Criticising Eliezer for being a high school dropout and never publishing in peer-reviewed journal is so much fun… but if he would some day publish in a peer-reviewed journal and get citations or whatever recognition by the scientific establishment, RationalWiki already knows the true explanation—the right-wing conspiracy bribed the scientists. (If the day comes that RW starts criticizing scientists for supporting LW, I will be laughing and munching popcorn.)
How do you know what you know? Specifically, where are those data about who upvoted and downvoted Holden coming from? (Or it is an alternative explanation-away? LW does not accept criticism and censors everything, but this one time the power of the popular opinion prevented them from deleting it.)
And finally a good idea:
I vote yes.
The article was improved ’cos AD (a RW regular who doesn’t care about LW) rewrote it.
It was disappointing to see Holden’s posts get any down votes.
I agree, but we are speaking about approximately 13 downvotes from 265 total votes. So we have at least 13 people on LessWrong who oppose a high-quality criticism.
The speculation about regulars downvoting and non-regulars upvoting is without any evidence; could have also been the other way round. We also had a few trolls and crazy people here in the past. And by the way, it’s not like people from RationalWiki couldn’t create throw-away accounts here. So, with the same zero evidence, I could propose an alternative hypothesis that Holden was actually downvoted by people from RW who smartly realized that his “criticism” of LW is actually no criticism. But that would just be silly.
Or there are approximately 13 people who believe the post is worth a mere 250 votes, not 265 and so used their vote to push it in the desired direction. Votes needn’t be made or considered to be made independently of each other.
One data point: I used to do that kind of things before the “% positive” thing was implemented, but I no longer do that, at least not deliberately.
I am pleasantly surprised that they didn’t get overwhelmed by the one or two LW trolls that swamped them a couple months back.
Looking through the talk pages, it seems those guys partially ran out of steam, which let cooler heads prevail.
My own thoughts:
I wonder how much “hundreds of blog posts written by one man” is the true rejection. I mean, would the reaction be different if it was a book instead of hundred blog posts? Would it be different if the Sequences were on a website separate from LessWrong? -- The intuition is that a “website by one man” would seem more natural than a “website mostly by one man”. Because people do have their personal blogs, and it’s not controversial. Even if the personal blog gets hundreds of comments, it still feels like a personal blog, not like a movement.
(Note: I am not recommending any change here. Just thinking loudly whether there is something about the format of the website that provokes people, or whether it is mere “I dislike you, therefore I dislike anything you do”.)
Having peer-reviewed articles (not just conference papers) or otherwise being connected with the scientific establishment would obviously be a good argument. I’m not saying it should be high priority for Eliezer, but if there is a PR department in MIRI/CFAR, it should be a priority for them. (Actually, I can imagine some CFAR ideas published in a pedagogical journal—that also counts as official science, and could be easier.)
The cultish stuff is the typical “did you stop beating your wife?” pattern. Anything you respond… is exactly what a cult would do. (Because being cultish is an evidence for being a cult, but not being cultish is also an evidence for being a cult, because cults try to appear not cultish. And by the way, using the word “evidence” is an evidence of being a brainwashed LW follower.)
What is the relation between politics and skepticism? I mean, do all skeptics have to be perfectly politically neutral? Or is left-wing politics compatible with skepticism and only the right-wing politics is incompatible? (I am not sure which of these was the author’s opinion.) How about things like “Atheism Plus”? And here is a horrible thought… if some research would show there is a non-zero corelation between atheism and a position on a political spectrum, would it mean that atheists are also forbidden from skeptical movement?
I appreciate the spin of saying that Julia is just a pretty face, and then suddenly attributing this opinion to LW. I mean, that’s a nice Dark Arts move—say something offensive, and then pretend it was actually your opponent who believes that, not you. (The author is mysteriously silent about his own opinion. Does he believe that Julia is not smart? Or does he believe that she is smart, but that it is completely accidental to the fact that she represents LW on Skepticon? Either choice would be very suspicious, so he just does not specify it. And turns off the comments on youtube, so we cannot ask.)
If it was a book, it’d be twice the size of Lord Of The Rings.
The only point I feel the need to contest is “EY admits he is libertarian”. What I remember is EY admitting that he was previously libertarian, then stopped.
Well, and “EY is a high school dropout with no peer reviewed articles”, not because it’s untrue, but because neither of those is all that important.
The rest is sound criticism, so far as I can tell.
Here is a comment (from 2007) about it:
It could be interpreted as Eliezer no longer being libertarian, but also as Eliezer remaining libertarian, just moving more meta and focusing on more winnable topics.
Sure, but why does it feel (I mean, at least to the author) as important? I guess it is heuristics “if you are not a scientist, and you speak a lot about science, you got it wrong”. Which may be generally correct, if people obsessed with science usually become either scientists or pseudoscientists.
The part about Julia didn’t sound fair to me—but perhaps you should see the original, not my interpretation. It starts at 8:50.
Otherwise, yes, he has some good points, he is just very selective about the evidence he considers. I was most impressed by the part about Holden’s non-criticism. (More meta, I wonder how he would interpret this agreement with his criticism. Possibly as something unimportant, or something that a cult would do to try appear non-cultish.)
In 2011, he describes himself as a “a very small-‘l’ libertarian” in this essay at Cato Unbound.
I think what this is really saying is that Galef is socially popular especially among skeptics (she has a popular blog, co-hosts multiple podcasts, and all that), but she’s not necessarily smarter, or even more involved in LW activities (presumably, MIRI/CFAR has a reputation of very smart folks being involved, hence the confusion), compared to many other LW folks, e.g. Eliezer, etc. So, the argument goes, it’s not really clear why she should get to be the public face of LW, but it’s certainly convenient in that, again, LW is made to look less like a cult than it really is.
I hope I am not mistaken about this, but it seems to me that MIRI and CFAR were separated because the former focuses on “Friendly AI” and the latter on “raising the sanity waterline”. It’s not just a difference in topic, but the topic also determines tools and strategy. -- To research Friendly AI, you need to find good mathematicians, develop a mathematical theory, convince AI researchers about its seriousness, publish in peer-reviewed journals, and ultimately develop the machine. To raise the sanity waterline, you need to find good teachers, develop a curriculum, educate people, and measure the impact. -- Obviously, Eliezer cares mostly about the former, and I believe even the author of the video would agree with that.
So, pretty likely, Eliezer is not the most involved person in CFAR. I don’t know about internal stuff of CFAR to say precisely who is that person. Perhaps there are many people contributing significantly in ways that can’t be directly compared; is it more important to research the curriculum, write the textbooks, test the curriculum, connect people, or keep everything running smoothly? Maybe it’s not Julia, but that doesn’t mean it’s Eliezer.
I guess CFAR could also send Anna Salamon, Michael Smith, Andrew Critch, or anyone else from their team to Skepticon. Would that be better? Or unless it is Eliezer personally, will it is always seem like the dark overlord Eliezer is hiding behind someone else’s face? (Actually, I wouldn’t mind if Eliezer goes to Skepticon, if he would think this is the best way to use his time.) How about all of them going to Skepticon together—would that be acceptable? Or is it: anyone but Julia?
By the way, I really liked Julia’s Straw Vulcan lecture, and sent a few people a hyperlink. So she has some interesting things to say, too. And those things are completely relevant to CFAR goals.
Chorus … We should help him read the sequences … shambles forward
The anti-LW’ers have become quite the community themselves, the video is referencing XiXiDu and others.
It’s thoroughly entertaining, the music especially.
Edit: I must say I found this statement by the video’s author illuminating indeed in regards to his strong discounting of Bayesian reasoning:
To his benefit, Dmytry explained it to him, and now all is well again.