Reply to this comment if you found LW through Harry Potter and the Methods of Rationality!
A survey for anyone who cares to respond (edit: specifically for people who did find LW through HPMoR):
Had you already registered an account before seeing this? (Edit: That is, had you already registered an account for a reason other than to reply to this comment?) If not, had you been planning or expecting to?
Have you been reading through the sequences, or just generally looking around and lurking?
What new rationality skills that you learned from HPMoR or LW have you found most useful? Most interesting? Most change-the-way-you-look-at-everything-ly?
Have you referred anyone else to HPMoR? Have you referred anyone else to LW?
Yes, I had registered an account, and had managed ten whole karma points as of this post, of which I am rather proud.
I have been reading through the sequences.
I’ve found a lot of the biases fascinating, particularly when it comes to testing a hypothesis, and I just finished a sequence on words and definitions, which I quite enjoyed.
I’ve attempted to refer a couple people, but found that my brother had already found Less Wrong independently (and hadn’t told me about it!).
I knew of LW’s existence before HPMoR, through the same source that referred me to HPMoR (ESR).
I registered mostly to comment on this post.
I’ve been reading the Sequences.
More stuff about Bayes’ Theorem (my extent of knowledge before I read the Intuitive Explanation was the idea that there will be many false positives on searching for rare events).
Yes, I made an account shortly after I read HPatMoR.
I’ve been taking peeks here and there. I mean I was aware of Less Wrong existing before. I’ve read stuff by Eliezer before, specifically the first contact story, and I found it fun if extremely formulaic and didactic. It was a pleasant surprise for me, that I could find something so stilted so fun.
I haven’t noticed anything I haven’t heard of before.
I don’t exactly fit your set since I had seen LW before, but there’s some good reason that I should be included in your sample. Explanation follows: I had read most of the sequences before (and frankly didn’t learn that much from them. A handful of cogsci and psych classes along with a fair bit of phil sci gives one a lot of the same material) and had previously read some of Eliezer’s fiction. I hadn’t really taken that detailed a look at LW as a whole, until HPMR. That was partially due to a conversation with a friend that went something like
Friend: So who is the author of this stuff?
JZ: He’s Eliezer Yudkowsky who is an all around very bright guy. He has some a bit off ideas about the Singularity.
Friend: What evidence do you have of that he’s bright and not just a good fiction writer? The one thing you’ve mentioned is something you disagree with.
JZ: Um, let me get back to you.
Then when reading I felt a need to register an account to make a comment, and then it has been downhill from there (I just linked an LW post to a friend who said that she refused to read it because “I’m not sure I’m willing to let myself -oh god oh god- be sucked into Less Wrong. I have heard it wastes time like tvtropes on crack.” I’m not sure if that’s a good or a bad thing).
I’ve linked HPMR to a fair number of people, and it seems to be having some impact on some of them. Indeed, it seems that it is quite effective at getting through defense mechanisms that some people have against being more rational, because the arguments aren’t being coached in an obvious way of trying to just present what is wrong with their thinking processes. I’m running into concerns about whether linking HPMR to people without telling them about that is ethical or not.
That which can be destroyed by the truth should be.
(On the other hand, Michael Vassar often claims that this quote is as disingenuous as a strong man saying “That which can be destroyed by lions should be.”)
(On the other hand, Michael Vassar often claims that this quote is as disingenuous as a strong man saying “That which can be destroyed by lions should be.”)
I’m not sure I understand. Lions can destroy any human, no matter how strong, right? Is the implication that truth is a weapon? Or that the only people who support truth are the ones who think they’re right? But people frequently think they’re right when they’re not.
If you are rational, you both are already more likely believe things that are true (or less wrong than your competitors) and more able to defend your false beliefs using knowledge of argument and cognitive biases.
“Well-armed” makes a little more sense, but I still don’t think it’s a good analogy. Lions destroy people who aren’t well-armed, so it’s disingenous for a well-armed person to say that a fair procedure for who lives is to let the lions attack and see who survives. Truth destroys false ideas, not people, and people frequently don’t know in advance which ideas will be destroyed by the truth. People, even rational ones, are often wrong in their predictions, unlike the well-armed man.
A precommitment to letting experiments and truth decide what ideas will survive doesn’t stack the deck in your favor, unlike in the lions example. The whole point is that you are willing to take the chance of having your ideas die, as long as the true ideas survive.
I think you could say that the truth does destroy people. You can’t be the same person once you’ve really accepted an entirely new, important idea, and rejected an old belief.
When someone says “that which should be destroyed by the truth should be” and he’s talking to a Christian or a white supremacist or thousand other people defined by the silly idea they take very seriously, you are often asking them to do something a lot more scary than go up against a lion.
If you’ve already seen the truth and accepted it, the deck is as stacked as it could be. And if you haven’t but are otherwise making your bet rationally, while the other is not, then you’ve still got a lot better chance.
That which can be destroyed by the truth should be.
And if that destruction itself requires withholding information? In most contexts I’m pretty sure most people here would think that something of the form “I know I’m right, but they’ll more likely to not believe the truth if I don’t tell them X” is not good rational behavior.
1: No. Most of my time I was lurking. Lot of stuff on LW.
2: Following links, like I was on TVtropes
3: Nothing yet. Eliezer has a distinct way of expressing himself, which is why I enjoy HPMoR, but most of the ideas he is expressing I have heard before.
You’re not very rational for a bunch of extreme rationalists, are you? It’s only possible to answer this survey if you register for the site, so excluding nearly all possible commenters (there is science on this) and presumably an even greater proportion of those who are uninterested in the ideas in LW. So that’s a Big Old Fail.
So, here goes:
No, I have registered purely for the purpose of replying to this comment.
I started to read through the Sequences, but they rapidly set off my nutcase detectors. So you might want to do something about that. But I was interested in who had written the fan fiction; and quickly found this thread.
I have learnt that there is a community of extreme rationalists who believe that humanity will soon use science to cheat death. Actually, I already knew that. But I have found one of their websites. I was already familiar with most of the philosophical tropes explored in HPMoR; I don’t think it would be anything like such a good story otherwise, and I think most of the readership will be people who are already relatively rational. So in terms of ‘raising the sanity waterline’, an endeavour which seems to be entirely worthwhile, I am not sure it will do that.
I have and will referred people to HPMoR. I have not referred anyone to LW.
You’re not very rational for a bunch of extreme rationalists, are you? It’s only possible to answer this survey if you register for the site, so excluding nearly all possible commenters (there is science on this) and presumably an even greater proportion of those who are uninterested in the ideas in LW. So that’s a Big Old Fail.
You really think we’ve never talked about selection bias here? It is constantly a concern every time we do a survey. This is why ata’s questions were directed at those who had registered and not at the entire group that read the fanfiction. If you know of some way we could poll everyone who read the fanfiction without response bias by all means tell us.
Something about us rubbed you the wrong way. Which is fine, things about us rub me the wrong way. But I’d much rather you articulate what that was than go searching for random things to criticize us about just because you want us to be irrational.
I started to read through the Sequences, but they rapidly set off my nutcase detectors.
Are you asking because you don’t know, or because you want to know which ones BohemianCoast noticed?
Most of the world is wrong. Formal education is overrated. The world as we know it is may cease within a century. Lots Of Math. Simultaneously mentioning the word quantum and talking about psychology. For that matter, mentioning the word quantum.
Those are just the ones off the top of my head, and I’m not BohemianCoast. But a lot of stuff written here (and in the “Sequences”) is true despite setting off nutcase detectors, not without setting them off.
Most of the world is wrong. Formal education is overrated. The world as we know it is may cease within a century. Lots Of Math. Simultaneously mentioning the word quantum and talking about psychology. For that matter, mentioning the word quantum.
There Isn’t That Much Math, Really. And none of the cargo-cultish use of mathy writing as impressive-looking gibberish that tends to mark nutcase stuff. Agree with the rest though. Oh, and also: The scientific method is poor and needs to be improved. A central notion on physics held by most practicing physicists is fundamentally misguided.
And none of the cargo-cultish use of mathy writing as impressive-looking gibberish that tends to mark nutcase stuff.
I’ve seen plenty of nutcase-stuff in which the math wasn’t gibberish—it was correct as math but was simply window-dressing for the nutcase argument. Sometimes, it seems that EY is just using it as garnish for his arguments as well. So, I think that there is a kernel of truth in what GuySrinivasan said about the mathiness of the site. It fits the pattern.
Which is not to say that EY is a nutcase. Those nutcase detectors may be returning false positives. But that doesn’t mean that the nutcase detectors are defective.
Yes. I went from LW to the OB archives, I created an account to comment on an old post there.
I’ve been ignoring the Sequences as such, but have been working my way through the OB archives chronologically, which I gather covers the same material.
Hard to answer that question. The cognitive bias stuff is fairly old hat. The timeless-physics stuff is new to me, but isn’t really a skill. I’m currently working my way through the metaethics stuff, which I’m not finding particularly convincing but haven’t finished thinking about.
One friend, to both HPMoR and the OB archives. Not so much LW per se, which (sorry) seems to have a higher noise:signal ratio than the old stuff.
I’ve been paying a little bit of attention to recent posts, but not a lot; mostly I’ve been “time-travelling” through the archives.
I’ve been responding to posts here and there when I have something to say I don’t see in the comments. I do this even though I don’t expect anyone is reading old comments (though sometimes they get upvoted or responded to, so it’s not a complete vacuum), mostly because I often don’t really know what I think about something until I’ve tried to formulate a response to it.
In my observation, replies to old comments and comments on old posts frequently get a fair amount of activity. I think that many users (including myself) operate largely from the “recent comments” list, so we stand a good chance of noticing new material wherever it is.
Reading through the Sequences. Well, I say reading through...you read through and then there’s a link, and then there’s another link, and another and another...So yes, reading through, but not in exact order.
I suppose...not necessarily have yet found useful, but am anticipating finding most useful in the future: the planning fallacy, the bit about believing the way Spinoza thought you did and not Descartes, and the conjunction falllacy.
Reply to this comment if you found LW through Harry Potter and the Methods of Rationality!
A survey for anyone who cares to respond (edit: specifically for people who did find LW through HPMoR):
Had you already registered an account before seeing this? (Edit: That is, had you already registered an account for a reason other than to reply to this comment?) If not, had you been planning or expecting to?
Have you been reading through the sequences, or just generally looking around and lurking?
What new rationality skills that you learned from HPMoR or LW have you found most useful? Most interesting? Most change-the-way-you-look-at-everything-ly?
Have you referred anyone else to HPMoR? Have you referred anyone else to LW?
Yes, I had registered an account, and had managed ten whole karma points as of this post, of which I am rather proud.
I have been reading through the sequences.
I’ve found a lot of the biases fascinating, particularly when it comes to testing a hypothesis, and I just finished a sequence on words and definitions, which I quite enjoyed.
I’ve attempted to refer a couple people, but found that my brother had already found Less Wrong independently (and hadn’t told me about it!).
I knew of LW’s existence before HPMoR, through the same source that referred me to HPMoR (ESR).
I registered mostly to comment on this post.
I’ve been reading the Sequences.
More stuff about Bayes’ Theorem (my extent of knowledge before I read the Intuitive Explanation was the idea that there will be many false positives on searching for rare events).
No.
Yes, I made an account shortly after I read HPatMoR.
I’ve been taking peeks here and there. I mean I was aware of Less Wrong existing before. I’ve read stuff by Eliezer before, specifically the first contact story, and I found it fun if extremely formulaic and didactic. It was a pleasant surprise for me, that I could find something so stilted so fun.
I haven’t noticed anything I haven’t heard of before.
I’ve referred people to HPatMoR but not LW.
I don’t exactly fit your set since I had seen LW before, but there’s some good reason that I should be included in your sample. Explanation follows: I had read most of the sequences before (and frankly didn’t learn that much from them. A handful of cogsci and psych classes along with a fair bit of phil sci gives one a lot of the same material) and had previously read some of Eliezer’s fiction. I hadn’t really taken that detailed a look at LW as a whole, until HPMR. That was partially due to a conversation with a friend that went something like
Friend: So who is the author of this stuff? JZ: He’s Eliezer Yudkowsky who is an all around very bright guy. He has some a bit off ideas about the Singularity. Friend: What evidence do you have of that he’s bright and not just a good fiction writer? The one thing you’ve mentioned is something you disagree with. JZ: Um, let me get back to you.
Then when reading I felt a need to register an account to make a comment, and then it has been downhill from there (I just linked an LW post to a friend who said that she refused to read it because “I’m not sure I’m willing to let myself -oh god oh god- be sucked into Less Wrong. I have heard it wastes time like tvtropes on crack.” I’m not sure if that’s a good or a bad thing).
I’ve linked HPMR to a fair number of people, and it seems to be having some impact on some of them. Indeed, it seems that it is quite effective at getting through defense mechanisms that some people have against being more rational, because the arguments aren’t being coached in an obvious way of trying to just present what is wrong with their thinking processes. I’m running into concerns about whether linking HPMR to people without telling them about that is ethical or not.
That which can be destroyed by the truth should be.
(On the other hand, Michael Vassar often claims that this quote is as disingenuous as a strong man saying “That which can be destroyed by lions should be.”)
I’m not sure I understand. Lions can destroy any human, no matter how strong, right? Is the implication that truth is a weapon? Or that the only people who support truth are the ones who think they’re right? But people frequently think they’re right when they’re not.
If you are rational, you both are already more likely believe things that are true (or less wrong than your competitors) and more able to defend your false beliefs using knowledge of argument and cognitive biases.
Substitute well armed for strong if you like.
“Well-armed” makes a little more sense, but I still don’t think it’s a good analogy. Lions destroy people who aren’t well-armed, so it’s disingenous for a well-armed person to say that a fair procedure for who lives is to let the lions attack and see who survives. Truth destroys false ideas, not people, and people frequently don’t know in advance which ideas will be destroyed by the truth. People, even rational ones, are often wrong in their predictions, unlike the well-armed man.
A precommitment to letting experiments and truth decide what ideas will survive doesn’t stack the deck in your favor, unlike in the lions example. The whole point is that you are willing to take the chance of having your ideas die, as long as the true ideas survive.
I think you could say that the truth does destroy people. You can’t be the same person once you’ve really accepted an entirely new, important idea, and rejected an old belief.
When someone says “that which should be destroyed by the truth should be” and he’s talking to a Christian or a white supremacist or thousand other people defined by the silly idea they take very seriously, you are often asking them to do something a lot more scary than go up against a lion.
If you’ve already seen the truth and accepted it, the deck is as stacked as it could be. And if you haven’t but are otherwise making your bet rationally, while the other is not, then you’ve still got a lot better chance.
And if that destruction itself requires withholding information? In most contexts I’m pretty sure most people here would think that something of the form “I know I’m right, but they’ll more likely to not believe the truth if I don’t tell them X” is not good rational behavior.
1: No. Most of my time I was lurking. Lot of stuff on LW.
2: Following links, like I was on TVtropes
3: Nothing yet. Eliezer has a distinct way of expressing himself, which is why I enjoy HPMoR, but most of the ideas he is expressing I have heard before.
4: Yes to HPMoR, no to LW.
You’re not very rational for a bunch of extreme rationalists, are you? It’s only possible to answer this survey if you register for the site, so excluding nearly all possible commenters (there is science on this) and presumably an even greater proportion of those who are uninterested in the ideas in LW. So that’s a Big Old Fail.
So, here goes:
No, I have registered purely for the purpose of replying to this comment.
I started to read through the Sequences, but they rapidly set off my nutcase detectors. So you might want to do something about that. But I was interested in who had written the fan fiction; and quickly found this thread.
I have learnt that there is a community of extreme rationalists who believe that humanity will soon use science to cheat death. Actually, I already knew that. But I have found one of their websites. I was already familiar with most of the philosophical tropes explored in HPMoR; I don’t think it would be anything like such a good story otherwise, and I think most of the readership will be people who are already relatively rational. So in terms of ‘raising the sanity waterline’, an endeavour which seems to be entirely worthwhile, I am not sure it will do that.
I have and will referred people to HPMoR. I have not referred anyone to LW.
You really think we’ve never talked about selection bias here? It is constantly a concern every time we do a survey. This is why ata’s questions were directed at those who had registered and not at the entire group that read the fanfiction. If you know of some way we could poll everyone who read the fanfiction without response bias by all means tell us.
Something about us rubbed you the wrong way. Which is fine, things about us rub me the wrong way. But I’d much rather you articulate what that was than go searching for random things to criticize us about just because you want us to be irrational.
What specifically?
Please Elaborate.
Are you asking because you don’t know, or because you want to know which ones BohemianCoast noticed?
Most of the world is wrong. Formal education is overrated. The world as we know it is may cease within a century. Lots Of Math. Simultaneously mentioning the word quantum and talking about psychology. For that matter, mentioning the word quantum.
Those are just the ones off the top of my head, and I’m not BohemianCoast. But a lot of stuff written here (and in the “Sequences”) is true despite setting off nutcase detectors, not without setting them off.
There Isn’t That Much Math, Really. And none of the cargo-cultish use of mathy writing as impressive-looking gibberish that tends to mark nutcase stuff. Agree with the rest though. Oh, and also: The scientific method is poor and needs to be improved. A central notion on physics held by most practicing physicists is fundamentally misguided.
I’ve seen plenty of nutcase-stuff in which the math wasn’t gibberish—it was correct as math but was simply window-dressing for the nutcase argument. Sometimes, it seems that EY is just using it as garnish for his arguments as well. So, I think that there is a kernel of truth in what GuySrinivasan said about the mathiness of the site. It fits the pattern.
Which is not to say that EY is a nutcase. Those nutcase detectors may be returning false positives. But that doesn’t mean that the nutcase detectors are defective.
CronoDAS definitely knows that Eliezer is not a nutcase. He’s very rational and well-informed.
Yes. I went from LW to the OB archives, I created an account to comment on an old post there.
I’ve been ignoring the Sequences as such, but have been working my way through the OB archives chronologically, which I gather covers the same material.
Hard to answer that question. The cognitive bias stuff is fairly old hat. The timeless-physics stuff is new to me, but isn’t really a skill. I’m currently working my way through the metaethics stuff, which I’m not finding particularly convincing but haven’t finished thinking about.
One friend, to both HPMoR and the OB archives. Not so much LW per se, which (sorry) seems to have a higher noise:signal ratio than the old stuff.
I’ve been paying a little bit of attention to recent posts, but not a lot; mostly I’ve been “time-travelling” through the archives.
I’ve been responding to posts here and there when I have something to say I don’t see in the comments. I do this even though I don’t expect anyone is reading old comments (though sometimes they get upvoted or responded to, so it’s not a complete vacuum), mostly because I often don’t really know what I think about something until I’ve tried to formulate a response to it.
In my observation, replies to old comments and comments on old posts frequently get a fair amount of activity. I think that many users (including myself) operate largely from the “recent comments” list, so we stand a good chance of noticing new material wherever it is.
Yes
Reading through the Sequences. Well, I say reading through...you read through and then there’s a link, and then there’s another link, and another and another...So yes, reading through, but not in exact order.
I suppose...not necessarily have yet found useful, but am anticipating finding most useful in the future: the planning fallacy, the bit about believing the way Spinoza thought you did and not Descartes, and the conjunction falllacy.
Yes. And yes.
Yes.
More towards looking around and lurking, but I’ve been reading LW long enough that I’ve read a fair number of important articles.
I’m not sure—to some extent, I’ve been working on this sort of thing anyway. I’ll post later if anything comes to mind.
Yes to both, mostly in conversation and on my livejournal.