What’s so intimidating? You don’t need much to post here, just a basic grounding in probability theory, decision theory, metaethics, philosophy of mind, philosophy of science, computer science, cognitive bias, evolutionary psychology, the theory of natural selection, artificial intelligence, existential risk, and quantum mechanics—oh, and of course to read a sequence of >600 3000+ word articles. So long as you can do that and you’re happy with your every word being subject to the anonymous judgment of a fiercely intelligent community, you’re good.
Being able to comment smartly and in a style that gets you upvoted doesn’t really need any grounding in any of those subjects. I just crossed 1500 karma and only have basic grounding in Computer Science, Mathematics, and Philosophy.
When I started out, I hadn’t read more than EY’s Bayes’ for Dummies, The Simple Truth, and one post on Newcomb’s.
In my opinion, the following things will help you more than a degree in any of the subjects you mentioned:
Crave the truth
Accept Reality as the source of truth
Learn in small steps
Ask questions when you don’t understand something
Test yourself for growth
Be willing to enter at low status
Be willing to lose karma by asking stupid questions
Being willing to shut up about a subject when people vote it down.
So far as I am aware, the chief reason non-spammers have been banned is for obnoxious evangelism for some unpopular idea. Many people have unpopular ideas but continue to be valued members (e.g. Mitchell_Porter).
Yeah. I just updated it again. I didn’t realize anyone was actually looking at it… :P
Recently I burned out on the sequences and am taking a break to gorge myself on other subjects. I tend to absorb large scale topics in rotation. It helps me stay focused over longer distances and has an added benefit of making me reread stuff that didn’t stick the first time through. The weekly study group will also help data retention.
Other data points that may be relevant: I have participated in a lot of online discussions; I have moved cross country into a drastically different cultural zone; I married someone from a non-US culture; I have visited at least one distinct Central American culture. In addition, I grew up in a religious culture but personally learn more toward a scientific/realistic culture. All of these things help build awareness that what I say isn’t what other people hear and vice versa.
As evidence of this, my conversations here have much better transmission success than my posts. Once I get to talk to someone and hear them respond I can start the translation predictors. I am still learning how to do this before I hear the responses.
Not “and”. “Or”. If you don’t already have it, then reading the sequences will give you a basic grounding in probability theory, decision theory, metaethics, philosophy of mind, philosophy of science, computer science, cognitive bias, evolutionary psychology, the theory of natural selection, artificial intelligence, existential risk, and quantum mechanics.
I actually think this is a little absurd. There is no where near enough on these topics in the sequences to give someone the background they need to participate comfortably here. Nearly everyone here as a lot of additional background knowledge. The sequences might be a decent enough guide for an autodidact to go off and learn more about a topic but there is no where near enough for most people.
The sequences are really kind of confusing… I tried linking people to Eliezer’s quantum physics sequence on Reddit and it got modded highly, but one guy posted saying that he got scared off as soon as he saw complex numbers. I think it’ll help once a professional edits the sequences into Eliezer’s rationality book.
Without new blood communities stagnate. The risk of group think is higher and assumptions are more likely to go unchecked. An extremely homogeneous group such as this one likely has major blind spots which we can help remedy by adding members with different kinds of experiences. I would be shocked if a bunch of white male, likely autism spectrum, CS and hard science types didn’t have blind spots. This can be corrected by informing our discussions with a more diverse set of experiences. Also, more diverse backgrounds means more domains we can comfortably apply rationality to.
I also think the world would be a better place if this rationality thing caught on. It is probably impossible (not to mention undesirable) to lower the entry barrier so that everyone can get in. But I think we could lower the barrier so that it is reasonable to think that 80-85+ percentile IQ, youngish, non-religious types could make sense of things. Rationality could benefit them and they being more rational could benefit the world.
Now we don’t want to be swamped with newbies and just end up rehashing everything over and over. But we’re hardly in any danger of that happening. I could be wrong but I suspect almost no top level posts have been created by anyone who didn’t come over from OB. It isn’t like we’re turning people away at the door right now. And you can set it up so that the newbie questions are isolated from everything else. The trick is finding a way to do it that isn’t totally patronizing (or targeting children so that we can get away with being patronizing).
What they need is trickier. Lets start here: A clear, concise one-stop-shop FAQ would be good. A place where asking the basic questions is acceptable and background isn’t assumed. Explanations that don’t rely on concepts from mathematics, CS or hard sciences.
Actually, I’m something of a partial data point against that as well. I did come here with the split from OB but I was just a causal reader, had only been there a few weeks and never commented.
I did go back and look at some of your early comments and my initial reaction is that you seem unusually well read and instinctively rational even for this crowd. In fact, I wonder if you should be asking me questions about how to make Less Wrong more amenable to people with limited background knowledge.
you can ask a person questions. ;)
You may regret this. I’m a very curious person.
To what extent was it obvious upon coming here that Less Wrong had a kind of affinity with computer science and programming? What effect did this affinity have on your interest? How much of your interest in participating was driven by Eliezer’s writings in particular compared to the community in general. Should the barrier to participation be lowered? If so how would you do it? What would have gotten you up to speed with everyone else faster? What would have made it easier? To what extent did you/do you now associate yourself with transhumanism? Did that factor into your interest in Less Wrong?
I could probably keep going. One more for now: What questions should I be asking you that I haven’t?
What questions should I be asking you that I haven’t?
That’s one question I really like. Originally learnt it from Jerry Weinberg as one of the “context free questions”, a very useful item in my toolkit.
What I’d ask people is “What motivated you to come here in the first place?” Where “here” takes a range of values—what motivated them to become readers; then to create a profile here and become commenters; then to start contributing.
To what extent was it obvious upon coming here that Less Wrong had a kind of affinity with computer science and programming? What effect did this affinity have on your interest?
Not very. What “hooked” me first, as a huge Dennett fan, was the Zombies post, which I came across while browsing random links from my Twitter feed. That led me to the QM sequence, which made sense for me of things that hadn’t made sense before, which motivated me to drill for more. That led me to the Bayes article. Parallel exploration turned up the FAI argument, which (I don’t dare use the word “click” yet) made intuitive sense even though it hadn’t crossed my mind before.
It was only then that I made the connection with CS/programming—I had this fantasy of getting my colleagues to invite Eliezer to keynote at our conference. Interestingly enough, the response I got from musing about that on Twitter was (direct quote) “the singularity will definitely have personality disorders”.
How much of your interest in participating was driven by Eliezer’s writings in particular compared to the community in general?
Well, I’d taken note that there was such a thing as the LW community blog, and I kept an eye on it, but in parallel I started reading all of the back-content of LW, all the posts by Eliezer ported over from OB. I wanted to catch up before increasing my participation. So initially I pretty much ignored the community, which anyway I couldn’t quite figure out.
What would have gotten you up to speed with everyone else faster? What would have made it easier?
I wish someone had told me, quite plainly, what I was expected to do! Something along the lines of, “this is a rationality dojo, posts are where intermediate students show off their moves, comments are where beginners learn from intermediates and demonstrate their abilities, you will be given cues by people ahead of you when you are ready to move along the path reader->commenter->poster”.
Looking back, I can see some mistakes made in the way this community is set up that tend to put it at odds with its stated mission; and I’m not at all sure I’d have done any better, given what people knew pre-launch. And figuring out how to participate was also part of the learning process, consistent with (for instance) Lave and Wenger’s notions on “legitimate peripheral particiation”.
I’m guessing that this process could be improved by thinking more explicitly about this kind of theoretical frameworks, when thinking about what this community is aiming to achieve and how to achieve it. I’ve done a lot of this kind of thinking in my “secret identity”, with some successes.
To what extent did you/do you now associate yourself with transhumanism? Did that factor into your interest in Less Wrong?
No, I’ve been vaguely aware of transhumanist ideas and values for some time, but never explicitly identified as singularitarian, transhumanist, extropian or anything of the sort. I have most of the background reading that seems to be common in these circles (from a very uninformed outsider’s perspective) but I guess I never was in the right place at the right time to become an insider. It feels as if I might have been.
LessWrong is missing “profile pages” of some kind, where the sort of biographical information that we’re discussing could be collected for later reference. Posting a comment to the “Welcome” thread doesn’t really cut it.
I wish someone had told me, quite plainly, what I was expected to do! Something along the lines of, “this is a rationality dojo...”
Indeed—the reason we don’t say that explicitly is that it’s unclear how much this is the case. However, if it were possible for Lw to become a “rationality dojo”, I think most of us would leap on the opportunity.
There is some previous discussion which suggests that not everyone here would be happy to see LW as a “rationality dojo”.
The term “dojo” has favorable connotations for me, partly because one of my secret identity’s modest claims to fame is as a co-originator of the “Coding Dojo”, an attempt to bring a measure of sanity back to the IT industry’s horrible pedagogy and hiring practices.
However these connotations might be biasing my thinking about whether using the “dojo” metaphor as a guide to direct the evolution of LW would be for good or ill on balance.
How about starting a discussion at the top of the current Open Thread to ask people what they now think of applying the Dojo metaphor to LW?
I think I’m the only one on that thread who explicitly advised against starting a rationality dojo, and the other concerns were mostly whether it was possible.
Indeed. Eliezer’s post itself, however, seemed mostly to caution against it, and perhaps what he took away from the subsequent discussion, after weighing the various contributions, was that it had too little to recommend it. At any rate, that I’m aware, the question wasn’t raised again?
Of course one issue is that it was never clarified what “it” might be, i.e. what would result from treating LW more explicitly as a “rationality dojo” (that would be different from what it is at present).
I’ve long said that Truly Part of You is the article with, by far, the highest ratio of “Less Wrong philosophy content” to length. (Unfortunately, it doesn’t seem to be listed in any sequence despite being a follow-up to two others.)
Other than knowing specific jargon, that would get people reasonably up to speed and should probably be what we’re pointing newcomers to.
Well there were several subjects in that list I knew little about until I started reading the Sequences, so yes, on that point I confess I’m being hyperbolic for humorous effect...
What’s so intimidating? You don’t need much to post here, just a basic grounding in probability theory, decision theory, metaethics, philosophy of mind, philosophy of science, computer science, cognitive bias, evolutionary psychology, the theory of natural selection, artificial intelligence, existential risk, and quantum mechanics—oh, and of course to read a sequence of >600 3000+ word articles. So long as you can do that and you’re happy with your every word being subject to the anonymous judgment of a fiercely intelligent community, you’re good.
Sounds like a pretty good filter for generating intelligent discussion to me. Why would we want to lower the bar?
Being able to comment smartly and in a style that gets you upvoted doesn’t really need any grounding in any of those subjects. I just crossed 1500 karma and only have basic grounding in Computer Science, Mathematics, and Philosophy.
When I started out, I hadn’t read more than EY’s Bayes’ for Dummies, The Simple Truth, and one post on Newcomb’s.
In my opinion, the following things will help you more than a degree in any of the subjects you mentioned:
Crave the truth
Accept Reality as the source of truth
Learn in small steps
Ask questions when you don’t understand something
Test yourself for growth
Be willing to enter at low status
Be willing to lose karma by asking stupid questions
Ignore the idiots
Another factor:
Being willing to shut up about a subject when people vote it down.
So far as I am aware, the chief reason non-spammers have been banned is for obnoxious evangelism for some unpopular idea. Many people have unpopular ideas but continue to be valued members (e.g. Mitchell_Porter).
Useful data point, thanks. Have you made any more progress with the sequences since you last updated your wiki user info page?
Yeah. I just updated it again. I didn’t realize anyone was actually looking at it… :P
Recently I burned out on the sequences and am taking a break to gorge myself on other subjects. I tend to absorb large scale topics in rotation. It helps me stay focused over longer distances and has an added benefit of making me reread stuff that didn’t stick the first time through. The weekly study group will also help data retention.
Other data points that may be relevant: I have participated in a lot of online discussions; I have moved cross country into a drastically different cultural zone; I married someone from a non-US culture; I have visited at least one distinct Central American culture. In addition, I grew up in a religious culture but personally learn more toward a scientific/realistic culture. All of these things help build awareness that what I say isn’t what other people hear and vice versa.
As evidence of this, my conversations here have much better transmission success than my posts. Once I get to talk to someone and hear them respond I can start the translation predictors. I am still learning how to do this before I hear the responses.
Not “and”. “Or”. If you don’t already have it, then reading the sequences will give you a basic grounding in probability theory, decision theory, metaethics, philosophy of mind, philosophy of science, computer science, cognitive bias, evolutionary psychology, the theory of natural selection, artificial intelligence, existential risk, and quantum mechanics.
I actually think this is a little absurd. There is no where near enough on these topics in the sequences to give someone the background they need to participate comfortably here. Nearly everyone here as a lot of additional background knowledge. The sequences might be a decent enough guide for an autodidact to go off and learn more about a topic but there is no where near enough for most people.
The sequences are really kind of confusing… I tried linking people to Eliezer’s quantum physics sequence on Reddit and it got modded highly, but one guy posted saying that he got scared off as soon as he saw complex numbers. I think it’ll help once a professional edits the sequences into Eliezer’s rationality book.
http://www.reddit.com/r/philosophy/comments/b1v1f/thought_waveparticle_duality_is_the_result_of_a/c0kjuno
Which people do we want? What do those people need?
However strongly you catapult a plane from the flight deck, at some point it has to fly by itself.
Without new blood communities stagnate. The risk of group think is higher and assumptions are more likely to go unchecked. An extremely homogeneous group such as this one likely has major blind spots which we can help remedy by adding members with different kinds of experiences. I would be shocked if a bunch of white male, likely autism spectrum, CS and hard science types didn’t have blind spots. This can be corrected by informing our discussions with a more diverse set of experiences. Also, more diverse backgrounds means more domains we can comfortably apply rationality to.
I also think the world would be a better place if this rationality thing caught on. It is probably impossible (not to mention undesirable) to lower the entry barrier so that everyone can get in. But I think we could lower the barrier so that it is reasonable to think that 80-85+ percentile IQ, youngish, non-religious types could make sense of things. Rationality could benefit them and they being more rational could benefit the world.
Now we don’t want to be swamped with newbies and just end up rehashing everything over and over. But we’re hardly in any danger of that happening. I could be wrong but I suspect almost no top level posts have been created by anyone who didn’t come over from OB. It isn’t like we’re turning people away at the door right now. And you can set it up so that the newbie questions are isolated from everything else. The trick is finding a way to do it that isn’t totally patronizing (or targeting children so that we can get away with being patronizing).
What they need is trickier. Lets start here: A clear, concise one-stop-shop FAQ would be good. A place where asking the basic questions is acceptable and background isn’t assumed. Explanations that don’t rely on concepts from mathematics, CS or hard sciences.
Data point to the contrary here. On top of being a data point, I’m also a person, which is convenient: you can ask a person questions. ;)
Actually, I’m something of a partial data point against that as well. I did come here with the split from OB but I was just a causal reader, had only been there a few weeks and never commented.
I did go back and look at some of your early comments and my initial reaction is that you seem unusually well read and instinctively rational even for this crowd. In fact, I wonder if you should be asking me questions about how to make Less Wrong more amenable to people with limited background knowledge.
You may regret this. I’m a very curious person.
To what extent was it obvious upon coming here that Less Wrong had a kind of affinity with computer science and programming? What effect did this affinity have on your interest? How much of your interest in participating was driven by Eliezer’s writings in particular compared to the community in general. Should the barrier to participation be lowered? If so how would you do it? What would have gotten you up to speed with everyone else faster? What would have made it easier? To what extent did you/do you now associate yourself with transhumanism? Did that factor into your interest in Less Wrong?
I could probably keep going. One more for now: What questions should I be asking you that I haven’t?
That’s one question I really like. Originally learnt it from Jerry Weinberg as one of the “context free questions”, a very useful item in my toolkit.
What I’d ask people is “What motivated you to come here in the first place?” Where “here” takes a range of values—what motivated them to become readers; then to create a profile here and become commenters; then to start contributing.
Not very. What “hooked” me first, as a huge Dennett fan, was the Zombies post, which I came across while browsing random links from my Twitter feed. That led me to the QM sequence, which made sense for me of things that hadn’t made sense before, which motivated me to drill for more. That led me to the Bayes article. Parallel exploration turned up the FAI argument, which (I don’t dare use the word “click” yet) made intuitive sense even though it hadn’t crossed my mind before.
It was only then that I made the connection with CS/programming—I had this fantasy of getting my colleagues to invite Eliezer to keynote at our conference. Interestingly enough, the response I got from musing about that on Twitter was (direct quote) “the singularity will definitely have personality disorders”.
Well, I’d taken note that there was such a thing as the LW community blog, and I kept an eye on it, but in parallel I started reading all of the back-content of LW, all the posts by Eliezer ported over from OB. I wanted to catch up before increasing my participation. So initially I pretty much ignored the community, which anyway I couldn’t quite figure out.
I wish someone had told me, quite plainly, what I was expected to do! Something along the lines of, “this is a rationality dojo, posts are where intermediate students show off their moves, comments are where beginners learn from intermediates and demonstrate their abilities, you will be given cues by people ahead of you when you are ready to move along the path reader->commenter->poster”.
Looking back, I can see some mistakes made in the way this community is set up that tend to put it at odds with its stated mission; and I’m not at all sure I’d have done any better, given what people knew pre-launch. And figuring out how to participate was also part of the learning process, consistent with (for instance) Lave and Wenger’s notions on “legitimate peripheral particiation”.
I’m guessing that this process could be improved by thinking more explicitly about this kind of theoretical frameworks, when thinking about what this community is aiming to achieve and how to achieve it. I’ve done a lot of this kind of thinking in my “secret identity”, with some successes.
No, I’ve been vaguely aware of transhumanist ideas and values for some time, but never explicitly identified as singularitarian, transhumanist, extropian or anything of the sort. I have most of the background reading that seems to be common in these circles (from a very uninformed outsider’s perspective) but I guess I never was in the right place at the right time to become an insider. It feels as if I might have been.
LessWrong is missing “profile pages” of some kind, where the sort of biographical information that we’re discussing could be collected for later reference. Posting a comment to the “Welcome” thread doesn’t really cut it.
There is a wiki, though it sadly uses a different authentication system. Nonetheless, many users do have profile pages there.
I’m going to remember that one.
Indeed—the reason we don’t say that explicitly is that it’s unclear how much this is the case. However, if it were possible for Lw to become a “rationality dojo”, I think most of us would leap on the opportunity.
There is some previous discussion which suggests that not everyone here would be happy to see LW as a “rationality dojo”.
The term “dojo” has favorable connotations for me, partly because one of my secret identity’s modest claims to fame is as a co-originator of the “Coding Dojo”, an attempt to bring a measure of sanity back to the IT industry’s horrible pedagogy and hiring practices.
However these connotations might be biasing my thinking about whether using the “dojo” metaphor as a guide to direct the evolution of LW would be for good or ill on balance.
How about starting a discussion at the top of the current Open Thread to ask people what they now think of applying the Dojo metaphor to LW?
I think I’m the only one on that thread who explicitly advised against starting a rationality dojo, and the other concerns were mostly whether it was possible.
Indeed. Eliezer’s post itself, however, seemed mostly to caution against it, and perhaps what he took away from the subsequent discussion, after weighing the various contributions, was that it had too little to recommend it. At any rate, that I’m aware, the question wasn’t raised again?
Of course one issue is that it was never clarified what “it” might be, i.e. what would result from treating LW more explicitly as a “rationality dojo” (that would be different from what it is at present).
I believe purely ballistic transportation systems have been proposed at various times, actually.
I’ve long said that Truly Part of You is the article with, by far, the highest ratio of “Less Wrong philosophy content” to length. (Unfortunately, it doesn’t seem to be listed in any sequence despite being a follow-up to two others.)
Other than knowing specific jargon, that would get people reasonably up to speed and should probably be what we’re pointing newcomers to.
Maybe. Except someone who has never looked at program code is going to be really confused.
Well there were several subjects in that list I knew little about until I started reading the Sequences, so yes, on that point I confess I’m being hyperbolic for humorous effect...