Many many things wrong, but since any of my early attempts at arriving at such an estimate would be just as flawed I applaud you for having the courage to post this and I think we can take slow steps to thinking about this more clearly and having better estimates.
So we are basically searching for the crude number of people who would join the LW community as it is right now if they where exposed to it? Is that a good approximation of what you mean is by ” upper bound of community size”?
We need to first establish how likely people in various IQ ranges are to be able to understand the concepts of LW: This is necessarily guestimation, but it almost deserves its own debate. We can’t just simply say minimum IQ is 120. I think its more accurate way would be to check out literature on what IQ ranges tell us about the likelihood academic success and being good at math (in more quantitative ways that just saying “It helps”).
I would say that being is a atheist is almost a precondition but not quite. There have been active posters who have identified as theist, I recommend you check out the threads about LW demographics. Considering the huge number of Theists in the world the number of nominally religious LW posters learing to be less wrong may indeed be nontrivial. Also don’t forget the strong Buddhist subculture that exists here (in mode of thought if in not outright religiosity), however we can’t just simply bunch Buddhists together with Atheists especially not traditional Buddhists.
Also I don’t think sex is that important. Sure there are many many more men on LW than women. But of that population women are disproportionate active (at least among the top contributors). I think a better approach than directly factoring in sex is figuring out conformity estimates and also perhaps education in the hard sciences.
We need to first establish how likely people in various IQ ranges are to be able to understand the concepts of LW: This is necessarily guestimate but it almost deserves its own debate. We can’t just simply say minimum IQ is 120. I think its more accurate way would be to check out literature on what IQ ranges tell us about the likelihood academic success and being good at math (in more quantitative ways that just saying “It helps”).
My suspicion is that for a lot of LessWrongers have weirder brains than raw IQ will capture. This is part of why we tend to be underachieving geniuses. I took WAIS III when I was 17 and got a 150 verbal IQ and a 120 performance IQ. That averages to a 135- but that number isn’t actually a very helpful predictor of future performance because the two scores are so different. This kind of thing gets labeled a learning disability, they didn’t even bother writing down the total on my test results. I suspect a lot of people here also have weird brains with strengths and weaknesses not accurately conveyed by raw IQ.
ETA: Which isn’t to say I have a better way of estimating a potential user base.
I’m quite sure that the average LW brain is weird and agree on the point. I considered proposing rates of highly functional nonneurotypicals to be added as a group aprticularly likley to end up here.
However I hope that you see why I think IQ estimates are very relevant since there are certain concepts necessary here that become very hard to grasp for those with lower IQs.
It seems like previous exposure to relevant material, including but not limited to the math parts of a college education, would be a much more direct benchmark.
Well, yes. That was an example; the point I intended and may not have been clear about was that specific content knowledge might be a more accurate way to narrow the set than a quantified measure of general intelligence. There are probably tons of extremely smart people who have never been exposed to the subjects which would make them productive LW contributors.
So we are basically searching for the crude number of people who would join the LW community as it is right now if they where exposed to it?
Yes, that’s a much better way to put it. Although I clarify my intended goal a bit more below.
I recommend you check out the threads about LW demographics
My analysis has been heavily influenced by data from Yvain’s survey… especially where it matches up with intuition. Atheism (or at least agnosticism) seems to be one of the strongest defining traits of this community (shared by 93.3%). We would expect that.
Perhaps the “crude number of people” I’m looking for is the number of people who would enjoy devoting their time to reading the sequences(for purposes other than to troll them) if they were introduced to them in the right way. I’m assuming that to enjoy the sequences, people would need to be smart enough to mostly understand them and at least be predisposed to caring about having correct beliefs… which unfortunately disqualifies almost everyone. They would also need some free time… and they would have to want to spend that free time reading… which probably disqualifies almost everyone else and reduces the target audience to around the current size of LW. :/ (As an aside, part of my working theory of why this community is so akrasia-filled is that the only people who have the time to read and digest something as long as the LW sequences are people with motivation systems so severely crippled that they prevent their owners from filling their time with things that most people without akrasia have like steady jobs, lovers, and rewarding social interactions. Think about it. Most Americans or other English speakers who have tendencies towards basic rationality (read: our entire target audience) and a somewhat functional motivation system in this world win so hard at life that they quickly become wayyyy too busy to allocate any of their precious time to boring, anti-social & low-prestige* tasks like reading online forums.)
Anyway, my personal theories on LW aside, my goal here is to build more rationalists and stronger rationalists. I assume a number of those people who read the sequences would be “involved in the LW community as it is right now” as well but that’s sort of a secondary consideration in my mind. As I’ve pointed out before, 96% of LW participants are lurkers who only read. So my main goal is to expose interested parties to the sequences and it is my expectation that 3-4% of those folks will naturally stick around and become more involved after that.
the number of nominally religious LW posters learing to be less wrong may indeed be nontrivial
I guess strong to moderately strong theists aren’t really in my imagined Less Wrong target audience. Despite a few extraordinary counter-examples, LessWrong isn’t really equipped or devoted to the matter of personal de-conversion. And I don’t think it’s going too far to suggest that being an atheist is pretty much a pre-req to being rational. It’s the canonical example of rationality for a reason.
Many many things wrong, but since any of my early attempts at arriving at such an estimate would be just as flawed I applaud you for having the courage to post this and I think we can take slow steps to thinking about this more clearly and having better estimates.
So we are basically searching for the crude number of people who would join the LW community as it is right now if they where exposed to it? Is that a good approximation of what you mean is by ” upper bound of community size”?
We need to first establish how likely people in various IQ ranges are to be able to understand the concepts of LW: This is necessarily guestimation, but it almost deserves its own debate. We can’t just simply say minimum IQ is 120. I think its more accurate way would be to check out literature on what IQ ranges tell us about the likelihood academic success and being good at math (in more quantitative ways that just saying “It helps”).
I would say that being is a atheist is almost a precondition but not quite. There have been active posters who have identified as theist, I recommend you check out the threads about LW demographics. Considering the huge number of Theists in the world the number of nominally religious LW posters learing to be less wrong may indeed be nontrivial. Also don’t forget the strong Buddhist subculture that exists here (in mode of thought if in not outright religiosity), however we can’t just simply bunch Buddhists together with Atheists especially not traditional Buddhists.
Also I don’t think sex is that important. Sure there are many many more men on LW than women. But of that population women are disproportionate active (at least among the top contributors). I think a better approach than directly factoring in sex is figuring out conformity estimates and also perhaps education in the hard sciences.
My suspicion is that for a lot of LessWrongers have weirder brains than raw IQ will capture. This is part of why we tend to be underachieving geniuses. I took WAIS III when I was 17 and got a 150 verbal IQ and a 120 performance IQ. That averages to a 135- but that number isn’t actually a very helpful predictor of future performance because the two scores are so different. This kind of thing gets labeled a learning disability, they didn’t even bother writing down the total on my test results. I suspect a lot of people here also have weird brains with strengths and weaknesses not accurately conveyed by raw IQ.
ETA: Which isn’t to say I have a better way of estimating a potential user base.
I’m quite sure that the average LW brain is weird and agree on the point. I considered proposing rates of highly functional nonneurotypicals to be added as a group aprticularly likley to end up here.
However I hope that you see why I think IQ estimates are very relevant since there are certain concepts necessary here that become very hard to grasp for those with lower IQs.
It seems like previous exposure to relevant material, including but not limited to the math parts of a college education, would be a much more direct benchmark.
But several of our more productive posters/commenters did little to no college math. That might be a weird exception for philosophers, though.
I’m also not sure college math/science is a sufficiently narrowing criterion.
Well, yes. That was an example; the point I intended and may not have been clear about was that specific content knowledge might be a more accurate way to narrow the set than a quantified measure of general intelligence. There are probably tons of extremely smart people who have never been exposed to the subjects which would make them productive LW contributors.
Yes.
Yes, that’s a much better way to put it. Although I clarify my intended goal a bit more below.
My analysis has been heavily influenced by data from Yvain’s survey… especially where it matches up with intuition. Atheism (or at least agnosticism) seems to be one of the strongest defining traits of this community (shared by 93.3%). We would expect that.
Perhaps the “crude number of people” I’m looking for is the number of people who would enjoy devoting their time to reading the sequences (for purposes other than to troll them) if they were introduced to them in the right way. I’m assuming that to enjoy the sequences, people would need to be smart enough to mostly understand them and at least be predisposed to caring about having correct beliefs… which unfortunately disqualifies almost everyone. They would also need some free time… and they would have to want to spend that free time reading… which probably disqualifies almost everyone else and reduces the target audience to around the current size of LW. :/ (As an aside, part of my working theory of why this community is so akrasia-filled is that the only people who have the time to read and digest something as long as the LW sequences are people with motivation systems so severely crippled that they prevent their owners from filling their time with things that most people without akrasia have like steady jobs, lovers, and rewarding social interactions. Think about it. Most Americans or other English speakers who have tendencies towards basic rationality (read: our entire target audience) and a somewhat functional motivation system in this world win so hard at life that they quickly become wayyyy too busy to allocate any of their precious time to boring, anti-social & low-prestige* tasks like reading online forums.)
Anyway, my personal theories on LW aside, my goal here is to build more rationalists and stronger rationalists. I assume a number of those people who read the sequences would be “involved in the LW community as it is right now” as well but that’s sort of a secondary consideration in my mind. As I’ve pointed out before, 96% of LW participants are lurkers who only read. So my main goal is to expose interested parties to the sequences and it is my expectation that 3-4% of those folks will naturally stick around and become more involved after that.
I guess strong to moderately strong theists aren’t really in my imagined Less Wrong target audience. Despite a few extraordinary counter-examples, LessWrong isn’t really equipped or devoted to the matter of personal de-conversion. And I don’t think it’s going too far to suggest that being an atheist is pretty much a pre-req to being rational. It’s the canonical example of rationality for a reason.