“Study rationality anyway. Work harder to make up for your lack of intelligence.” I don’t think most of LessWrong’s material is out of reach of an average-intelligence person.
“Think about exactly what people mean by words when they use them; there are all kinds of tricks to watch out for involving subtle variations of a word’s meaning.” Read Yvain’s “The Worst Argument in the World”..
“Don’t fall for the sunk cost fallacy, which is what you’re doing when you say ‘This movie I’m watching sucks [absolutely, not just relative to what you’d expect for the cost], but I’m gonna keep watching, because I already payed to get in.’”
“Your brain loves to lie to you about the reason you want to do something. For example, maybe you’re thinking about moving to a new job. You don’t get along well with one of your current coworkers, but you don’t think this is a good reason to get a new job, so you refuse to take that into consideration. Your brain makes a bigger deal of minor advantages of the new job to compensate. Learn to recognize these lies.”
These statements don’t necessarily contradict each other. Even if average-intelligence people don’t read Less Wrong, perhaps they could. Personally, I suspect it’s more because of a lack of interest (and perhaps a constellation of social factors).
I bet the average LessWrong person has a great sense of humour and feels things more than other people, too.
Seriously, every informal IQ survey amongst a group/forum I have seen reports very high IQ. My (vague) memories of the LessWrong one included people who seemed to be off the scale (I don’t mean very bright. I mean that such IQs either have never been given out in official testing rather than online tests, or possibly that they just can’t be got on those tests and people were lying).
There’s always a massive bias in self-reporting: those will only be emphasised on an intellectual website that starts the survey post by saying that LessWrongers are, on average, in the top 0.11% for SATs, and gives pre-packaged excuses for not reporting inconvenient results - “Many people would prefer not to have people knowing their scores. That’s great, but please please please do post it anonymously. Especially if it’s a low one, but not if it’s low because you rushed the test”, (my emphasis).
If there’s a reason to be interested in average IQ beyond mutual ego-massage, I guess the best way would be to have an IQ test where you logged on as ‘Less Wrong member X’ and then it reported all the results, not just the ones that people chose to share. And where it revealed how many people pulled out halfway through (to avoid people bailing if they weren’t doing well).
Selection bias—which groups and forums actually asked about IQ?
Your average knitting/auto maintenance/comic book forum probably has a lower average IQ but doesn’t think to ask. And of course we’re already selecting a little just by taking the figures off of web forums, which are a little on the cerebral side.
True. I don’t think I can define the precise level of inaccuracy or anything. My point is not that I’ve detected the true signal: it’s that there’s too much noise for there to be a useful signal.
Do I think the average LessWronger has a higher IQ? Sure. But that’s nothing remotely to do with this survey. It’s just too flawed to give me any particularly useful information. I would probably update my view of LW intelligence more based on its existence than its results. In that reading the thread lowers my opinion of LW intellgence, simply because this forum is usually massively more rational and self-questioning than every other forum I’ve been on, which I would guess is associated with high IQ, and people taking the survey seriously is one of the clearest exceptions.
BTW, I’m not sure your assessments of knitting/auto maintenance/comic books/web forums are necessarily accurate. I’m not sure I have enough information on any of them to reasonably guess their intelligence. Forums are particularly exceptional in terms of showing amazing intelligence and incredible stupidity side by side.
If there’s a reason to be interested in average IQ beyond mutual ego-massage, I guess the best way would be to have an IQ test where you logged on as ‘Less Wrong member X’ and then it reported all the results, not just the ones that people chose to share.
Would still suffer from selection effects. People that thought they might not do so well would be disinclined to do it, and people who knew they were hot shit would be extra inclined to do it. The phrase “anonymous survey” doesn’t really penetrate into our status-aware hindbrains.
Better: randomly select a group of users (within some minimal activity criteria) and offer the test directly to that group. Publicly state the names of those selected (make it a short list, so that people actually read it, maybe 10-20) and then after a certain amount of time give another public list of those who did or didn’t take it, along with the results (although don’t associate results with names). That will get you better participation, and the fact that you have taken a group of known size makes it much easier to give outer bounds on the size of the selection effect caused by people not participating.
You can also improve participation by giving those users an easily accessible icon on Less Wrong itself which takes them directly to the test, and maybe a popup reminder once a day or so when they log on to the site if they’ve been selected but haven’t done it yet. Requires moderate coding.
I would find such a feature to be extraordinarily obnoxious, to the point that I’d be inclined to refused such a test purely out of anger (and my scores are not at all embarrassing). I can’t think of any other examples of a website threatening to publicly shame you for non-compliance.
Wasn’t the average IQ here from the survey something like 130+?
The average self-reported IQ.
If we really wanted to measure LWs collective IQ, I’d suggest using the education data as a proxy; we have fairly good information about average IQs by degree and major, and people with less educational history will likely be much less reticent to answer than those with a low IQ test result since there are so many celebrated geniuses who didn’t complete their schooling.
The average tested IQ on the survey was about 125, which is close to my estimate of the true average IQ around here; I don’t entirely trust the testing site that Yvain used, but I think it’s skewing low, and that ought to counteract some of the reporting bias that I’d still expect to see.
125 is pretty much in line with what you’d expect if you assume that everyone here is, or is going to be, a four-year college graduate in math, philosophy, or a math-heavy science or engineering field (source). That’s untrue as stated, of course, but we do skew that way pretty hard, and I’m prepared to assume that the average contributor has that kind of intellectual chops.
I think that’s a fair assessment, although it might be because my guess was around 120 to start with. I never meant to say we’re not smart around here, far from it, but I don’t think we’re all borderline geniuses either. It’s important to keep perspective and very easy to overestimate yourself.
Find those reasons to be sufficient to spend time reading it
Read it
Put forth the cognitive effort to understand it (reading something and putting forth cognitive resources to understand it are not the same thing)
Succeed in understanding it
Intelligence is just one component of knowledge acquisition, and probably less important than affective issue. Often, intelligence acts indirectly by affecting affect, but in such cases, those effects can be counteracted. The mistaking of performance of cognitive tasks for intelligence is, I believe, often an aspect of the fundamental attribution error.
Not anymore, though only just. The 2012 survey reports a mean of 138 and change with a SD of 12.7. It was 140 or higher on the 2011 and 2009 surveys, though.
All the usual self-reporting caveats apply, of course.
I’m interested in operationalising your advice. By study rationality, I assume you mean read rationality blogs and try to practice persuasive prescrptions. At the moment I only read Lesswrong regularly, but I try to give the blogs mentioned in the sidebar a go once in a while. Robin Hansin opened my mind in this Youtube interview but I find it hard to understand his blog posts on Overcoming Bias. I thought maybe that’s because the blog posts are very scholastic and I’m not up to date. I don’t find this is the case on SSC, but it is occasionally the case here on Lesswrong. If you could describe the intended readership of each rationality blog in a way for potential audience members to decide which to commit to reading, how would you do it? Could you come up with a scale of academic rigour vs accessibility, or similar?
The thing that struck me most about the sequences was how accessible they were. Minimal domain-specific jargon and a comprehensive (excessive at times, in my opinion) explanation of each concept in turn. I do believe LessWrong is not over-the-top inaccessible, but as the existence of this post implies, it seems that’s not always agreed upon.
I think this underestimates the difficulty average humans have with just reading upwards of 2500 words about abstract ideas. It’s not a question even of getting the explanation, it’s a question of simply being able to pay attention to it.
I keep repeating this: The average human is extremely average. Check your privilege, as the social-justice types might say. You’re assuming a level of comfort with, and interest in, abstraction that just is not the case for most of our species.
Upvoted. Every time I’m tempted to provide a long post aimed at the general public, I’ve found it worthwhile to look at a math or biochemistry paper that is far outside of my knowledge domains—the sort of stuff that requires you to go searching for a glossary to find the name of a symbol.
Sufficiently abstract writing feels like that to a significant amount of the populace, and worse, even Up-Goer 5-level writing looks like it will feel like that, to a lot of people who’ve been trained into thinking they’re not good enough at this matter.
((I still tend to make a lot of long posts, because apparently I am a terrible person.))
Datum: I know at least one person who refuses to read LW links, explicitly because they are walls of text about abstract ideas that she thinks (incorrectly, IMO) are over her head. This occurs even for topics she’s normally interested in. So things like that do limit LW’s reach.
Whether they do so in a significant fraction of cases, I don’t know. But the impact is non-zero.
This doesn’t seem to me to be about fudamental intelligence, but upbringing/training/priorities.
You say in another response that IQ correlates heavily with conscientiousness (though others dispute it). But even if that’s true, different cultures/jobs/education systems make different sort of demands, and I don’t think we can assume that most people who aren’t currently inclined to read long, abstract posts can’t do so.
I know from personal experience that it can take quite a long while to get used to a new sort of taking in information (lectures rather than lessons, reading rather than lectures, reading different sorts of things (science to arguments relying on formal or near-formal logic to broader humanities). And even people who are very competent at focusing on a particular way of gaining information can get out of the habit and find it hard to readjust after a break.
In terms of checking privilege, there is a real risk that those with slightly better training/jargon, or simply those who think/talk more like ourselves are mistaken for being fundamentally more intelligent/rational.
This doesn’t seem to me to be about fundamental intelligence, but upbringing/training/priorities.
Well, then I have to ask what you think “fundamental intelligence” consists of, if not ability with (and consequently patience for and interest in) abstractions.
Can we taboo ‘intelligence’, perhaps? We are discussing what someone ought to do who is average in something, which I think we are implicitly assuming to be bell-curved-ish distributed. How changeable is that something, and how important is its presence to understanding the Sequences?
I reject the assumption behind ‘ability with (and consequentially patience for and interest in)‘. You could equally say ‘patience for and interest in (and consequentially ability in)’, and it’s entirely plausible that said patience/interest/ability could all be trained.
Lots of people I know went to schools were languages were not prioritised in teaching. These people seem to be less inherently good at languages, and to have less patience with languages, and to have less interest in them. If someone said ‘how can they help the Great Work of Translation without languages’, I could suggest back office roles, acting as domestic servants for the linguists, whatever. But my first port of call would be ‘try to see if you can actually get good at languages’
So my answer to your question is basically that by the time someone is the sort of person who says ‘I am not that intelligent but I am a utilitarian rationalist seeking advice on how to live a more worthwhile life’ that they are either already higher on the bellcurve than simple ‘intelligence’ would suggest, or at least they are highly likely to be able to advance.
Oh no, I don’t expect very many people to read it all. I expect a select few articles to go viral every now and then, though. This wouldn’t be possible if the writing wasn’t clear and accessible.
Sure, but I suggest that “viral on the Internet” for a long text article does not in fact mean that humans of average intelligence are reading it. The Internet skews up in intelligence to start with, but the stuff that goes viral enough to be noticed by mainstream media—which at least in principle reach down to the average human—is cat videos and cute kids, not long articles. Sequence posts may certainly go viral among a Hacker-News-ish, technical, college-educated, Populares-ish sort of crowd, but that’s already well outside the original “average intelligence” demographic.
I think you’re vastly underestimating internet usage here. One of the best things Facebook has done (in my opinion) is massively proliferate the practice of internet arguing. The enforced principle of not getting socked by someone in a fit of rage just makes the internet so irresistible for speaking your mind, you know?
Additionally, every so often I see my siblings scrolling through Facebook or some “funny image collection” linked from Facebook, seeing for the first time images I saw years ago. If the internet has a higher-than average intelligence, then the internet usage resulting from Facebook is a powerful intelligence boost to the general population.
I suppose I should write my analysis here into a proper post some time, as I do consider it a significant modern event.
I agree that the internet usage has lead to a massive proliferation of certain types of knowledge and certain types of intelligent thought.
At the same time, it’s important to note that image memes, Twitter, and Tumblr have increasingly replaced Livejournal or other long-form writing at the same time that popular discussion has expanded, and style guides have increasingly encourage three-sentence paragraphs over five-sentence paragraphs for internet publishing. There are a few exceptions—fanfiction has been tending to longer and longer-form, often exceeding the length of what previous generations would traditionally consider a doorstopper by orders of magnitude* -- but much social media focuses on short and often very short form writing.
There are at least a dozen Harry Potter fanfictions with a higher wordcount than the entire Harry Potter series, spinoff media included. Several My Little Pony authors have put out similar million-word-plus texts in just a few years, including a couple of the top twenty read fictions. This may increase tolerance for nonfiction long reads, although I’m uncertain the effects will hit the general populace.
I agree that the Internet is a boost to human intelligence, relative to the TV that it is replacing and to whatever-it-was that TV replaced—drinking at the pub, probably. I don’t think the effect is large compared to the selection bias of hanging out in LW-ish parts of the Internet.
My current heuristic is to take special note of the times LessWrong has a well-performing post identify one of the hundreds of point-biases I’ve formalized in my own independent analysis of every person and disagreement I’ve ever seen or imagined.
I’m sure there are better methods to measure that LessWrong can figure out for itself, but mine works pretty well for me.
Not quite sure what you mean here; could you give an example?
But this aside, it seems that you are in some sense discussing the performance of LessWrong, the website, in identifying and talking about biases; while I was discussing the performance of LessWrongers, the people, in applying rationality to their real lives.
A good example would be any of the articles about identity.
It comes down to a question of what frequency of powerful realizations individual rationalists are having that make their way back to LessWrong. I’m estimating it’s high, but I can easily re-assess my data under the assumption that I’m only seeing a small fraction of the realizations individual rationalists are having.
I think the sequences are accessible for how abstract they are and how unfamiliar the ideas are (usually, abstraction and unfamiliarity decrease accessibility). I work as a tutor in a program for young people, and one of the interesting parts of the program is that all of the students are given a variety of tests, including IQ tests, which the tutors have access to as part of an effort to customize teaching approach to best suit students’ interests and abilities. I have all kinds of doubts about how ethical and useful the program is, but it has taught me a lot about how incredibly widely people vary. I don’t believe most of my students would get much out of the sequences, but perhaps I’m too pessimistic. I think even if they understood the basic argument, they would not internalize it or realize its implications. I’d guess that their understanding would be crappily correlated with IQ. I have been spending a lot of time trying to figure out how to communicate those ideas without simplifying away the point.
These ideas are trivial. When I say “accessible,” I mean in terms of the people educated in the world of the past who systematically had their ideas shut down. Anyone who has been able to control their education from an early age is a member of the Singularity already; their genius—the genius that each person possesses—has simply yet to fully shatter the stale ideas of a generation or two of fools who thought they knew much about anything. You really don’t need to waste your time trying to get them to recognize the immense quality of this old-world content to old-world rationalists.
I apologize that this will come across as an extraordinary claim, but I’ve already grown up in the Singularity and derived 99% of the compelling content of LessWrong—sequences and Yudkowsky’s thoughts included—by the age of 20. I’m gonna get downvoted to hell saying this, but really I’m just letting you know this so you don’t get confused by how amazing unrestricted human curiosity is. Basically, I’m only saying this because I want to see your reaction in ten years.
“Study rationality anyway. Work harder to make up for your lack of intelligence.” I don’t think most of LessWrong’s material is out of reach of an average-intelligence person.
“Think about exactly what people mean by words when they use them; there are all kinds of tricks to watch out for involving subtle variations of a word’s meaning.” Read Yvain’s “The Worst Argument in the World”..
“Don’t fall for the sunk cost fallacy, which is what you’re doing when you say ‘This movie I’m watching sucks [absolutely, not just relative to what you’d expect for the cost], but I’m gonna keep watching, because I already payed to get in.’”
“Your brain loves to lie to you about the reason you want to do something. For example, maybe you’re thinking about moving to a new job. You don’t get along well with one of your current coworkers, but you don’t think this is a good reason to get a new job, so you refuse to take that into consideration. Your brain makes a bigger deal of minor advantages of the new job to compensate. Learn to recognize these lies.”
Wasn’t the average IQ here from the survey something like 130+?
These statements don’t necessarily contradict each other. Even if average-intelligence people don’t read Less Wrong, perhaps they could. Personally, I suspect it’s more because of a lack of interest (and perhaps a constellation of social factors).
I bet the average LessWrong person has a great sense of humour and feels things more than other people, too.
Seriously, every informal IQ survey amongst a group/forum I have seen reports very high IQ. My (vague) memories of the LessWrong one included people who seemed to be off the scale (I don’t mean very bright. I mean that such IQs either have never been given out in official testing rather than online tests, or possibly that they just can’t be got on those tests and people were lying).
There’s always a massive bias in self-reporting: those will only be emphasised on an intellectual website that starts the survey post by saying that LessWrongers are, on average, in the top 0.11% for SATs, and gives pre-packaged excuses for not reporting inconvenient results - “Many people would prefer not to have people knowing their scores. That’s great, but please please please do post it anonymously. Especially if it’s a low one, but not if it’s low because you rushed the test”, (my emphasis).
If there’s a reason to be interested in average IQ beyond mutual ego-massage, I guess the best way would be to have an IQ test where you logged on as ‘Less Wrong member X’ and then it reported all the results, not just the ones that people chose to share. And where it revealed how many people pulled out halfway through (to avoid people bailing if they weren’t doing well).
Selection bias—which groups and forums actually asked about IQ?
Your average knitting/auto maintenance/comic book forum probably has a lower average IQ but doesn’t think to ask. And of course we’re already selecting a little just by taking the figures off of web forums, which are a little on the cerebral side.
True. I don’t think I can define the precise level of inaccuracy or anything. My point is not that I’ve detected the true signal: it’s that there’s too much noise for there to be a useful signal.
Do I think the average LessWronger has a higher IQ? Sure. But that’s nothing remotely to do with this survey. It’s just too flawed to give me any particularly useful information. I would probably update my view of LW intelligence more based on its existence than its results. In that reading the thread lowers my opinion of LW intellgence, simply because this forum is usually massively more rational and self-questioning than every other forum I’ve been on, which I would guess is associated with high IQ, and people taking the survey seriously is one of the clearest exceptions.
BTW, I’m not sure your assessments of knitting/auto maintenance/comic books/web forums are necessarily accurate. I’m not sure I have enough information on any of them to reasonably guess their intelligence. Forums are particularly exceptional in terms of showing amazing intelligence and incredible stupidity side by side.
People with high IQ have extra power to be exceptionally stupid.
Would still suffer from selection effects. People that thought they might not do so well would be disinclined to do it, and people who knew they were hot shit would be extra inclined to do it. The phrase “anonymous survey” doesn’t really penetrate into our status-aware hindbrains.
Yep! But it’s the best way I can imagine that someone could plausibly create on the forum.
Better: randomly select a group of users (within some minimal activity criteria) and offer the test directly to that group. Publicly state the names of those selected (make it a short list, so that people actually read it, maybe 10-20) and then after a certain amount of time give another public list of those who did or didn’t take it, along with the results (although don’t associate results with names). That will get you better participation, and the fact that you have taken a group of known size makes it much easier to give outer bounds on the size of the selection effect caused by people not participating.
You can also improve participation by giving those users an easily accessible icon on Less Wrong itself which takes them directly to the test, and maybe a popup reminder once a day or so when they log on to the site if they’ve been selected but haven’t done it yet. Requires moderate coding.
I would find such a feature to be extraordinarily obnoxious, to the point that I’d be inclined to refused such a test purely out of anger (and my scores are not at all embarrassing). I can’t think of any other examples of a website threatening to publicly shame you for non-compliance.
btw, in Markdown use double asterisks at each end for bold, like this **bold text.
with two at the end also.
The average self-reported IQ.
If we really wanted to measure LWs collective IQ, I’d suggest using the education data as a proxy; we have fairly good information about average IQs by degree and major, and people with less educational history will likely be much less reticent to answer than those with a low IQ test result since there are so many celebrated geniuses who didn’t complete their schooling.
The average tested IQ on the survey was about 125, which is close to my estimate of the true average IQ around here; I don’t entirely trust the testing site that Yvain used, but I think it’s skewing low, and that ought to counteract some of the reporting bias that I’d still expect to see.
125 is pretty much in line with what you’d expect if you assume that everyone here is, or is going to be, a four-year college graduate in math, philosophy, or a math-heavy science or engineering field (source). That’s untrue as stated, of course, but we do skew that way pretty hard, and I’m prepared to assume that the average contributor has that kind of intellectual chops.
I think that’s a fair assessment, although it might be because my guess was around 120 to start with. I never meant to say we’re not smart around here, far from it, but I don’t think we’re all borderline geniuses either. It’s important to keep perspective and very easy to overestimate yourself.
To comprehend a text, a person must:
Become aware of it
Have some reason for reading it
Find those reasons to be sufficient to spend time reading it
Read it
Put forth the cognitive effort to understand it (reading something and putting forth cognitive resources to understand it are not the same thing)
Succeed in understanding it
Intelligence is just one component of knowledge acquisition, and probably less important than affective issue. Often, intelligence acts indirectly by affecting affect, but in such cases, those effects can be counteracted. The mistaking of performance of cognitive tasks for intelligence is, I believe, often an aspect of the fundamental attribution error.
140+
Not anymore, though only just. The 2012 survey reports a mean of 138 and change with a SD of 12.7. It was 140 or higher on the 2011 and 2009 surveys, though.
All the usual self-reporting caveats apply, of course.
I’m interested in operationalising your advice. By study rationality, I assume you mean read rationality blogs and try to practice persuasive prescrptions. At the moment I only read Lesswrong regularly, but I try to give the blogs mentioned in the sidebar a go once in a while. Robin Hansin opened my mind in this Youtube interview but I find it hard to understand his blog posts on Overcoming Bias. I thought maybe that’s because the blog posts are very scholastic and I’m not up to date. I don’t find this is the case on SSC, but it is occasionally the case here on Lesswrong. If you could describe the intended readership of each rationality blog in a way for potential audience members to decide which to commit to reading, how would you do it? Could you come up with a scale of academic rigour vs accessibility, or similar?
The thing that struck me most about the sequences was how accessible they were. Minimal domain-specific jargon and a comprehensive (excessive at times, in my opinion) explanation of each concept in turn. I do believe LessWrong is not over-the-top inaccessible, but as the existence of this post implies, it seems that’s not always agreed upon.
I think this underestimates the difficulty average humans have with just reading upwards of 2500 words about abstract ideas. It’s not a question even of getting the explanation, it’s a question of simply being able to pay attention to it.
I keep repeating this: The average human is extremely average. Check your privilege, as the social-justice types might say. You’re assuming a level of comfort with, and interest in, abstraction that just is not the case for most of our species.
Upvoted. Every time I’m tempted to provide a long post aimed at the general public, I’ve found it worthwhile to look at a math or biochemistry paper that is far outside of my knowledge domains—the sort of stuff that requires you to go searching for a glossary to find the name of a symbol.
Sufficiently abstract writing feels like that to a significant amount of the populace, and worse, even Up-Goer 5-level writing looks like it will feel like that, to a lot of people who’ve been trained into thinking they’re not good enough at this matter.
((I still tend to make a lot of long posts, because apparently I am a terrible person.))
Datum: I know at least one person who refuses to read LW links, explicitly because they are walls of text about abstract ideas that she thinks (incorrectly, IMO) are over her head. This occurs even for topics she’s normally interested in. So things like that do limit LW’s reach.
Whether they do so in a significant fraction of cases, I don’t know. But the impact is non-zero.
This doesn’t seem to me to be about fudamental intelligence, but upbringing/training/priorities.
You say in another response that IQ correlates heavily with conscientiousness (though others dispute it). But even if that’s true, different cultures/jobs/education systems make different sort of demands, and I don’t think we can assume that most people who aren’t currently inclined to read long, abstract posts can’t do so.
I know from personal experience that it can take quite a long while to get used to a new sort of taking in information (lectures rather than lessons, reading rather than lectures, reading different sorts of things (science to arguments relying on formal or near-formal logic to broader humanities). And even people who are very competent at focusing on a particular way of gaining information can get out of the habit and find it hard to readjust after a break.
In terms of checking privilege, there is a real risk that those with slightly better training/jargon, or simply those who think/talk more like ourselves are mistaken for being fundamentally more intelligent/rational.
Well, then I have to ask what you think “fundamental intelligence” consists of, if not ability with (and consequently patience for and interest in) abstractions.
Can we taboo ‘intelligence’, perhaps? We are discussing what someone ought to do who is average in something, which I think we are implicitly assuming to be bell-curved-ish distributed. How changeable is that something, and how important is its presence to understanding the Sequences?
I reject the assumption behind ‘ability with (and consequentially patience for and interest in)‘. You could equally say ‘patience for and interest in (and consequentially ability in)’, and it’s entirely plausible that said patience/interest/ability could all be trained.
Lots of people I know went to schools were languages were not prioritised in teaching. These people seem to be less inherently good at languages, and to have less patience with languages, and to have less interest in them. If someone said ‘how can they help the Great Work of Translation without languages’, I could suggest back office roles, acting as domestic servants for the linguists, whatever. But my first port of call would be ‘try to see if you can actually get good at languages’
So my answer to your question is basically that by the time someone is the sort of person who says ‘I am not that intelligent but I am a utilitarian rationalist seeking advice on how to live a more worthwhile life’ that they are either already higher on the bellcurve than simple ‘intelligence’ would suggest, or at least they are highly likely to be able to advance.
Oh no, I don’t expect very many people to read it all. I expect a select few articles to go viral every now and then, though. This wouldn’t be possible if the writing wasn’t clear and accessible.
Sure, but I suggest that “viral on the Internet” for a long text article does not in fact mean that humans of average intelligence are reading it. The Internet skews up in intelligence to start with, but the stuff that goes viral enough to be noticed by mainstream media—which at least in principle reach down to the average human—is cat videos and cute kids, not long articles. Sequence posts may certainly go viral among a Hacker-News-ish, technical, college-educated, Populares-ish sort of crowd, but that’s already well outside the original “average intelligence” demographic.
I think you’re vastly underestimating internet usage here. One of the best things Facebook has done (in my opinion) is massively proliferate the practice of internet arguing. The enforced principle of not getting socked by someone in a fit of rage just makes the internet so irresistible for speaking your mind, you know?
Additionally, every so often I see my siblings scrolling through Facebook or some “funny image collection” linked from Facebook, seeing for the first time images I saw years ago. If the internet has a higher-than average intelligence, then the internet usage resulting from Facebook is a powerful intelligence boost to the general population.
I suppose I should write my analysis here into a proper post some time, as I do consider it a significant modern event.
I agree that the internet usage has lead to a massive proliferation of certain types of knowledge and certain types of intelligent thought.
At the same time, it’s important to note that image memes, Twitter, and Tumblr have increasingly replaced Livejournal or other long-form writing at the same time that popular discussion has expanded, and style guides have increasingly encourage three-sentence paragraphs over five-sentence paragraphs for internet publishing. There are a few exceptions—fanfiction has been tending to longer and longer-form, often exceeding the length of what previous generations would traditionally consider a doorstopper by orders of magnitude* -- but much social media focuses on short and often very short form writing.
There are at least a dozen Harry Potter fanfictions with a higher wordcount than the entire Harry Potter series, spinoff media included. Several My Little Pony authors have put out similar million-word-plus texts in just a few years, including a couple of the top twenty read fictions. This may increase tolerance for nonfiction long reads, although I’m uncertain the effects will hit the general populace.
I agree that the Internet is a boost to human intelligence, relative to the TV that it is replacing and to whatever-it-was that TV replaced—drinking at the pub, probably. I don’t think the effect is large compared to the selection bias of hanging out in LW-ish parts of the Internet.
I’d agree if I thought LessWrong performed better than average.
What metric would you propose to measure LW performance?
My current heuristic is to take special note of the times LessWrong has a well-performing post identify one of the hundreds of point-biases I’ve formalized in my own independent analysis of every person and disagreement I’ve ever seen or imagined.
I’m sure there are better methods to measure that LessWrong can figure out for itself, but mine works pretty well for me.
Not quite sure what you mean here; could you give an example?
But this aside, it seems that you are in some sense discussing the performance of LessWrong, the website, in identifying and talking about biases; while I was discussing the performance of LessWrongers, the people, in applying rationality to their real lives.
A good example would be any of the articles about identity.
It comes down to a question of what frequency of powerful realizations individual rationalists are having that make their way back to LessWrong. I’m estimating it’s high, but I can easily re-assess my data under the assumption that I’m only seeing a small fraction of the realizations individual rationalists are having.
I think the sequences are accessible for how abstract they are and how unfamiliar the ideas are (usually, abstraction and unfamiliarity decrease accessibility). I work as a tutor in a program for young people, and one of the interesting parts of the program is that all of the students are given a variety of tests, including IQ tests, which the tutors have access to as part of an effort to customize teaching approach to best suit students’ interests and abilities. I have all kinds of doubts about how ethical and useful the program is, but it has taught me a lot about how incredibly widely people vary. I don’t believe most of my students would get much out of the sequences, but perhaps I’m too pessimistic. I think even if they understood the basic argument, they would not internalize it or realize its implications. I’d guess that their understanding would be crappily correlated with IQ. I have been spending a lot of time trying to figure out how to communicate those ideas without simplifying away the point.
These ideas are trivial. When I say “accessible,” I mean in terms of the people educated in the world of the past who systematically had their ideas shut down. Anyone who has been able to control their education from an early age is a member of the Singularity already; their genius—the genius that each person possesses—has simply yet to fully shatter the stale ideas of a generation or two of fools who thought they knew much about anything. You really don’t need to waste your time trying to get them to recognize the immense quality of this old-world content to old-world rationalists.
I apologize that this will come across as an extraordinary claim, but I’ve already grown up in the Singularity and derived 99% of the compelling content of LessWrong—sequences and Yudkowsky’s thoughts included—by the age of 20. I’m gonna get downvoted to hell saying this, but really I’m just letting you know this so you don’t get confused by how amazing unrestricted human curiosity is. Basically, I’m only saying this because I want to see your reaction in ten years.