I’m doubtful LW will ever become really good at recruiting.
I went to the page with the Recent Posts and crunched some numbers. Of 250 recent posts, 35 were on meetups, 25 were continuous threads like Quotes or Potter Commentary, 28 were things that require you believe in our particular formulation of transhuman singularitarianism before the premise even makes sense, and 13 were meta about the site itself. That leaves a hundred fifty that could realistically draw new people in. Of those, 55 had <15 karma; ie we couldn’t even find fifteen people on this site who were willing to say they were any good and so newbies are unlikely to like them much. That leaves 95. Of those, about 25 were about specialized math-y topics that most IQ 130+ rationalists don’t even realize exist and have no background in, like decision theory. So that leaves maybe 70 posts since May that really in the running for attracting newbies. Many of these posts are in my opinion either so soft and vague as to be boring, or else filled with jargon.
Honestly, I’m surprised we’re getting new people at the rate we are.
I haven’t looked as closely at this as you but from what I can tell the site looks much better if we only examine promoted posts. Meta and AI posts rarely make the front page. What does fill up the front page are the meet-up announcements. I’m not sure what to do about that since promoting that aspect of the site does seem important. It wouldn’t looks so bad if we were getting more front page worthy rationality/science posts.
Does anyone other than me have a bunch of post ideas they’ve been meaning to write but keep procrastinating on? If there are a couple of us, I have an idea.
28 were things that require you believe in our particular formulation of transhuman singularitarianism before the premise even makes sense,
I’ll just say it: for reasons totally unrelated to recruitment I’d like to see considerably less of these. People are pretty good about voting down the FAI ones (which are nearly always horrendously sloppy). But my sense is that, in general, posts on transhumanist/singularity topics are noticeably less rigorous and less insightful than posts on other topics. Why this is, is a rather interesting question. My guess is the lack of a foil. With topics in science, philosophy and rationality there are established authorities with empirical work to bolster our position and professional studies for us to criticize and engage with. When I did high school debate my best matches were with debaters much more talented and experienced than I because I would rise to their level. Conversely I would sink to the level of my least talented opponents. The problem with transhumanist/singularity topics is that a) people here already agree with the most talented contributors in the related fields and b) many of the other contributors in those fields are quite mediocre relative to domain experts in more mainstream fields.
I believe LW is at it’s best when it is engaging and criticizing mainstream institutions and popular ways of doing things and trying to provide alternatives. This is particularly the case when the target is a representative of traditional rationality. Now of course there is a Less Wrong critique of the mainstream regarding singularity and transhuman topics it’s just a rather simple and straight forward one which has been covered dozens of times. It is simple because most people, institutions and systems of thought have made zero effort to engage with transhumanism and Singulatarianism. There is no foil.
I realize of course I have no where near the standing to dictate the way Less Wrong should be. But occasional months where AI was forbidden as a topic, as it was in the very beginning, would I think, rejuvenate our discussions.
And yes, I realize I’m the guy who posted about Pet Cryonics in the discussion section.
Thanks for this analysis Yvain. I’m glad you’re interested in this even if you are (rightfully) pessimistic.
I agree that the ongoing community dynamic is a terrible place to “jump into” LW. I’ve been wanting help from someone (any volunteers out there??) to help design a more friendly, more Wikipedia-ish homepage for LessWrong which could actually help non-hardcore LW users navigate the site in a way that makes sense. The promoted stream is such a horrible resource for that. If someone could make a mock-up of a potential LW homepage (on the LW wiki perhaps?), I could get it implemented for you if it’s even halfway respectable. Nothing could be worse than what we currently have. A good homepage would probably have a few stable pointers to the important sequences, a portion of the current promoted feed, a video or other tutorial explaining what LW is all about… I dunno. Just anything that isn’t an opaque listing of “Open threads!”, “Cryonics!”, “Torture scenarios!”, “Rationality Quotes!”, “Omega!”, “Meetups!”, “Cake!”
For what it’s worth, the thing that got me visiting here regularly was the list of EY’s OB posts.
I know y’all love the sequences as such, and I can understand why, but the fact remains that I was motivated to work my way through much of that material relatively systematically in a chronological format, but once I got to the end of that list—that is, the time of the OB-to-LW migration—I bogged down (1). The sequence/wiki style is less compelling to me than the chronological style, especially given the degree to which the posts themselves really are blog posts and not articles.
I suspect that having some sense of where the process terminates is a key aspect of that. If I don’t know how long the process is going to be, it’s harder to get up the energy to work through it.
Anyway, to the extent that appealing to people like me is a valuable subgoal(2), it seems what you should do is assemble a simple chronological list of LW posts from the dawn of the site that are most representative of what you’d like the site content to look like. (3)
==
(1) To be fair, that was also the start of the Fun Theory sequence, which I am finding less compelling than some of its predecessors, so the content may bear some of the responsibility for my bogged-down-ness… but not all of it, nor even (I think) most of it.
(2) Which isn’t intended as false modesty; I just mean there’s no particular reason to believe that what appeals to me will appeal to anyone else.
(3) It may be sufficient to take the highest-voted tier of posts for each month, say, and string them together chronologically… though you’d probably want to backfill other posts that they depend on.
This is one of the few places on the net where cryonics related topics are discussed in a coherent manner, particularly by people in my age bracket. And I like that while it’s cryonics friendly, it’s not about cryonics. It’s about the (supposed) rational benefit of cryonics. Not just any cryonics related topic gets discussed, and not just in any way.
I can see how the community might grow to a point where the different topics cannot all be discussed in the same place. The current division between discussion and main is already helpful for distinguishing formal versus informal discussions. Perhaps sub-groups for cryonics, decision theory, existential risk, etc. are good ideas. But separating them out runs the risk of creating trivial inconvenience to cross-pollination of ideas.
An updated promotion policy that keeps meetup threads, discussion threads and obscure singularity-heavy (and other nominally off-topic) threads from being promoted could improve the situation.
A quick glance at the front page shows that 9 of the articles currently sitting on it are for meetups, 7 of which are over. I really think meet ups at the very least should be taken off the front page. I’d also favour at least half the horizontal width being used for pointers to places to dive into the sequences, and potentially even a featured article of the week/month.
28 were things that require you believe in our particular formulation of transhuman singularitarianism before the premise even makes sense
This. This is the thing that has to be fixed before LessWrong can claim to primarily be about “refining the art of human rationality.”
AI is interesting, but not about rationality. Cryonics is interesting, but not about rationality. Nanotechnology is interesting, but not about rationality.
Ways of thinking are about rationality. Ways of being smart are about rationality. Ways of being stupid are about rationality. Stories of failing spectacularly are about rationality.
The first group may of course be highly relevant to the second. But it’s not about it, and requiring readers of a site advertised as being on the topic of “rationality” to buy into the standard transhumanist belief cluster is a failure of signaling rationality, and thus for these purposes a failure of instrumental rationality.
Before posting, don’t just think “Is this interesting to the LW readership?” but also “and is it on topic? Is my actual point about rationality?”
Honestly, I’m surprised we’re getting new people at the rate we are.
It’s an interesting site full of really smart people, and (and this is a real plus point) the comment quality is consistently high because people buy into the moderation system, i.e. “mod up if you want to see more comments like this.” That’s a BIG WIN. Keeps me reading.
Really. I strongly suggest you just write a main section post requesting people stay on-topic in posts, and that not clearly being on-topic about rationality is a reason to downvote. See what the community thinks.
I agree that generally most of the newer content we have is less interesting than the older, but maybe that’s because the older already covers a huge amount of LW’s subject base pretty well. The good thing is, the old content is all still here and good for attracting new readers, if new readers are what we care about. That said, they could be better promoted, easier to find, navigate and search.
Of course, speaking selfishly I’d really like it if we had new content that was as solid and useful as Eliezer’s work, and some of the other ‘classic’ content. Perhaps we need more LW people from different backgrounds than we currently cover?
I think the logical next step is to translate the sequences into a book or several books of some kind. I vaguely remember EY talking about something about writting a book on rationality, but I could be mistaken. This might make LW very good at recruiting new members compared to now it the book is successful and well read for years to come.
Everyone who’s actually read the sequences—and that’s a lot of reading, too much for almost any casual reader—should try summarising and condensing them themselves for their own circle of net-friends. That’s how to get a meme out there. You’ll be a hell of a lot more convincing to people you know if you’re saying something than if you point to someone else saying something.
Eliezer is writing the Sequences into a rationality book, but I agree with David Gerard’s suggestion that LW readers should try summarizing his ideas in their own words. This could be an important topic for discussion here on LW: making sure that some of the core ideas discussed on this site aren’t lost in translation.
I’m doubtful LW will ever become really good at recruiting.
I went to the page with the Recent Posts and crunched some numbers. Of 250 recent posts, 35 were on meetups, 25 were continuous threads like Quotes or Potter Commentary, 28 were things that require you believe in our particular formulation of transhuman singularitarianism before the premise even makes sense, and 13 were meta about the site itself. That leaves a hundred fifty that could realistically draw new people in. Of those, 55 had <15 karma; ie we couldn’t even find fifteen people on this site who were willing to say they were any good and so newbies are unlikely to like them much. That leaves 95. Of those, about 25 were about specialized math-y topics that most IQ 130+ rationalists don’t even realize exist and have no background in, like decision theory. So that leaves maybe 70 posts since May that really in the running for attracting newbies. Many of these posts are in my opinion either so soft and vague as to be boring, or else filled with jargon.
Honestly, I’m surprised we’re getting new people at the rate we are.
I haven’t looked as closely at this as you but from what I can tell the site looks much better if we only examine promoted posts. Meta and AI posts rarely make the front page. What does fill up the front page are the meet-up announcements. I’m not sure what to do about that since promoting that aspect of the site does seem important. It wouldn’t looks so bad if we were getting more front page worthy rationality/science posts.
Does anyone other than me have a bunch of post ideas they’ve been meaning to write but keep procrastinating on? If there are a couple of us, I have an idea.
I’ll just say it: for reasons totally unrelated to recruitment I’d like to see considerably less of these. People are pretty good about voting down the FAI ones (which are nearly always horrendously sloppy). But my sense is that, in general, posts on transhumanist/singularity topics are noticeably less rigorous and less insightful than posts on other topics. Why this is, is a rather interesting question. My guess is the lack of a foil. With topics in science, philosophy and rationality there are established authorities with empirical work to bolster our position and professional studies for us to criticize and engage with. When I did high school debate my best matches were with debaters much more talented and experienced than I because I would rise to their level. Conversely I would sink to the level of my least talented opponents. The problem with transhumanist/singularity topics is that a) people here already agree with the most talented contributors in the related fields and b) many of the other contributors in those fields are quite mediocre relative to domain experts in more mainstream fields.
I believe LW is at it’s best when it is engaging and criticizing mainstream institutions and popular ways of doing things and trying to provide alternatives. This is particularly the case when the target is a representative of traditional rationality. Now of course there is a Less Wrong critique of the mainstream regarding singularity and transhuman topics it’s just a rather simple and straight forward one which has been covered dozens of times. It is simple because most people, institutions and systems of thought have made zero effort to engage with transhumanism and Singulatarianism. There is no foil.
I realize of course I have no where near the standing to dictate the way Less Wrong should be. But occasional months where AI was forbidden as a topic, as it was in the very beginning, would I think, rejuvenate our discussions.
And yes, I realize I’m the guy who posted about Pet Cryonics in the discussion section.
Thanks for this analysis Yvain. I’m glad you’re interested in this even if you are (rightfully) pessimistic.
I agree that the ongoing community dynamic is a terrible place to “jump into” LW. I’ve been wanting help from someone (any volunteers out there??) to help design a more friendly, more Wikipedia-ish homepage for LessWrong which could actually help non-hardcore LW users navigate the site in a way that makes sense. The promoted stream is such a horrible resource for that. If someone could make a mock-up of a potential LW homepage (on the LW wiki perhaps?), I could get it implemented for you if it’s even halfway respectable. Nothing could be worse than what we currently have. A good homepage would probably have a few stable pointers to the important sequences, a portion of the current promoted feed, a video or other tutorial explaining what LW is all about… I dunno. Just anything that isn’t an opaque listing of “Open threads!”, “Cryonics!”, “Torture scenarios!”, “Rationality Quotes!”, “Omega!”, “Meetups!”, “Cake!”
For what it’s worth, the thing that got me visiting here regularly was the list of EY’s OB posts.
I know y’all love the sequences as such, and I can understand why, but the fact remains that I was motivated to work my way through much of that material relatively systematically in a chronological format, but once I got to the end of that list—that is, the time of the OB-to-LW migration—I bogged down (1). The sequence/wiki style is less compelling to me than the chronological style, especially given the degree to which the posts themselves really are blog posts and not articles.
I suspect that having some sense of where the process terminates is a key aspect of that. If I don’t know how long the process is going to be, it’s harder to get up the energy to work through it.
Anyway, to the extent that appealing to people like me is a valuable subgoal(2), it seems what you should do is assemble a simple chronological list of LW posts from the dawn of the site that are most representative of what you’d like the site content to look like. (3)
== (1) To be fair, that was also the start of the Fun Theory sequence, which I am finding less compelling than some of its predecessors, so the content may bear some of the responsibility for my bogged-down-ness… but not all of it, nor even (I think) most of it.
(2) Which isn’t intended as false modesty; I just mean there’s no particular reason to believe that what appeals to me will appeal to anyone else.
(3) It may be sufficient to take the highest-voted tier of posts for each month, say, and string them together chronologically… though you’d probably want to backfill other posts that they depend on.
This is one of the few places on the net where cryonics related topics are discussed in a coherent manner, particularly by people in my age bracket. And I like that while it’s cryonics friendly, it’s not about cryonics. It’s about the (supposed) rational benefit of cryonics. Not just any cryonics related topic gets discussed, and not just in any way.
I can see how the community might grow to a point where the different topics cannot all be discussed in the same place. The current division between discussion and main is already helpful for distinguishing formal versus informal discussions. Perhaps sub-groups for cryonics, decision theory, existential risk, etc. are good ideas. But separating them out runs the risk of creating trivial inconvenience to cross-pollination of ideas.
Yes. Who is promoting these things, and why? What is their rationale?
An updated promotion policy that keeps meetup threads, discussion threads and obscure singularity-heavy (and other nominally off-topic) threads from being promoted could improve the situation.
A quick glance at the front page shows that 9 of the articles currently sitting on it are for meetups, 7 of which are over. I really think meet ups at the very least should be taken off the front page. I’d also favour at least half the horizontal width being used for pointers to places to dive into the sequences, and potentially even a featured article of the week/month.
This. This is the thing that has to be fixed before LessWrong can claim to primarily be about “refining the art of human rationality.”
AI is interesting, but not about rationality. Cryonics is interesting, but not about rationality. Nanotechnology is interesting, but not about rationality.
Ways of thinking are about rationality. Ways of being smart are about rationality. Ways of being stupid are about rationality. Stories of failing spectacularly are about rationality.
The first group may of course be highly relevant to the second. But it’s not about it, and requiring readers of a site advertised as being on the topic of “rationality” to buy into the standard transhumanist belief cluster is a failure of signaling rationality, and thus for these purposes a failure of instrumental rationality.
Before posting, don’t just think “Is this interesting to the LW readership?” but also “and is it on topic? Is my actual point about rationality?”
It’s an interesting site full of really smart people, and (and this is a real plus point) the comment quality is consistently high because people buy into the moderation system, i.e. “mod up if you want to see more comments like this.” That’s a BIG WIN. Keeps me reading.
Really. I strongly suggest you just write a main section post requesting people stay on-topic in posts, and that not clearly being on-topic about rationality is a reason to downvote. See what the community thinks.
I felt a bit out of place until I started reading MoR; what was all this cryonics/decision theory stuff?
A couple chapters in I thought, “THAT’S the kind of stuff I’m interested in talking about! Now I feel like I’m in the right place.”
I agree that generally most of the newer content we have is less interesting than the older, but maybe that’s because the older already covers a huge amount of LW’s subject base pretty well. The good thing is, the old content is all still here and good for attracting new readers, if new readers are what we care about. That said, they could be better promoted, easier to find, navigate and search.
Of course, speaking selfishly I’d really like it if we had new content that was as solid and useful as Eliezer’s work, and some of the other ‘classic’ content. Perhaps we need more LW people from different backgrounds than we currently cover?
I think the logical next step is to translate the sequences into a book or several books of some kind. I vaguely remember EY talking about something about writting a book on rationality, but I could be mistaken. This might make LW very good at recruiting new members compared to now it the book is successful and well read for years to come.
Everyone who’s actually read the sequences—and that’s a lot of reading, too much for almost any casual reader—should try summarising and condensing them themselves for their own circle of net-friends. That’s how to get a meme out there. You’ll be a hell of a lot more convincing to people you know if you’re saying something than if you point to someone else saying something.
Eliezer is writing the Sequences into a rationality book, but I agree with David Gerard’s suggestion that LW readers should try summarizing his ideas in their own words. This could be an important topic for discussion here on LW: making sure that some of the core ideas discussed on this site aren’t lost in translation.
Yes, that’s being done.