There seem to be two broad categories of discussion topics on LessWrong: topics that are directly and obviously rationality-related (which seems to me to be an ever-shrinking category), and topics that have come to be incidentally associated with LessWrong to the extent that its founders / first or highest-status members chose to use this website to promote them—artificial intelligence and MIRI’s mission along with it, effective altruism, transhumanism, cryonics, utilitarianism—especially in the form of implausible but difficult dilemmas in utilitarian ethics or game theory, start-up culture and libertarianism, polyamory, ideas originating from Overcoming Bias which, apparently, “is not about” overcoming bias, NRx (a minor if disturbing concern)… I could even say California itself, as a great place to live in.
As a person interested in rationality and little else that this website has to offer, I would like for there to be a way to filter out cognitive improvement discussions from these topics. Because unrelated and affiliated memes are given more importance here than related and unaffiliated memes, I have since begun to migrate to other websites for my daily dose of debiasing. Obviously it would be all varieties of rude of me to tell everybody else “stop talking about that stuff! Talk about this stuff instead… while I sit here in the audience and enjoy listening to you speaking”, and obviously the best thing I could do to further my purpose of seeing more rationality material on LessWrong would be to post* some high-quality rationality material—which I do plan on doing, but I still feel that my ideas have some maturing and polishing to undergo before they’re publishable. So what I intend to do with this post is to poll people for thoughts and opinions on this matter, and perhaps re-raise the old discussions about revamping the Main/Discussion division of LessWrong.
Also, for what it’s worth, it seems to me that most of the bad PR LessWrong gets comes from those topics that I’ve mentioned in the first paragraph being more visible to outsiders than the stated mission of “refining the art of human rationality”. People often can’t get beyond the peculiarities of Bayland to the actual insights that we value this community most for—and to be honest, if I hadn’t read the Sequences first and instead got hit in the face with persuasions to donate to charity or to believe in x-risk or to get my head frozen upon my first visit to LW, I’d have politely “No-Thank-You”ed the messengers like I do door-to-door salesmen. To outsiders not predisposed to be friendly to transhumanism & co. through their demographics, to conflate the two sides of LessWrong is to devalue the side that champions rationality. Unless, of course, that was the point all along and LessWrong has less intrinsic value for the founders than its purpose as an attractor of smart, concerned young people.
* notably SSC, RibbonFarm, TheLastPsychiatrist, and even highly biased but well-written blogs coming from the opposite side of the political spectrum—hopefully for our respective biases to cancel out and for me to be left with a more accurate worldview than I started out with. (I don’t read political material that I agree with, and to be honest it would be difficult to even come across texts prioritizing the same issues that I care about. I sometimes feel like I’m the first one of my political inclination...) I’m not necessarily endorsing any of these for anyone else (except Scott, read Scott, he’s amazing), it’s just that there is where I get my food for thought. They raise issues and put a new spin on things that don’t usually occur to me.
With most of the main contributors having left and no new ones emerging (except for an occasional post by Swimmer963 and So8res discussing MIRI research), this forum appears, unfortunately, to have jumped the shark. It is still an OK forum to hang out on, but don’t expect great things. Unless you are the one producing them.
I don’t think that’s necessarily the right conclusion. In general, the quality and quantity of posting here seems to wax and wane over time with no strong patterns to it.
As a person interested in rationality and little else that this website has to offer
I’m confused why you categorize SSC as appropriate for debiasing but not LW; doesn’t SSC have as much of a mix of non-rationality material as LW? Is it a mix you like better? Do you just enjoy SSC for other reasons?
1) Scott posts about politics and that’s one of the muddiest areas in debate—and with the moratorium on politics around here, one could use some good insights on controversial issues somewhere else.
2) LessWrong is a collection of people, but Scott is one guy—and as far as I’ve seen he’s one of the most level-headed and reasonable people in recent history, two traits which I consider indispensable to rationality. And he posts a lot about what constitutes reasonableness and how important it is. LessWrong does exhibit this as well, naturally, but there as opposed to here there aren’t tendencies in the opposite direction that dilute the collective wisdom of the place, so to speak. (Then again I don’t read comments.)
3) I just happen to really like his writing style, jokes, way of thinking, just about everything.
but not LW
No. Never said that. It’s just that other sites get updated with material of interest to me more often, whereas the best stuff around here is already old for me.
In theory, the main or promoted posts should be more focused in rationality-for-its-own-sake topics while Discussion (and especially many of the more open threads therein, the literal Open Threads most of all) are going to contain a lot of memes of interest to the rationalist community without actually being about rationalism per se.
On the other hand, the rate of release of rationality material here isn’t terribly high, and some of it does get mixed in with affiliated-but-unrelated topics.
The thing is, Main is like that as well. I went back some three pages on Main to check, and there were a few rationality-related articles, some periodic posts (the survey, rationality quotes), a whole lot more posts relating to organizations of interest for the people on LessWrong who form its central real-life social circle, including reports on recent activity and calls for donations or for paid participation in events.
Besides, effective altruist organizations have recently been included in the Rationality Blogs list in the sidebar. (And there was this comment of Eliezer’s on a post which, if I remember correctly, called for help in some matter. He said he’s not going to devote time—five minutes, an hour, I don’t remember the interval he gave—to someone who had donated less than $5000 to charity. To get some people out of their American bubble, as a comparison, that’s more than my current yearly income… likely much more. Needless to say, I found it rather unpalatable.)
And there’s the higher bar for posting in Main… Unless you write something at least obviously good enough to break even in terms of karma, you get, technically a punishment of a few tens of negative karma points for having dared to post there. (I think at least that that’s how the karma multiplier works.) And people are going to respond more positively to common in-group-y topics. So if anything non-affiliated topics are more likely to be found in Discussion.
Eh. Difference between theory and practice, I guess. I too wish there was more actual rationality stuff coming out; the archive is big, but it’s hard to engage with people on those topics now and there’s always more to cover. I don’t mind the side topics so much as you seem to, but I would like to see more of the core topic.
As for the charity thing, that’s EY’s right if he so chooses to exercise it, but if income where you live is so low that $5000 is more than your annual income, or even if it’s just temporarily more than that because you’re a student or something (I made about that much per year on summer jobs my first two years of university), then I really doubt he would hold you to that if you were to approach him.
On the other hand, EY isn’t anywhere near a top contributor to LW at this point in time; I barely see him comment anywhere on the site anymore. That’s probably part of the reason for the dearth of good rationality posts, but it also means that his opinions will have less impact on the site as a whole, at least for a while.
This is something that I’ve noticed and been concerned with. I think this is worthy of a top level discussion post.
I think part of the problem is that rationalism is harder that weird and interesting ideas like transhumanism : anyone can dream about the future and fiddle with the implications, but it takes significant study and thought to produce new and worthwhile interventions for how to think better.
My feeling is that the Main is for rationality stuff and the Discussion is for whatever the members of this community find interesting, but since we don’t have strong leaders who are doing the work and producing novel content on rationality, the Main rarely has a new post, so I at least gravitate to the Discussion.
Also, keep in mind that many of these secondary ideas sprang from rationalist origins: cryonics is presented as an “obvious” rational choice, when you don’t let your biases get in the way: you have an expressed desire not to die, this is the only available option to not die. Polyamory similarly came to bear as the result of looking at relationships “with fresh eyes.” These secondary topics gain prominence because they are examples (debatablly) of rationality applied to specific problems. They are the object level; “Rationality” is the meta level. But, like I said, it’s a lot easier to think at the object level, because that can be visualized, so most people do.
From the point of view of someone who doesn’t buy into them, I think it’s only incidental that those specific positions are advocated as a logical consequence of more rational thinking and not others. Had the founders not been American programmers, the “natural and obvious” consequences of their rationalism would have looked highly different. My point being that these practices are not at all more rational than the alternatives and very likely less so. But yeah, if these ideas gain rationalist adherents, then obviously some of the advocacy for them is going to take a rationalist-friendly form, with rationalist lingo and emphasized connections to rationalism.
Yes—atheism. And by extension disbelief in the supernatural. It’s the first consequence of acquiring better thinking practices. However, it is not as if atheism in itself forms a good secondary basis for discussion in a rationalist community, since most of the activity would necessarily take the form of “ha ha, look how stupid these people are!”. I would know; been there, done that. But it gets very old very quickly, and besides isn’t of much use except for novice apostates who need social validation of their new identities. From that point of view I regard atheism as a solved problem and therefore uninteresting.
Nothing else seems to spring to mind, though—or at least no positive rather than negative positions on ideological questions. “Don’t be a fanatic”, “don’t buy snake oil”, “don’t join cults”, “check the prevailing scientific paradigms before denying things left and right [evolution, moon landing, the Holocaust, global warming etc.]”… critical thinking 101. Mostly all other beliefs and practices that seem to go hand in hand with rationalism seem to be explainable by membership of this particular cluster of Silicon Valley culture.
I don’t know, I like the option of locking yourself in a vault when you’re about to die so that time travellers can come and rescue you without changing history, since nobody can see into the vault.
Okay, I lied, I don’t like that option, but it’s not worse than cryonics, and does count as another available option./
I want to emphasize that I neither endorse nor oppose the conclusion that polyamory or cryonics are rational, just point out that they are included in discussion here, in large part, because of how they impinge, or are presumed to impinge, on rationality.
There seem to be two broad categories of discussion topics on LessWrong: topics that are directly and obviously rationality-related (which seems to me to be an ever-shrinking category), and topics that have come to be incidentally associated with LessWrong to the extent that its founders / first or highest-status members chose to use this website to promote them—artificial intelligence and MIRI’s mission along with it, effective altruism, transhumanism, cryonics, utilitarianism—especially in the form of implausible but difficult dilemmas in utilitarian ethics or game theory, start-up culture and libertarianism, polyamory, ideas originating from Overcoming Bias which, apparently, “is not about” overcoming bias, NRx (a minor if disturbing concern)… I could even say California itself, as a great place to live in.
As a person interested in rationality and little else that this website has to offer, I would like for there to be a way to filter out cognitive improvement discussions from these topics. Because unrelated and affiliated memes are given more importance here than related and unaffiliated memes, I have since begun to migrate to other websites for my daily dose of debiasing. Obviously it would be all varieties of rude of me to tell everybody else “stop talking about that stuff! Talk about this stuff instead… while I sit here in the audience and enjoy listening to you speaking”, and obviously the best thing I could do to further my purpose of seeing more rationality material on LessWrong would be to post* some high-quality rationality material—which I do plan on doing, but I still feel that my ideas have some maturing and polishing to undergo before they’re publishable. So what I intend to do with this post is to poll people for thoughts and opinions on this matter, and perhaps re-raise the old discussions about revamping the Main/Discussion division of LessWrong.
Also, for what it’s worth, it seems to me that most of the bad PR LessWrong gets comes from those topics that I’ve mentioned in the first paragraph being more visible to outsiders than the stated mission of “refining the art of human rationality”. People often can’t get beyond the peculiarities of Bayland to the actual insights that we value this community most for—and to be honest, if I hadn’t read the Sequences first and instead got hit in the face with persuasions to donate to charity or to believe in x-risk or to get my head frozen upon my first visit to LW, I’d have politely “No-Thank-You”ed the messengers like I do door-to-door salesmen. To outsiders not predisposed to be friendly to transhumanism & co. through their demographics, to conflate the two sides of LessWrong is to devalue the side that champions rationality. Unless, of course, that was the point all along and LessWrong has less intrinsic value for the founders than its purpose as an attractor of smart, concerned young people.
* notably SSC, RibbonFarm, TheLastPsychiatrist, and even highly biased but well-written blogs coming from the opposite side of the political spectrum—hopefully for our respective biases to cancel out and for me to be left with a more accurate worldview than I started out with. (I don’t read political material that I agree with, and to be honest it would be difficult to even come across texts prioritizing the same issues that I care about. I sometimes feel like I’m the first one of my political inclination...) I’m not necessarily endorsing any of these for anyone else (except Scott, read Scott, he’s amazing), it’s just that there is where I get my food for thought. They raise issues and put a new spin on things that don’t usually occur to me.
With most of the main contributors having left and no new ones emerging (except for an occasional post by Swimmer963 and So8res discussing MIRI research), this forum appears, unfortunately, to have jumped the shark. It is still an OK forum to hang out on, but don’t expect great things. Unless you are the one producing them.
I don’t think that’s necessarily the right conclusion. In general, the quality and quantity of posting here seems to wax and wane over time with no strong patterns to it.
I’m confused why you categorize SSC as appropriate for debiasing but not LW; doesn’t SSC have as much of a mix of non-rationality material as LW? Is it a mix you like better? Do you just enjoy SSC for other reasons?
Because
1) Scott posts about politics and that’s one of the muddiest areas in debate—and with the moratorium on politics around here, one could use some good insights on controversial issues somewhere else.
2) LessWrong is a collection of people, but Scott is one guy—and as far as I’ve seen he’s one of the most level-headed and reasonable people in recent history, two traits which I consider indispensable to rationality. And he posts a lot about what constitutes reasonableness and how important it is. LessWrong does exhibit this as well, naturally, but there as opposed to here there aren’t tendencies in the opposite direction that dilute the collective wisdom of the place, so to speak. (Then again I don’t read comments.)
3) I just happen to really like his writing style, jokes, way of thinking, just about everything.
No. Never said that. It’s just that other sites get updated with material of interest to me more often, whereas the best stuff around here is already old for me.
In theory, the main or promoted posts should be more focused in rationality-for-its-own-sake topics while Discussion (and especially many of the more open threads therein, the literal Open Threads most of all) are going to contain a lot of memes of interest to the rationalist community without actually being about rationalism per se.
On the other hand, the rate of release of rationality material here isn’t terribly high, and some of it does get mixed in with affiliated-but-unrelated topics.
The thing is, Main is like that as well. I went back some three pages on Main to check, and there were a few rationality-related articles, some periodic posts (the survey, rationality quotes), a whole lot more posts relating to organizations of interest for the people on LessWrong who form its central real-life social circle, including reports on recent activity and calls for donations or for paid participation in events.
Besides, effective altruist organizations have recently been included in the Rationality Blogs list in the sidebar. (And there was this comment of Eliezer’s on a post which, if I remember correctly, called for help in some matter. He said he’s not going to devote time—five minutes, an hour, I don’t remember the interval he gave—to someone who had donated less than $5000 to charity. To get some people out of their American bubble, as a comparison, that’s more than my current yearly income… likely much more. Needless to say, I found it rather unpalatable.)
And there’s the higher bar for posting in Main… Unless you write something at least obviously good enough to break even in terms of karma, you get, technically a punishment of a few tens of negative karma points for having dared to post there. (I think at least that that’s how the karma multiplier works.) And people are going to respond more positively to common in-group-y topics. So if anything non-affiliated topics are more likely to be found in Discussion.
Eh. Difference between theory and practice, I guess. I too wish there was more actual rationality stuff coming out; the archive is big, but it’s hard to engage with people on those topics now and there’s always more to cover. I don’t mind the side topics so much as you seem to, but I would like to see more of the core topic.
As for the charity thing, that’s EY’s right if he so chooses to exercise it, but if income where you live is so low that $5000 is more than your annual income, or even if it’s just temporarily more than that because you’re a student or something (I made about that much per year on summer jobs my first two years of university), then I really doubt he would hold you to that if you were to approach him.
On the other hand, EY isn’t anywhere near a top contributor to LW at this point in time; I barely see him comment anywhere on the site anymore. That’s probably part of the reason for the dearth of good rationality posts, but it also means that his opinions will have less impact on the site as a whole, at least for a while.
This is something that I’ve noticed and been concerned with. I think this is worthy of a top level discussion post.
I think part of the problem is that rationalism is harder that weird and interesting ideas like transhumanism : anyone can dream about the future and fiddle with the implications, but it takes significant study and thought to produce new and worthwhile interventions for how to think better.
My feeling is that the Main is for rationality stuff and the Discussion is for whatever the members of this community find interesting, but since we don’t have strong leaders who are doing the work and producing novel content on rationality, the Main rarely has a new post, so I at least gravitate to the Discussion.
Also, keep in mind that many of these secondary ideas sprang from rationalist origins: cryonics is presented as an “obvious” rational choice, when you don’t let your biases get in the way: you have an expressed desire not to die, this is the only available option to not die. Polyamory similarly came to bear as the result of looking at relationships “with fresh eyes.” These secondary topics gain prominence because they are examples (debatablly) of rationality applied to specific problems. They are the object level; “Rationality” is the meta level. But, like I said, it’s a lot easier to think at the object level, because that can be visualized, so most people do.
From the point of view of someone who doesn’t buy into them, I think it’s only incidental that those specific positions are advocated as a logical consequence of more rational thinking and not others. Had the founders not been American programmers, the “natural and obvious” consequences of their rationalism would have looked highly different. My point being that these practices are not at all more rational than the alternatives and very likely less so. But yeah, if these ideas gain rationalist adherents, then obviously some of the advocacy for them is going to take a rationalist-friendly form, with rationalist lingo and emphasized connections to rationalism.
Just curious, are there any positions which you you regard as “a logical consequence of more rational thinking”?
Yes—atheism. And by extension disbelief in the supernatural. It’s the first consequence of acquiring better thinking practices. However, it is not as if atheism in itself forms a good secondary basis for discussion in a rationalist community, since most of the activity would necessarily take the form of “ha ha, look how stupid these people are!”. I would know; been there, done that. But it gets very old very quickly, and besides isn’t of much use except for novice apostates who need social validation of their new identities. From that point of view I regard atheism as a solved problem and therefore uninteresting.
Nothing else seems to spring to mind, though—or at least no positive rather than negative positions on ideological questions. “Don’t be a fanatic”, “don’t buy snake oil”, “don’t join cults”, “check the prevailing scientific paradigms before denying things left and right [evolution, moon landing, the Holocaust, global warming etc.]”… critical thinking 101. Mostly all other beliefs and practices that seem to go hand in hand with rationalism seem to be explainable by membership of this particular cluster of Silicon Valley culture.
Clarke’s Third Law: “Any sufficiently advanced technology is indistinguishable from magic”.
I don’t know, I like the option of locking yourself in a vault when you’re about to die so that time travellers can come and rescue you without changing history, since nobody can see into the vault.
Okay, I lied, I don’t like that option, but it’s not worse than cryonics, and does count as another available option./
I want to emphasize that I neither endorse nor oppose the conclusion that polyamory or cryonics are rational, just point out that they are included in discussion here, in large part, because of how they impinge, or are presumed to impinge, on rationality.