I hope the forum’s moderators will take care to squash unproductive and divisive conversations about race, gender, social justice, etc., which seem have been invading and hindering nearby communities like the atheism/secularism world and the rationality world.
I hope the forum’s participants end up discussing a well-balanced set of topics in EA, so that e.g. we don’t end up with 10% of conversations being about AGI risk mitigation while 1% of conversations are about policy interventions.
Also, I think this “Well-kept gardens die by pacifism” post might be kind of a good illustration of a problem I have with how the sequences are regarded in general. The epistemological quality of this post seems pretty poor: although it’s discussing phenomena that are best studied empirically (as opposed to phenomena that are best studied theoretically, like math), it cites no studies, and doesn’t make an attempt to become a proto-study itself by trying to, say, find a method to do quasi-random sampling of online communities and figure out whether each community constitutes evidence for or against its thesis. Instead, its argument is based mainly on personal experience (even concrete examples in the form of actual specific anecdotes, like 4chan say, are few).
This phenomena could also have been productively studied theoretically: say, by making references to the expected quality of any given post, thinking about how frequently users are likely to return to a given forum and important it is for them to see new/valuable content each time they return, etc. But EY makes no attempt to do that either. (At least MBlume starts to think about modeling things in a more mathematical fashion.)
And yet it’s a featured post voted up 91 points… as far as I can tell, largely on the strength of the author’s charisma. I’m glad this post was written. I found it valuable to read; it has a couple novel arguments and insights. But it seems suboptimal when people cite it as if it was the last word on the question it addresses. And it seems weird that mostly on the strength of that post’s advice, posts as epistemologically weak as it are no longer being written as often on LW any more. Tossing around novel perspectives can be really valuable even if they aren’t supported by strong theoretical or empirical evidence yet.
I hope the forum’s moderators will take care to squash unproductive and divisive conversations about race, gender, social justice, etc., which seem have been invading and hindering nearby communities like the atheism/secularism world and the rationality world.
To play devil’s advocate: Will MacAskill reported that this post of his criticizing the popular ice bucket challenge got lots of attention for the EA movement. Scott Alexander reports that his posts on social justice bring lots of hits to his blog. So it seems plausible to me that a well-reasoned, balanced post that made an important and novel point on a controversial topic could be valuable for attracting attention. Remember that this new EA forum will not have been seeded with content and a community quite the way LW was. Also, there are lots of successful group blogs (Huffington Post, Bleacher Report, Seeking Alpha, Daily Kos, etc.) that seem to have a philosophy of having members post all they want and then filtering the good stuff out of that.
I think the “Well-kept gardens die by pacifism” advice is cargo culted from a Usenet world where there weren’t ways to filter by quality aside from the binary censor/don’t censor. The important thing is to make it easy for users to find the good stuff, and suppressing the bad stuff is only one (rather blunt) way of accomplishing this. Ultimately the best way to help users find quality stuff depends on your forum software. It might be interesting to try to do a study of successful and unsuccessful subreddits to see what successful intellectual subreddits do that unsuccessful ones don’t, given that the LW userbase and forum software are a bit similar to those of reddit.
(It’s possible that strategies that work for HuffPo et al. will not transfer well at all to a blog focused more on serious intellectual discussion. So it might be useful to decide whether the new EA forum is more about promoting EA itself or promoting serious intellectual discussion of EA topics.)
(Another caveat: I’ve talked to people who’ve ditched LW because they get seriously annoyed and it ruins their day when they see a comment that they regard as insufficiently rational. I’m not like this and I’m not sure how many people are, but these people seem likely to be worth keeping around and catering to the interests of.)
I like your comment, but this struck me as a bit odd:
(Another caveat: I’ve talked to people who’ve ditched LW because they get seriously annoyed and it ruins their day when they see a comment that they regard as insufficiently rational. I’m not like this and I’m not sure how many people are, but these people seem likely to be worth keeping around and catering to the interests of.)
Having one’s day ruined because of one irrational comment is quite bizarre (and irrational). I don’t think that people with such extreme reactions should be catered to.
I think the “Well-kept gardens die by pacifism” advice is cargo culted from a Usenet world where there weren’t ways to filter by quality aside from the binary censor/don’t censor.
Ah… you just resolved a bit of confusion I didn’t know I had. Eliezer often seems quite wise about “how to manage a community” stuff, but also strikes me as a bit too ban-happy at times. I had thought it was just overcompensation in response to a genuine problem, but it makes a lot more sense as coming from a context where more sophisticated ways of promoting good content aren’t available.
I hope the forum’s participants end up discussing a well-balanced set of topics in EA, so that e.g. we don’t end up with 10% of conversations being about AGI risk mitigation while 1% of conversations are about policy interventions.
I assume balance would imply 99% about AGI risk mitigation, and 0.001% about (non-AGI) policy interventions?
I won’t be able to put much effort into the new forum, probably, but I’ll express a few hopes for it anyway.
I hope the “Well-kept gardens die by pacifism” warning will be taken very seriously at the EA forum.
I hope the forum’s moderators will take care to squash unproductive and divisive conversations about race, gender, social justice, etc., which seem have been invading and hindering nearby communities like the atheism/secularism world and the rationality world.
I hope the forum’s participants end up discussing a well-balanced set of topics in EA, so that e.g. we don’t end up with 10% of conversations being about AGI risk mitigation while 1% of conversations are about policy interventions.
Also, I think this “Well-kept gardens die by pacifism” post might be kind of a good illustration of a problem I have with how the sequences are regarded in general. The epistemological quality of this post seems pretty poor: although it’s discussing phenomena that are best studied empirically (as opposed to phenomena that are best studied theoretically, like math), it cites no studies, and doesn’t make an attempt to become a proto-study itself by trying to, say, find a method to do quasi-random sampling of online communities and figure out whether each community constitutes evidence for or against its thesis. Instead, its argument is based mainly on personal experience (even concrete examples in the form of actual specific anecdotes, like 4chan say, are few).
This phenomena could also have been productively studied theoretically: say, by making references to the expected quality of any given post, thinking about how frequently users are likely to return to a given forum and important it is for them to see new/valuable content each time they return, etc. But EY makes no attempt to do that either. (At least MBlume starts to think about modeling things in a more mathematical fashion.)
And yet it’s a featured post voted up 91 points… as far as I can tell, largely on the strength of the author’s charisma. I’m glad this post was written. I found it valuable to read; it has a couple novel arguments and insights. But it seems suboptimal when people cite it as if it was the last word on the question it addresses. And it seems weird that mostly on the strength of that post’s advice, posts as epistemologically weak as it are no longer being written as often on LW any more. Tossing around novel perspectives can be really valuable even if they aren’t supported by strong theoretical or empirical evidence yet.
To play devil’s advocate: Will MacAskill reported that this post of his criticizing the popular ice bucket challenge got lots of attention for the EA movement. Scott Alexander reports that his posts on social justice bring lots of hits to his blog. So it seems plausible to me that a well-reasoned, balanced post that made an important and novel point on a controversial topic could be valuable for attracting attention. Remember that this new EA forum will not have been seeded with content and a community quite the way LW was. Also, there are lots of successful group blogs (Huffington Post, Bleacher Report, Seeking Alpha, Daily Kos, etc.) that seem to have a philosophy of having members post all they want and then filtering the good stuff out of that.
I think the “Well-kept gardens die by pacifism” advice is cargo culted from a Usenet world where there weren’t ways to filter by quality aside from the binary censor/don’t censor. The important thing is to make it easy for users to find the good stuff, and suppressing the bad stuff is only one (rather blunt) way of accomplishing this. Ultimately the best way to help users find quality stuff depends on your forum software. It might be interesting to try to do a study of successful and unsuccessful subreddits to see what successful intellectual subreddits do that unsuccessful ones don’t, given that the LW userbase and forum software are a bit similar to those of reddit.
(It’s possible that strategies that work for HuffPo et al. will not transfer well at all to a blog focused more on serious intellectual discussion. So it might be useful to decide whether the new EA forum is more about promoting EA itself or promoting serious intellectual discussion of EA topics.)
(Another caveat: I’ve talked to people who’ve ditched LW because they get seriously annoyed and it ruins their day when they see a comment that they regard as insufficiently rational. I’m not like this and I’m not sure how many people are, but these people seem likely to be worth keeping around and catering to the interests of.)
I like your comment, but this struck me as a bit odd:
Having one’s day ruined because of one irrational comment is quite bizarre (and irrational). I don’t think that people with such extreme reactions should be catered to.
Ah… you just resolved a bit of confusion I didn’t know I had. Eliezer often seems quite wise about “how to manage a community” stuff, but also strikes me as a bit too ban-happy at times. I had thought it was just overcompensation in response to a genuine problem, but it makes a lot more sense as coming from a context where more sophisticated ways of promoting good content aren’t available.
I assume balance would imply 99% about AGI risk mitigation, and 0.001% about (non-AGI) policy interventions?