As far as “playing the comments game”, I admit I am guilty of that. At a deeper level it comes from a desire to connect with like-minded people. I may even be doing it right now.
We like to think people post because they are genuinely intellectually engaged in the material we’ve written, but the truth is people post comments for a myriad of different reasons, including wanting to score comment ‘points’ or ‘karma’ or engage in a back-and-forth with a figure they admire. People like getting attention. [even shy nerdy people who are socially isolated or socially awkward, for which commenting on an internet blog may count as a significant social engagement] As you point out, the ‘comments game’ motivation isn’t necessarily bad in terms of the consequences—it gets debate and discussion going. Given the importance of the topics discussed on LW and elsewhere, even low quality discussion is better than no discussion, or shutting people out.
Obviously though, there is a tension in the ‘rational-sphere’, though between wanting to draw in lots of new people and wanting to maintain a sense of community, or people who are on the ‘same wavelength’. This tension is not at all unique to rationalism, and it typically leads to some type of fragmentation—people who want to ‘spread rationalism’ and grow the movement go one way and the people who want to maintain a sense of community and maintain purity go another. I’ve seen the same dynamic at work in the Libertarian party and in Christian churches. I think we have to accept both sides have good points.
But getting back to your post, it seems like you are more on the ‘we need to maintain a sense of community’ side. Personally I haven’t been very active in forums or online communities, but from what I have seen, maintaining a community online is possible , but it takes work—it requires considerable organization, active moderators and administrators, etc. Some platforms are more conducive to it than others. I can’t really comment on the viability of LW, since I’m kinda new here, but it seems to be a good place.
As a side note, I’m not sure how much ‘social trust’ is required for commenting. While I might be very hesitant to talk to someone at a cocktail party for fear of annoying them, or because I don’t trust them to take me seriously, I don’t feel that way about commenting, or if I do, it’s to a much lower extent. There is a difference—talking to someone in real life requires really interrupting them and taking their time, while writing a comment doesn’t really interrupt someone as they can always ignore it if they want to. What you said about more socially privileged people being more trusting or confident is definitely true though.
people who want to ‘spread rationalism’ and grow the movement go one way and the people who want to maintain a sense of community and maintain purity go another. I’ve seen the same dynamic at work in the Libertarian party and in Christian churches. I think we have to accept both sides have good points.
I believe the proper solution is like an eukaryotic cell—with outer circle, and inner circle(s). In Christianity, the outer circle is to be formally a Christian, and to visit a church on (some) Sundays. The inner circles are various monastic orders, or becoming a priest, or this kind of stuff. Now you can provide both options for people who want different things. If you just want the warm fuzzy feelings of belonging to a community, here you go. If you want some hardcore stuff, okay, come here.
These two layers need to cooperate: the outer circle must respect the inner circle, but the inner circle must provide some services for the outer circle. -- In case of LW such services would mostly be writing articles or making videos.
The outer circle must be vague enough that anyone can join, but the inner circles must be protected from invasion of charlatans; they must cooperate with each other so that they are able to formally declare someone “not one of us”, if a charlatan tries to take over the system or just benefit from claiming that he is a part of the system. In other words, the inner circles need some system to formally recognize who is an inner circle of the system and who is not.
Looking at rationalist community today, “MIRI representatives” and “CFAR representatives” seem like inner circles, and there are also a few obvious celebrities such as Yvain of SSC. But if the community is going to grow, these people are going to need some common flag to make them different from anyone else who decides to make “rationality” their applause light and gather followers.
But if the community is going to grow, these people are going to need some common flag to make them different from anyone else who decides to make “rationality” their applause light and gather followers.
What, you are not allowed to call yourself a rationalist if you are not affiliated with MIRI, even if you subscribe to branches of Western philosophy descended from Descartes and Kant and Vienna circle...?
I think there should exist a name for the cluster in thingspace that is currently known here as “the rationalist community”. That is my concern. How specifically it will be called, that is less important. We just have to coordinate on using the new name.
Generic “subscribing to branches of Western philosophy descended from Descartes and Kant and Vienna circle” is not exactly the same thing.
Viliam is right that unless we have a name for the cluster in thingspace that is the rationalist community, it’s difficult to talk about. While I can understand why one might be alarmed, but I think MIRI/CFAR representatives mostly want everyone to be able to identify them in a clearly delineated way so that they and only they can claim to speak on behalf of those organizations on manners such as AI safety, existential risk reduction, or their stance on what to make of various parts of the rationality community now that they’re trying to re-engage it. I think everyone can agree that it won’t make anyone better off to confuse people who both identify with the LW/rationality community and those outside of it what MIRI/CFAR actually believe, re: their missions and goals.
This is probably more important to MIRI’s/CFAR’s relationship to EA and academia than people merely involved with LW/rationalists, since what’s perceived as the positions of these organizations could effect how much funding they receive, and their crucial relationships with other organizations working on the same important problems.
LessWrong itself doesn’t have as much activity as it once did, but the first users on LessWrong have pursued their ideas on Artificial Intelligence and rationality, through the Machine Intelligence Research Institute (MIRI) and the Center for Applied Rationality (CFAR), respectively, they have a lot more opportunity to impact the world than they did before. If those are the sorts of things you or anyone, really, is passionate about, if they can get abreast of what these organizations are doing now and can greatly expand on it on LW itself, it can lead to jobs. Well, it’d probably help to be able to work in the United States and also have a degree to work at either CFAR or MIRI. I’ve known several people who’ve gone on to collaborate with them by starting on LW. Still, though, personally I’d find the most exciting part to be shaping the future of ideas regardless of whether it led to a job or not.
I think it’s much easier to say now to become a top contributor on LW can be a springboard to much greater things. Caveat: whether those things are greater depends on what you want. Of course there are all manner of readers and users on LW who don’t particularly pay attention to what goes on in AI safety, or at CFAR/MIRI. I shouldn’t say building connections through LW is unusually likely to lead to great things if most LessWrongers might not think the outcomes so great after all. If LW became the sort of rationality community which was conducive to other slam-dunk examples of systematic winning, like a string of successful entrepreneurs, that’d make the sight much more attractive.
I know several CFAR alumni have credited the rationality skills they learned at CFAR as contributing to their success as entrepreneurs or on other projects. That’s something else entirely different from finding the beginnings of that sort of success merely on this website itself. If all manner of aspiring rationalists pursued and won in all manner of domains, with all the beginnings of their success attributed to LW, that’d really be something else.
Oops, went on a random walk there. Anyway, my point even shy nerdy people...
[even shy nerdy people who are socially isolated or socially awkward, for which commenting on an internet blog may count as a significant social engagement]
...can totally think of LW as significant social engagement if they want to, because I know dozens of people for whom down the road it’s brought them marriages, families, careers, new passions, and whole new family-like communities. That’s really more common among people who attended LW meetups in the past, when those were more common.
As far as “playing the comments game”, I admit I am guilty of that. At a deeper level it comes from a desire to connect with like-minded people. I may even be doing it right now.
We like to think people post because they are genuinely intellectually engaged in the material we’ve written, but the truth is people post comments for a myriad of different reasons, including wanting to score comment ‘points’ or ‘karma’ or engage in a back-and-forth with a figure they admire. People like getting attention. [even shy nerdy people who are socially isolated or socially awkward, for which commenting on an internet blog may count as a significant social engagement] As you point out, the ‘comments game’ motivation isn’t necessarily bad in terms of the consequences—it gets debate and discussion going. Given the importance of the topics discussed on LW and elsewhere, even low quality discussion is better than no discussion, or shutting people out.
Obviously though, there is a tension in the ‘rational-sphere’, though between wanting to draw in lots of new people and wanting to maintain a sense of community, or people who are on the ‘same wavelength’. This tension is not at all unique to rationalism, and it typically leads to some type of fragmentation—people who want to ‘spread rationalism’ and grow the movement go one way and the people who want to maintain a sense of community and maintain purity go another. I’ve seen the same dynamic at work in the Libertarian party and in Christian churches. I think we have to accept both sides have good points.
But getting back to your post, it seems like you are more on the ‘we need to maintain a sense of community’ side. Personally I haven’t been very active in forums or online communities, but from what I have seen, maintaining a community online is possible , but it takes work—it requires considerable organization, active moderators and administrators, etc. Some platforms are more conducive to it than others. I can’t really comment on the viability of LW, since I’m kinda new here, but it seems to be a good place.
As a side note, I’m not sure how much ‘social trust’ is required for commenting. While I might be very hesitant to talk to someone at a cocktail party for fear of annoying them, or because I don’t trust them to take me seriously, I don’t feel that way about commenting, or if I do, it’s to a much lower extent. There is a difference—talking to someone in real life requires really interrupting them and taking their time, while writing a comment doesn’t really interrupt someone as they can always ignore it if they want to. What you said about more socially privileged people being more trusting or confident is definitely true though.
I believe the proper solution is like an eukaryotic cell—with outer circle, and inner circle(s). In Christianity, the outer circle is to be formally a Christian, and to visit a church on (some) Sundays. The inner circles are various monastic orders, or becoming a priest, or this kind of stuff. Now you can provide both options for people who want different things. If you just want the warm fuzzy feelings of belonging to a community, here you go. If you want some hardcore stuff, okay, come here.
These two layers need to cooperate: the outer circle must respect the inner circle, but the inner circle must provide some services for the outer circle. -- In case of LW such services would mostly be writing articles or making videos.
The outer circle must be vague enough that anyone can join, but the inner circles must be protected from invasion of charlatans; they must cooperate with each other so that they are able to formally declare someone “not one of us”, if a charlatan tries to take over the system or just benefit from claiming that he is a part of the system. In other words, the inner circles need some system to formally recognize who is an inner circle of the system and who is not.
Looking at rationalist community today, “MIRI representatives” and “CFAR representatives” seem like inner circles, and there are also a few obvious celebrities such as Yvain of SSC. But if the community is going to grow, these people are going to need some common flag to make them different from anyone else who decides to make “rationality” their applause light and gather followers.
What, you are not allowed to call yourself a rationalist if you are not affiliated with MIRI, even if you subscribe to branches of Western philosophy descended from Descartes and Kant and Vienna circle...?
I think there should exist a name for the cluster in thingspace that is currently known here as “the rationalist community”. That is my concern. How specifically it will be called, that is less important. We just have to coordinate on using the new name.
Generic “subscribing to branches of Western philosophy descended from Descartes and Kant and Vienna circle” is not exactly the same thing.
LW crowd.
“The rationalist community” sounds way too hoity-toity and pretentious to me.
Viliam is right that unless we have a name for the cluster in thingspace that is the rationalist community, it’s difficult to talk about. While I can understand why one might be alarmed, but I think MIRI/CFAR representatives mostly want everyone to be able to identify them in a clearly delineated way so that they and only they can claim to speak on behalf of those organizations on manners such as AI safety, existential risk reduction, or their stance on what to make of various parts of the rationality community now that they’re trying to re-engage it. I think everyone can agree that it won’t make anyone better off to confuse people who both identify with the LW/rationality community and those outside of it what MIRI/CFAR actually believe, re: their missions and goals.
This is probably more important to MIRI’s/CFAR’s relationship to EA and academia than people merely involved with LW/rationalists, since what’s perceived as the positions of these organizations could effect how much funding they receive, and their crucial relationships with other organizations working on the same important problems.
The rationality police will come and use the rationality spray on you, leaving your writhing on the floor crying “Oh, my eyes! It burns, IT BURNS!”
LessWrong itself doesn’t have as much activity as it once did, but the first users on LessWrong have pursued their ideas on Artificial Intelligence and rationality, through the Machine Intelligence Research Institute (MIRI) and the Center for Applied Rationality (CFAR), respectively, they have a lot more opportunity to impact the world than they did before. If those are the sorts of things you or anyone, really, is passionate about, if they can get abreast of what these organizations are doing now and can greatly expand on it on LW itself, it can lead to jobs. Well, it’d probably help to be able to work in the United States and also have a degree to work at either CFAR or MIRI. I’ve known several people who’ve gone on to collaborate with them by starting on LW. Still, though, personally I’d find the most exciting part to be shaping the future of ideas regardless of whether it led to a job or not.
I think it’s much easier to say now to become a top contributor on LW can be a springboard to much greater things. Caveat: whether those things are greater depends on what you want. Of course there are all manner of readers and users on LW who don’t particularly pay attention to what goes on in AI safety, or at CFAR/MIRI. I shouldn’t say building connections through LW is unusually likely to lead to great things if most LessWrongers might not think the outcomes so great after all. If LW became the sort of rationality community which was conducive to other slam-dunk examples of systematic winning, like a string of successful entrepreneurs, that’d make the sight much more attractive.
I know several CFAR alumni have credited the rationality skills they learned at CFAR as contributing to their success as entrepreneurs or on other projects. That’s something else entirely different from finding the beginnings of that sort of success merely on this website itself. If all manner of aspiring rationalists pursued and won in all manner of domains, with all the beginnings of their success attributed to LW, that’d really be something else.
Oops, went on a random walk there. Anyway, my point even shy nerdy people...
...can totally think of LW as significant social engagement if they want to, because I know dozens of people for whom down the road it’s brought them marriages, families, careers, new passions, and whole new family-like communities. That’s really more common among people who attended LW meetups in the past, when those were more common.
Commenting takes less energy than moderating comments, certainly.