If this community depends on Eliezer writing, that would be a huge failure. I think we were supposed to become stronger, try harder, and cooperate. We had enough time to do that. If a group of people cannot replace one person, then either we don’t care enough, or we failed to learn our lessons.
But actually, I think we are doing it quite well. (By “we” I mean the whole community; I specifically haven’t contributed an article yet.) When Eliezer was writing the Sequences, he wrote one article per day. We have on average maybe three articles per day. If this is not enough, then how much would be enough? Five? Ten? Twenty? Are we trying to replace Reddit’s output? I know I would like to see twenty new articles on LW every day, but that’s just my inner procrastinator speaking. If I were more effective in real life and spent less time on internet, I would be barely able to read three articles (and their discussions) every day.
Having more content on LW can be a lost purpose. To some degree it reflects the health of the community. For example, if we were doing many cool things, and writing reports about them on LW, there would be many cool articles on LW. I’d like to see us being more awesome. But it doesn’t work the other way round: by increasing the number of articles we can’t increase our awesomeness in the real life.
I didn’t mind the deleted article, but I also didn’t mind that it was deleted. It was a shiny thing for procrastination, with no other value besides signalling author’s smartness and contrarianism—something this specific author does very frequently. This is a thing we shouldn’t reward. I certainly would hate if other people started writing similar stuff. Or if the same author started doing it more often. (And unfortunately, this specific author seems to have some creepy mind-controlling powers he’s using to get a lot of upvotes here, so the standard moderation fails. I don’t fully understand his strategy, but it includes writing obscure comments which seem as if the author has something very interesting to say, but he just never says it. And then he rewards people for trying to guess what he meant, or simply for giving him attention. Everything he writes serves the ultimate purpose of drawing attention to his person, and he is doing it quite well. It’s trolling 2.0 optimized for LessWrong. E.g. read this, although he lost a lot of karma there.)
I know I’m not the first person to say this, but in case any of the moderators and people generally in charge are listening: as a peripheral member of Less Wrong at best, this is a significant factor in my reluctance to become further involved with the site and its community. When I come here and see extremely typical forum drama like the Will_Newsome debacle, it greatly reduces my confidence that this is a good place to learn rationality and find suitable role models for its practice.
I realise that’s an irrational judgment in many ways, but that’s sort of the point. I come here to learn to think rationally, I see poor behaviour from the people who are supposed to possess the fruits of that learning effort, and my gut instinct is to go away again (rather than learn the skills that would allow me to override that gut instinct and seek maximum benefit from Less Wrong in spite of its flaws).
This is reminding me of Heinlein’s “Gulf”, which describes intelligence/rationality training, and then testing it under stress.
There are also the teams in HPMOR.
However, (aside from that I’m citing fictional evidence) there may be an important difference between maintaining rationality under physical stress vs. maintaining rationality while sitting comfortably at a keyboard under social stress.
To cite a facebook post from last Wednesday by XiXiDu (LW-name):
Yudkowsky is again going all nuts over Roko’s basilisk. All posts pertaining Roko’s basilisk have been deleted from the LessWrong Facebook group, and several people who participated in the discussions appear to have been banned .
Leadership is not the essential purpose of moderation. The purpose is to raise the tone of the debate and make the community adhere to norms. If the active membership is pretty good at adhering to norms, and you just need to take out occasional trash, shadowy moderation works fine. LW has a dedicated core with extremely strong norms, and I suspect that even if Eliezer went into a coma for the next year and the site was completely unmoderated that wouldn’t change.
Sad? I’d say quite the opposite. If the best contribution to the community is that of normal members, and the contribution of moderators is a lesser one, that’s well and good.
Eliezer recently deleted posts from LW but on the other hand doesn’t write very much on LW. That’s behavior that’s not healthy for this community.
If this community depends on Eliezer writing, that would be a huge failure. I think we were supposed to become stronger, try harder, and cooperate. We had enough time to do that. If a group of people cannot replace one person, then either we don’t care enough, or we failed to learn our lessons.
But actually, I think we are doing it quite well. (By “we” I mean the whole community; I specifically haven’t contributed an article yet.) When Eliezer was writing the Sequences, he wrote one article per day. We have on average maybe three articles per day. If this is not enough, then how much would be enough? Five? Ten? Twenty? Are we trying to replace Reddit’s output? I know I would like to see twenty new articles on LW every day, but that’s just my inner procrastinator speaking. If I were more effective in real life and spent less time on internet, I would be barely able to read three articles (and their discussions) every day.
Having more content on LW can be a lost purpose. To some degree it reflects the health of the community. For example, if we were doing many cool things, and writing reports about them on LW, there would be many cool articles on LW. I’d like to see us being more awesome. But it doesn’t work the other way round: by increasing the number of articles we can’t increase our awesomeness in the real life.
I didn’t mind the deleted article, but I also didn’t mind that it was deleted. It was a shiny thing for procrastination, with no other value besides signalling author’s smartness and contrarianism—something this specific author does very frequently. This is a thing we shouldn’t reward. I certainly would hate if other people started writing similar stuff. Or if the same author started doing it more often. (And unfortunately, this specific author seems to have some creepy mind-controlling powers he’s using to get a lot of upvotes here, so the standard moderation fails. I don’t fully understand his strategy, but it includes writing obscure comments which seem as if the author has something very interesting to say, but he just never says it. And then he rewards people for trying to guess what he meant, or simply for giving him attention. Everything he writes serves the ultimate purpose of drawing attention to his person, and he is doing it quite well. It’s trolling 2.0 optimized for LessWrong. E.g. read this, although he lost a lot of karma there.)
It was more than one article. There are events that are more recent than the Will_Newsome episode.
I know I’m not the first person to say this, but in case any of the moderators and people generally in charge are listening: as a peripheral member of Less Wrong at best, this is a significant factor in my reluctance to become further involved with the site and its community. When I come here and see extremely typical forum drama like the Will_Newsome debacle, it greatly reduces my confidence that this is a good place to learn rationality and find suitable role models for its practice.
I realise that’s an irrational judgment in many ways, but that’s sort of the point. I come here to learn to think rationally, I see poor behaviour from the people who are supposed to possess the fruits of that learning effort, and my gut instinct is to go away again (rather than learn the skills that would allow me to override that gut instinct and seek maximum benefit from Less Wrong in spite of its flaws).
This is reminding me of Heinlein’s “Gulf”, which describes intelligence/rationality training, and then testing it under stress.
There are also the teams in HPMOR.
However, (aside from that I’m citing fictional evidence) there may be an important difference between maintaining rationality under physical stress vs. maintaining rationality while sitting comfortably at a keyboard under social stress.
What were the more recent events?
To cite a facebook post from last Wednesday by XiXiDu (LW-name):
1) I think we can hardly criticize him for not writing enough, given that he’s about to spend a month writing a book for free for all of us.
2) It’d be pretty sad if having a moderator in place of a member was an active harm to the community.
Moderation is about leadership. If you do it from the shadows it’s not as effective.
Leadership is not the essential purpose of moderation. The purpose is to raise the tone of the debate and make the community adhere to norms. If the active membership is pretty good at adhering to norms, and you just need to take out occasional trash, shadowy moderation works fine. LW has a dedicated core with extremely strong norms, and I suspect that even if Eliezer went into a coma for the next year and the site was completely unmoderated that wouldn’t change.
Sad? I’d say quite the opposite. If the best contribution to the community is that of normal members, and the contribution of moderators is a lesser one, that’s well and good.