I’m strongly in favor of the sequences requirement. If I had been firmly encouraged/pressured into reading the sequences when I joined LW at ~march/april 2022, my life would have been much better and more successful by now. I suspect this would be the case for many other people. I’ve spent a lot of time thinking about ways that LW could set people up to steer themselves (and eachother) towards self-improvement, like the Battle School in Ender’s Game, but it seems like it’s much easier to just tell people to read the Sequences.
Something that I’m worried about is that the reading sequences requirement actually makes lesswrong too reputable, i.e. that it makes LW into an obvious elite exclusive club that people race to become a member of in order to show status. This scenario is in contrast with the current paradigm, where people who are good enough to notice LW’s value, often at a glance, are the ones who stay and hang around the most.
It seems to me that building trust by somehow confirming that a person understands certain important background knowledge (some might call this knowledge a “religious story”, those stories that inspire a certain social order wherever they’re common knowledge), but I haven’t ever seen a nice, efficient social process for confirming the presence of knowledge within a community. It always seems very ad hoc. The processes I’ve seen too demand very uncritical, entry-level understandings of the religious stories, or just randomly misfire sometimes, or are vulnerable to fakers who have no deep or integrated understanding of the stories, or sometimes there will be random holes in peoples’ understandings of the stories that cause problems to occur even when everyone’s being good faith. Maybe this stuff just inherently requires good old fashioned time and effort and I should stop looking for an easy way through.
I’m strongly in favor of the sequences requirement. If I had been firmly encouraged/pressured into reading the sequences when I joined LW at ~march/april 2022, my life would have been much better and more successful by now. I suspect this would be the case for many other people. I’ve spent a lot of time thinking about ways that LW could set people up to steer themselves (and eachother) towards self-improvement, like the Battle School in Ender’s Game, but it seems like it’s much easier to just tell people to read the Sequences.
Something that I’m worried about is that the reading sequences requirement actually makes lesswrong too reputable, i.e. that it makes LW into an obvious elite exclusive club that people race to become a member of in order to show status. This scenario is in contrast with the current paradigm, where people who are good enough to notice LW’s value, often at a glance, are the ones who stay and hang around the most.
It seems to me that building trust by somehow confirming that a person understands certain important background knowledge (some might call this knowledge a “religious story”, those stories that inspire a certain social order wherever they’re common knowledge), but I haven’t ever seen a nice, efficient social process for confirming the presence of knowledge within a community. It always seems very ad hoc. The processes I’ve seen too demand very uncritical, entry-level understandings of the religious stories, or just randomly misfire sometimes, or are vulnerable to fakers who have no deep or integrated understanding of the stories, or sometimes there will be random holes in peoples’ understandings of the stories that cause problems to occur even when everyone’s being good faith. Maybe this stuff just inherently requires good old fashioned time and effort and I should stop looking for an easy way through.