I’m confused why there is such a focus on understanding Eliezer’s ideas, given that everyone seems to agree the first online community Eliezer had custom-built for him (LW 1.0) was a failure. Having someone custom create an online community for you is an incredible luxury. I suspect there are a lot of other people who have detailed models of how online communities work and/or large internet followings that could be induced to migrate to a new community.
I do think it’d be valuable if the Arbital team released their design documents so others could draw inspiration—I just don’t see any compelling reason to treat Eliezer as an authority for the purpose of making decisions.
It seems very weird to me to call LW 1.0 a failure. Sure, nobody maintained it and so it slowly declined, but it still gave rise to a pretty massive and active community and was the hub of a lot of excellent writing for quite a few years (e.g. Scott Alexander, lukeprog, etc.).
The way I think about online communities, there are two important inputs: There’s your initial endowment of users & attention, and there’s the skill with which you design the community culture, rules, software, etc. These interact in the form of an exponential function.
If you do a great job of designing community software, your community can grow exponentially in to something massive/beautiful like Facebook, Wikipedia, etc. Online community success is power law distributed.
If you don’t do a great job, you get exponential decay instead, and the community gradually fades away. Eliezer put a lot of effort in to the exponent for Less Wrong 1.0: A custom platform was built to Eliezer’s specification, and he wrote an entire sequence about the sort of culture he wanted. But none of that prevented exponential decay—even while the rationalist community itself expanded! Instead, LW 1.0 turned in to a site Eliezer himself didn’t want to use.
It’s possible that LW 1.0 would have thrived with more maintenance. But I think it still counts as evidence against the idea that Eliezer has unusual ability as an online community designer. First, if people get enthusiastic enough about an online community, some of them will step up to maintain it. So “lack of maintenance” is not cleanly separable from other trends toward decline. Second, by saying that LW 1.0 failed due to lack of maintenance, you’re effectively saying that the Eliezer strategy of carefully figuring out how the community should work from first principles, without any empirical feedback, is inferior to a more hands-on strategy of keeping your hands on the wheel and seeing where things lead. This also appears to be one of Alexei’s takeaways.
Again, I’m not saying we shouldn’t listen to Eliezer’s ideas. If I was creating an online community, I’d love to take a look at the Arbital design document. But I would not follow Eliezer’s advice if it didn’t make sense to me, and it seems Alexei also reached this conclusion. Indeed, I see this as a big takeaway of Inadequate Equilibria: if the advice of high status people doesn’t make sense to you, consider not following it.
I’m confused why there is such a focus on understanding Eliezer’s ideas, given that everyone seems to agree the first online community Eliezer had custom-built for him (LW 1.0) was a failure. Having someone custom create an online community for you is an incredible luxury. I suspect there are a lot of other people who have detailed models of how online communities work and/or large internet followings that could be induced to migrate to a new community.
I do think it’d be valuable if the Arbital team released their design documents so others could draw inspiration—I just don’t see any compelling reason to treat Eliezer as an authority for the purpose of making decisions.
It seems very weird to me to call LW 1.0 a failure. Sure, nobody maintained it and so it slowly declined, but it still gave rise to a pretty massive and active community and was the hub of a lot of excellent writing for quite a few years (e.g. Scott Alexander, lukeprog, etc.).
Yes, I agree all that was valuable and important.
The way I think about online communities, there are two important inputs: There’s your initial endowment of users & attention, and there’s the skill with which you design the community culture, rules, software, etc. These interact in the form of an exponential function.
If you do a great job of designing community software, your community can grow exponentially in to something massive/beautiful like Facebook, Wikipedia, etc. Online community success is power law distributed.
If you don’t do a great job, you get exponential decay instead, and the community gradually fades away. Eliezer put a lot of effort in to the exponent for Less Wrong 1.0: A custom platform was built to Eliezer’s specification, and he wrote an entire sequence about the sort of culture he wanted. But none of that prevented exponential decay—even while the rationalist community itself expanded! Instead, LW 1.0 turned in to a site Eliezer himself didn’t want to use.
It’s possible that LW 1.0 would have thrived with more maintenance. But I think it still counts as evidence against the idea that Eliezer has unusual ability as an online community designer. First, if people get enthusiastic enough about an online community, some of them will step up to maintain it. So “lack of maintenance” is not cleanly separable from other trends toward decline. Second, by saying that LW 1.0 failed due to lack of maintenance, you’re effectively saying that the Eliezer strategy of carefully figuring out how the community should work from first principles, without any empirical feedback, is inferior to a more hands-on strategy of keeping your hands on the wheel and seeing where things lead. This also appears to be one of Alexei’s takeaways.
Again, I’m not saying we shouldn’t listen to Eliezer’s ideas. If I was creating an online community, I’d love to take a look at the Arbital design document. But I would not follow Eliezer’s advice if it didn’t make sense to me, and it seems Alexei also reached this conclusion. Indeed, I see this as a big takeaway of Inadequate Equilibria: if the advice of high status people doesn’t make sense to you, consider not following it.