The way I think about online communities, there are two important inputs: There’s your initial endowment of users & attention, and there’s the skill with which you design the community culture, rules, software, etc. These interact in the form of an exponential function.
If you do a great job of designing community software, your community can grow exponentially in to something massive/beautiful like Facebook, Wikipedia, etc. Online community success is power law distributed.
If you don’t do a great job, you get exponential decay instead, and the community gradually fades away. Eliezer put a lot of effort in to the exponent for Less Wrong 1.0: A custom platform was built to Eliezer’s specification, and he wrote an entire sequence about the sort of culture he wanted. But none of that prevented exponential decay—even while the rationalist community itself expanded! Instead, LW 1.0 turned in to a site Eliezer himself didn’t want to use.
It’s possible that LW 1.0 would have thrived with more maintenance. But I think it still counts as evidence against the idea that Eliezer has unusual ability as an online community designer. First, if people get enthusiastic enough about an online community, some of them will step up to maintain it. So “lack of maintenance” is not cleanly separable from other trends toward decline. Second, by saying that LW 1.0 failed due to lack of maintenance, you’re effectively saying that the Eliezer strategy of carefully figuring out how the community should work from first principles, without any empirical feedback, is inferior to a more hands-on strategy of keeping your hands on the wheel and seeing where things lead. This also appears to be one of Alexei’s takeaways.
Again, I’m not saying we shouldn’t listen to Eliezer’s ideas. If I was creating an online community, I’d love to take a look at the Arbital design document. But I would not follow Eliezer’s advice if it didn’t make sense to me, and it seems Alexei also reached this conclusion. Indeed, I see this as a big takeaway of Inadequate Equilibria: if the advice of high status people doesn’t make sense to you, consider not following it.
Yes, I agree all that was valuable and important.
The way I think about online communities, there are two important inputs: There’s your initial endowment of users & attention, and there’s the skill with which you design the community culture, rules, software, etc. These interact in the form of an exponential function.
If you do a great job of designing community software, your community can grow exponentially in to something massive/beautiful like Facebook, Wikipedia, etc. Online community success is power law distributed.
If you don’t do a great job, you get exponential decay instead, and the community gradually fades away. Eliezer put a lot of effort in to the exponent for Less Wrong 1.0: A custom platform was built to Eliezer’s specification, and he wrote an entire sequence about the sort of culture he wanted. But none of that prevented exponential decay—even while the rationalist community itself expanded! Instead, LW 1.0 turned in to a site Eliezer himself didn’t want to use.
It’s possible that LW 1.0 would have thrived with more maintenance. But I think it still counts as evidence against the idea that Eliezer has unusual ability as an online community designer. First, if people get enthusiastic enough about an online community, some of them will step up to maintain it. So “lack of maintenance” is not cleanly separable from other trends toward decline. Second, by saying that LW 1.0 failed due to lack of maintenance, you’re effectively saying that the Eliezer strategy of carefully figuring out how the community should work from first principles, without any empirical feedback, is inferior to a more hands-on strategy of keeping your hands on the wheel and seeing where things lead. This also appears to be one of Alexei’s takeaways.
Again, I’m not saying we shouldn’t listen to Eliezer’s ideas. If I was creating an online community, I’d love to take a look at the Arbital design document. But I would not follow Eliezer’s advice if it didn’t make sense to me, and it seems Alexei also reached this conclusion. Indeed, I see this as a big takeaway of Inadequate Equilibria: if the advice of high status people doesn’t make sense to you, consider not following it.