Yeah, I think some of this is true, but while there is a lot of AI content, I actually think that a lot of the same people would probably write non-AI content, and engage on non-AI content, if AI was less urgent, or the site had less existing AI content.
That counterfactual is hard to evaluate, but like, a lot of people who used to be core contributors to LW 1.0 are now also posting to LW 2.0, though they are now posting primarily on AI, and I think that’s evidence that it’s more that there has been a broader shift among LW users that AI is just like really urgent and important, instead of there having been a very different new user base that was discovered.
I kind of agree on development of rationality feeling kind of stagnant right now. I think there are still good posts being written, but a lot of cognitive energy is definitely going into AI stuff, more so than rationality stuff.
Yeah, I think some of this is true, but while there is a lot of AI content, I actually think that a lot of the same people would probably write non-AI content, and engage on non-AI content, if AI was less urgent, or the site had less existing AI content.
That counterfactual is hard to evaluate, but like, a lot of people who used to be core contributors to LW 1.0 are now also posting to LW 2.0, though they are now posting primarily on AI, and I think that’s evidence that it’s more that there has been a broader shift among LW users that AI is just like really urgent and important, instead of there having been a very different new user base that was discovered.
I kind of agree on development of rationality feeling kind of stagnant right now. I think there are still good posts being written, but a lot of cognitive energy is definitely going into AI stuff, more so than rationality stuff.
I would love to be able to stop worrying about AI and go back to improving rationality. Yet another thing to look forward to once we leap this hurdle.