I did the have the idea of there being regions with varying standards and barriers, in particular places where new users cannot comment easily and place where they can, as an idea.
This feels like a/the natural solution. In particular, what occurred to me was:
Make LW about rationality again.
Expand the Alignment Forum:
2.1. By default, everything is as it is currently: a small set of users post, comment, and upvote, and that’s what people see by default.
2.2. There’s another section that’s open to whoever.
The reasoning being that the influx is specifically about AI, not just a big influx.
The idea of AF having both a passing-the-current-AF-bar section and a passing-the-current-LW-bar section is intriguing to me. With some thought about labeling etc., it could be a big win for non-alignment people (since LW can suppress alignment content more aggressively by default), and a big win for people trying to get into alignment (since they can host their stuff on a more professional-looking dedicated alignment site), and no harm done to the current AF people (since the LW-bar section would be clearly labeled and lower on the frontpage).
I like this direction, but I’m not sure how broadly one would want to define rationality: Would a post collecting quotes about intracranial ultrasound stimulation for meditation enhancement be rationality related enough? What about weird quantified self experiments?
In general I appreciate LessWrong because it is so much broader than other fora, while still staying interesting.
I did the have the idea of there being regions with varying standards and barriers, in particular places where new users cannot comment easily and place where they can, as an idea.
This feels like a/the natural solution. In particular, what occurred to me was:
Make LW about rationality again.
Expand the Alignment Forum: 2.1. By default, everything is as it is currently: a small set of users post, comment, and upvote, and that’s what people see by default. 2.2. There’s another section that’s open to whoever.
The reasoning being that the influx is specifically about AI, not just a big influx.
The idea of AF having both a passing-the-current-AF-bar section and a passing-the-current-LW-bar section is intriguing to me. With some thought about labeling etc., it could be a big win for non-alignment people (since LW can suppress alignment content more aggressively by default), and a big win for people trying to get into alignment (since they can host their stuff on a more professional-looking dedicated alignment site), and no harm done to the current AF people (since the LW-bar section would be clearly labeled and lower on the frontpage).
I didn’t think it through very carefully though.
I like this direction, but I’m not sure how broadly one would want to define rationality: Would a post collecting quotes about intracranial ultrasound stimulation for meditation enhancement be rationality related enough? What about weird quantified self experiments?
In general I appreciate LessWrong because it is so much broader than other fora, while still staying interesting.
Well, at least we can say, “whatever LW has been, minus most AI stuff”.