Good ideas propagate. Nobody from an AGI org has to read a LessWrong post for any good ideas generated here to reach them. Although they definitely do read Alignment Forum posts and often LessWrong posts. Check out the Alignment Forum FAQ to understand its relationship to LW.
LessWrong and AF also provide something that journals do not: public discussion that includes both expert and outside contributions. This is lacking in other academic forums. After spending a long time in cognitive neuroscience, it looked to me like intellectual progress was severaly hampered by people communicating rarely, and in cliques. Labs each had their own viewpoint that was pretty biased and limited, and cross-lab communication was rare, but extremely valuable when it happened. So I think the existence of a common forum is extremely valuable for making rapid progress.
There are specialized filters for LW by tag. If you’re not interested in AI, you can turn that topic down as far as you want.
Ah okay, thanks. I wasn’t aware of the Alignment Forum, I’ll check it out.
I don’t disagree that informal forums are valuable. I take Jacque Ellul’s belief in Technological Society that science firms held by monopolies tend to have their growth stunted for exactly the reasons you pointed out.
I think it’s more that places like LessWrong are susceptible to having the narrative around them warped (referencing the article about Scott Alexander). Though this is slightly off-topic now.
Lastly, I am interested in AI; I’m just feeling around for what the best way to get into it is. So thanks.
Good ideas propagate. Nobody from an AGI org has to read a LessWrong post for any good ideas generated here to reach them. Although they definitely do read Alignment Forum posts and often LessWrong posts. Check out the Alignment Forum FAQ to understand its relationship to LW.
LessWrong and AF also provide something that journals do not: public discussion that includes both expert and outside contributions. This is lacking in other academic forums. After spending a long time in cognitive neuroscience, it looked to me like intellectual progress was severaly hampered by people communicating rarely, and in cliques. Labs each had their own viewpoint that was pretty biased and limited, and cross-lab communication was rare, but extremely valuable when it happened. So I think the existence of a common forum is extremely valuable for making rapid progress.
There are specialized filters for LW by tag. If you’re not interested in AI, you can turn that topic down as far as you want.
Ah okay, thanks. I wasn’t aware of the Alignment Forum, I’ll check it out.
I don’t disagree that informal forums are valuable. I take Jacque Ellul’s belief in Technological Society that science firms held by monopolies tend to have their growth stunted for exactly the reasons you pointed out.
I think it’s more that places like LessWrong are susceptible to having the narrative around them warped (referencing the article about Scott Alexander). Though this is slightly off-topic now.
Lastly, I am interested in AI; I’m just feeling around for what the best way to get into it is. So thanks.