Also, can I just remind you that for most of LessWrong’s history the top-karma post was Holden’s critique of SingInst where he recommended against funding SingInst and argued in favor of Tool AI as the solution. Recently Eliezer’s List-of-Lethalities became the top-karma post, but less than a month later Paul’s response-and-critique post became the top-karma post where he argued that the problem is much more tractable than Eliezer thinks, and generally advocates a very different research strategy for dealing with alignment.
Eliezer is the primary person responsible for noticing and causing people to work on the alignment problem, due to his superior foresight and writing skill, and also founded this site, so most people here have read his perspective and understand it somewhat, but any notion that dissent isn’t welcomed here (which I am perhaps over-reading into your comment) seems kind of obviously not the case.
Also, can I just remind you that for most of LessWrong’s history the top-karma post was Holden’s critique of SingInst where he recommended against funding SingInst and argued in favor of Tool AI as the solution. Recently Eliezer’s List-of-Lethalities became the top-karma post, but less than a month later Paul’s response-and-critique post became the top-karma post where he argued that the problem is much more tractable than Eliezer thinks, and generally advocates a very different research strategy for dealing with alignment.
Eliezer is the primary person responsible for noticing and causing people to work on the alignment problem, due to his superior foresight and writing skill, and also founded this site, so most people here have read his perspective and understand it somewhat, but any notion that dissent isn’t welcomed here (which I am perhaps over-reading into your comment) seems kind of obviously not the case.