He points out that there’s arguably a “missing mood” around the way most people in EA and the AI alignment community communicate with safety-unconcerned researchers. The missing sense of urgency probably lowers the chance of successful persuasion efforts?
Sorry for responding very late, but it’s basically because contra the memes, most LWers do not agree with Eliezer’s views on how doomed we are. This is very much a fringe viewpoint on LW, not the mainstream.
So the missing mood is basically because most of LW doesn’t share Eliezer’s views on certain cruxes.
Sorry for responding very late, but it’s basically because contra the memes, most LWers do not agree with Eliezer’s views on how doomed we are. This is very much a fringe viewpoint on LW, not the mainstream.
So the missing mood is basically because most of LW doesn’t share Eliezer’s views on certain cruxes.