Which list of top posts are you thinking of? If you look at the most-upvoted posts on LW, the only one in the top ten about AI risk is Holden Karnofsky explaining, in 2012, why he thought the Singularity Institute wasn’t worth funding.
I grant that I was talking out of my memory; the previous time I read the LW stuff was years ago. MIRI and CFAR logos up there did not help.
I grant that I was talking out of my memory; the previous time I read the LW stuff was years ago. MIRI and CFAR logos up there did not help.