Agree with all of this, but my concern is not that the coupling of [worrying about AGI] and [being anti-social-justice] happens tomorrow. (I did have some separate concerns about people being put off by the post today, but I’ve been convinced somewhere in the comments under this post that the opposite is about equally likely.) It’s that this happens when AGI saftey is a much bigger deal in the public discourse. (Not sure if you think this will never happen? I think there’s a chance it never happens but that seems widely uncertain. I would put maybe 50% on it or something? Note that even if it happens very late, say 4 years before AGI poses an existential risk, I think that’s still more than enough time for the damage to be done. EY famously argued that there is no firelarm for AGI; if you buy this then we can’t rely on “by this point the danger is so obvious that people will take safety seriously no matter what”.)
If your next question is “why worry about this now”, one reason is that I don’t have faith that mods will react in time when the risk increases (I’ve updated upward on how likely I think this is after talking to Ruby but not to 100% and who knows who’s mod in 20 years), and I have the opportunity to say something now. But even if I had full authority over how the policy changes in the future, I still wouldn’t have allowed this post because people can dig out old material if they want to write a hit piece. This post has been archived, so from this point on there will forever be the opportunity to link LW to TBC for anyone wants to do that. And if you applied the analog of security mindset to this problem (which I think is appropriate), this is not something you would allow to happen. There is precedent for people losing positions over things that have happened decades in the past.
One somewhat concrete scenario that seems plausible (but widely unlikely because it’s concrete) is that Elon Musk manages to make the issue mainstream in 15 years; someone does a deep dive and links this to LW and LW to anti-social-jutice (even though LW itself still doesn’t have that many more readers); this gets picked up a lot of people who think worrying about AGI is bad; the aforementioned coupling occurs.
The only other thing I’d say is that there is also a substantial element of randomness to what does and doesn’t create a vast backlash. You can’t look at one instance of “person with popularity level x said thing of controversy level y, nothing bad happened” and conclude that any other instance (x′,y′) with x′<x and y′<y will definitely not lead to anything bad happening.
Agree with all of this, but my concern is not that the coupling of [worrying about AGI] and [being anti-social-justice] happens tomorrow. (I did have some separate concerns about people being put off by the post today, but I’ve been convinced somewhere in the comments under this post that the opposite is about equally likely.) It’s that this happens when AGI saftey is a much bigger deal in the public discourse. (Not sure if you think this will never happen? I think there’s a chance it never happens but that seems widely uncertain. I would put maybe 50% on it or something? Note that even if it happens very late, say 4 years before AGI poses an existential risk, I think that’s still more than enough time for the damage to be done. EY famously argued that there is no firelarm for AGI; if you buy this then we can’t rely on “by this point the danger is so obvious that people will take safety seriously no matter what”.)
If your next question is “why worry about this now”, one reason is that I don’t have faith that mods will react in time when the risk increases (I’ve updated upward on how likely I think this is after talking to Ruby but not to 100% and who knows who’s mod in 20 years), and I have the opportunity to say something now. But even if I had full authority over how the policy changes in the future, I still wouldn’t have allowed this post because people can dig out old material if they want to write a hit piece. This post has been archived, so from this point on there will forever be the opportunity to link LW to TBC for anyone wants to do that. And if you applied the analog of security mindset to this problem (which I think is appropriate), this is not something you would allow to happen. There is precedent for people losing positions over things that have happened decades in the past.
One somewhat concrete scenario that seems plausible (but widely unlikely because it’s concrete) is that Elon Musk manages to make the issue mainstream in 15 years; someone does a deep dive and links this to LW and LW to anti-social-jutice (even though LW itself still doesn’t have that many more readers); this gets picked up a lot of people who think worrying about AGI is bad; the aforementioned coupling occurs.
The only other thing I’d say is that there is also a substantial element of randomness to what does and doesn’t create a vast backlash. You can’t look at one instance of “person with popularity level x said thing of controversy level y, nothing bad happened” and conclude that any other instance (x′,y′) with x′<x and y′<y will definitely not lead to anything bad happening.