In ten or twenty or forty years from now, in a way that’s impossible to predict because any specific scenario is extremely unlikely, the position to be worried about AGI will get coupled to being anti social justice in the public discourse, as a result it will massively lose status and the big labs react by taking safety far less seriously and maybe we have fewer people writing papers on alignment
So, I both think that in the past 1) people have thought the x-risk folks are weird and low-status and didn’t want to be affiliated with them, and in the present 2) people like Phil Torres are going around claiming that EAs and longtermists are white surpremacists, because of central aspects of longtermism (like thinking the present matters in large part because of its ability to impact the future). Things like “willingness to read The Bell Curve” no doubt contribute to their case, but I think focusing on that misses the degree to which the core is actually in competition with other ideologies or worldviews.
I think there’s a lot of value in trying to nudge your presentation to not trigger other people’s allergies or defenses, and trying to incorporate criticisms and alternative perspectives. I think we can’t sacrifice the core to do those things. If we disagree with people about whether the long-term matters, then we disagree with them; if they want to call us names accordingly, so much the worse for them.
If we disagree with people about whether the long-term matters, then we disagree with them; if they want to call us names accordingly, so much the worse for them.
I mean, this works until someone in a position of influence bows the the pressure, and I don’t see why this can’t happen.
I think we can’t sacrifice the core to do those things.
The main disagreement seems to come down to how much we would give up when disallowing posts like this. My gears model still says ‘almost nothing’ since all it would take is to extend the norm “let’s not talk about politics” to “let’s not talk about politics and extremely sensitive social-justice adjacent issues”, and I feel like that would extend the set of interesting taboo topics by something like 10%.
(I’ve said the same here; if you have a response to this, it might make sense to all keep it in one place.)
So, I both think that in the past 1) people have thought the x-risk folks are weird and low-status and didn’t want to be affiliated with them, and in the present 2) people like Phil Torres are going around claiming that EAs and longtermists are white surpremacists, because of central aspects of longtermism (like thinking the present matters in large part because of its ability to impact the future). Things like “willingness to read The Bell Curve” no doubt contribute to their case, but I think focusing on that misses the degree to which the core is actually in competition with other ideologies or worldviews.
I think there’s a lot of value in trying to nudge your presentation to not trigger other people’s allergies or defenses, and trying to incorporate criticisms and alternative perspectives. I think we can’t sacrifice the core to do those things. If we disagree with people about whether the long-term matters, then we disagree with them; if they want to call us names accordingly, so much the worse for them.
I mean, this works until someone in a position of influence bows the the pressure, and I don’t see why this can’t happen.
The main disagreement seems to come down to how much we would give up when disallowing posts like this. My gears model still says ‘almost nothing’ since all it would take is to extend the norm “let’s not talk about politics” to “let’s not talk about politics and extremely sensitive social-justice adjacent issues”, and I feel like that would extend the set of interesting taboo topics by something like 10%.
(I’ve said the same here; if you have a response to this, it might make sense to all keep it in one place.)