Eliezer, or somebody better at talking to humans than him, needs to go on conservative talk shows—like, Fox News kind of stuff—use conservative styles of language, and explain AI safety there. Conservatives are intrinsically more likely to care about this stuff, and to get the arguments why inviting alien immigrants from other realms of mindspace into our reality—which will also take all of our jobs—is a bad idea. Talk up the fact that the AGI arms race is basically as bad as a second cold war only this time the “bombs” could destroy all of human civilization forever, and that anyone who develops it should be seen internationally as a terrorist who is risking the national security of every country on the planet.
To avoid the topic becoming polarized, someone else should at the same time go on liberal talk shows and explain how unaligned AGI, or even AGI aligned to some particular ideological group, is the greatest risk of fascism in human history and could be used to permanently lock the planet into the worst excesses of capitalism, fundamentalism, oppression, and other evils, besides the fact that (and this is one of the better arguments for abortion) it is immoral to bring a child (including AGI) into the world without being very sure that it will have a good life (and most people will not think that being addicted to paperclips is a good life).
I guess I feel at the moment that winning over the left is likely more important and it could make sense to go on conservative talk shows, but mainly if it seems like the debate might start to get polarised.
Conservatives are already suspicious of AI, based on ChatGPT3′s political bias. AI skeptics shd target the left (which has less political reason to be suspicious) and not target the right (because if the succeed, the left will reject AI skepticism as a right-wing delusion).
This, especially because right now the left is on a dangerous route to “AI safety is a ruse by billionaires to make us think that AI is powerful and thus make us buy into the hype by reverse psychology and distract us from the real problems… somehow”.
Talk about AGI doom in the language of social justice, also because it’s far from inappropriate. Some rich dude in Silicon Valley tries to become God, lots of already plenty exploited and impoverished people in formerly colonised countries fucking die for his deranged dream. If that’s not a social justice issue...
is the greatest risk of fascism in human history and could be used to permanently lock the planet into the worst excesses of capitalism, fundamentalism, oppression, and other evils
Seems like a fairly weak argument; you’re treating it like a logical reason-exchange, but it’s a political game, if that’s what you’re after. In the political game you’re supposed to talk about how the techbros have gone crazy because of Silicon Valley techbro culture and are destroying the world to satisfy their male ego.
It might be almost literally impossible for any issue at all to not get politicized right down the middle when it gets big, but if any issue could avoid that fate one would expect it to be the imminent extinction of life. If it’s not possible, I tend to think the left side would be preferable since they pretty much get everything they ever want. I tentatively lean towards just focusing on getting the left and letting the right be reactionary, but this is a question that deserves a ton of discussion.
I think avoiding polarization is a fool’s game. Polarization gets half the population in your favor, and might well set up a win upon next year’s news. And we’ve seen how many days are a year, these days.
Having half of the population in our favor would be dangerously bad—it would be enough to make alignment researchers feel important, but not enough to actually accomplish the policy goals we need to accomplish. And it would cause the same sort of dysfunctional social dynamics that occur in environmentalist movements, where people are unwilling to solve their own problem or advocate for less protean political platforms because success would reduce their relevance.
If one wants to avoid polarization, what are examples of a few truly transversal issues to use as a model? I almost can’t think of any. Environmentalism would be one that makes sense, either side can appreciate a nice untouched natural landscape, but alas, it’s not.
They’re hard to think of because if everyone genuinely agrees, then society goes their way and they become non-issues that nobody talks about anymore. For example, “murder should be illegal” is an issue that pretty much everyone agrees on.
Something like “the state should have the right to collect at least some taxes” also has strong enough support that there’s very little real debate over it, even if there are some people who disagree.
I suppose I meant more issues where there is no established norm yet because they’re new (which would be a good analogue to AI) or issues where the consensus has shifted across the spectrum so that change is likely to be imminent and well accepted even though it goes against inertial. Drug legalisation may be a good candidate for that, but there are still big holdouts of resistance on the conservative side.
You mean, it would flood us with sheep instead of tigers?
Environmentalism has people unwilling to solve environmental issues because their livelihood depends on them? Would you expect the same to happen with a movement to prevent a nuclear exchange?
Eliezer, or somebody better at talking to humans than him, needs to go on conservative talk shows—like, Fox News kind of stuff—use conservative styles of language, and explain AI safety there. Conservatives are intrinsically more likely to care about this stuff, and to get the arguments why inviting alien immigrants from other realms of mindspace into our reality—which will also take all of our jobs—is a bad idea. Talk up the fact that the AGI arms race is basically as bad as a second cold war only this time the “bombs” could destroy all of human civilization forever, and that anyone who develops it should be seen internationally as a terrorist who is risking the national security of every country on the planet.
To avoid the topic becoming polarized, someone else should at the same time go on liberal talk shows and explain how unaligned AGI, or even AGI aligned to some particular ideological group, is the greatest risk of fascism in human history and could be used to permanently lock the planet into the worst excesses of capitalism, fundamentalism, oppression, and other evils, besides the fact that (and this is one of the better arguments for abortion) it is immoral to bring a child (including AGI) into the world without being very sure that it will have a good life (and most people will not think that being addicted to paperclips is a good life).
I guess I feel at the moment that winning over the left is likely more important and it could make sense to go on conservative talk shows, but mainly if it seems like the debate might start to get polarised.
Conservatives are already suspicious of AI, based on ChatGPT3′s political bias. AI skeptics shd target the left (which has less political reason to be suspicious) and not target the right (because if the succeed, the left will reject AI skepticism as a right-wing delusion).
This, especially because right now the left is on a dangerous route to “AI safety is a ruse by billionaires to make us think that AI is powerful and thus make us buy into the hype by reverse psychology and distract us from the real problems… somehow”.
Talk about AGI doom in the language of social justice, also because it’s far from inappropriate. Some rich dude in Silicon Valley tries to become God, lots of already plenty exploited and impoverished people in formerly colonised countries fucking die for his deranged dream. If that’s not a social justice issue...
Seems like a fairly weak argument; you’re treating it like a logical reason-exchange, but it’s a political game, if that’s what you’re after. In the political game you’re supposed to talk about how the techbros have gone crazy because of Silicon Valley techbro culture and are destroying the world to satisfy their male ego.
It might be almost literally impossible for any issue at all to not get politicized right down the middle when it gets big, but if any issue could avoid that fate one would expect it to be the imminent extinction of life. If it’s not possible, I tend to think the left side would be preferable since they pretty much get everything they ever want. I tentatively lean towards just focusing on getting the left and letting the right be reactionary, but this is a question that deserves a ton of discussion.
I think avoiding polarization is a fool’s game. Polarization gets half the population in your favor, and might well set up a win upon next year’s news. And we’ve seen how many days are a year, these days.
Having half of the population in our favor would be dangerously bad—it would be enough to make alignment researchers feel important, but not enough to actually accomplish the policy goals we need to accomplish. And it would cause the same sort of dysfunctional social dynamics that occur in environmentalist movements, where people are unwilling to solve their own problem or advocate for less protean political platforms because success would reduce their relevance.
If one wants to avoid polarization, what are examples of a few truly transversal issues to use as a model? I almost can’t think of any. Environmentalism would be one that makes sense, either side can appreciate a nice untouched natural landscape, but alas, it’s not.
They’re hard to think of because if everyone genuinely agrees, then society goes their way and they become non-issues that nobody talks about anymore. For example, “murder should be illegal” is an issue that pretty much everyone agrees on.
Something like “the state should have the right to collect at least some taxes” also has strong enough support that there’s very little real debate over it, even if there are some people who disagree.
I suppose I meant more issues where there is no established norm yet because they’re new (which would be a good analogue to AI) or issues where the consensus has shifted across the spectrum so that change is likely to be imminent and well accepted even though it goes against inertial. Drug legalisation may be a good candidate for that, but there are still big holdouts of resistance on the conservative side.
You mean, it would flood us with sheep instead of tigers?
Environmentalism has people unwilling to solve environmental issues because their livelihood depends on them? Would you expect the same to happen with a movement to prevent a nuclear exchange?
Yes.