I’ve tried to raise the topic with smart physics people I know or encounter whenever the opportunity presents itself. So far, the only ones who actually went on to take steps to try and enter alignment already had prior involvement with EA or LW.
For the others, the main reactions I got seemed to be:
Sounds interesting, but this is all too hypothetical for me to really take seriously. It hinges on all these concepts and ideas you propose about how AGI is going to work, and I don’t buy yet that all of them are correct
Sounds concerning, but I’d rather work on physics
Sounds depressing. I already thought climate change will kill us all, now there’s also this? Let me just work on physics and not think about this any more.
I’m not a mind reader of course, so maybe their real reaction was “Quick, say something conciliatory to make this person shut up about the pet topic they are insane about.”
I’ve tried to raise the topic with smart physics people I know or encounter whenever the opportunity presents itself. So far, the only ones who actually went on to take steps to try and enter alignment already had prior involvement with EA or LW.
For the others, the main reactions I got seemed to be:
Sounds interesting, but this is all too hypothetical for me to really take seriously. It hinges on all these concepts and ideas you propose about how AGI is going to work, and I don’t buy yet that all of them are correct
Sounds concerning, but I’d rather work on physics
Sounds depressing. I already thought climate change will kill us all, now there’s also this? Let me just work on physics and not think about this any more.
I’m not a mind reader of course, so maybe their real reaction was “Quick, say something conciliatory to make this person shut up about the pet topic they are insane about.”