Perhaps the broader point here is that public relations is a complex art, of which we are mostly not even practitioners let alone masters. We should probably learn about it and get better.
I also want to note that there are probably psychological as well as societal defense mechanisms against someone trying to change your worldview. I don’t know the name of the phenomena, but this is essentially why counselors/therapists typically avoid giving advice or stating their opinion plainly; the client is prone to rebel against that advice or worldview. I’d suspect this happens because it’s terribly dangereous to just let other people tell you how to think; you’ll be taken advantage of rather quickly if you do. Obviously there are multiple routes around these defense mechanisms, since people do convince others to change their minds in both subtle and forceful ways. But we should probably learn the theory of how that happens, prior to triggering a bunch of defense mechanisms by going in swinging with amateur enthusiasm (and the unusual perspective of devoted rationalism).
Waiting to speak while polishing our approach seems foolish when time is short. I find very short timelines entirely plausible, but I nonetheless think it would behoove us to collectively gather some clues before arguing loudly in public.
This is in part because I think the last point is very true and very relevant: people who aren’t taking AGI risk seriously largely just aren’t taking AI seriously. They’ll take it more seriously the more it advances, with no convincing needed. So a good bit of the work is being done by progress, so we’re not as far behind in getting people to pay attention as it seems. That gives us a bit more time to figure out how to work with that societal attention as it continues to grow.
None of this is arguing for shutting up or not speaking the truth. I’m just suggesting we err on the side of speaking gently, to avoid triggering strong defense mechanisms we don’t understand.
That is an excellent point. I hate the idea of gathering attention and reputation, but that’s probably an big part of having people listen to you when it’s important.
Perhaps the broader point here is that public relations is a complex art, of which we are mostly not even practitioners let alone masters. We should probably learn about it and get better.
I also want to note that there are probably psychological as well as societal defense mechanisms against someone trying to change your worldview. I don’t know the name of the phenomena, but this is essentially why counselors/therapists typically avoid giving advice or stating their opinion plainly; the client is prone to rebel against that advice or worldview. I’d suspect this happens because it’s terribly dangereous to just let other people tell you how to think; you’ll be taken advantage of rather quickly if you do. Obviously there are multiple routes around these defense mechanisms, since people do convince others to change their minds in both subtle and forceful ways. But we should probably learn the theory of how that happens, prior to triggering a bunch of defense mechanisms by going in swinging with amateur enthusiasm (and the unusual perspective of devoted rationalism).
Waiting to speak while polishing our approach seems foolish when time is short. I find very short timelines entirely plausible, but I nonetheless think it would behoove us to collectively gather some clues before arguing loudly in public.
This is in part because I think the last point is very true and very relevant: people who aren’t taking AGI risk seriously largely just aren’t taking AI seriously. They’ll take it more seriously the more it advances, with no convincing needed. So a good bit of the work is being done by progress, so we’re not as far behind in getting people to pay attention as it seems. That gives us a bit more time to figure out how to work with that societal attention as it continues to grow.
None of this is arguing for shutting up or not speaking the truth. I’m just suggesting we err on the side of speaking gently, to avoid triggering strong defense mechanisms we don’t understand.
Unfortunately, as societal attention to AI ramps up, less and less of that attention will go to “us”.
That is an excellent point. I hate the idea of gathering attention and reputation, but that’s probably an big part of having people listen to you when it’s important.