With what’s happening with the coronavirus, I’d think that people would be particularly receptive to the ideas that:
1) We need to be prepared for long term risks.
2) Things with exponential growth are super scary.
3) We should trust the professionals who predict these sorts of things.
I wouldn’t expect anyone to be willing to open their wallets right now, but it could be a good time to “plant the seed”.
Right now, most people are hyperfocused on COVID-19; this creates an obvious incentive for people to try to tie their pet issues to it, which I expect a variety of groups to try and which I expect to mostly backfire if tried in the short run. (See for example the receptiontthe WHO got when they tried to talk about stigma and discriminatio; people interpreted it as the output of an “always tie my pet issue to the topic du jour” algorithm and ridiculed then for it. Talking about AI risk in the current environment risks provoking the same reaction, because it probably would in fact be coming from a tie-my-pet-topic algorithm.
A month from now, however, will be a different matter. Once people start feeling like they have attention to spare, and have burned out on COVID-19 news, I expect them to be much more receptive to arguments about tail risk and to model-based extrapolation of the future than they were before.
I would wait longer than that. The repercussions of the virus are going to be large and will last a long time, ranging from unemployment and permanent lung damage to the deaths of loved ones. For quite a while I expect any talk about x-risk to come off to the average person as “we told you so, you should have listened to us” and would be like rubbing salt in a fresh wound. I would expect this to provoke a hostile reaction, burning social capital for a small shift in public opinion.
I would seriously consider not doing more outreach than you are now – possibly for several years.
In the near-term, I think significantly more people will find x-risk on their own.
From a comment I made in the comments on this question:
Given the second part, I still think one should do no more outreach than usual but also definitely do not tie x-risk, or a specific non-pandemic x-risk, to the current pandemic.
...
It just occurred to me that the form of the outreach, and especially the targeted audience of new outreach campaigns, could be decisive for my answer. When I first read your question, I immediately imagined things like ads on popular websites, in YouTube videos, or even on TV. Perhaps that wasn’t what you imagined by “outreach”. You did write:
I think outreach specifically intended to “plant the seed” – but not soliciting for funds or funding – could be very much worth doing now. But, because you’re not looking for money, you should target people that are most likely to ‘spread the seed’, e.g. public intellectuals.
(I still think you’re going to have a hard time selling non-friendly-AI risk, especially given that approximately everything is going to try to twist stories of the pandemic to their advantage. { Global warming / climate change} is an obvious likely competitor; as is universal healthcare in the U.S..)