I was trying to convince my friend that AGI poses a serious risk to humanity. He agreed with me that AGI would pose a serious threat, but was not convinced that AGI is coming or is even possible. I tried a number of ways to convince him but was unsuccessful.
What is the best way to convince someone AGI is coming? Is there some convincing educational material that outlines the arguments?
Based on the language you’ve used in this post, it seems like you’ve tried several arguments in succession, none of them have worked, and you’re not sure why.
One possibility might be to first focus on understanding his belief as well as possible, and then once you understand his conclusions and why he’s reached them, you might have more luck. Maybe taking a look at Street Epistemology for some tips on this style of inquiry would help.
(It is also worth turning this lens upon yourself, and asking why is it so important to you that your friend believes that AGI is immiment? Then you can decide whether it’s worth continuing to try to persuade him.)
Given that there is no consensus on the topic even among people who do this professionally, maybe trying to convince someone is not the best idea? Pattern matches “join my doomsday cult”, despite the obvious signs of runaway AI improvements. Why do you want to convince them? What is in it for you?
Tell them microsoft has literally published that they have a small (read: young) AGI.
https://arxiv.org/abs/2303.12712
The ‘skeptic’ would respond likely that Microsoft has a vested interest in making such finding.
‘skeptics’ update slowly in response to new evidence that invalidates their beliefs. I think it’s a waste of your time to argue with them, in practice they will always be miscalibrated.