I have the same experience, whenever I try to explain AI X-risk to a “layman” they want a concrete story about how AGI could take over.
I have the same experience, whenever I try to explain AI X-risk to a “layman” they want a concrete story about how AGI could take over.