It’s like saying, weapons are dangerous, imagine what would happen if they fall in the wrong hands. Well, it does happen and sometimes that have bad consequences but there is no logical connection between that and everyone dying, which is what doom means. Do you want to argue that LLMs are dangerous? Fine. No problem with that. But doom is not that.
I don’t see how that implies that everyone dies.
It’s like saying, weapons are dangerous, imagine what would happen if they fall in the wrong hands. Well, it does happen and sometimes that have bad consequences but there is no logical connection between that and everyone dying, which is what doom means. Do you want to argue that LLMs are dangerous? Fine. No problem with that. But doom is not that.
That’s fair. Edited to reflect that.
I do think it could be a useful way to convince someone who is completely skeptical of risk from AI.