You don’t know enough to accurately decide whether there is a high risk of extinction. You don’t know enough to accurately decide whether a specific measure you advocate would increase or decrease it. Use epistemic modesty to guide your actions. Being sure of something you cannot derive from first principles, as opposed to from parroting select other people’s arguments is a good sign that you are not qualified.
One classic example is the environmentalist movement accelerating anthropogenic global climate change by being anti-nuclear energy. If you think you are smarter now about AI dangers than they were back then about climate, it is a red flag.
But AI doomers do think there is a high risk of extinction. I am not saying a call to violence is right: I am saying that not discussing it seems inconsistent with their worldview.
You don’t know enough to accurately decide whether there is a high risk of extinction. You don’t know enough to accurately decide whether a specific measure you advocate would increase or decrease it. Use epistemic modesty to guide your actions. Being sure of something you cannot derive from first principles, as opposed to from parroting select other people’s arguments is a good sign that you are not qualified.
One classic example is the environmentalist movement accelerating anthropogenic global climate change by being anti-nuclear energy. If you think you are smarter now about AI dangers than they were back then about climate, it is a red flag.
But AI doomers do think there is a high risk of extinction. I am not saying a call to violence is right: I am saying that not discussing it seems inconsistent with their worldview.
Eliezer discussed it multiple times, quite recently on Twitter and on various podcasts. Other people did, too.