You don’t know enough to accurately decide whether there is a high risk of extinction. You don’t know enough to accurately decide whether a specific measure you advocate would increase or decrease it. Use epistemic modesty to guide your actions. Being sure of something you cannot derive from first principles, as opposed to from parroting select other people’s arguments is a good sign that you are not qualified.
One classic example is the environmentalist movement accelerating anthropogenic global climate change by being anti-nuclear energy. If you think you are smarter now about AI dangers than they were back then about climate, it is a red flag.
But AI doomers do think there is a high risk of extinction. I am not saying a call to violence is right: I am saying that not discussing it seems inconsistent with their worldview.
If you have perfect foresight and you know that action X is the only thing that will prevent the human race from going extinct, then maybe action X is justified. But none of those conditions apply.
That’s not true - we don’t make decisions based on perfect knowledge. If you believe the probability of doom is 1, or even not 1 but incredibly high, then any actions that prevent it or slow it down are worth pursuing—it’s a matter of expected value.
Isn’t the prevention of the human race one of those exceptions?
You don’t know enough to accurately decide whether there is a high risk of extinction. You don’t know enough to accurately decide whether a specific measure you advocate would increase or decrease it. Use epistemic modesty to guide your actions. Being sure of something you cannot derive from first principles, as opposed to from parroting select other people’s arguments is a good sign that you are not qualified.
One classic example is the environmentalist movement accelerating anthropogenic global climate change by being anti-nuclear energy. If you think you are smarter now about AI dangers than they were back then about climate, it is a red flag.
But AI doomers do think there is a high risk of extinction. I am not saying a call to violence is right: I am saying that not discussing it seems inconsistent with their worldview.
Eliezer discussed it multiple times, quite recently on Twitter and on various podcasts. Other people did, too.
If you have perfect foresight and you know that action X is the only thing that will prevent the human race from going extinct, then maybe action X is justified. But none of those conditions apply.
That’s not true - we don’t make decisions based on perfect knowledge. If you believe the probability of doom is 1, or even not 1 but incredibly high, then any actions that prevent it or slow it down are worth pursuing—it’s a matter of expected value.
I think you accidentally humanity