“Of course, sure, whatever, it’s killing people, but it might kill all people!”
But this isn’t the actual back-and-forth, the third point should be “no it won’t, you’re distracting from the people currently being killed!”. This is all a game to subtly beg the question. If AI is an existential threat, all current mundane threats like misinformation, job loss, AI bias, etc. are rounding errors to the total harm, the only situation where you’d talk about them is if you’ve already granted that the existential risks don’t exist.
If a large comet is heading towards Earth, and some group thinks it won’t actual hit Earth, but merely pass harmlessly close-by, and they start talking about the sun’s reflections off the asteroid making life difficult for people with sensitive eyes… they are trying to get you to assume the conclusion.
Sure, I agree, the asteroid is going to kill us all. But it would be courteous to acknowledge that it’s going to hit a poor area first, and they’ll die a few minutes earlier. Also, uh, all of us are going to die, I think that’s the core thing! we should save the poor area, and also all the other areas!
rounding errors to the total harm, the only situation where you’d talk about them is if you’ve already granted that the existential risks don’t exist
It’s possible to consider relatively irrelevant things, such as everything in ordinary human experience, even when there is an apocalypse on the horizon. The implied contextualizing norm asks for inability to consider them, or at least increases the cost.
But this isn’t the actual back-and-forth, the third point should be “no it won’t, you’re distracting from the people currently being killed!”. This is all a game to subtly beg the question. If AI is an existential threat, all current mundane threats like misinformation, job loss, AI bias, etc. are rounding errors to the total harm, the only situation where you’d talk about them is if you’ve already granted that the existential risks don’t exist.
If a large comet is heading towards Earth, and some group thinks it won’t actual hit Earth, but merely pass harmlessly close-by, and they start talking about the sun’s reflections off the asteroid making life difficult for people with sensitive eyes… they are trying to get you to assume the conclusion.
Sure, I agree, the asteroid is going to kill us all. But it would be courteous to acknowledge that it’s going to hit a poor area first, and they’ll die a few minutes earlier. Also, uh, all of us are going to die, I think that’s the core thing! we should save the poor area, and also all the other areas!
It’s possible to consider relatively irrelevant things, such as everything in ordinary human experience, even when there is an apocalypse on the horizon. The implied contextualizing norm asks for inability to consider them, or at least increases the cost.