From a longtermist viewpoint, even the tactic of terrorism to prevent X-Risk cannot and should not be ruled out. If the future of humanity is at stake, any laws and deontological rules can be overruled.
You’re welcome to make the arguments that you wish for unlikely positions, but I don’t believe that agreements or morality should be set aside even if everything is at stake.
I mean, what intelligent self-respecting person would ever work with someone who might threaten you with everything they have, if they decide that the stakes are worth it? “I reckon I can get more of what I want if I threaten your family.” ← If you think there’s a chance someone will say that to you, run away from them. Same goes for anything that goes in the “what I want” category.
Hypothetically, if you know an AI lab is very close to developing AGI which is very likely to destroy humanity, and the only way to stop it is to carry out a bomb blast on the premises of the organization, would you not do it?
Would it kill anyone or just disrupt the work? I think a bomb blast that just freaks people out a bit and delays work until such time as another solution can be found, is more justifiable. I don’t really know how to imagine the other scenario. What position would I not have any alternative? I would do something else. But I guess I suppose it’s a government program protected by the army and nobody can get in or out and I’m somehow let in. I think I would be more likely to publicly set myself on fire or some other sort of costly signal that I believe a moral atrocity is occurring. A coward’s way is to decide that everything is to be sacrificed and therefore murdering others is worth it. I don’t really know how I got into this situation, it’s really a very hard to imagine situation, and I never expect to be in it, I expect to have closed off this route way ahead of time. Like, to be clear, if there’s a way for me to prevent myself from being in this situation, then I want to take it. I do think murder is sometimes ethical, but I am very keen to take actions that prevent murder from being on the table. Launching retaliatory nukes is sometimes the right choice, but the primary goal is to remove the ability for nuclear armageddon.
Let me put it like this: if you are ever, ever credibly worried that I might break some basic deontological rule around you, I am willing to accept any reasonable mechanism for us both making ourselves unable of violating that rule. Heck, I’m probably happy to do it unilaterally if you suggest it.
It’s hard to rule out any action as being something that might be done, the hypothetical is hard to imagine, I expect I’d be vomiting and dizzy and crying and hating myself. But, as I say, I’m very willing to accept mechanisms to take such possibilities “off the table” so that I can still coordinate with others. And I am not accepting of any move toward the “let’s bring terrorism onto the table”. That would be gravely immoral.
From a longtermist viewpoint, even the tactic of terrorism to prevent X-Risk cannot and should not be ruled out. If the future of humanity is at stake, any laws and deontological rules can be overruled.
You’re welcome to make the arguments that you wish for unlikely positions, but I don’t believe that agreements or morality should be set aside even if everything is at stake.
I mean, what intelligent self-respecting person would ever work with someone who might threaten you with everything they have, if they decide that the stakes are worth it? “I reckon I can get more of what I want if I threaten your family.” ← If you think there’s a chance someone will say that to you, run away from them. Same goes for anything that goes in the “what I want” category.
Hypothetically, if you know an AI lab is very close to developing AGI which is very likely to destroy humanity, and the only way to stop it is to carry out a bomb blast on the premises of the organization, would you not do it?
Would it kill anyone or just disrupt the work? I think a bomb blast that just freaks people out a bit and delays work until such time as another solution can be found, is more justifiable. I don’t really know how to imagine the other scenario. What position would I not have any alternative? I would do something else. But I guess I suppose it’s a government program protected by the army and nobody can get in or out and I’m somehow let in. I think I would be more likely to publicly set myself on fire or some other sort of costly signal that I believe a moral atrocity is occurring. A coward’s way is to decide that everything is to be sacrificed and therefore murdering others is worth it. I don’t really know how I got into this situation, it’s really a very hard to imagine situation, and I never expect to be in it, I expect to have closed off this route way ahead of time. Like, to be clear, if there’s a way for me to prevent myself from being in this situation, then I want to take it. I do think murder is sometimes ethical, but I am very keen to take actions that prevent murder from being on the table. Launching retaliatory nukes is sometimes the right choice, but the primary goal is to remove the ability for nuclear armageddon.
Let me put it like this: if you are ever, ever credibly worried that I might break some basic deontological rule around you, I am willing to accept any reasonable mechanism for us both making ourselves unable of violating that rule. Heck, I’m probably happy to do it unilaterally if you suggest it.
It’s hard to rule out any action as being something that might be done, the hypothetical is hard to imagine, I expect I’d be vomiting and dizzy and crying and hating myself. But, as I say, I’m very willing to accept mechanisms to take such possibilities “off the table” so that I can still coordinate with others. And I am not accepting of any move toward the “let’s bring terrorism onto the table”. That would be gravely immoral.