1) If only one AI passes this threshold and it works to end humanity either directly or indirectly, humanity has zero chance of survival.
Zero isn’t a probability. What’s worse, this starts with the premise of a threshold for non-negligible risk, and then assumes that any AI past that threshold causes extinction with certainty. This is incoherent. There are other flaws, but an internal inconsistency like this is more than enough to render it completely invalid.
Part (2) is just as incoherent as part (1) since it depends upon the same argument.
The argument in (3) is almost as bad. Why would preventing other AIs from making the leap be “unlikely to result in a net positive return”, if it’s reducing the probability of extinction? Significantly lowering the odds of extinction seems to be a very positive return! The argument is completely missing a reason why it wouldn’t likely reduce the probability of extinction, or have any other net positive effect.
I could see an argument that it would be difficult to prevent other AIs from reaching such a threshold, but that’s not the same thing as not worthwhile.
Zero isn’t a probability. What’s worse, this starts with the premise of a threshold for non-negligible risk, and then assumes that any AI past that threshold causes extinction with certainty. This is incoherent. There are other flaws, but an internal inconsistency like this is more than enough to render it completely invalid.
Part (2) is just as incoherent as part (1) since it depends upon the same argument.
The argument in (3) is almost as bad. Why would preventing other AIs from making the leap be “unlikely to result in a net positive return”, if it’s reducing the probability of extinction? Significantly lowering the odds of extinction seems to be a very positive return! The argument is completely missing a reason why it wouldn’t likely reduce the probability of extinction, or have any other net positive effect.
I could see an argument that it would be difficult to prevent other AIs from reaching such a threshold, but that’s not the same thing as not worthwhile.