At the risk of asking the obvious:
Does the fact that no one has yet succeeded in constructing transhuman AI imply that doing so would necessarily wipe out humanity?
No.
But does it increase the probability of it, and if so, by how much?
At the risk of asking the obvious:
Does the fact that no one has yet succeeded in constructing transhuman AI imply that doing so would necessarily wipe out humanity?
No.
But does it increase the probability of it, and if so, by how much?