If only one AI passes this threshold and it works to end humanity either directly or indirectly, humanity has zero chance of survival.
No, zero is not a probability.
Eliezer thinks your strategy won’t work because AIs will collude. I think that’s not too likely at critical stages.
I can imagine that having multiple AIs of unclear alignment is bad because race dynamics cause them to do something reckless.
But my best guess is that having multiple AIs is good under the most likely scenarios.
No, zero is not a probability.
Eliezer thinks your strategy won’t work because AIs will collude. I think that’s not too likely at critical stages.
I can imagine that having multiple AIs of unclear alignment is bad because race dynamics cause them to do something reckless.
But my best guess is that having multiple AIs is good under the most likely scenarios.