The greatest danger is that an arms race will lead to the creation of a superintelligence which will immediately be used to dominate all others. Speculative threats by an autonomous superintelligence are plausible but are less certain than the first-strike logic inherent in such an arms race.
Here’s what we know from recent history: a) the instinct for domination is alive and well in the human species, and where circumstances allow an intelligent psychopath to reach the pinnacle of power, all available means will be deployed to maintain his (usually his) power. Cf. Stalin, Hitler, Mao, Kim (x3), Saddam, Putin, etc., and b) the logic of this kind of arms race dictates that if you’ve got it, you must use it. Multiple centers of power would almost certainly lead to a cyberwar or perhaps outright war. It only makes sense that the first to gain power must use it to suppress all other pretenders.
Collaboration on a precursor project, similar perhaps to the Human Genome Project, might at least point us in the right direction. Perhaps it could focus on the use of AI to build an Internet immune system that might limit mitigate today’s threats and constrain future one. Still, better ideas are needed to thwart the various catastrophic scenarios ahead.
The greatest danger is that an arms race will lead to the creation of a superintelligence which will immediately be used to dominate all others. Speculative threats by an autonomous superintelligence are plausible but are less certain than the first-strike logic inherent in such an arms race. Here’s what we know from recent history: a) the instinct for domination is alive and well in the human species, and where circumstances allow an intelligent psychopath to reach the pinnacle of power, all available means will be deployed to maintain his (usually his) power. Cf. Stalin, Hitler, Mao, Kim (x3), Saddam, Putin, etc., and b) the logic of this kind of arms race dictates that if you’ve got it, you must use it. Multiple centers of power would almost certainly lead to a cyberwar or perhaps outright war. It only makes sense that the first to gain power must use it to suppress all other pretenders. Collaboration on a precursor project, similar perhaps to the Human Genome Project, might at least point us in the right direction. Perhaps it could focus on the use of AI to build an Internet immune system that might limit mitigate today’s threats and constrain future one. Still, better ideas are needed to thwart the various catastrophic scenarios ahead.