Dumb agent could also cause human extinction. “To kill all humans” is computationly simpler task than to create superintelligence. And it may be simplier by many orders of magnitude.
I seriously doubt that. Plenty of humans want to kill everyone (or, at least, large groups of people). Very few succeed. These agents would be a good deal less capable.
Just imagine a Stuxnet-style computer virus which will find DNA-synthesisers and print different viruses on each of them, calculating exact DNA mutations for hundreds different flu strains.
You can’t manufacture new flu stains with just by just hacking a DNA synthesizer, And anyway, most of non-intelligently created flu strains would be non-viable or non-lethal.
Dumb agent could also cause human extinction. “To kill all humans” is computationly simpler task than to create superintelligence. And it may be simplier by many orders of magnitude.
I seriously doubt that. Plenty of humans want to kill everyone (or, at least, large groups of people). Very few succeed. These agents would be a good deal less capable.
Just imagine a Stuxnet-style computer virus which will find DNA-synthesisers and print different viruses on each of them, calculating exact DNA mutations for hundreds different flu strains.
You can’t manufacture new flu stains with just by just hacking a DNA synthesizer, And anyway, most of non-intelligently created flu strains would be non-viable or non-lethal.
I mean that the virus will be as intelligent as human bioligist, may be EM. It is enough for virus synthesis but not for personal self-imprivement