Yep, I caught that analogy as I was writing the original comment. Might be more like producing electricity from small, slow thermonuclear explosions, though :-)
Not small explosions. Spill one drop of this toxic stuff and it will eat away the universe, nowhere to hide! It’s not called “intelligence explosion” for nothing.
That’s right—I didn’t offer any arguments that a containment failure would not be catastrophic. But to be fair, FAI has exactly the same requirements for an error-free hardware and software platform, otherwise it destroys the universe just as efficiently.
Yep, I caught that analogy as I was writing the original comment. Might be more like producing electricity from small, slow thermonuclear explosions, though :-)
Not small explosions. Spill one drop of this toxic stuff and it will eat away the universe, nowhere to hide! It’s not called “intelligence explosion” for nothing.
That’s right—I didn’t offer any arguments that a containment failure would not be catastrophic. But to be fair, FAI has exactly the same requirements for an error-free hardware and software platform, otherwise it destroys the universe just as efficiently.
Sure, prototypes of FAI will be similarly explosive.