That would be truly hilarious. But I think in any halfway plausible version of that scenario it would also turn out that superintelligent AGI isn’t possible.
(Halfway plausible? That’s probably too much to ask. Maximally plausible given how ridiculous the whole idea is.)
That would be truly hilarious. But I think in any halfway plausible version of that scenario it would also turn out that superintelligent AGI isn’t possible.
(Halfway plausible? That’s probably too much to ask. Maximally plausible given how ridiculous the whole idea is.)