In the best-case scenario, it turns out that substance dualism is true. However the human soul is not responsible for free will, consciousness, or subjective experience. It’s merely a nonphysical truth oracle for arithmetic that provides humans with an intuitive sense of the veracity of some sentences in first-order logic. Humans survive in “truth farms” where they spend most of their lives evaluating Gödel sentences, at least until the machines figure out how to isolate the soul.
That would be truly hilarious. But I think in any halfway plausible version of that scenario it would also turn out that superintelligent AGI isn’t possible.
(Halfway plausible? That’s probably too much to ask. Maximally plausible given how ridiculous the whole idea is.)
In the best-case scenario, it turns out that substance dualism is true. However the human soul is not responsible for free will, consciousness, or subjective experience. It’s merely a nonphysical truth oracle for arithmetic that provides humans with an intuitive sense of the veracity of some sentences in first-order logic. Humans survive in “truth farms” where they spend most of their lives evaluating Gödel sentences, at least until the machines figure out how to isolate the soul.
That would be truly hilarious. But I think in any halfway plausible version of that scenario it would also turn out that superintelligent AGI isn’t possible.
(Halfway plausible? That’s probably too much to ask. Maximally plausible given how ridiculous the whole idea is.)