I don’t call it near certainty because it would be awesome. I call it near certainty because it would be implausible given what we know for mildly superhuman intelligence to be impossible, and even implausible for it to be supremely hard, and because we are on the technological cusp of being able to brute-force a duplicate (via uploads) that could easily be made mildly superhuman, even as we are also on the scientific cusp of being able to understand how the internal design of the brain works.
Hugely superhuman intelligence is of course another ball of wax, but even that I would rate as “hard to argue against”.
I don’t call it near certainty because it would be awesome. I call it near certainty because it would be implausible given what we know for mildly superhuman intelligence to be impossible, and even implausible for it to be supremely hard, and because we are on the technological cusp of being able to brute-force a duplicate (via uploads) that could easily be made mildly superhuman, even as we are also on the scientific cusp of being able to understand how the internal design of the brain works.
Hugely superhuman intelligence is of course another ball of wax, but even that I would rate as “hard to argue against”.