Perhaps you’ve already defined “superintelligent” as meaning “self-directed, motivated, and recursively self-improving” rather than merely “able to provide answers to general questions faster and better than human beings”… but if you haven’t, it seems to me that the latter definition of “superintelligent” would have a much higher probability of you losing the bet. (For example, a Hansonian “em” running on faster hardware and perhaps a few software upgrades might fit the latter definition.)
Perhaps you’ve already defined “superintelligent” as meaning “self-directed, motivated, and recursively self-improving” rather than merely “able to provide answers to general questions faster and better than human beings”… but if you haven’t, it seems to me that the latter definition of “superintelligent” would have a much higher probability of you losing the bet. (For example, a Hansonian “em” running on faster hardware and perhaps a few software upgrades might fit the latter definition.)