Depends on how much of a superintelligence, how implemented. I wouldn’t be surprised if somebody got far superhuman theorem-proving from a mind that didn’t generalize beyond theorems. Presuming you were asking it to prove old-school fancy-math theorems, and not to, eg, arbitrarily speed up a bunch of real-world computations like asking it what GPT-4 would say about things, etc.
Depends on how much of a superintelligence, how implemented. I wouldn’t be surprised if somebody got far superhuman theorem-proving from a mind that didn’t generalize beyond theorems. Presuming you were asking it to prove old-school fancy-math theorems, and not to, eg, arbitrarily speed up a bunch of real-world computations like asking it what GPT-4 would say about things, etc.