I expect it to be a problem—probably as serious—for superintelligence. The universe will always be bigger and more complex than any model of it, and I’m pretty sure a mind can’t fully model itself.
Superintelligences will presumably have epistemic problems we can’t understand, and probably better tools for working on them, but unless I’m missing something, there’s no way to make the problem go away.
Yeah, but at least it shouldn’t have all the subconscious signaling problems that compromise conscious reasoning in humans- at least I hope nobody would be dumb enough to build a superintelligence that deceives itself on account of social adaptations that don’t update when the context changes...
I expect it to be a problem—probably as serious—for superintelligence. The universe will always be bigger and more complex than any model of it, and I’m pretty sure a mind can’t fully model itself.
Superintelligences will presumably have epistemic problems we can’t understand, and probably better tools for working on them, but unless I’m missing something, there’s no way to make the problem go away.
Yeah, but at least it shouldn’t have all the subconscious signaling problems that compromise conscious reasoning in humans- at least I hope nobody would be dumb enough to build a superintelligence that deceives itself on account of social adaptations that don’t update when the context changes...