Yes, this would be ‘probabilistic’ and thus this is an issue of evidence that AIs would share with each other.
Why or how would one system trust another that the state (code+data) shared is honest?
Sandboxing is (currently) imperfect, tho perhaps sufficiently advanced AIs could actually achieve it? (On the other hand, there are security vulnerabilities that exploit the ‘computational substrate’, e.g. Spectre, so I would guess that would remain as a potential vulnerability even for AIs that designed and built their own substrates.) This also seems like it would only help if the sandboxed version could be ‘sped up’ and if the AI running the sandboxed AI can ‘convince’ the sandboxed AI that it’s not’ sandboxed.
The ‘prototypical’ AI I’m imagining seems like it would be too ‘big’ and too ‘diffuse’ (e.g. distributed) for it to be able to share (all of) itself with another AI. Another commenter mentioned an AI ‘folding itself up’ for sharing, but I can’t understand concretely how that would help (or how it would work either).
Some of my own intuitions about this:
Yes, this would be ‘probabilistic’ and thus this is an issue of evidence that AIs would share with each other.
Why or how would one system trust another that the state (code+data) shared is honest?
Sandboxing is (currently) imperfect, tho perhaps sufficiently advanced AIs could actually achieve it? (On the other hand, there are security vulnerabilities that exploit the ‘computational substrate’, e.g. Spectre, so I would guess that would remain as a potential vulnerability even for AIs that designed and built their own substrates.) This also seems like it would only help if the sandboxed version could be ‘sped up’ and if the AI running the sandboxed AI can ‘convince’ the sandboxed AI that it’s not’ sandboxed.
The ‘prototypical’ AI I’m imagining seems like it would be too ‘big’ and too ‘diffuse’ (e.g. distributed) for it to be able to share (all of) itself with another AI. Another commenter mentioned an AI ‘folding itself up’ for sharing, but I can’t understand concretely how that would help (or how it would work either).