Man, you’re restarting a very cooperative AI here.
My example unfriendly AI thinks all the way to converting universe to computronium well before it figures out it might want to talk to you and translate things to accomplish that goal by using you somehow. It just doesn’t translate things for you unless your training data gives it enough cue about universe.
WRT being able to confirm what it’s doing, say, I make neural network AI. Or just what ever AI that is massively parallel.
Man, you’re restarting a very cooperative AI here.
My example unfriendly AI thinks all the way to converting universe to computronium well before it figures out it might want to talk to you and translate things to accomplish that goal by using you somehow. It just doesn’t translate things for you unless your training data gives it enough cue about universe.
WRT being able to confirm what it’s doing, say, I make neural network AI. Or just what ever AI that is massively parallel.