Would an AGI ever try to convince you of something you can’t understand? I wouldn’t try to explain special relativity to a kindergarten class. Surely an AGI would know perfectly well what you are capable of grasping. If it tries to convince me of something, knowing it cannot, what then are its intentions?
Ack. ‘Surely an AGI would be able to...’ should be made illegal. I can quite easily conceive of an artificial mind that cannot model my thought processes. There’s a great big long stretch of cleverness above human level before you reach omniscience!
There are also some humans who can understand lots of things, and some who can understand only very few things. If I’m being asked to sever a limb or stamp on a puppy, I at least want my shiny new master to have a stab at explaining why.
Would an AGI ever try to convince you of something you can’t understand? I wouldn’t try to explain special relativity to a kindergarten class. Surely an AGI would know perfectly well what you are capable of grasping. If it tries to convince me of something, knowing it cannot, what then are its intentions?
Ack. ‘Surely an AGI would be able to...’ should be made illegal. I can quite easily conceive of an artificial mind that cannot model my thought processes. There’s a great big long stretch of cleverness above human level before you reach omniscience!
There are also some humans who can understand lots of things, and some who can understand only very few things. If I’m being asked to sever a limb or stamp on a puppy, I at least want my shiny new master to have a stab at explaining why.