If you can give the AGI any terminal goal you like, irrespective of how smart it is, that’s orthogonality right there.
No. Orthogonality is when agent follows any given goal, not when you give it. And as my thought experiment shows it is not intelligent to blindly follow given goal.
If you can give the AGI any terminal goal you like, irrespective of how smart it is, that’s orthogonality right there.
No. Orthogonality is when agent follows any given goal, not when you give it. And as my thought experiment shows it is not intelligent to blindly follow given goal.