Impossible to influence values, not just very difficult.
Nothing is impossible. Maybe AI’s hardware is faulty (and that is why it computes 2+2=4 every time), which would prompt AI to investigate the issue more thoroughly, if it has nothing better to do.
(This is more of an out-of-context remark, since I can’t place “influencing own values”. If “values” are not values, and instead something that should be “influenced” for some reason, why do they matter?)
Impossible to influence values, not just very difficult.
Which would also mean doing things that would be bad in other unlikely worlds.
See my comment on your comment.
Nothing is impossible. Maybe AI’s hardware is faulty (and that is why it computes 2+2=4 every time), which would prompt AI to investigate the issue more thoroughly, if it has nothing better to do.
(This is more of an out-of-context remark, since I can’t place “influencing own values”. If “values” are not values, and instead something that should be “influenced” for some reason, why do they matter?)