This is definitively not AGI.
And it lacks the cognitive ability to consider most of these things because this doesn’t improve reward during the training phase.
If it lacks cognitively ability to consider things that humans can consider, then it’s not AGI.
Current theme: default
Less Wrong (text)
Less Wrong (link)
This is definitively not AGI.
If it lacks cognitively ability to consider things that humans can consider, then it’s not AGI.