Not being an AI researcher, what do we mean when we speak about AGI—will an AGI be able to do all the things a competent adult does? (If, we imagine, we gave it some robotic limbs and means of locomotion and it had corollaries of the 5 senses).
In the Western World for example, most humans can make detailed transport plans that may include ensuring there is enough petrol in their car, so that they can go to a certain store to purchase ingredients which they will later on use a recipe to make a meal of: perhaps in service of a larger goal like ingratiating themselves to a lover or investor.
In Non-Developed countries there is a stunning ingenuity, for example, how in the Sahel mechanics will get old Toyotas working again.
While arguably lots of these sub-tasks are Sphexish, this being just one humdrum examples of the wide variety of skills that the average human adult has mastered, others include writing in longhand, mastering various videogames, the muscle coordination and strategic thinking to play any number of sports or games or performing arts which require coordination between intent and physicality (guitar playing, Soccer, being a steadicam operator).
Of course, once you start getting into coordination of body and mind you get into cognitive cognition and discussions about what is really “intelligence” and whether that is representational, or whether utilizing anti-representational means of cognition can also be intelligence? But that’s tangential.
Right now ChatGPT (and Claude, and Llama etc. ) do very well for only having a highly verbocentric means of representing the world. However details of implementation are often highly wanting—they continue to speak in broad, abstract brushstrokes if I ask “How do I...”
For example, I asked Claude what I should be feeling from my partner when dancing the tango (if I’m ‘leading’ - even though it is the traditionally the woman who actually controls the flow of the dance—the lead or man must interpret the woman’s next moves correctly): “Notice the level of tension and responsiveness in your partner’s muscles, which can indicate their next move” no mention of what that feels like, what muscles, or where i should be feeling it (my hands? should I feel my weight being ‘psuhed’)… the only specific cue it offered was:
”Pay attention to small movements, head tilts, or changes in your partner’s energy that signal their intention.”
Head tilts!
Now I realize, this is partly reflective of the information bottleneck of tactic-to-explicit: people have trouble writing about this knowledge, and a LLM can only be trained on what is written. But the point remains: execution counts!
Not being an AI researcher, what do we mean when we speak about AGI—will an AGI be able to do all the things a competent adult does? (If, we imagine, we gave it some robotic limbs and means of locomotion and it had corollaries of the 5 senses).
In the Western World for example, most humans can make detailed transport plans that may include ensuring there is enough petrol in their car, so that they can go to a certain store to purchase ingredients which they will later on use a recipe to make a meal of: perhaps in service of a larger goal like ingratiating themselves to a lover or investor.
In Non-Developed countries there is a stunning ingenuity, for example, how in the Sahel mechanics will get old Toyotas working again.
While arguably lots of these sub-tasks are Sphexish, this being just one humdrum examples of the wide variety of skills that the average human adult has mastered, others include writing in longhand, mastering various videogames, the muscle coordination and strategic thinking to play any number of sports or games or performing arts which require coordination between intent and physicality (guitar playing, Soccer, being a steadicam operator).
Of course, once you start getting into coordination of body and mind you get into cognitive cognition and discussions about what is really “intelligence” and whether that is representational, or whether utilizing anti-representational means of cognition can also be intelligence? But that’s tangential.
Right now ChatGPT (and Claude, and Llama etc. ) do very well for only having a highly verbocentric means of representing the world. However details of implementation are often highly wanting—they continue to speak in broad, abstract brushstrokes if I ask “How do I...”
For example, I asked Claude what I should be feeling from my partner when dancing the tango (if I’m ‘leading’ - even though it is the traditionally the woman who actually controls the flow of the dance—the lead or man must interpret the woman’s next moves correctly): “Notice the level of tension and responsiveness in your partner’s muscles, which can indicate their next move” no mention of what that feels like, what muscles, or where i should be feeling it (my hands? should I feel my weight being ‘psuhed’)… the only specific cue it offered was:
”Pay attention to small movements, head tilts, or changes in your partner’s energy that signal their intention.”
Head tilts!
Now I realize, this is partly reflective of the information bottleneck of tactic-to-explicit: people have trouble writing about this knowledge, and a LLM can only be trained on what is written. But the point remains: execution counts!