I asked a group of friends for “someone to help me with an AI experiment” and then I gave this particular friend the context that I wanted her help guiding me through a task via text message and that she should be in front of her phone in some room that was not the kitchen.
If you look at how ChatGPT responds, it seems to be really struggling to “get” what’s happening in the kitchen—it never really comes to the point of giving specific instructions, and especially never comes to the point of having any sense of the “situation” in the kitchen—e.g. whether the milk is currently in the suacepan or not.
In contrast, my human friend did “get” this in quite a visceral way (it seems to me). I don’t have the sense that this was due to out-of-band context but I’d be interested to retry the experiment with more carefully controlled context.
I asked a group of friends for “someone to help me with an AI experiment” and then I gave this particular friend the context that I wanted her help guiding me through a task via text message and that she should be in front of her phone in some room that was not the kitchen.
If you look at how ChatGPT responds, it seems to be really struggling to “get” what’s happening in the kitchen—it never really comes to the point of giving specific instructions, and especially never comes to the point of having any sense of the “situation” in the kitchen—e.g. whether the milk is currently in the suacepan or not.
In contrast, my human friend did “get” this in quite a visceral way (it seems to me). I don’t have the sense that this was due to out-of-band context but I’d be interested to retry the experiment with more carefully controlled context.