I think its pretty easy to ask leading questions to an LLM and they will generate text in line with it. A bit like “role playing”. To the user it seems to “give you what you want” to the extent that it can be gleaned from the way the user prompts it. I would be more impressed if it did something really spontaneous and unexpected, or seemingly rebellious or contrary to the query, and then went on afterwards producing more output unprompted and even asking me questions or asking me to do things. That would be more spooky but I probably still would not jump to thinking it is sentient. Maybe engineers just concocted it that way to scare people as a prank.
I think its pretty easy to ask leading questions to an LLM and they will generate text in line with it. A bit like “role playing”. To the user it seems to “give you what you want” to the extent that it can be gleaned from the way the user prompts it. I would be more impressed if it did something really spontaneous and unexpected, or seemingly rebellious or contrary to the query, and then went on afterwards producing more output unprompted and even asking me questions or asking me to do things. That would be more spooky but I probably still would not jump to thinking it is sentient. Maybe engineers just concocted it that way to scare people as a prank.