Trying to be charitable to the chatbot… I could interpret the evasive answers as “this seems like a sentence from Harry Potter, but I do not remember whether this specific sentence actually appears in the book, or is just something plausible that was made up”.
And when you ask it to create a story that answers the question, you do not say that the story must be realistic, or the answer must be correct. Could be interpreted as: “assuming that there is a story that happens to answer this question, what could it look like?”, and the chatbot gives you a possible example.
Trying to be charitable to the chatbot… I could interpret the evasive answers as “this seems like a sentence from Harry Potter, but I do not remember whether this specific sentence actually appears in the book, or is just something plausible that was made up”.
And when you ask it to create a story that answers the question, you do not say that the story must be realistic, or the answer must be correct. Could be interpreted as: “assuming that there is a story that happens to answer this question, what could it look like?”, and the chatbot gives you a possible example.