Also, it’s a normal part of reasoning to interpret people’s answers in a way that makes sense. If you have a typo or a missing word in your prompt, but your intended meaning is clear, ChatGPT will go with what you obviously meant rather than with what you wrote.
For some of these puzzles, it wouldn’t normally make much sense to ask about the simplified version because the answer is so obvious. Why would you ask “how can the man and the goat cross the river with the boat” if the answer was just “by taking the boat”? So it’s not unreasonable for ChatGPT to assume that you really meant the standard form of the puzzle, and just forgot to describe it in full. Its tendency to pattern-match in that way is the same thing that gives it the ability to ignore people’s typos.
Also, it’s a normal part of reasoning to interpret people’s answers in a way that makes sense. If you have a typo or a missing word in your prompt, but your intended meaning is clear, ChatGPT will go with what you obviously meant rather than with what you wrote.
For some of these puzzles, it wouldn’t normally make much sense to ask about the simplified version because the answer is so obvious. Why would you ask “how can the man and the goat cross the river with the boat” if the answer was just “by taking the boat”? So it’s not unreasonable for ChatGPT to assume that you really meant the standard form of the puzzle, and just forgot to describe it in full. Its tendency to pattern-match in that way is the same thing that gives it the ability to ignore people’s typos.