chatgpt is not a consistent agent; it is incredibly inclined to agree with whatever you ask. it can provide insights, but because it’s so inclined to agree, it has far stronger confirmation bias than humans. while its guesses seem reasonable, the hedge it insists on outputting constantly is not actually wrong.
Related predictions I made 1.5 years ago.
chatgpt is not a consistent agent; it is incredibly inclined to agree with whatever you ask. it can provide insights, but because it’s so inclined to agree, it has far stronger confirmation bias than humans. while its guesses seem reasonable, the hedge it insists on outputting constantly is not actually wrong.