Another thought I just had was, could it be that ChatGPT, because it’s trained to be such a people pleaser, is losing intentionally to make the user happy?
Have you tried telling it to actually try to win? Probably won’t make a difference, but it seems like a really easy thing to rule out.
Another thought I just had was, could it be that ChatGPT, because it’s trained to be such a people pleaser, is losing intentionally to make the user happy?
Have you tried telling it to actually try to win? Probably won’t make a difference, but it seems like a really easy thing to rule out.