If you ask ChatGPT to multiply two 4-digit numbers it writes out the reasoning process in natural knowledge and comes to the right answer.
People keep saying such things. Am I missing something? I asked it to calculate 1024 * 2047, and the answer isn’t even close. (Though to my surprise, the first 2 steps are at least correct steps, and not nonsense. And it is actually adding the right numbers together in step 3, again, to my surprise. I’ve seen it perform much, much worse.)
I did ask it at the beginning to multiply numbers and it seems to behave now differently than it did 5 weeks ago and isn’t making correct multiplications anymore. Unfortunatley, I can’t access the old chats.
Interesting. I’m having the opposite experience (due to timing, apparently), where at least it’s making some sense now. I’ve seen it using tricks only applicable to addition and pulling numbers out of its ass, so I was surprised what it did wasn’t completely wrong.
People keep saying such things. Am I missing something? I asked it to calculate 1024 * 2047, and the answer isn’t even close. (Though to my surprise, the first 2 steps are at least correct steps, and not nonsense. And it is actually adding the right numbers together in step 3, again, to my surprise. I’ve seen it perform much, much worse.)
I did ask it at the beginning to multiply numbers and it seems to behave now differently than it did 5 weeks ago and isn’t making correct multiplications anymore. Unfortunatley, I can’t access the old chats.
Interesting. I’m having the opposite experience (due to timing, apparently), where at least it’s making some sense now. I’ve seen it using tricks only applicable to addition and pulling numbers out of its ass, so I was surprised what it did wasn’t completely wrong.
Asking the same question again even gives a completely different (but again wrong) result: