“But to assign some probability to the wrong answer is logically equivalent to assigning probability to 0=1.”
Only if you know it is the wrong answer. You say the robot doesn’t know, so what’s the problem? We assign probabilities to propositions which are wrong all the time, before we know if they are wrong or not.
Was the “Putting in the Numbers” post the one you were referring to? You didn’t post that on Saturday, but now it is Monday and there doesn’t seem be a third post. Anyway I did not see this question answered anywhere in “Putting in the Numbers”...
Ok, but do you really mean that sentence how it is written? To me it means the same thing as saying that assigning probability to anything is logically equivalent to assigning probability to 0=1 (which I am perfectly happy to do so if that is the point then fine, but that doesn’t seem to be your implication)
“But to assign some probability to the wrong answer is logically equivalent to assigning probability to 0=1.”
Only if you know it is the wrong answer. You say the robot doesn’t know, so what’s the problem? We assign probabilities to propositions which are wrong all the time, before we know if they are wrong or not.
I’ll tell you on Saturday!
Was the “Putting in the Numbers” post the one you were referring to? You didn’t post that on Saturday, but now it is Monday and there doesn’t seem be a third post. Anyway I did not see this question answered anywhere in “Putting in the Numbers”...
Yeah, sorry, I’ve been delayed by the realization that everything I wrote for the forthcoming post needed a complete re-write. Planning fallacy!
Lol ok, so long as I get my answer eventually :p.
Ok, but do you really mean that sentence how it is written? To me it means the same thing as saying that assigning probability to anything is logically equivalent to assigning probability to 0=1 (which I am perfectly happy to do so if that is the point then fine, but that doesn’t seem to be your implication)