Most people don’t usually make these kinds of elaborate things up. Prior probability for that hypothesis is low, even if it might be higher for Tuxedage than it would be for an average person. People do actually try the AI box experiment, and we had a big thread about people potentially volunteering to do it a while back, so prior information suggests that LWers do want to participate in these experiments. Since extraordinary claims are extraordinary evidence (within limits), Tuxedage telling this story is good enough evidence that it really happened.
But on a separate note, I’m not sure the prior probability for this being a lie would necessarily be higher just because Tuxedage has some incentive to lie. If it is found out to be a lie, the cause of FAI might be significantly hurt (“they’re a bunch of nutters who lie to advance their silly religious cause”). Folks on Rational Wiki watch this site for things like that, so Tuxedage also has some incentive to not lie. Also more than one person has to be involved in this lie, giving a complexity penalty. I suppose the only story detail that needs to be a lie to advance FAI is “I almost won,” but then why not choose “I won”?
Most people don’t usually make these kinds of elaborate things up. Prior probability for that hypothesis is low, even if it might be higher for Tuxedage than it would be for an average person.
Most people don’t report about these kinds of things either. The correct prior is not the frequency of elaborate lies among all statements of an average person, but the frequency of lies among the relevant class of dubious statements. Of course, what constitutes the relevant class may be disputed.
Anyway, I agree with Hanson that it is not low prior probability which makes a claim dubious in the relevant sense, but rather the fact that the speaker may be motivated to say it for reasons independent of its truth. In such cases, I don’t think the claim is extraordinary evidence, and I consider this to be such a case. Probably not much more can be said without writing down the probabilities which I’d prefer not to, but am willing to do it if you insist.
I suppose the only story detail that needs to be a lie to advance FAI is “I almost won,” but then why not choose “I won”?
When talking about games without an explicit score, “I almost won” is a very fuzzy phrase which can be translated to “I lost” without real loss of meaning.
I don’t think there’s any point in treating the “almost victory” as anything other than a defeat, for either the people who believe or disbelieve him.
If I am interested in the question of whether winning is possible in the game, “almost victory” and “utter defeat” have very different meaning for me. Why would I need explicit score?
Most people don’t usually make these kinds of elaborate things up. Prior probability for that hypothesis is low, even if it might be higher for Tuxedage than it would be for an average person. People do actually try the AI box experiment, and we had a big thread about people potentially volunteering to do it a while back, so prior information suggests that LWers do want to participate in these experiments. Since extraordinary claims are extraordinary evidence (within limits), Tuxedage telling this story is good enough evidence that it really happened.
But on a separate note, I’m not sure the prior probability for this being a lie would necessarily be higher just because Tuxedage has some incentive to lie. If it is found out to be a lie, the cause of FAI might be significantly hurt (“they’re a bunch of nutters who lie to advance their silly religious cause”). Folks on Rational Wiki watch this site for things like that, so Tuxedage also has some incentive to not lie. Also more than one person has to be involved in this lie, giving a complexity penalty. I suppose the only story detail that needs to be a lie to advance FAI is “I almost won,” but then why not choose “I won”?
Most people don’t report about these kinds of things either. The correct prior is not the frequency of elaborate lies among all statements of an average person, but the frequency of lies among the relevant class of dubious statements. Of course, what constitutes the relevant class may be disputed.
Anyway, I agree with Hanson that it is not low prior probability which makes a claim dubious in the relevant sense, but rather the fact that the speaker may be motivated to say it for reasons independent of its truth. In such cases, I don’t think the claim is extraordinary evidence, and I consider this to be such a case. Probably not much more can be said without writing down the probabilities which I’d prefer not to, but am willing to do it if you insist.
In order to allow this argument.
When talking about games without an explicit score, “I almost won” is a very fuzzy phrase which can be translated to “I lost” without real loss of meaning.
I don’t think there’s any point in treating the “almost victory” as anything other than a defeat, for either the people who believe or disbelieve him.
If I am interested in the question of whether winning is possible in the game, “almost victory” and “utter defeat” have very different meaning for me. Why would I need explicit score?