“To the best of your ability to say at this point, what would have been your
initial probability that the bill was yours?” I said.
“Fifteen percent,” said Nick.
“I would have said twenty percent,” I said.
So we split it $8.57 / $11.43, and went happily on our way, guilt-free.
(this is a 3 : 4 ratio)
Now for this to have been fair, you must both have walked away with a 3⁄7
probability that the money belonged to Nick.
This algorithm definitely feels like the wrong answer. Taking this ratio
couldn’t possibly be the correct way for both of you to update your beliefs.
Why?
Well, because the correct answer really ought to be invariant under swapping.
If you and Nick Bostrom had traded opinions, Nick’s answer would have been
eighty percent, your answer would have been eighty-five percent, and you
would have split the twenty in a 16:17 ratio. You would also have ended up
with 16:17 if you had instead asked “What would have been your initial
probability that the bill was not yours?”
It’s the wrong answer because if X is the proposition that the bill belongs
to Eliezer then setting your mutual belief in X to:
f(X) = p_E(X) / (p_E(X) + p_N(~X))
doesn’t look pretty and symetric. Not only that, but f(X)+f(~X) != 1. What
you did only appeared symetric because X made reference to the parties
involved. Granted, when propositions do that, people tend to be biased in
their own favor, but taking that into account would be solving a different
problem.
Now I haven’t read Aumann’s paper, so I don’t know the correct answer, if he
presents one. But if I had to come up with an answer just intuitively it
would be:
p(X) = (p_E(X)+p_N(X))/2
One (oversimplified) possible set of assumptions that could lead to this answer
would be:
“Well, presuming we saw the same things, one of our memories must have made a
serious mistake. Now if we ignore the possibility that we both made mistake,
and we know nothing about the relative frequences at which we both make
mistakes like this one, it seems reasonable to assume that, given that one of
us made a mistake the principle of indifference would suggest that the
chances are half that I was the one who reasoned properly and half that you
are the one who reasoned properly. So, provided that we think mistakes
aren’t more or less likely for either of us depending on the truth of the
proposition, averaging our probability estimates makes sense.”
Now of course, real mistakes are continuous, and more seriously, different
evidence was observed by both parties, so I don’t think averaging is the
correct answer in all cases.
However, that said, this formula gives p(X) = 21⁄40, and thus your fair share
is $10.50. While I won’t say this with authority because I wasn’t there and
I don’t know what types of mistakes people are likely to make, and how likely
each of you was to see and remember what, you owe Nick Bostrom ninety-seven
cents.
This algorithm definitely feels like the wrong answer. Taking this ratio couldn’t possibly be the correct way for both of you to update your beliefs.
Why?
Well, because the correct answer really ought to be invariant under swapping. If you and Nick Bostrom had traded opinions, Nick’s answer would have been eighty percent, your answer would have been eighty-five percent, and you would have split the twenty in a 16:17 ratio. You would also have ended up with 16:17 if you had instead asked “What would have been your initial probability that the bill was not yours?”
It’s the wrong answer because if X is the proposition that the bill belongs to Eliezer then setting your mutual belief in X to: f(X) = p_E(X) / (p_E(X) + p_N(~X)) doesn’t look pretty and symetric. Not only that, but f(X)+f(~X) != 1. What you did only appeared symetric because X made reference to the parties involved. Granted, when propositions do that, people tend to be biased in their own favor, but taking that into account would be solving a different problem.
Now I haven’t read Aumann’s paper, so I don’t know the correct answer, if he presents one. But if I had to come up with an answer just intuitively it would be: p(X) = (p_E(X)+p_N(X))/2
One (oversimplified) possible set of assumptions that could lead to this answer would be:
“Well, presuming we saw the same things, one of our memories must have made a serious mistake. Now if we ignore the possibility that we both made mistake, and we know nothing about the relative frequences at which we both make mistakes like this one, it seems reasonable to assume that, given that one of us made a mistake the principle of indifference would suggest that the chances are half that I was the one who reasoned properly and half that you are the one who reasoned properly. So, provided that we think mistakes aren’t more or less likely for either of us depending on the truth of the proposition, averaging our probability estimates makes sense.”
Now of course, real mistakes are continuous, and more seriously, different evidence was observed by both parties, so I don’t think averaging is the correct answer in all cases.
However, that said, this formula gives p(X) = 21⁄40, and thus your fair share is $10.50. While I won’t say this with authority because I wasn’t there and I don’t know what types of mistakes people are likely to make, and how likely each of you was to see and remember what, you owe Nick Bostrom ninety-seven cents.