Robin’s paper is unfortunately not going to avail you here. It applies to cases where Bayesians share all the same information but nevertheless disagree.
This is not correct. Even the original Aumann theorem only assumes that the Bayesians have (besides common priors) common knowledge of each other’s probability estimates—not that they share all the same information! (In fact, if they have common priors and the same information, then their posteriors are trivially equal.)
Robin’s paper imposes restrictions on being able to postulate uncommon priors as a way of escaping Aumann’s theorem: if you want to assume uncommon priors, certain consequences follow. (Roughly speaking, if Richard and I have differing priors, then we must also disagree about the origin of our priors.)
In any event, you do get closer to what I regard as the point here:
Experiences are not propositions! You cannot conditionalize on an experience.
Another term for “conditionalize” is “update”. Why can’t you update on an experience?
The sense I get is that you’re not wanting to apply the Bayesian model of belief to “experiences”. But if our “experiences” affect our beliefs, then I see no reason not to.
The actual values of O and O’ at hand are “That one particular mental event which occurred in Richard’s mind at time t [when he was trying to conceive of zombies] was a conception of zombies,” and “That one particular mental event which occurred in Richard’s mind at time t was a conception of something other than zombies, or a non-conception.” The truth-value value of the O″ you provide has little bearing on either of these.
In these terms, O″ is simply “that one particular mental event occurred in Richard’s mind”—so again, the question is what the occurrence of that mental event implies, and we should be able to bypass the dispute about whether to classify it as O or O’ by analyzing its implications directly. (The truth-value of O″ isn’t a subject of dispute; in fact O″ is chosen that way.)
Here’s a thought experiment that might illuminate my argument a bit. Imagine a group of evil scientists kidnaps you and implants special contact lenses which stream red light directly into your retina constantly. Your visual field is a uniformly red canvas, and you can never shut it off. The scientists then strand you on an island full of Bayesian tribespeople who are congenitally blind. The tribespeople consider the existence of visual experience ridiculous and point to all sorts of icky human biases tainting our judgment. How do you update your belief that you’re experiencing red?
It goes down, since the tribespeople would be more likely to say that if there is no visual experience than if there is. Of course, the amount it goes down by will depend on my other information (in particular, if I know they’re congenitally blind, that significantly weakens this evidence).
This is not correct. Even the original Aumann theorem only assumes that the Bayesians have (besides common priors) common knowledge of each other’s probability estimates—not that they share all the same information! (In fact, if they have common priors and the same information, then their posteriors are trivially equal.)
Robin’s paper imposes restrictions on being able to postulate uncommon priors as a way of escaping Aumann’s theorem: if you want to assume uncommon priors, certain consequences follow. (Roughly speaking, if Richard and I have differing priors, then we must also disagree about the origin of our priors.)
In any event, you do get closer to what I regard as the point here:
Another term for “conditionalize” is “update”. Why can’t you update on an experience?
The sense I get is that you’re not wanting to apply the Bayesian model of belief to “experiences”. But if our “experiences” affect our beliefs, then I see no reason not to.
In these terms, O″ is simply “that one particular mental event occurred in Richard’s mind”—so again, the question is what the occurrence of that mental event implies, and we should be able to bypass the dispute about whether to classify it as O or O’ by analyzing its implications directly. (The truth-value of O″ isn’t a subject of dispute; in fact O″ is chosen that way.)
It goes down, since the tribespeople would be more likely to say that if there is no visual experience than if there is. Of course, the amount it goes down by will depend on my other information (in particular, if I know they’re congenitally blind, that significantly weakens this evidence).