If you can be Dutch booked about probabilities asked in close sequence, where you gain no new information except that a question was asked, I’d think that reflects a considerable failure of rationality. There are grounds to reject “temporal Dutch book arguments”, but this isn’t one; the time and the “new information” should both be negligible.
To put it differently, if you have no information about what beads are in the jar, then you have even less information about why Omega wants to know your probabilities for the sorts of beads in the jar. Omega is a weird dude. Omega asking you a question does not mean what it means when a human asks you the same question.
I reject the proposition “Omega asking a question supplies negligible new information”. What information you glean from Omega depends entirely on what your prior beliefs about likely Omega behavior are. It is not at all absurd for a rational entity to have beliefs representing the reasoning “if Omega asks about multiple basic colors then there is a higher probability that his bead jar contains beads selected from the basic colors than if he only asks about red beads”.
For my part I would definitely not assign p(red) = 0.5 on the first question and then update to p(red) = 1/n(basic colors). I would, however, lower p(red) by greater that 0.
Omega asking you a question does not mean what it means when a human asks you the same question.
That is true, it doesn’t. However, this is an argument for a zero information “0.5” probability, not against it. We (that is, yourself in your initial post and me in my replies here) are using inductive reasoning to assign probabilities based on our knowledge of human color labels. You have extracted information from “What is p(red)?” and used it to update from 0.5 in the direction of 1⁄12. The same process must also be allowed to apply when further questions are asked.
If “What is p(red)?” provokes me to even consider the number 0.083 in my reasoning then “What is p(red)?… What are p(blue) and p(yellow)?” will provoke me to consider 0.083 with greater weight. The question “What is p(darkturquoise)?” must also provoke me to consider a significantly lower figure.
If you can be Dutch booked about probabilities asked in close sequence, where you gain no new information except that a question was asked, I’d think that reflects a considerable failure of rationality. There are grounds to reject “temporal Dutch book arguments”, but this isn’t one; the time and the “new information” should both be negligible.
To put it differently, if you have no information about what beads are in the jar, then you have even less information about why Omega wants to know your probabilities for the sorts of beads in the jar. Omega is a weird dude. Omega asking you a question does not mean what it means when a human asks you the same question.
I reject the proposition “Omega asking a question supplies negligible new information”. What information you glean from Omega depends entirely on what your prior beliefs about likely Omega behavior are. It is not at all absurd for a rational entity to have beliefs representing the reasoning “if Omega asks about multiple basic colors then there is a higher probability that his bead jar contains beads selected from the basic colors than if he only asks about red beads”.
For my part I would definitely not assign p(red) = 0.5 on the first question and then update to p(red) = 1/n(basic colors). I would, however, lower p(red) by greater that 0.
That is true, it doesn’t. However, this is an argument for a zero information “0.5” probability, not against it. We (that is, yourself in your initial post and me in my replies here) are using inductive reasoning to assign probabilities based on our knowledge of human color labels. You have extracted information from “What is p(red)?” and used it to update from 0.5 in the direction of 1⁄12. The same process must also be allowed to apply when further questions are asked.
If “What is p(red)?” provokes me to even consider the number 0.083 in my reasoning then “What is p(red)?… What are p(blue) and p(yellow)?” will provoke me to consider 0.083 with greater weight. The question “What is p(darkturquoise)?” must also provoke me to consider a significantly lower figure.
Either questions give information or they don’t.