When you calculate the expected outcome for the “deciders say nay” strategy and the “deciders say yea” strategy, you already know that the deciders will be deciders. So “you are a decider” is not new information (relative to that strategy), don’t change your answer. (It may be new information relative to other strategies, where the one making the decision is an individual that wasn’t necessarily going to be told “you are the decider” for the original problem. If you’re told, “you are the decider”, you should still conclude with 90% probability that the coin is tails.)
(Possibly a rephrasing of 1.) If the deciders in the tails universe come to the same conclusion as the deciders in the heads universe about the probability of which universe they’re in, one might conclude that they didn’t actually get useful information about which universe they’re in.
(Also a rephrasing of 1.) The deciders do a pretty good job of predicting what universe they’re in individually, but the situation is contrived to give the one wrong decider nine times the decision-making power. (Edit: And since you know about that trap in advance, you shouldn’t fall into it.)
(Isomorphic?) Perhaps “there’s a 90% probability that I’m in the ‘tails’ universe” is the wrong probability to look at. The relevant probability is, “if nine hypothetical individuals are told ‘you’re a decider’, there’s only a 10% probability that they’re all in the tails universe”.
A few thoughts on the cousin_its problem:
When you calculate the expected outcome for the “deciders say nay” strategy and the “deciders say yea” strategy, you already know that the deciders will be deciders. So “you are a decider” is not new information (relative to that strategy), don’t change your answer. (It may be new information relative to other strategies, where the one making the decision is an individual that wasn’t necessarily going to be told “you are the decider” for the original problem. If you’re told, “you are the decider”, you should still conclude with 90% probability that the coin is tails.)
(Possibly a rephrasing of 1.) If the deciders in the tails universe come to the same conclusion as the deciders in the heads universe about the probability of which universe they’re in, one might conclude that they didn’t actually get useful information about which universe they’re in.
(Also a rephrasing of 1.) The deciders do a pretty good job of predicting what universe they’re in individually, but the situation is contrived to give the one wrong decider nine times the decision-making power. (Edit: And since you know about that trap in advance, you shouldn’t fall into it.)
(Isomorphic?) Perhaps “there’s a 90% probability that I’m in the ‘tails’ universe” is the wrong probability to look at. The relevant probability is, “if nine hypothetical individuals are told ‘you’re a decider’, there’s only a 10% probability that they’re all in the tails universe”.