I think this is a confusion of two different types of thinking. One is the classical thought of one being responsible only for the consequences of one’s individual actions. If you think of yourself as an individual making independent decisions like this, then you are justified in thinking that there is a 90% chance of heads upon seeing a green room: 90% of individuals in green rooms, in expectation, are there when the coin flips heads.(note that if you modify the problem so that the outcomes of the bet only apply to the people making it, the bet becomes favorable even in advance, regardless of whether the agents are altruistic or selfish).
However, in this case you cannot claim that your “yes” decision has utility based on the result of the entire group’s response to the question. If you did, then in heads flips, if all 18 people say “yes”, all 18 people will claim that the utility of their action was $18, but only $18 of utility was gained by the group in total as a consequence of all these decisions, so they cannot all be right.(Note that if you modify the problem so that you really are the only one responsible for the decision, like by saying that everyone in green rooms except one person is mind-controlled to say “yes”, then saying “yes” really is the right decision for that free-willed person, even in advance.)
It is also possible to reason in a TDT sort of way, acting as though you are making decisions for all identical copies of yourself. This is effectively defining yourself, as an agent, as the set of all identical copies of yourself with identical input. In this case, it does make sense to take responsibility for the decision of the entire group, but it no longer makes sense to do an anthropic update: you as an agent, as a set of copies of yourself seeing green rooms, would exist whether or not the coin came up heads, and would make the same observations.
In conclusion, it makes sense to do an anthropic update if you think of yourself as an individual, it makes sense to take utilitarian responsibility for the whole group of actions if you think of yourself as a group of identical copies, but in neither case does it make sense to do both, which is what you would need to justify saying “yes” in this situation.
I think this is a confusion of two different types of thinking. One is the classical thought of one being responsible only for the consequences of one’s individual actions. If you think of yourself as an individual making independent decisions like this, then you are justified in thinking that there is a 90% chance of heads upon seeing a green room: 90% of individuals in green rooms, in expectation, are there when the coin flips heads.(note that if you modify the problem so that the outcomes of the bet only apply to the people making it, the bet becomes favorable even in advance, regardless of whether the agents are altruistic or selfish).
However, in this case you cannot claim that your “yes” decision has utility based on the result of the entire group’s response to the question. If you did, then in heads flips, if all 18 people say “yes”, all 18 people will claim that the utility of their action was $18, but only $18 of utility was gained by the group in total as a consequence of all these decisions, so they cannot all be right.(Note that if you modify the problem so that you really are the only one responsible for the decision, like by saying that everyone in green rooms except one person is mind-controlled to say “yes”, then saying “yes” really is the right decision for that free-willed person, even in advance.)
It is also possible to reason in a TDT sort of way, acting as though you are making decisions for all identical copies of yourself. This is effectively defining yourself, as an agent, as the set of all identical copies of yourself with identical input. In this case, it does make sense to take responsibility for the decision of the entire group, but it no longer makes sense to do an anthropic update: you as an agent, as a set of copies of yourself seeing green rooms, would exist whether or not the coin came up heads, and would make the same observations.
In conclusion, it makes sense to do an anthropic update if you think of yourself as an individual, it makes sense to take utilitarian responsibility for the whole group of actions if you think of yourself as a group of identical copies, but in neither case does it make sense to do both, which is what you would need to justify saying “yes” in this situation.