The agent in this scenario doesn’t necessarily know if the driver can read faces or not, in the original problem the agent isn’t aware of this information. Surely if FDT advises you pay him on arrival in the face reading scenario, you would do the same in the non-face reading scenario since the agent can’t tell them apart.
No, the whole premise of the face-reading scenario is that the agent can tell that his face is being read, and that’s why he pays the money. If the agent can’t tell whether his face is being read, then his correct action (under FDT) is to pay the money if and only if (probability of being read) times (utility of returning to civilization) is greater than (utility of the money). Now, if this condition holds but in fact the driver can’t read faces, then FDT does pay the $50, but this is just because it got unlucky, and we shouldn’t hold that against it.
Then you violate the accurate beliefs condition.
(If the world is infact a random mixture in proportion which their beliefs track correctly, then fdt will do better when averaging over the mixture)
True beliefs doesn’t mean omniscience. It is possible to have only true beliefs but still not know everything. In this case, the agent might not know if the driver can read minds but still have accurate beliefs otherwise.
The agent in this scenario doesn’t necessarily know if the driver can read faces or not, in the original problem the agent isn’t aware of this information. Surely if FDT advises you pay him on arrival in the face reading scenario, you would do the same in the non-face reading scenario since the agent can’t tell them apart.
No, the whole premise of the face-reading scenario is that the agent can tell that his face is being read, and that’s why he pays the money. If the agent can’t tell whether his face is being read, then his correct action (under FDT) is to pay the money if and only if (probability of being read) times (utility of returning to civilization) is greater than (utility of the money). Now, if this condition holds but in fact the driver can’t read faces, then FDT does pay the $50, but this is just because it got unlucky, and we shouldn’t hold that against it.
Then you violate the accurate beliefs condition. (If the world is infact a random mixture in proportion which their beliefs track correctly, then fdt will do better when averaging over the mixture)
True beliefs doesn’t mean omniscience. It is possible to have only true beliefs but still not know everything. In this case, the agent might not know if the driver can read minds but still have accurate beliefs otherwise.