(I will not try to prove transitivity here, since my goal is to get the overall picture across; I have not checked it, although I expect it to hold.)
Transitivity doesn’t hold, here’s a counterexample.
The intuitive story is: X’s action tells you whether Z failed, Y fails sometimes, and Z fails more rarely.
The full counterexample (all of the following is according to your beliefs P1): Say available actions are 0 and 1. There is a hidden fair coin, and your utility is high if you manage to match the coin, and low if you don’t. Y peeks at the coin, and takes the correct action, except when it fails, which has a 1⁄4 chance. Z does the same, but it only fails with a 1⁄100 chance. X plays 1 iff Z has failed. Given X’s and Y’s action, you always go with Y’s action, since X tells you nothing about the coin, and Y gives you some information. Given Z’s and Y’s actions, you always go with Z’s, because it’s less likely to have failed (even when they disagree). But given Z’s and X’s, there will be some times (1/100), in which you see X played 1, and then you will not play the same as Z.
The same counterexample works for beliefs (or continuous actions) instead of discrete actions (where you will choose a probability p∈[0,1] to believe, instead of an action a∈{0,1}), but needs a couple small changes. Now both Z and Y fail with 1⁄4 probability (independently). Also, Y outputs its guess as 0.75 or 0.25 (instead of 1 or 0), because YOU (that is, P1) will be taking into account the possibility that it has failed (and Y better output whatever you will want to guess after seeing it). Instead of Z, consider A as the third expert, which outputs 0.5 if Z and Y disagree, 15⁄16 if they agree on yes, and 1⁄16 if they agree on no. X still tells you whether Z failed. Seeing Y and X, you always go with Y’s guess. Seeing A and Y, you always go with A’s guess. But if you see A = 15⁄16 and X = 1, you know both failed, and guess 0. (In fact, even when you see X = 0, you will guess 1 instead of 15⁄16.)
My intuition says the natural thing would be to assume something about the experts not talking about each other (which probably means being independent, which sounds too strong). I feel like whenever they can talk about each other an example like this will exist. But not sure! Maybe you can have a relative endorsement definition that’s more like “modulo the other information I’m receiving about you from the environment, I treat the additional bits you’re giving me as the best information”.
Transitivity doesn’t hold, here’s a counterexample.
The intuitive story is: X’s action tells you whether Z failed, Y fails sometimes, and Z fails more rarely.
The full counterexample (all of the following is according to your beliefs P1): Say available actions are 0 and 1. There is a hidden fair coin, and your utility is high if you manage to match the coin, and low if you don’t. Y peeks at the coin, and takes the correct action, except when it fails, which has a 1⁄4 chance. Z does the same, but it only fails with a 1⁄100 chance. X plays 1 iff Z has failed.
Given X’s and Y’s action, you always go with Y’s action, since X tells you nothing about the coin, and Y gives you some information. Given Z’s and Y’s actions, you always go with Z’s, because it’s less likely to have failed (even when they disagree). But given Z’s and X’s, there will be some times (1/100), in which you see X played 1, and then you will not play the same as Z.
The same counterexample works for beliefs (or continuous actions) instead of discrete actions (where you will choose a probability p∈[0,1] to believe, instead of an action a∈{0,1}), but needs a couple small changes. Now both Z and Y fail with 1⁄4 probability (independently). Also, Y outputs its guess as 0.75 or 0.25 (instead of 1 or 0), because YOU (that is, P1) will be taking into account the possibility that it has failed (and Y better output whatever you will want to guess after seeing it). Instead of Z, consider A as the third expert, which outputs 0.5 if Z and Y disagree, 15⁄16 if they agree on yes, and 1⁄16 if they agree on no. X still tells you whether Z failed. Seeing Y and X, you always go with Y’s guess. Seeing A and Y, you always go with A’s guess. But if you see A = 15⁄16 and X = 1, you know both failed, and guess 0. (In fact, even when you see X = 0, you will guess 1 instead of 15⁄16.)
Ah, very interesting, thanks! I wonder if there is a different way to measure relative endorsement that could achieve transitivity.
My intuition says the natural thing would be to assume something about the experts not talking about each other (which probably means being independent, which sounds too strong). I feel like whenever they can talk about each other an example like this will exist. But not sure! Maybe you can have a relative endorsement definition that’s more like “modulo the other information I’m receiving about you from the environment, I treat the additional bits you’re giving me as the best information”.