(midco developed this separately from our project last term, so this is actually my first read)
I have a lot of small questions.
What is your formal definition of the IEU ui? What kinds of goals is it conditioning on (because IEU is what you compute after you view your type in a Bayesian game)?
Multi-agent “impact” seems like it should deal with the Shapley value. Do you have opinions on how this should fit in?
You note that your formalism has some EDT-like properties with respect to impact:
Well, in a sense, they do. The universes where player i shouts “heads” are exactly the universes in which everyone wins. The problem is that of agency: player i doesn’t choose their action, the coin (ω) does. If we condition on the value of ω, then each player’s action becomes deterministic, thus IEU is constant across each player’s (trivial) action space.
This seems weird and not entailed by the definition of IEU, so I’m pretty surprised that IEU would tell you to shout ‘heads.’
Given arbitrary R.V.s A, B, we define the estimate of A given B=b as
e(A,B):=EB=b[A]
Is this supposed to be e(A,B=b)? If so, this is more traditionally called the conditional expectation of A given B=b.
I played fast and loose with IEU in the intro section. I think it can be consistently defined in the Bayesian game sense of “expected utility given your type”, where the games in the intro section are interpreted as each player having constant type. In the Bayesian Network section, this is explicitly the definition (in particular, player i’s IEU varies as a function of their type).
Upon reading the Wiki page, it seems like Shapley value and Impact share a lot of common properties? I’m not sure of any exact relationship, but I’ll look into connections in the future.
I think what’s going on is that the “causal order” of ω and ai is switched, which makes ai “look as though” it controls the value of ω. In terms of game theory the distinction is (I think) definitional; I include it because Impact has to explicitly consider this dynamic.
In retrospect: yep, that’s conditional expectation! My fault for the unnecessary notation. I introduced it to capture the idea of a vector space projection on random variables and didn’t see the connection to pre-existing notation.
(midco developed this separately from our project last term, so this is actually my first read)
I have a lot of small questions.
What is your formal definition of the IEU ui? What kinds of goals is it conditioning on (because IEU is what you compute after you view your type in a Bayesian game)?
Multi-agent “impact” seems like it should deal with the Shapley value. Do you have opinions on how this should fit in?
You note that your formalism has some EDT-like properties with respect to impact:
This seems weird and not entailed by the definition of IEU, so I’m pretty surprised that IEU would tell you to shout ‘heads.’
Is this supposed to be e(A,B=b)? If so, this is more traditionally called the conditional expectation of A given B=b.
Answering questions one-by-one:
I played fast and loose with IEU in the intro section. I think it can be consistently defined in the Bayesian game sense of “expected utility given your type”, where the games in the intro section are interpreted as each player having constant type. In the Bayesian Network section, this is explicitly the definition (in particular, player i’s IEU varies as a function of their type).
Upon reading the Wiki page, it seems like Shapley value and Impact share a lot of common properties? I’m not sure of any exact relationship, but I’ll look into connections in the future.
I think what’s going on is that the “causal order” of ω and ai is switched, which makes ai “look as though” it controls the value of ω. In terms of game theory the distinction is (I think) definitional; I include it because Impact has to explicitly consider this dynamic.
In retrospect: yep, that’s conditional expectation! My fault for the unnecessary notation. I introduced it to capture the idea of a vector space projection on random variables and didn’t see the connection to pre-existing notation.