Shapely values are very cool. Let me mention some cool facts:
They arise in (cooperative) game theory but also in ML when doing credit allocation a combined prediction from mixing predictions from different modules of a system.
One piece of evidence of their fundamentalness is that they arise naturally from the Hodge theory on the hypercube of a coalition game: https://arxiv.org/abs/1709.08318
Another interesting fact I learned from Davidad: Shapley values are not compositional: a group of actors can increase their total Shapley value by forming a single cabal such that individuals within that cabal refuse to cooperate with individuals outside the cabal without the rest of the cabal joining in. This is can be a measure of collusion potential.
Shapely values are very cool. Let me mention some cool facts:
They arise in (cooperative) game theory but also in ML when doing credit allocation a combined prediction from mixing predictions from different modules of a system.
One piece of evidence of their fundamentalness is that they arise naturally from the Hodge theory on the hypercube of a coalition game: https://arxiv.org/abs/1709.08318
Another interesting fact I learned from Davidad: Shapley values are not compositional: a group of actors can increase their total Shapley value by forming a single cabal such that individuals within that cabal refuse to cooperate with individuals outside the cabal without the rest of the cabal joining in. This is can be a measure of collusion potential.