Found an example in the wild with Mutual information! These equivalent definitions of Mutual Information undergo concept splintering as you go beyond just 2 variables:
interpretation: joint entropy minus all unshared info
… become bound information
… each with different properties (eg co-information is a bit too sensitive because just a single pair being independent reduces the whole thing to 0, total-correlation seems to overcount a bit, etc) and so with different uses (eg bound information is interesting for time-series).
Found an example in the wild with Mutual information! These equivalent definitions of Mutual Information undergo concept splintering as you go beyond just 2 variables:
I[X;Y]=H[X]+H[Y]−H[X,Y]
interpretation: common information
… become co-information, the central atom of your I-diagram
I[X;Y]=D(Pr(x,y)∥Pr(x)Pr(y))
interpretation: relative entropy b/w joint and product of margin
… become total-correlation
I[X;Y]=H[X,Y]−H[X∣Y]−H[Y∣X]
interpretation: joint entropy minus all unshared info
… become bound information
… each with different properties (eg co-information is a bit too sensitive because just a single pair being independent reduces the whole thing to 0, total-correlation seems to overcount a bit, etc) and so with different uses (eg bound information is interesting for time-series).
Wow, I missed this comment! This is a fantastic example, thank you!
have been meaning to write the concept splintering megapost—your comment might push me to finish it before the Rapture :D