I wonder if it would be reasonable to use “xentropy” for the broad sense of “entropy” in OP, with the understanding that xentropy is always a two-argument function.
“The length of a codeword is the xentropy between [the delta distribution located at] the state and [the coinflip distribution implied by] the code”
I wonder if it would be reasonable to use “xentropy” for the broad sense of “entropy” in OP, with the understanding that xentropy is always a two-argument function.
“The length of a codeword is the xentropy between [the delta distribution located at] the state and [the coinflip distribution implied by] the code”