You don’t need any correlation between X and Y to have E[XY]≠E[X]E[Y] . Suppose both variables are 1 with probability .5 and 2 with probability .5; then their mean is 1.5, but the mean of their products is 2.25.
Indeed, each has a mean of 1.5; so the product of their means is 2.25, which equals the mean of their product. We do in fact have E[XY]=E[X]E[Y] in this case. More generally we have this iff X and Y are uncorrelated, because, well, that’s just how “uncorrelated” in the technical sense is defined. I mean if you really want to get into fundamentals, E[XY]-E[X]E[Y] is not really the most fundamental definition of covariance, I’d say, but it’s easily seen to be equivalent. And then of course either way you have to show that independent implies uncorrelated. (And then I guess you have to do the analogues for more than two, but...)
You don’t need any correlation between X and Y to have E[XY]≠E[X]E[Y] . Suppose both variables are 1 with probability .5 and 2 with probability .5; then their mean is 1.5, but the mean of their products is 2.25.
Indeed, each has a mean of 1.5; so the product of their means is 2.25, which equals the mean of their product. We do in fact have E[XY]=E[X]E[Y] in this case. More generally we have this iff X and Y are uncorrelated, because, well, that’s just how “uncorrelated” in the technical sense is defined. I mean if you really want to get into fundamentals, E[XY]-E[X]E[Y] is not really the most fundamental definition of covariance, I’d say, but it’s easily seen to be equivalent. And then of course either way you have to show that independent implies uncorrelated. (And then I guess you have to do the analogues for more than two, but...)
Gah, of course you’re correct. I can’t imagine how I got so confused but thank you for the correction.