Nitpick: I think you’re mixing up the variance and the second moment. The variance equals the second moment when the mean is zero, but not in general. The theorem with F2+2F1G1+G2 in it is about second moments, not variances. The corresponding theorem for variances just says that the variance of the sum equals the sum of the variances (when the distributions are independent, if you’re talking about probability distributions rather than arbitrary functions, which if you use the word “variance” you ought to be). If you use central moments (in which case the second moment is the same thing as the variance) then I guess this theorem is true, but in a rather silly way because then F1 and G1 are necessarily zero.
Nitpick: I think you’re mixing up the variance and the second moment. The variance equals the second moment when the mean is zero, but not in general. The theorem with F2+2F1G1+G2 in it is about second moments, not variances. The corresponding theorem for variances just says that the variance of the sum equals the sum of the variances (when the distributions are independent, if you’re talking about probability distributions rather than arbitrary functions, which if you use the word “variance” you ought to be). If you use central moments (in which case the second moment is the same thing as the variance) then I guess this theorem is true, but in a rather silly way because then F1 and G1 are necessarily zero.
I definitely was thinking they were literally the same in every case! I corrected that part and learned something. Thanks!