Yes, there was supposed to be a 1/n in the sum, sorry!
Essentially what the g is doing is taking the place of the interval probabilities; for example, if I think of g as being the characteristic function on an interval (one on that interval and zero elsewhere) then the sum and integral should both be equal to the probability of a point landing in that interval. Then one can approximate all measurable functions by characteristic functions or somesuch to make the equivalence.
In practice (for me) in Fourier analysis you prove this for a basis, such as integer powers of cosine on a close interval, or simply integer powers on an open interval (these are the moments of a distribution).
I’m not sure what an average sum of powers is; where do you do this in the formula you gave? Is it encapsulated in the function g?
Yes; after you add in the 1/n hopefully the “average” part makes sense, and then just take g for a single variable to be x^k and vary over integers k. And as I mentioned above, yes I believe it does reduce to just “count the events;” just if you want to prove things you need to count using a countable basis of function space rather than looking at intervals.
It looks to me like we’ve bridged the gap between the approaches. We are doing the same thing, but the physics case is much more specific: We have a generating function in mind and just want to know its parameters, and we look only at the linear average, we don’t vary the powers (*). So we don’t use the tools you mentioned in the comment that started this thread, because they’re adapted to the much more general case.
(*) Edit to add: Actually, on further thought, that’s not entirely true. There are cases where we take moments of distributions and whatnot; a friend of mine who was a PhD student at the same time as me worked on such an analysis. It’s just sufficiently rare (or maybe just rare in my experience!) that it didn’t come to my mind right away.
Okay, so my hypothesis that basically all of the things that I care about are swept under the rug because you only care about what I would call trivial cases was essentially right.
And it definitely makes sense that if you’ve already restricted to a specific function and you just want parameters that you really don’t need to deal with higher moments.
Yes, there was supposed to be a 1/n in the sum, sorry!
Essentially what the g is doing is taking the place of the interval probabilities; for example, if I think of g as being the characteristic function on an interval (one on that interval and zero elsewhere) then the sum and integral should both be equal to the probability of a point landing in that interval. Then one can approximate all measurable functions by characteristic functions or somesuch to make the equivalence.
In practice (for me) in Fourier analysis you prove this for a basis, such as integer powers of cosine on a close interval, or simply integer powers on an open interval (these are the moments of a distribution).
Yes; after you add in the 1/n hopefully the “average” part makes sense, and then just take g for a single variable to be x^k and vary over integers k. And as I mentioned above, yes I believe it does reduce to just “count the events;” just if you want to prove things you need to count using a countable basis of function space rather than looking at intervals.
It looks to me like we’ve bridged the gap between the approaches. We are doing the same thing, but the physics case is much more specific: We have a generating function in mind and just want to know its parameters, and we look only at the linear average, we don’t vary the powers (*). So we don’t use the tools you mentioned in the comment that started this thread, because they’re adapted to the much more general case.
(*) Edit to add: Actually, on further thought, that’s not entirely true. There are cases where we take moments of distributions and whatnot; a friend of mine who was a PhD student at the same time as me worked on such an analysis. It’s just sufficiently rare (or maybe just rare in my experience!) that it didn’t come to my mind right away.
Okay, so my hypothesis that basically all of the things that I care about are swept under the rug because you only care about what I would call trivial cases was essentially right.
And it definitely makes sense that if you’ve already restricted to a specific function and you just want parameters that you really don’t need to deal with higher moments.