The central limit theorem is a piece of pure math that mixes together two things that should be pretty separate. One is what the distribution looks like in the center and the other is what the distribution looks like at the tails. Both aspects of the distribution converge to the Gaussian, but if you want to measure how fast they converge, you should probably choose one or the other and define your metric (and your graphs) based on that target.
Skewness is a red herring. You should care about both tails separately and they don’t cancel out.
This is an interesting view. I like the idea of thinking of center and tails separately. At the same time: If the center and tails were best thought of as separate, isn’t it interesting that both go to Gaussian? Are there cases you know of (probably with operations that are not convolution?) where the tails go to Gaussian but the center does not, or vice-versa?
Skew doesn’t capture everything, but it really seems like it captures something, doesn’t it? I’d be interested to see a collection of distributions for which the relationship between skew and Gaussianness-after-n-convolutions relationship does not hold, if you know of one!
If the center goes to a unique limit and the tails go to a unique limit, then the two unique limits must be the same. CLT applies to anything with finite variance. So if something has gaussian tails, it just is the gaussian. But there are infinite variance examples with fat tails that have limits. What would it mean to ask if the center is gaussian? Where does the center end and the tails begin? So I don’t think it makes sense to ask if the center is gaussian. But there are coarser statements you can consider. Is the limit continuous? How fast does a discrete distribution converge to a continuous one? Then you can look at the first and second derivative of the pdf. This is like an infinitesimal version of variance.
The central limit theorem is a beautiful theorem that captures something relating multiple phenomena. It is valuable to study that relationship, even if you should ultimately disentangle them. In contrast, skew is an arbitrary formula that mixes things together as a kludge with rough edges. It is adequate for dealing with one tail, but a serious mistake for two. It is easy to give examples of a distribution with zero skew that isn’t gaussian: any symmetric distribution.
The central limit theorem is a piece of pure math that mixes together two things that should be pretty separate. One is what the distribution looks like in the center and the other is what the distribution looks like at the tails. Both aspects of the distribution converge to the Gaussian, but if you want to measure how fast they converge, you should probably choose one or the other and define your metric (and your graphs) based on that target.
Skewness is a red herring. You should care about both tails separately and they don’t cancel out.
This is an interesting view. I like the idea of thinking of center and tails separately. At the same time: If the center and tails were best thought of as separate, isn’t it interesting that both go to Gaussian? Are there cases you know of (probably with operations that are not convolution?) where the tails go to Gaussian but the center does not, or vice-versa?
Skew doesn’t capture everything, but it really seems like it captures something, doesn’t it? I’d be interested to see a collection of distributions for which the relationship between skew and Gaussianness-after-n-convolutions relationship does not hold, if you know of one!
If the center goes to a unique limit and the tails go to a unique limit, then the two unique limits must be the same. CLT applies to anything with finite variance. So if something has gaussian tails, it just is the gaussian. But there are infinite variance examples with fat tails that have limits. What would it mean to ask if the center is gaussian? Where does the center end and the tails begin? So I don’t think it makes sense to ask if the center is gaussian. But there are coarser statements you can consider. Is the limit continuous? How fast does a discrete distribution converge to a continuous one? Then you can look at the first and second derivative of the pdf. This is like an infinitesimal version of variance.
The central limit theorem is a beautiful theorem that captures something relating multiple phenomena. It is valuable to study that relationship, even if you should ultimately disentangle them. In contrast, skew is an arbitrary formula that mixes things together as a kludge with rough edges. It is adequate for dealing with one tail, but a serious mistake for two. It is easy to give examples of a distribution with zero skew that isn’t gaussian: any symmetric distribution.