First, it should be noted that human traits are usually lognormally distributed, with apparent normal distributions being an artifact. E.g. while IQ is normally distributed, per item response theory it has an exponential relationship to the likelihood of success at difficult tasks. E.g. Most of What You Read on the Internet is Written by Insane People. Etc. So it’s not really about normal distribution vs lognormal distributions, it’s about linear diffusion of lognormals vs exponential interaction[1] of normals[2].
There’s some different solutions to the missing heritability problem. One proposal is rare variants, since they aren’t picked up by most sequencing technology, but the rarer the variant, the larger the effect size can be, so that makes the rare variants end up as our “sparse lognormals”.
But let’s say rare variants are of negligible size, so they don’t give us linear diffusion of lognormals, and instead the longtailedness of human traits is due to some sort of exponential interaction.
Then another thing that could give us missing heritability is if apparent traits aren’t actually the true genetic traits, but rather the true genetic traits trigger some dynamics, with e.g. the largest dynamics dominating, and (the logarithm of) those dynamics are what we end up measuring as traits. But that’s just linear diffusion of sparse lognormals on a phenotypic level of analysis.
First, it should be noted that human traits are usually lognormally distributed, with apparent normal distributions being an artifact. E.g. while IQ is normally distributed, per item response theory it has an exponential relationship to the likelihood of success at difficult tasks. E.g. Most of What You Read on the Internet is Written by Insane People. Etc. So it’s not really about normal distribution vs lognormal distributions, it’s about linear diffusion of lognormals vs exponential interaction[1] of normals[2].
There’s some different solutions to the missing heritability problem. One proposal is rare variants, since they aren’t picked up by most sequencing technology, but the rarer the variant, the larger the effect size can be, so that makes the rare variants end up as our “sparse lognormals”.
But let’s say rare variants are of negligible size, so they don’t give us linear diffusion of lognormals, and instead the longtailedness of human traits is due to some sort of exponential interaction.
Then another thing that could give us missing heritability is if apparent traits aren’t actually the true genetic traits, but rather the true genetic traits trigger some dynamics, with e.g. the largest dynamics dominating, and (the logarithm of) those dynamics are what we end up measuring as traits. But that’s just linear diffusion of sparse lognormals on a phenotypic level of analysis.
As in exp(∑iβixi)
Or, well, short-tailed variables; e.g. alleles are usually modelled as Bernoulli.