The original definition of IQ, intelligence quotient, is mental age (as determined by cognitive test scores) divided by chronological age (and then multiplied by 100). A 6-year-old with the test scores of the average 9-year-old thus has an IQ of 150 by the ratio IQ definition.
People then found that IQ scores roughly followed a normal distribution, and subsequent tests defined IQ scores in terms of standard deviations from the mean. This makes it more convenient to evaluate adults, since test scores stop going up past a certain age in adulthood (I’ve seen some tests go up to age 21). However, when you get too many standard deviations away from the mean, such that there’s no way the test was normed on that many people, it makes sense to return to the ratio IQ definition.
So an IQ 300 human would theoretically, at age 6, have the cognitive test scores of the average 18-year-old. How would we predict what would happen in later years? I guess we could compare them to IQ 200 humans (of which we have a few), so that the IQ 300 12-year-old would be like the IQ 200 18-year-old. But when they reached 18, we wouldn’t have anything to compare them against.
I think that’s the most you can extract from the underlying model.
The original definition of IQ, intelligence quotient, is mental age (as determined by cognitive test scores) divided by chronological age (and then multiplied by 100). A 6-year-old with the test scores of the average 9-year-old thus has an IQ of 150 by the ratio IQ definition.
People then found that IQ scores roughly followed a normal distribution, and subsequent tests defined IQ scores in terms of standard deviations from the mean. This makes it more convenient to evaluate adults, since test scores stop going up past a certain age in adulthood (I’ve seen some tests go up to age 21). However, when you get too many standard deviations away from the mean, such that there’s no way the test was normed on that many people, it makes sense to return to the ratio IQ definition.
So an IQ 300 human would theoretically, at age 6, have the cognitive test scores of the average 18-year-old. How would we predict what would happen in later years? I guess we could compare them to IQ 200 humans (of which we have a few), so that the IQ 300 12-year-old would be like the IQ 200 18-year-old. But when they reached 18, we wouldn’t have anything to compare them against.
I think that’s the most you can extract from the underlying model.