If the readership of LessWrong has gone up similarly in that time, then I would not expect to see an improvement, even if everyone who reads LessWrong improves.
Yes, I was thinking that. Suppose it takes a certain fixed amount of time for any LessWronger to learn the local official truth. Then if the population grows exponentially, you’d expect the fraction that knows the local official truth to remain constant, right? But I’m not sure the population has been growing exponentially, and even so you might have expected the local official truth to become more accurate over time, and you might have expected the community to get better over time at imparting the local official truth.
Regardless of what we should have expected, my impression is LessWrong as a whole tends to assume that it’s getting closer to the truth over time. If that’s not happening because of newcomers, that’s worth worrying about.
Note that it is possible for newcomers to hold the same inaccurate beliefs as their predecessors while the core improves its knowledge or expands in size. In fact, as LW grows it will have to recruit from, say, Hacker News (where I first heard of LW) instead of Singularity lists, producing newcomers less in tune with the local truth.
(Unnamed’s comment shows interesting differences in opinion between a “core” and the rest, but (s)he seems to have skipped the only question with an easily-verified answer, i.e. Newton.)
The calibration question was more complicated to analyze, but now I’ve looked at it and it seems like core members were slightly more accurate at estimating the correct year (p=.05 when looking at size of the error, and p=.12 when looking at whether or not it was within the 20-year range), but there’s no difference in calibration.
If the readership of LessWrong has gone up similarly in that time, then I would not expect to see an improvement, even if everyone who reads LessWrong improves.
Yes, I was thinking that. Suppose it takes a certain fixed amount of time for any LessWronger to learn the local official truth. Then if the population grows exponentially, you’d expect the fraction that knows the local official truth to remain constant, right? But I’m not sure the population has been growing exponentially, and even so you might have expected the local official truth to become more accurate over time, and you might have expected the community to get better over time at imparting the local official truth.
Regardless of what we should have expected, my impression is LessWrong as a whole tends to assume that it’s getting closer to the truth over time. If that’s not happening because of newcomers, that’s worth worrying about.
Note that it is possible for newcomers to hold the same inaccurate beliefs as their predecessors while the core improves its knowledge or expands in size. In fact, as LW grows it will have to recruit from, say, Hacker News (where I first heard of LW) instead of Singularity lists, producing newcomers less in tune with the local truth.
(Unnamed’s comment shows interesting differences in opinion between a “core” and the rest, but (s)he seems to have skipped the only question with an easily-verified answer, i.e. Newton.)
The calibration question was more complicated to analyze, but now I’ve looked at it and it seems like core members were slightly more accurate at estimating the correct year (p=.05 when looking at size of the error, and p=.12 when looking at whether or not it was within the 20-year range), but there’s no difference in calibration.
(“He”, btw.)
Couldn’t the current or future data be correlated with length of readership to determine this?