Why would you suspect this is true? This sounds like one of those feel-good ideas that is morally satisfying but could just as easily be false.
When people do something, they tend to become better at that thing by picking up tricks relevant to it. If the thing they are doing is learning lots of random things, presumably some of the tricks they pick up would be tricks for learning lots of random things.
How big of an effect are we talking?
I don’t know. I’ve talked with some people who are interested in intelligence research about how to measure learning ability. It would essentially require measuring people’s ability to do lots of things, then teaching them those things, then measuring their ability on those things again, and looking at something like the difference in ability. The trouble is that it is simultaneously really expensive to perform such measurements (as having to teach people things makes it orders of magnitude more expensive than ordinary psychometrics), and yet still too noisy when performed at reasonable scales to be useful.
So measuring learning ability would be difficult. And even if we found out how to do that, we would still need some sort of randomized trial or natural experiment to test school’s effect on learning ability.
The price is 12 high-quality years, so even a 10% improvement in ability to learn wouldn’t nearly justify the cost. Also, your neuroplasticity will probably drop by more than that over the course of the 12 years, so the net effect will be to take 12 years and leave you with a reduced ability to learn.
Maybe. This assumes ability to learn when younger is as valueable as ability to learn when older, which might not be true because you have much more information about what you need to learn when you are older. For instance at my job I had to learn KQL, but KQL did not exist when I was a child, so in order to teach it to me as a child, we would have to be able to accurately forecast the invention of KQL, which seems impossible.
If “getting taught a bundle of random things” is valuable, is it more valuable than doing whatever you would do by default? Even the most wasteful activities you would realistically do—watching TV, playing videogames, surfing the net, talking to friends—all have some benefits. All of them would improve literacy, numeracy, and your knowledge of the world, and all of them would require you to learn a bundle of random things, which (following your suggestion) may be valuable in itself.
I suspect it depends on the person.
The sort of person who watches science documentaries on TV, who builds redstone computers in Minecraft, who reads LessWrong and scientific papers when surfing the web, and who talks with friends about topics like the theoretical arguments for and against school would probably have a much more intellectually stimulating environment outside of school than within it.
But such people are extremely rare, so we can to good approximation say they don’t exist. I’m less sure about how it would work out for the median person, who spends their time on other stuff. I think they might tend to learn things that are less intellectually varied, specializing deeply into keeping track of social relations, doing exciting things, or similar? Idk, I don’t know very much about the median person.
When people do something, they tend to become better at that thing by picking up tricks relevant to it. If the thing they are doing is learning lots of random things, presumably some of the tricks they pick up would be tricks for learning lots of random things.
I don’t know. I’ve talked with some people who are interested in intelligence research about how to measure learning ability. It would essentially require measuring people’s ability to do lots of things, then teaching them those things, then measuring their ability on those things again, and looking at something like the difference in ability. The trouble is that it is simultaneously really expensive to perform such measurements (as having to teach people things makes it orders of magnitude more expensive than ordinary psychometrics), and yet still too noisy when performed at reasonable scales to be useful.
So measuring learning ability would be difficult. And even if we found out how to do that, we would still need some sort of randomized trial or natural experiment to test school’s effect on learning ability.
Maybe. This assumes ability to learn when younger is as valueable as ability to learn when older, which might not be true because you have much more information about what you need to learn when you are older. For instance at my job I had to learn KQL, but KQL did not exist when I was a child, so in order to teach it to me as a child, we would have to be able to accurately forecast the invention of KQL, which seems impossible.
I suspect it depends on the person.
The sort of person who watches science documentaries on TV, who builds redstone computers in Minecraft, who reads LessWrong and scientific papers when surfing the web, and who talks with friends about topics like the theoretical arguments for and against school would probably have a much more intellectually stimulating environment outside of school than within it.
But such people are extremely rare, so we can to good approximation say they don’t exist. I’m less sure about how it would work out for the median person, who spends their time on other stuff. I think they might tend to learn things that are less intellectually varied, specializing deeply into keeping track of social relations, doing exciting things, or similar? Idk, I don’t know very much about the median person.