So the usual story about Easter Island is that construction and transportation of the statues used up all their palm trees, and the island’s ecosystem depended on the palm trees, so they starved to death or something. (I think there’s some doubt about whether that’s actually right, but it’s a plausible enough story and a useful analogy even if it turns out not to be literally true.)
What resource are universities in danger of consuming all of?
Money? US spending on universities seems to be on the order of a couple of percent of GDP.
Researcher time? Even supposing that university research is worthless, there’s a lot of research being done by corporate R&D departments, and OP here gives some examples to suggest that some of it’s pretty good. (And, for what it’s worth, I don’t find it plausible that university research is worthless, though no doubt some of it is.)
Students’ youth? Even supposing that time spent at university is worthless, it’s only a few years per person. (And, for what it’s worth, I don’t find it plausible that time spent at university is worthless. I know that at university I both had fun and learned things I am still glad to know; maybe I was exceptionally lucky but I don’t know of any good reason to think so.)
Students’ youth? Even supposing that time spent at university is worthless, it’s only a few years per person.
Is that period not important though ?
As en, even assuming universities are not “magical tower that remove 4 years of life to validate IQ > 100 and conscientiousness in the 80th percentile, you could hardly argue what they teach is perfect.
But those “few years” are basically the most critical years of development we have, as in, the brain is developed enough to actually do stuff yet still plastic.
I won’t go into myelination, because I’m lazy and finding good references is hard, as far as I know Giedd has a few studies on grey matter changes that everyone cites, but maybe there’s better references:
Gist of it is: We loss neuronal bodies as we age starting around the age of 5. The loss doesn’t happen in the prefrontal cortex until we enter our teens and seems to keep happening until the reach 20.
I don’t know of any good studies going after 20, there’s a lot of meh studies and if you aggregate them you get this: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3004040/ (see fig2 and fig3). Though note that many of these use secondary markers or rather oudated imaging methods and basically nobody is doing brain byopsies on living humans… best you can get is DV-MRI and FMRI and postmortem biopsies (which is probably very biased, because only the very poor or the very educated will be fine with their recently dead child’s brain being quickly removed and analyzed for the sake of neuroscience… come to think of it, the other 2 probably are too, either in the same way or by selecting for people with mental disorders)
I.e. If a neuron is not pruned, the likelihood of it’s various axons being heavily myelinated is increased and vice versa.
So now, assume that the front-cortex is indeed “what makes us different from animals”, what gives us most of our ability to be intelligent in the two math and write and gather evidence part.
Assume that people with pruned cortexes are indeed noticeably smarter in everything related to engineering/science (think people in the 10+ age range vs people in the 10< age range).
Assume that indeed the studies above are true and take into account the fact that we have empirical cultural evidence (see stereotypes of learning new things as people age, underpayment of older workers in non-tenure thinking related jobs like programming and accounting, can’t teach a dog new tricks… etc).
I think these are all pretty safe assumption, not true in the sense of scientific truth in physics, but true in the sense of “safe to operate using them as a rough guidelines”, or at least if they are not based on your model, then I also invite you to throw out all of psychology with them.
Youth is indeed very important, the 15-25 +/- 3 year age range is critical for the development required to be a scientists, engineer, doctor or any other profession where unusual intelligence is required.
So University time might be “only 3 or 4 year per person”, though let’s be honest thins like med school take 6 to 10 depending on location and an alarming number of people are putting in an extra 1-3 years getting a masters. But those are 3-10 years of a person’s most valuable time in life as far as brain plasticity.
<And yes, one could make the same argument about high school, but that would basically be arguing that high schools serve the triple role of counter-biasing aggressive tendencies in people that would otherwise basically be criminals, cultural indoctrination and learning… and that’s a much more taboo argument to make so I’m not making it>
***
That’s just answering your question though, it’s not the point I’m making in the article, the point of the article is that universities basically have a lot of signaling power for “If you are smart and want to self-actualize this is the place”. So if you want to think in terms of scarce resource being wasted, that’s the way I’d have put it there, universities are wasting critical signaling mechanisms.
I agree that you can make a case that sending a lot of people to university is wasteful; maybe you can make a case that sending anyone to university is wasteful (though, for what it’s worth, that feels entirely wrong to me). But shminux was making a different claim: that our universities are so wasteful that they imperil our civilization’s survival. That claim seems absurdly overblown to me.
Yes, the age at which people go to university is a good age for learning new things. That would be why people of that age are often encouraged to go off to a place designed for learning new things: a university.
Maybe just getting a job will (on average) actually result in learning more valuable things, but frankly I don’t see any reason to believe that. (More things valuable for becoming a cog in someone else’s industrial machine, maybe, though even that isn’t obvious.)
Maybe all young people (or at least all fairly bright young people?) should be trying to start their own businesses, but again I see no reason to believe that either. Starting a business is hard; most new businesses fail; most 18-year-olds lack knowledge and experience that would greatly improve their chances of starting a successful business. (There are other reasons why I think this would be a bad idea, but since I’m not even sure it’s what you have in mind I’ll leave it there.)
Maybe the learning people currently do at universities, or the learning they’re meant to be doing at universities, or whatever other learning should replace it, should be done “in the background” while they are working a job; but I see no reason to think that’s even possible in most cases. Their jobs are likely to be too demanding in time, effort and mental focus. For sure some people can do it, but if you want it to be the general case then I’d like to see evidence that it’s feasible.
Maybe we need different ways of optimizing 18-20-year-olds’ lives for learning new and valuable things. I’d be interested to see concrete proposals. An obvious question I hope they’d address: why expect that in practice this will end up better than universities?
Apprenticeship seems promising to me. It’s died out in most of the world, but there’s still formal apprenticeship programs in Germany that seem to work pretty well.
Also, it’s a surprisingly common position among very successful people I know that young people would benefit from 2 years of national service after high school. It wouldn’t have to be military service — it could be environmental conservation, poverty relief, Peace Corps type activities, etc.
We actually have reasonable control groups for this both in countries with mandatory national service and the Mormon Church, whom the majority of their members go on a 2-year mission. I haven’t looked at hard numbers or anything, but my sense is that both countries with national service and Mormons tend to be more successful than similar cohorts that don’t undergo such experiences.
Maybe just getting a job will (on average) actually result in learning more valuable things, but frankly I don’t see any reason to believe that. (More things valuable for becoming a cog in someone else’s industrial machine, maybe, though even that isn’t obvious.)
Ok, well I certainly wouldn’t argue that a generic alternative exists, I mean, that’s my original point, that they are wasteful via the fact that they steal signal-strength from any alternative that would crop up.
In my personal experience, getting a job on average is better for learning, if you look for jobs that can provide de-facto mentors/teachers, but that might be because so few young people get a job. Or maybe me and the people I know that took my advice and quite university are just very good at learning from other practitionares rather than professors.
Maybe we need different ways of optimizing 18-20-year-olds’ lives for learning new and valuable things. I’d be interested to see concrete proposals. An obvious question I hope they’d address: why expect that in practice this will end up better than universities?
Well, my proposal in the article is basically that we had such a system, it was called a university, but it got slowly eroded as it went the way of a safety/community provision institution (or at least provisioning an illusion of those two).
My argument for why it worked better in the past are point 1-2 and arguably 3 and 4.
I can well believe that universities used to work well and worsened over time. The point of my question at the end there is that I would expect any New Improved University Replacement to suffer the same process.
(Of course it might be worth it anyway, if it works better for long enough.)
So the usual story about Easter Island is that construction and transportation of the statues used up all their palm trees, and the island’s ecosystem depended on the palm trees, so they starved to death or something. (I think there’s some doubt about whether that’s actually right, but it’s a plausible enough story and a useful analogy even if it turns out not to be literally true.)
What resource are universities in danger of consuming all of?
Money? US spending on universities seems to be on the order of a couple of percent of GDP.
Researcher time? Even supposing that university research is worthless, there’s a lot of research being done by corporate R&D departments, and OP here gives some examples to suggest that some of it’s pretty good. (And, for what it’s worth, I don’t find it plausible that university research is worthless, though no doubt some of it is.)
Students’ youth? Even supposing that time spent at university is worthless, it’s only a few years per person. (And, for what it’s worth, I don’t find it plausible that time spent at university is worthless. I know that at university I both had fun and learned things I am still glad to know; maybe I was exceptionally lucky but I don’t know of any good reason to think so.)
Is that period not important though ?
As en, even assuming universities are not “magical tower that remove 4 years of life to validate IQ > 100 and conscientiousness in the 80th percentile, you could hardly argue what they teach is perfect.
But those “few years” are basically the most critical years of development we have, as in, the brain is developed enough to actually do stuff yet still plastic.
I won’t go into myelination, because I’m lazy and finding good references is hard, as far as I know Giedd has a few studies on grey matter changes that everyone cites, but maybe there’s better references:
https://www.pnas.org/content/101/21/8174
Gist of it is: We loss neuronal bodies as we age starting around the age of 5. The loss doesn’t happen in the prefrontal cortex until we enter our teens and seems to keep happening until the reach 20.
I don’t know of any good studies going after 20, there’s a lot of meh studies and if you aggregate them you get this: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3004040/ (see fig2 and fig3). Though note that many of these use secondary markers or rather oudated imaging methods and basically nobody is doing brain byopsies on living humans… best you can get is DV-MRI and FMRI and postmortem biopsies (which is probably very biased, because only the very poor or the very educated will be fine with their recently dead child’s brain being quickly removed and analyzed for the sake of neuroscience… come to think of it, the other 2 probably are too, either in the same way or by selecting for people with mental disorders)
This process is roughly associated with pruning, essentially making networks more efficient and/or optimizing for resource consumption. This goes in tandem with myelination: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3982854/
I.e. If a neuron is not pruned, the likelihood of it’s various axons being heavily myelinated is increased and vice versa.
So now, assume that the front-cortex is indeed “what makes us different from animals”, what gives us most of our ability to be intelligent in the two math and write and gather evidence part.
Assume that people with pruned cortexes are indeed noticeably smarter in everything related to engineering/science (think people in the 10+ age range vs people in the 10< age range).
Assume that indeed the studies above are true and take into account the fact that we have empirical cultural evidence (see stereotypes of learning new things as people age, underpayment of older workers in non-tenure thinking related jobs like programming and accounting, can’t teach a dog new tricks… etc).
I think these are all pretty safe assumption, not true in the sense of scientific truth in physics, but true in the sense of “safe to operate using them as a rough guidelines”, or at least if they are not based on your model, then I also invite you to throw out all of psychology with them.
Youth is indeed very important, the 15-25 +/- 3 year age range is critical for the development required to be a scientists, engineer, doctor or any other profession where unusual intelligence is required.
So University time might be “only 3 or 4 year per person”, though let’s be honest thins like med school take 6 to 10 depending on location and an alarming number of people are putting in an extra 1-3 years getting a masters. But those are 3-10 years of a person’s most valuable time in life as far as brain plasticity.
<And yes, one could make the same argument about high school, but that would basically be arguing that high schools serve the triple role of counter-biasing aggressive tendencies in people that would otherwise basically be criminals, cultural indoctrination and learning… and that’s a much more taboo argument to make so I’m not making it>
***
That’s just answering your question though, it’s not the point I’m making in the article, the point of the article is that universities basically have a lot of signaling power for “If you are smart and want to self-actualize this is the place”. So if you want to think in terms of scarce resource being wasted, that’s the way I’d have put it there, universities are wasting critical signaling mechanisms.
I agree that you can make a case that sending a lot of people to university is wasteful; maybe you can make a case that sending anyone to university is wasteful (though, for what it’s worth, that feels entirely wrong to me). But shminux was making a different claim: that our universities are so wasteful that they imperil our civilization’s survival. That claim seems absurdly overblown to me.
Yes, the age at which people go to university is a good age for learning new things. That would be why people of that age are often encouraged to go off to a place designed for learning new things: a university.
Maybe just getting a job will (on average) actually result in learning more valuable things, but frankly I don’t see any reason to believe that. (More things valuable for becoming a cog in someone else’s industrial machine, maybe, though even that isn’t obvious.)
Maybe all young people (or at least all fairly bright young people?) should be trying to start their own businesses, but again I see no reason to believe that either. Starting a business is hard; most new businesses fail; most 18-year-olds lack knowledge and experience that would greatly improve their chances of starting a successful business. (There are other reasons why I think this would be a bad idea, but since I’m not even sure it’s what you have in mind I’ll leave it there.)
Maybe the learning people currently do at universities, or the learning they’re meant to be doing at universities, or whatever other learning should replace it, should be done “in the background” while they are working a job; but I see no reason to think that’s even possible in most cases. Their jobs are likely to be too demanding in time, effort and mental focus. For sure some people can do it, but if you want it to be the general case then I’d like to see evidence that it’s feasible.
Maybe we need different ways of optimizing 18-20-year-olds’ lives for learning new and valuable things. I’d be interested to see concrete proposals. An obvious question I hope they’d address: why expect that in practice this will end up better than universities?
Apprenticeship seems promising to me. It’s died out in most of the world, but there’s still formal apprenticeship programs in Germany that seem to work pretty well.
Also, it’s a surprisingly common position among very successful people I know that young people would benefit from 2 years of national service after high school. It wouldn’t have to be military service — it could be environmental conservation, poverty relief, Peace Corps type activities, etc.
We actually have reasonable control groups for this both in countries with mandatory national service and the Mormon Church, whom the majority of their members go on a 2-year mission. I haven’t looked at hard numbers or anything, but my sense is that both countries with national service and Mormons tend to be more successful than similar cohorts that don’t undergo such experiences.
Ok, well I certainly wouldn’t argue that a generic alternative exists, I mean, that’s my original point, that they are wasteful via the fact that they steal signal-strength from any alternative that would crop up.
In my personal experience, getting a job on average is better for learning, if you look for jobs that can provide de-facto mentors/teachers, but that might be because so few young people get a job. Or maybe me and the people I know that took my advice and quite university are just very good at learning from other practitionares rather than professors.
Well, my proposal in the article is basically that we had such a system, it was called a university, but it got slowly eroded as it went the way of a safety/community provision institution (or at least provisioning an illusion of those two).
My argument for why it worked better in the past are point 1-2 and arguably 3 and 4.
I can well believe that universities used to work well and worsened over time. The point of my question at the end there is that I would expect any New Improved University Replacement to suffer the same process.
(Of course it might be worth it anyway, if it works better for long enough.)
That seems reasonable, I’d assume the same.
As in, if I could think of an implementable solution I’d have tried implementing it.
My point here as to describe the problem from a certain angle, which is easy, I lay no claim on the harder task of prescribing a solution.