dynamic balancing of self-assertiveness vs. deference to authority
The proportion of “deference to authority” is too high, in my opinion.
Knowledge acquisition, on the other hand, can be done via Wikipedia etc. and does not need to occupy school time. People who want to acquire knowledge can do this easily in their bedroom at night.
This isn’t application-based knowledge. I mentioned that students can learn concepts on their own, but what society currently lacks is a path to do something useful with it from a younger age.
Also, I agree that learning social behavior is one of the primary purposes of school, and I’d like to stress that I’m not advocating for the removal of the school system.
The proportion of “deference to authority” is too high, in my opinion.
In school, or in the real world? And if the latter, what context in particular? In a career context, for example, lower deference to authority (when carefully executed) tends to lead to more rapid promotion, where at the terminus (CEO) everyone in an organisation defers to you. It doesn’t seem there’s a huge supply/demand imbalance for senior roles, which suggests to me that the self-assertiveness vs. deference balance in working-age society is more or less optimal.
what society currently lacks is a path to do something useful with it from a younger age.
Agreed, but why should teenagers being ‘useful’ be a goal? A century ago, most teenagers did actually do useful things (work in factories etc.) but we’ve moved away from that these days. Being a teenager is fun, with low responsibility, a lot of free time for self-discovery, etc. We have a lifetime after that to be ‘useful’. Why should we cut our young years short?
I meant that school generally tries to embed deference to authority. It fades in the real world for certain jobs though.
Why should we cut our young years short?
Brain myelination and information processing speed are highest then. Time is ticking if you want it to be easy to do creative, innovative work quickly. It is, of course, very possible to be successful as an adult with lower levels of neuroplasticity and processing and more “crystallized” intelligence, however adolescents have that particular advantage, differentiating them and making them valuable in a unique way.
This is turning into more subjective philosophy territory, but is relaxing and having “fun” necessarily better than intellectual stimulation and learning from challenges? And won’t experiences like that speed up self-discovery?
but is relaxing and having “fun” necessarily better than intellectual stimulation and learning from challenges? And won’t experiences like that speed up self-discovery?
I think it speeds up self-discovery, at the expense of narrowing the domain within which that self-discovery takes place. So if you spend a lot of time as a teenager developing software, you certainly learn more about yourself in terms of your aptitude for developing software. But there’s an opportunity cost. I favour unguided self-discovery (a.k.a. “having fun”) for longer, because I view self-discovery during teenage years as a global optimization, for which algorithms like simulated annealing tend to find better optima with a higher temperature, albeit taking longer to do so. As a result, I do not favour cutting short ‘childhood’ so people can be ‘useful’ sooner.
Also, it may be well be that the LessWrong demographic favours intellectual stimulation as “better” than many other things, but for the general population, I don’t see evidence this is the case. I know plenty of highly satisfied people, not driven by intellectual stimulation but nonetheless doing things most would regard as valuable to society. But yes, this comes down to subjective philosophy on what is “better” in terms of one’s own utility function, and what we should be optimising for.
I’d argue that working earlier and having fun are not necessarily mutually exclusive—for example, look at university life. There are a lot of students doing research and other work, while participating in probably the strongest self-discovery of their lives. I also don’t think specialization has a significant impact on what forms of self-discovery someone can engage in—software engineering covers a broad variety of things, from working with people to problem solving to time management to creativity and pitching your work
The proportion of “deference to authority” is too high, in my opinion.
This isn’t application-based knowledge. I mentioned that students can learn concepts on their own, but what society currently lacks is a path to do something useful with it from a younger age.
Also, I agree that learning social behavior is one of the primary purposes of school, and I’d like to stress that I’m not advocating for the removal of the school system.
In school, or in the real world? And if the latter, what context in particular? In a career context, for example, lower deference to authority (when carefully executed) tends to lead to more rapid promotion, where at the terminus (CEO) everyone in an organisation defers to you. It doesn’t seem there’s a huge supply/demand imbalance for senior roles, which suggests to me that the self-assertiveness vs. deference balance in working-age society is more or less optimal.
Agreed, but why should teenagers being ‘useful’ be a goal? A century ago, most teenagers did actually do useful things (work in factories etc.) but we’ve moved away from that these days. Being a teenager is fun, with low responsibility, a lot of free time for self-discovery, etc. We have a lifetime after that to be ‘useful’. Why should we cut our young years short?
I meant that school generally tries to embed deference to authority. It fades in the real world for certain jobs though.
Brain myelination and information processing speed are highest then. Time is ticking if you want it to be easy to do creative, innovative work quickly. It is, of course, very possible to be successful as an adult with lower levels of neuroplasticity and processing and more “crystallized” intelligence, however adolescents have that particular advantage, differentiating them and making them valuable in a unique way.
This is turning into more subjective philosophy territory, but is relaxing and having “fun” necessarily better than intellectual stimulation and learning from challenges? And won’t experiences like that speed up self-discovery?
I think it speeds up self-discovery, at the expense of narrowing the domain within which that self-discovery takes place. So if you spend a lot of time as a teenager developing software, you certainly learn more about yourself in terms of your aptitude for developing software. But there’s an opportunity cost. I favour unguided self-discovery (a.k.a. “having fun”) for longer, because I view self-discovery during teenage years as a global optimization, for which algorithms like simulated annealing tend to find better optima with a higher temperature, albeit taking longer to do so. As a result, I do not favour cutting short ‘childhood’ so people can be ‘useful’ sooner.
Also, it may be well be that the LessWrong demographic favours intellectual stimulation as “better” than many other things, but for the general population, I don’t see evidence this is the case. I know plenty of highly satisfied people, not driven by intellectual stimulation but nonetheless doing things most would regard as valuable to society. But yes, this comes down to subjective philosophy on what is “better” in terms of one’s own utility function, and what we should be optimising for.
I’d argue that working earlier and having fun are not necessarily mutually exclusive—for example, look at university life. There are a lot of students doing research and other work, while participating in probably the strongest self-discovery of their lives. I also don’t think specialization has a significant impact on what forms of self-discovery someone can engage in—software engineering covers a broad variety of things, from working with people to problem solving to time management to creativity and pitching your work