you shouldn’t have students learn things that they don’t have the proper foundation for.
I think that in education the problem usually isn’t students learning things they don’t have the proper foundation for, but rather people disagreeing on the goal of the learning. You don’t elaborate your point and so I’m unsure what you mean precisely, so what follows may be a tangent.
Almost all academic subjects—in math, computing, natural sciences, history, literature, economics—have ‘foundations’ one could learn. In math, these are axiomatic formulations and proofs (as opposed to results), and often complex underlying theories. In computing, these are enginnering artifacts—software and hardware designs, implementations, and algorithms. In the sciences, these are the more fundamental laws of nature (which may be described by a slightly different science), as well as a great deal of organized facts, especially in biology. In history and literature, these are even more history and literature which provide the necessary background context.
One can always point towards a lesson (which isn’t at a graduate level) and claim students are missing some fundamental background. So does that mean we shouldn’t teach calculus without the underlying proofs, or programming 101 before operating system design, or classical physics before modern theories? That depends on what you want to accomplish.
When I studied undergrad CS, most courses had several versions you could choose from. All freshmen in the sciences & engineering faculties had to take a course on real number calculus, but there were as many as 5 variations, billed as calculus for math, physics, CS, engineering, and biology/chemistry majors, respectively. The math majors spent most of their time (and were tested mostly on) proving theorems. The engineering students studied solving really hard instances of the equations. And the biology students did the minimum and skipped some of the hard proofs entirely.
Did the biology students not “learn the proper foundation” for calculus? That depends entirely on the use they expect to make of it. The tradeoffs of the harder courses are clear: more time and effort spent studying, and more students failing the course and not graduating. (Also, things one learns but never uses are usually forgotten after a few years.)
In many cases it’s harder to match up the optimal lesson for the goal. In grade school we studied history. Which historical subjects should be taught? How should the available time be spread across them—studying a few eras in depth, or many on a shallow level? How useful is it, and against what goal should it be judged, to learn about any one historical setting without understanding its causes and effects—what came before and after, what was happening elsewhere at the same time, why people behaved as they did? But any mandatory education system must make centralized decisions about the accepted minimum that everyone should know.
You’re right, I wasn’t clear enough about what I mean. Sorry.
1) Dependencies often have the connotation of getting deep into theories and proofs, but in a strict sense that’s only a subset of dependencies. Like you said, what’s relevant is what you’re trying to accomplish. I’m completely on board with the idea that a bio major may not need to know all the theory behind, say calculus. But for whatever it is that the bio major is learning, it still has dependencies.
2) Try thinking about it like this: say that the bio major doesn’t understand something and is in office hours with the professor. The professor does something like this: “You don’t know A? Ok, well do you know B? You do, good. What about C? You do, good. What about D? No, ok, well do you know E? You do, good. What about F? You don’t? Ok, let me explain it to you.” Teachers (good ones anyway) implicitly are traversing the dependency tree in order to diagnose the holes in students’ knowledge. I think that you could use software to approximate this.
Since you placed this in your list of problems with the current state of education, do you have concrete examples in mind of study programs missing important prerequisites? How prevalent do you feel this is? (Grade school and college probably should be evaluated separately here, since colleges have entrance exams and explicit requirements that are supposed to be pretty much exhaustive wrt. required prior knowledge.)
It’s not always clear, and people often disagree, on what students are supposed to be learning to accomplish. In practice, most of the time, they’re learning to pass the graduation or college-entrance exam; most students don’t practice most of the knowledge they were taught. Also, subjects in the humanities have no clear goal at all. How can you quantify the use of studying history, and compare the utility of different historical subjects, in young people who haven’t even chosen a profession yet? Yet most people agree that learning history, arts, etc. is worthwhile. (And we know it’s not simply for fun—otherwise you could just point people at the library and save a lot of money on history classes.)
1) I think it’s very prevalent and I think that most lessons do a pretty bad job of addressing prerequisites. To be clear, I’m not talking prerequisites as “you need algebra before calculus”. I’m talking on a much smaller level than that.
As for examples… unfortunately I don’t really have great examples. I’m a student and am constantly reading and learning, and am constantly thinking to myself, “They’re trying to explain X but X depends on me knowing Y. I don’t know Y, and there isn’t any reason for them to have assumed that I did know Y. At the very least they should recognize that Y is likely to be something that trips students up (they’d see this if they did ‘user research’ and iterated) and provide convenient reference to material explaining Y.”
2) Right. The question of what students should be learning is another separate and huge topic. I don’t think this is the right place to properly argue them, but rest assured, I’ve got my opinions :)
I believe strongly that rationality should be a big part of the curriculum.
I think a big question is “to what extent do you let kids choose what they want, and to what extent do you force them to learn certain things”. I don’t think kids are mature enough to make great decisions. I think that they should be forced to learn a lot of “fundamental” things. The reasons for this are a) it’ll allow them to have a better basis for making a career choice and b) it’ll make them “more well rounded people”. But I think that there should be much less of a focus on details. Eg. don’t make kids memorize things, just have them understand the fundamental concepts. Memorizing details doesn’t help achieve goals a) or b).
I see a lot of things like languages and music and literature as hobbies. It’s very unlikely you end up using these things in your career or life. They at least should largely be electives rather than requirements. In general, I have a bit of a disapproval for the humanities.
I think that the basics of computer science should be a requirement. Same with the core ideas of economics, psychology and design (and probably some other things I’m not thinking of).
I would emphasize writing. Students should be able to make logical arguments and to write clearly and concisely.
I don’t think kids are mature enough to make great decisions.
Even most undergrad students are not informed enough to make great decisions. They may know what they want, but they often mistake how hard it will be to study something, or how fun it would be to work in a profession, or their chances of finding a good job.
On the other hand, the bureaucrats and politicians who do write school programs don’t always have the students’ best interest in mind. They may discount subjects they don’t understand well or aren’t interested in personally. They may make decisions for political reasons, like making school easier to raise graduation rates. And, of course, many fields of study are excellent indoctrination and covert political propaganda tools and are chosen mostly for these reasons.
On balance, I would trust students to influence their studies more than they do today, but I’m biased in favor of academically good students.
I see a lot of things like languages and music and literature as hobbies. It’s very unlikely you end up using these things in your career or life. They at least should largely be electives rather than requirements. In general, I have a bit of a disapproval for the humanities.
Isn’t that at odds with your desire to make students “more well rounded people”? If you think humanities aren’t valuable, all I can say is that most people disagree (me included). It’s a difference in values, and isn’t eclectic learning according to different values necessary to make a person well-rounded?
Also, and perhaps more importantly, most technical and scientific (i.e. non-humanities) subjects taught in school are also unlikely to be used by most people in most careers. Going purely by how many people actually use something they learned in later life, most mathematics should be an elective (especially geometry and trigonometry), as should natural sciences (physics, chemistry, biology) and general subjects like history, geology, most of geography and economics, etc. Do away with the humanities too, and what’s left for the core curriculum? You’d be back to the basics—reading, writing, and arithmetic.
Eg. don’t make kids memorize things, just have them understand the fundamental concepts. Memorizing details doesn’t help achieve goals a) or b).
I agree, understanding is much more important than memorizing. And memorized but poorly understood facts are usually forgotten later in life anyway.
However, most subjects do require memorization of a bunch of useful facts if they’re to be taught at all. The precise date of a battle isn’t important, but knowing who won and why it matters is.
I think that the basics of computer science should be a requirement. Same with the core ideas of economics, psychology and design (and probably some other things I’m not thinking of).
How many people do you think are going to use computer science (as opposed to programming), economics or psychology? I think very few are. Here too I feel this doesn’t align well with your desire to make things few people use electives.
It may be that you are projecting your own love of e.g. compsci and dislike of e.g. literature to others. But forcing all students to learn compsci is very unlikely to make more of them like it. On the contrary, people sometimes report hating subjects like history because they’re tainted by forced, badly conducted learning in school, even when they might have otherwise enjoyed them as adults.
I wonder. In this day and age, is film theory as relevant as literary theory? (As in, studying the techniques that filmmakers use to tell stories well that are unique to film, as opposed to those that are unique to prose or poetry.)
I think that in education the problem usually isn’t students learning things they don’t have the proper foundation for, but rather people disagreeing on the goal of the learning. You don’t elaborate your point and so I’m unsure what you mean precisely, so what follows may be a tangent.
Almost all academic subjects—in math, computing, natural sciences, history, literature, economics—have ‘foundations’ one could learn. In math, these are axiomatic formulations and proofs (as opposed to results), and often complex underlying theories. In computing, these are enginnering artifacts—software and hardware designs, implementations, and algorithms. In the sciences, these are the more fundamental laws of nature (which may be described by a slightly different science), as well as a great deal of organized facts, especially in biology. In history and literature, these are even more history and literature which provide the necessary background context.
One can always point towards a lesson (which isn’t at a graduate level) and claim students are missing some fundamental background. So does that mean we shouldn’t teach calculus without the underlying proofs, or programming 101 before operating system design, or classical physics before modern theories? That depends on what you want to accomplish.
When I studied undergrad CS, most courses had several versions you could choose from. All freshmen in the sciences & engineering faculties had to take a course on real number calculus, but there were as many as 5 variations, billed as calculus for math, physics, CS, engineering, and biology/chemistry majors, respectively. The math majors spent most of their time (and were tested mostly on) proving theorems. The engineering students studied solving really hard instances of the equations. And the biology students did the minimum and skipped some of the hard proofs entirely.
Did the biology students not “learn the proper foundation” for calculus? That depends entirely on the use they expect to make of it. The tradeoffs of the harder courses are clear: more time and effort spent studying, and more students failing the course and not graduating. (Also, things one learns but never uses are usually forgotten after a few years.)
In many cases it’s harder to match up the optimal lesson for the goal. In grade school we studied history. Which historical subjects should be taught? How should the available time be spread across them—studying a few eras in depth, or many on a shallow level? How useful is it, and against what goal should it be judged, to learn about any one historical setting without understanding its causes and effects—what came before and after, what was happening elsewhere at the same time, why people behaved as they did? But any mandatory education system must make centralized decisions about the accepted minimum that everyone should know.
You’re right, I wasn’t clear enough about what I mean. Sorry.
1) Dependencies often have the connotation of getting deep into theories and proofs, but in a strict sense that’s only a subset of dependencies. Like you said, what’s relevant is what you’re trying to accomplish. I’m completely on board with the idea that a bio major may not need to know all the theory behind, say calculus. But for whatever it is that the bio major is learning, it still has dependencies.
2) Try thinking about it like this: say that the bio major doesn’t understand something and is in office hours with the professor. The professor does something like this: “You don’t know A? Ok, well do you know B? You do, good. What about C? You do, good. What about D? No, ok, well do you know E? You do, good. What about F? You don’t? Ok, let me explain it to you.” Teachers (good ones anyway) implicitly are traversing the dependency tree in order to diagnose the holes in students’ knowledge. I think that you could use software to approximate this.
Since you placed this in your list of problems with the current state of education, do you have concrete examples in mind of study programs missing important prerequisites? How prevalent do you feel this is? (Grade school and college probably should be evaluated separately here, since colleges have entrance exams and explicit requirements that are supposed to be pretty much exhaustive wrt. required prior knowledge.)
It’s not always clear, and people often disagree, on what students are supposed to be learning to accomplish. In practice, most of the time, they’re learning to pass the graduation or college-entrance exam; most students don’t practice most of the knowledge they were taught. Also, subjects in the humanities have no clear goal at all. How can you quantify the use of studying history, and compare the utility of different historical subjects, in young people who haven’t even chosen a profession yet? Yet most people agree that learning history, arts, etc. is worthwhile. (And we know it’s not simply for fun—otherwise you could just point people at the library and save a lot of money on history classes.)
1) I think it’s very prevalent and I think that most lessons do a pretty bad job of addressing prerequisites. To be clear, I’m not talking prerequisites as “you need algebra before calculus”. I’m talking on a much smaller level than that.
As for examples… unfortunately I don’t really have great examples. I’m a student and am constantly reading and learning, and am constantly thinking to myself, “They’re trying to explain X but X depends on me knowing Y. I don’t know Y, and there isn’t any reason for them to have assumed that I did know Y. At the very least they should recognize that Y is likely to be something that trips students up (they’d see this if they did ‘user research’ and iterated) and provide convenient reference to material explaining Y.”
2) Right. The question of what students should be learning is another separate and huge topic. I don’t think this is the right place to properly argue them, but rest assured, I’ve got my opinions :)
I believe strongly that rationality should be a big part of the curriculum.
I think a big question is “to what extent do you let kids choose what they want, and to what extent do you force them to learn certain things”. I don’t think kids are mature enough to make great decisions. I think that they should be forced to learn a lot of “fundamental” things. The reasons for this are a) it’ll allow them to have a better basis for making a career choice and b) it’ll make them “more well rounded people”. But I think that there should be much less of a focus on details. Eg. don’t make kids memorize things, just have them understand the fundamental concepts. Memorizing details doesn’t help achieve goals a) or b).
I see a lot of things like languages and music and literature as hobbies. It’s very unlikely you end up using these things in your career or life. They at least should largely be electives rather than requirements. In general, I have a bit of a disapproval for the humanities.
I think that the basics of computer science should be a requirement. Same with the core ideas of economics, psychology and design (and probably some other things I’m not thinking of).
I would emphasize writing. Students should be able to make logical arguments and to write clearly and concisely.
Even most undergrad students are not informed enough to make great decisions. They may know what they want, but they often mistake how hard it will be to study something, or how fun it would be to work in a profession, or their chances of finding a good job.
On the other hand, the bureaucrats and politicians who do write school programs don’t always have the students’ best interest in mind. They may discount subjects they don’t understand well or aren’t interested in personally. They may make decisions for political reasons, like making school easier to raise graduation rates. And, of course, many fields of study are excellent indoctrination and covert political propaganda tools and are chosen mostly for these reasons.
On balance, I would trust students to influence their studies more than they do today, but I’m biased in favor of academically good students.
Isn’t that at odds with your desire to make students “more well rounded people”? If you think humanities aren’t valuable, all I can say is that most people disagree (me included). It’s a difference in values, and isn’t eclectic learning according to different values necessary to make a person well-rounded?
Also, and perhaps more importantly, most technical and scientific (i.e. non-humanities) subjects taught in school are also unlikely to be used by most people in most careers. Going purely by how many people actually use something they learned in later life, most mathematics should be an elective (especially geometry and trigonometry), as should natural sciences (physics, chemistry, biology) and general subjects like history, geology, most of geography and economics, etc. Do away with the humanities too, and what’s left for the core curriculum? You’d be back to the basics—reading, writing, and arithmetic.
I agree, understanding is much more important than memorizing. And memorized but poorly understood facts are usually forgotten later in life anyway.
However, most subjects do require memorization of a bunch of useful facts if they’re to be taught at all. The precise date of a battle isn’t important, but knowing who won and why it matters is.
How many people do you think are going to use computer science (as opposed to programming), economics or psychology? I think very few are. Here too I feel this doesn’t align well with your desire to make things few people use electives.
It may be that you are projecting your own love of e.g. compsci and dislike of e.g. literature to others. But forcing all students to learn compsci is very unlikely to make more of them like it. On the contrary, people sometimes report hating subjects like history because they’re tainted by forced, badly conducted learning in school, even when they might have otherwise enjoyed them as adults.
I wonder. In this day and age, is film theory as relevant as literary theory? (As in, studying the techniques that filmmakers use to tell stories well that are unique to film, as opposed to those that are unique to prose or poetry.)
I would add computer/video game design theory.