Welcome to LessWrong! I think the other commenters are being a bit harsh to a first-time poster ;p though perhaps you wouldn’t have it any other way, given your subject matter. So I just want to say that I agree with your overall message; lots of university courses have poor epistemic standards. But then, lots of society has poor epistemic standards, so is this a real surprise?
Additional comment: this view would be a bit more controversial among lesswrong users, but I believe that this also applies to some math courses. Everyone should be taught formal logic, budgeting, probability, and data analysis (and other things), but for most people calculus and matrix math really isn’t needed.
I agree wrt prioritization of math topics, but this is a lot different from the topic of most of your post, which is accuracy. I don’t find this kind of not-commonly-useful argument to be much of a strike against a class. Because if you do take those classes, then you can do those technical jobs (at least, if you get far enough).
Too often it’s the kid in math class complaining “this will never be useful to me”. It’s too easy of an out. If you were training an RL system, you would just put it straight to work and have it do the actual thing it needs to learn thousands of times. But humans get a lot of benefit from broad knowledge, including preparedness for unexpected situations, and having analogies to work from. If you want to do really innovative work, it pays to learn a lot of individually “useless” stuff. (If you don’t want to do really innovative work, OK, sure.)
If someone were approaching school with an attitude of getting as much out of it as possible, it seems like they’d want to identify skills to level up (EG they might identify art history as not being a skill to level up), and then grind those skills as much as they can. Any of those proficiencies, once you grind enough, becomes a marketable skill. (Even if you make an economically poor choice, like law or art, there will be ways to succeed, and potential applications beyond getting a job. Not that I’m recommending making economically poor choices.)
It seems better to have an attitude of collecting as many of those as possible, rather than as little as possible. So long as you’re at school anyway.
This ignores the opportunity costs, and just assumes the problem. The OP is arguing for reform, not questioning the mainstream strategy of success for an individual. A smart person can be, e.g., a productive programmer with a middle-school level of math, and two to four years of programming training (i.e., basics of a programming language, data structures and algorithms, and lots of hands-on toy projects). Doing “really innovative work” also might not be efficient either at the individual level or even the societal one. There are lots of normal work to be done.
The problem is more than mere epistemics as well; Most rigorous courses teach few useful skills. Most of what one learns is forgotten when not actively used.
Welcome to LessWrong! I think the other commenters are being a bit harsh to a first-time poster ;p though perhaps you wouldn’t have it any other way, given your subject matter. So I just want to say that I agree with your overall message; lots of university courses have poor epistemic standards. But then, lots of society has poor epistemic standards, so is this a real surprise?
I agree wrt prioritization of math topics, but this is a lot different from the topic of most of your post, which is accuracy. I don’t find this kind of not-commonly-useful argument to be much of a strike against a class. Because if you do take those classes, then you can do those technical jobs (at least, if you get far enough).
Too often it’s the kid in math class complaining “this will never be useful to me”. It’s too easy of an out. If you were training an RL system, you would just put it straight to work and have it do the actual thing it needs to learn thousands of times. But humans get a lot of benefit from broad knowledge, including preparedness for unexpected situations, and having analogies to work from. If you want to do really innovative work, it pays to learn a lot of individually “useless” stuff. (If you don’t want to do really innovative work, OK, sure.)
If someone were approaching school with an attitude of getting as much out of it as possible, it seems like they’d want to identify skills to level up (EG they might identify art history as not being a skill to level up), and then grind those skills as much as they can. Any of those proficiencies, once you grind enough, becomes a marketable skill. (Even if you make an economically poor choice, like law or art, there will be ways to succeed, and potential applications beyond getting a job. Not that I’m recommending making economically poor choices.)
It seems better to have an attitude of collecting as many of those as possible, rather than as little as possible. So long as you’re at school anyway.
This ignores the opportunity costs, and just assumes the problem. The OP is arguing for reform, not questioning the mainstream strategy of success for an individual. A smart person can be, e.g., a productive programmer with a middle-school level of math, and two to four years of programming training (i.e., basics of a programming language, data structures and algorithms, and lots of hands-on toy projects). Doing “really innovative work” also might not be efficient either at the individual level or even the societal one. There are lots of normal work to be done.
The problem is more than mere epistemics as well; Most rigorous courses teach few useful skills. Most of what one learns is forgotten when not actively used.