I think academic math has a problem where it’s more culturally valorized to *be really smart* than to teach well, to the point that effective communication actually gets stigmatized as catering too much to dumb people.
Having left academic math, I am no longer terrified of revealing my stupidity, so I can now admit that I learned intro probability theory from a class in the operations research department (that used an actual syllabus and lecture notes! unlike most math classes!), that I learned more about solving ODEs from my economics classes than from my ODEs class, that I only grokked Fourier analysis when I revisited it in a signal processing context, and that my favorite introduction to representation theory is Shlomo Sternberg’s *Group Theory and Physics.*
Concrete examples are easier for some people to learn from!
I think academic math has a problem where it’s more culturally valorized to be really smart than to teach well
I don’t think that’s the issue exactly. My guess is that academic math has a culture of teaching something quite different from what most applied practitioners actually want. The culture is to focus really hard on how you reliably prove new results, and to get as quickly as possible to the frontier of things that are still a subject of research and aren’t quite “done” just yet. Under this POV, focusing on detailed explanations about existing knowledge, even really effective ones, might just be a waste of time and effort that’s better spent elsewhere!
One semester in high school, I and a bunch of my classmates were taking a math course, where we were learning linear algebra, and also a computer science course (where we were learning C).
Our math teacher had spent a solid two or three weeks on matrices—with most of that time taken up by matrix multiplication. None of us were getting it. We were all smart kids (this was, like, a specialized / magnet / whatever school), and generally reasonably good at math, but this was eluding us; most of the class managed to learn the procedure by rote well enough to pass the tests, but we didn’t really grasp the concept. Eventually, after weeks of frustration, we had more or less “learned” how to multiply matrices sufficiently well that the teacher decided to move on to the next topic in the syllabus.
Not long afterwards, our CS teacher mentioned offhandedly that our next assignment would require us to write matrix multiplication code (it had to do with graphics). As an afterthought, he turned to the class and asked, “Oh, you guys learned this in your math classes, right?”—clearly expecting to get a chorus of “yes”s. When instead his question was greeted by an awkward silence and some “uhh…” and “umm…”, he looked surprised for a moment, then went “Ok, look…”, and started sketching something on the whiteboard.
Five minutes later, the entire class erupted into a collective “ohhhhh!!!”. And that was that.
I’ve had a number of other experiences like this, though that one was the most memorable. So, yes, “computer science departments/instructors teach math better than math departments/instructors” seems to be a trend.
This reminds me of a story that a CS major friend told me: she and a bunch of others had ran into sum notation earlier in some math classes, but hadn’t quite understood how it should be interpreted… until they were taking a CS class, where the TA noticed that they seemed to be confused about it.
The TA was like, “well, you guys know for loops, right?”
Them: ”… yes …”
The TA: “Okay, so if you’ve got ∑5n=1n for example, then you could read that as x = 0; for (int n = 1; n ⇐ 5; n++){x = x + n}; return x”
I heard a similar story about when Paul Sally visited a grade school classroom. He asked the students what they were learning, and they said “Adding fractions. It’s really hard, you have to find the greatest common denominator....” Sally said “Forget about that, just multiply the numerator of each fraction by the denominator of the other and add them, and that’s your numerator.” The students loved this, and called it the Sally method.
That does not always produce a reduced fraction, of course. In order to do that, you need to go find a GCF just like before… but I agree, that should be presented as an *optimization* after teaching the basic idea.
If I had to guess, it would be something like Kaj’s example in the sibling comment—they were doing summations instead of loops, and they hadn’t seen the graphical arrangement of matrices that makes the multiplication obvious. (“Okay, wait, why are we using this index and that index?” → “You put matrix A here, matrix B there, you vector-multiply this row and this column and that creates this cell, and then you do that for all the cells.”) If you look at the Wikipedia page, imagine a class that only did the definition section and not the illustration section.
Oh gosh, no, sorry; I wish I did. This was many, many years ago, and the last time I had to write matrix code was also (a smaller number, but still) many years ago.
My experience has also been that the Computer Science department at Berkeley is much better at teaching Math than the Math department.
This matches my experience.
I think academic math has a problem where it’s more culturally valorized to *be really smart* than to teach well, to the point that effective communication actually gets stigmatized as catering too much to dumb people.
Having left academic math, I am no longer terrified of revealing my stupidity, so I can now admit that I learned intro probability theory from a class in the operations research department (that used an actual syllabus and lecture notes! unlike most math classes!), that I learned more about solving ODEs from my economics classes than from my ODEs class, that I only grokked Fourier analysis when I revisited it in a signal processing context, and that my favorite introduction to representation theory is Shlomo Sternberg’s *Group Theory and Physics.*
Concrete examples are easier for some people to learn from!
I don’t think that’s the issue exactly. My guess is that academic math has a culture of teaching something quite different from what most applied practitioners actually want. The culture is to focus really hard on how you reliably prove new results, and to get as quickly as possible to the frontier of things that are still a subject of research and aren’t quite “done” just yet. Under this POV, focusing on detailed explanations about existing knowledge, even really effective ones, might just be a waste of time and effort that’s better spent elsewhere!
True story:
One semester in high school, I and a bunch of my classmates were taking a math course, where we were learning linear algebra, and also a computer science course (where we were learning C).
Our math teacher had spent a solid two or three weeks on matrices—with most of that time taken up by matrix multiplication. None of us were getting it. We were all smart kids (this was, like, a specialized / magnet / whatever school), and generally reasonably good at math, but this was eluding us; most of the class managed to learn the procedure by rote well enough to pass the tests, but we didn’t really grasp the concept. Eventually, after weeks of frustration, we had more or less “learned” how to multiply matrices sufficiently well that the teacher decided to move on to the next topic in the syllabus.
Not long afterwards, our CS teacher mentioned offhandedly that our next assignment would require us to write matrix multiplication code (it had to do with graphics). As an afterthought, he turned to the class and asked, “Oh, you guys learned this in your math classes, right?”—clearly expecting to get a chorus of “yes”s. When instead his question was greeted by an awkward silence and some “uhh…” and “umm…”, he looked surprised for a moment, then went “Ok, look…”, and started sketching something on the whiteboard.
Five minutes later, the entire class erupted into a collective “ohhhhh!!!”. And that was that.
I’ve had a number of other experiences like this, though that one was the most memorable. So, yes, “computer science departments/instructors teach math better than math departments/instructors” seems to be a trend.
This reminds me of a story that a CS major friend told me: she and a bunch of others had ran into sum notation earlier in some math classes, but hadn’t quite understood how it should be interpreted… until they were taking a CS class, where the TA noticed that they seemed to be confused about it.
The TA was like, “well, you guys know for loops, right?”
Them: ”… yes …”
The TA: “Okay, so if you’ve got ∑5n=1n for example, then you could read that as x = 0; for (int n = 1; n ⇐ 5; n++){x = x + n}; return x”
Them: “OOOOOOOH”
I heard a similar story about when Paul Sally visited a grade school classroom. He asked the students what they were learning, and they said “Adding fractions. It’s really hard, you have to find the greatest common denominator....” Sally said “Forget about that, just multiply the numerator of each fraction by the denominator of the other and add them, and that’s your numerator.” The students loved this, and called it the Sally method.
That does not always produce a reduced fraction, of course. In order to do that, you need to go find a GCF just like before… but I agree, that should be presented as an *optimization* after teaching the basic idea.
Cool, do you remember what the 5-minute explanation was?
If I had to guess, it would be something like Kaj’s example in the sibling comment—they were doing summations instead of loops, and they hadn’t seen the graphical arrangement of matrices that makes the multiplication obvious. (“Okay, wait, why are we using this index and that index?” → “You put matrix A here, matrix B there, you vector-multiply this row and this column and that creates this cell, and then you do that for all the cells.”) If you look at the Wikipedia page, imagine a class that only did the definition section and not the illustration section.
Yes, it was a graphical explanation, I do remember that. (The one you describe sounds plausible, at least.)
Oh gosh, no, sorry; I wish I did. This was many, many years ago, and the last time I had to write matrix code was also (a smaller number, but still) many years ago.