Professors are selected to be good at research not good at teaching. They are also evaluated at being good at research, not at teaching. You are assuming universities primarily care about undergraduate teaching, but that is very wrong.
(I’m not sure why this is the case, but I’m confident that it is)
And then brags about how much research funding they bring in, which is heavily used in creating a “ranking” of that school within that research area. Which isn’t really wrong, if you’re evaluating its ability to get someone from undergraduate to graduate research in that area, or grad to post-grad.
That ranking gets the school (or at least is thought to indirectly get it) student dollars and donor dollars. And sometimes state funding dollars.
So following the money and prestige, it’s a simple story I think.
Agree in general, but there is an ecosystem of mostly-small colleges where teaching has higher priority, and most ambitious American students and their parents know about it. Note for example that Harvard, Yale, Princeton and Stanford do not appear in the following list of about 200 colleges:
I agree that this is the case (and indeed, a quick google search of even my worst professors yields considerably impressive CVs). I don’t understand why that’s the case. Is it, as ErickBall suggests, simply cheaper to hire good researchers than good teachers? I find that a little unlikely. I also find it unlikely that this is more profitable—surely student tuition + higher alumni donations be worth more than whatever cut of NIH/NSF/etc. funding they’re taking.
My question is who this system leaves better off? Students get worse professors, good researchers have to waste their time teaching and good teachers have to waste their time researching. Other than maybe the science journals or something, who has a stake in perpetuating this?
A natural equilibrium of institutions doesn’t have to leave anyone better off. Excellence at research is the most legible prestige-carrying property of professors, being good teachers is harder to observe. As Viliam points out, the purpose of raising researchers is best served by teachers who are good researchers, and also otherwise there is risk of content drifting away from relevance or sanity. So even for students, orgs with good researchers are more credible sources of learning, given the current state of legible education quality indicators.
My question is who this system leaves better off? Students get worse professors, good researchers have to waste their time teaching and good teachers have to waste their time researching.
I am quite curious about this, too.
I suspect there might be some kind of fallacy involved, something like “if we make a job that is for both research and teaching, we will automatically get people who are good at both research and teaching… even if we actually evaluate and reward them only for the research”. Maybe, if someone sucks at teaching, it is assumed that they would never apply for such job in the first place—they could get a job at some purely research institution instead. (So why does this not happen? I suppose that even for a researcher without teaching skills, a work at university can be preferable for some selfish reasons. Or they can be overconfident about their teaching skills.)
And the following step is that someone who is good at both research and teaching is obviously better than someone who is merely good at teaching, because such person will be able to teach the latest science. Which ignores the fact that a lot of what is taught at universities is not the latest science. But it is still better to have someone who has the ability to get the latest science right.
To steelman this position, imagine the opposite extreme: imagine a university where all teachers are great at teaching, but suck at research. It would be a pleasant experience for the students, but I would worry that a few decades later what the professors teach could be obsolete, or even outright pseudoscience. Also, teachers who are not themselves good researchers might have a problem to bring up a new generation of researchers; and where else would we get them?
a) Even at high levels, professors are rarely teaching the absolute cutting edge. With the exception of my AI/ML courses and some of the upper-level CS, I don’t think I’ve learned very much that a professor 10-20 years ago wouldn’t have known. And I would guess that CS is very much the outlier in this regard: I would be mildly surprised if more than 5-10% of undergrads encounter, say, chemistry, economics, or physics that wasn’t already mainstream 50 years ago.
b) Ballpark estimate based on looking at a couple specific schools—maybe 10% of undergrads at a top university go on to a PhD. Universities can (and should) leverage the fact that very few of their students want to go on to do research, and the ones that do will almost all have 4-5 more years of school to learn how to do good research.
If I were running a university, I would employ somewhat standardized curricula for most courses and stipulate that professors must test their students on that material. For the undergrad, I would aim to hire the best teachers (conditioned on a very strong understanding of the material, obviously), while for the graduate school I would aim to hire the best researchers, who would have to teach fewer courses since they would never teach undergrads. Top researchers would be attracted by the benefit of not having to teach any intro courses, top teachers would be attracted by the benefit of not being pressured to constantly put out research, undergrads would be attracted by the benefit of having competent teachers, and PhD students would be attracted by the more individual attention they get from having research faculty’s full focus. And as a university, the amount of top-tier research being outputted would probably increase, since those people don’t have to teach Bio 101 or whatever.
I contend that this leaves all the stakeholders better off without being more expensive, more difficult, or more resource-intensive. Obviously I’m wrong somewhere, or colleges would just do this, but I’m unsure where...
I would aim to hire the best teachers (conditioned on a very strong understanding of the material, obviously), while for the graduate school I would aim to hire the best researchers, who would have to teach fewer courses since they would never teach undergrads.
This seems like an obvious solution, so I wonder whether some institutions are already doing it, or there is a catch that we didn’t notice.
(This is just a wild guess, but it perhaps a university that only does a half of that—i.e. hires best teachers and mediocre researchers, or best researchers and mediocre teachers—would be just as popular, for half the cost. You cannot get unlimited amounts of students anyway, so if you already get those who want the best teaching, you don’t need to also attract the ones who want the best research, and vice versa.)
I was thinking from the opposite direction, whether it would make sense for the professors to make pairs—one who wants to teach, plus one who wants to do research—and trade: “I will teach your lessons, if you write my thesis and add me as a co-author to your publications”. Not sure if this is legal. (Also, it seems fragile: if one decides to quit or gets hit by a bus, the other’s career is also over.)
There are in fact many universities that have both “research faculty” and “teaching faculty”. Being research faculty has higher prestige, but nowadays it can be the case that teaching faculty have almost the same job security as research faculty. (This is for permanent teaching faculty, sessional instructors have very low job security.)
In my experience, the teaching faculty often do have a greater enthusiasm for teaching than most research faculty, and also often get better student evaluations. I think it’s generally a good idea to have such teaching faculty.
However, my experience has been that there are some attitudinal differences that indicate that letting the teaching faculty have full control of the teaching aspect of the university’s mission isn’t a good idea.
One such is a tendency for teaching faculty to start to see the smooth running of the undergraduate program as an end in itself. Research faculty are more likely to have an ideological commitment to the advancement of knowledge, even if promoting that is not as convenient.
A couple anecdotes (from my being research faculty at a highly-rated university):
At one point, there was a surge in enrollment in CS. Students enrolled in CS programs found it hard to take all the courses they needed, since they were full. This led some teaching faculty to propose that CS courses (after first year) no longer be open to students in any other department, seeing as such students don’t need CS courses to fulfill their degree requirements. Seems logical: students need to smoothly check off degree requirements and graduate. The little matter that knowledge of CS is crucial to cutting-edge research in many important fields like biology and physics seemed less important...
Another time, I somewhat unusually taught an undergrad course a bit outside my area, which I didn’t teach again the next year. I put all the assignments I gave out, with solutions, on my web page. The teaching faculty instructor the next year asked me to take this down, worrying that students might find answers to future assigned questions on my web page. I pointed out that these were all my own original questions, not from the textbook, and asked whether he also wanted the library to remove from circulation all the books on this topic…
Also, some textbooks written by teaching faculty seem more oriented towards moving students through standard material than teaching them what is actually important.
Nevertheless, it is true that many research faculty are not very good at teaching, and often not much interested either. A comment I once got on a course evaluation was “there’s nothing stupid about this course”. I wonder what other experiences this student had had that made that notable!
Professors are selected to be good at research not good at teaching. They are also evaluated at being good at research, not at teaching. You are assuming universities primarily care about undergraduate teaching, but that is very wrong.
(I’m not sure why this is the case, but I’m confident that it is)
Being nitpicky: Professors are selected to be legibly good at research.
Which means getting government grants, from which the university takes a cut of overhead.
And then brags about how much research funding they bring in, which is heavily used in creating a “ranking” of that school within that research area. Which isn’t really wrong, if you’re evaluating its ability to get someone from undergraduate to graduate research in that area, or grad to post-grad.
That ranking gets the school (or at least is thought to indirectly get it) student dollars and donor dollars. And sometimes state funding dollars.
So following the money and prestige, it’s a simple story I think.
Agree in general, but there is an ecosystem of mostly-small colleges where teaching has higher priority, and most ambitious American students and their parents know about it. Note for example that Harvard, Yale, Princeton and Stanford do not appear in the following list of about 200 colleges:
https://www.usnews.com/best-colleges/rankings/national-liberal-arts-colleges
I agree that this is the case (and indeed, a quick google search of even my worst professors yields considerably impressive CVs). I don’t understand why that’s the case. Is it, as ErickBall suggests, simply cheaper to hire good researchers than good teachers? I find that a little unlikely. I also find it unlikely that this is more profitable—surely student tuition + higher alumni donations be worth more than whatever cut of NIH/NSF/etc. funding they’re taking.
My question is who this system leaves better off? Students get worse professors, good researchers have to waste their time teaching and good teachers have to waste their time researching. Other than maybe the science journals or something, who has a stake in perpetuating this?
A natural equilibrium of institutions doesn’t have to leave anyone better off. Excellence at research is the most legible prestige-carrying property of professors, being good teachers is harder to observe. As Viliam points out, the purpose of raising researchers is best served by teachers who are good researchers, and also otherwise there is risk of content drifting away from relevance or sanity. So even for students, orgs with good researchers are more credible sources of learning, given the current state of legible education quality indicators.
I am quite curious about this, too.
I suspect there might be some kind of fallacy involved, something like “if we make a job that is for both research and teaching, we will automatically get people who are good at both research and teaching… even if we actually evaluate and reward them only for the research”. Maybe, if someone sucks at teaching, it is assumed that they would never apply for such job in the first place—they could get a job at some purely research institution instead. (So why does this not happen? I suppose that even for a researcher without teaching skills, a work at university can be preferable for some selfish reasons. Or they can be overconfident about their teaching skills.)
And the following step is that someone who is good at both research and teaching is obviously better than someone who is merely good at teaching, because such person will be able to teach the latest science. Which ignores the fact that a lot of what is taught at universities is not the latest science. But it is still better to have someone who has the ability to get the latest science right.
To steelman this position, imagine the opposite extreme: imagine a university where all teachers are great at teaching, but suck at research. It would be a pleasant experience for the students, but I would worry that a few decades later what the professors teach could be obsolete, or even outright pseudoscience. Also, teachers who are not themselves good researchers might have a problem to bring up a new generation of researchers; and where else would we get them?
I’d offer the counterpoints that:
a) Even at high levels, professors are rarely teaching the absolute cutting edge. With the exception of my AI/ML courses and some of the upper-level CS, I don’t think I’ve learned very much that a professor 10-20 years ago wouldn’t have known. And I would guess that CS is very much the outlier in this regard: I would be mildly surprised if more than 5-10% of undergrads encounter, say, chemistry, economics, or physics that wasn’t already mainstream 50 years ago.
b) Ballpark estimate based on looking at a couple specific schools—maybe 10% of undergrads at a top university go on to a PhD. Universities can (and should) leverage the fact that very few of their students want to go on to do research, and the ones that do will almost all have 4-5 more years of school to learn how to do good research.
If I were running a university, I would employ somewhat standardized curricula for most courses and stipulate that professors must test their students on that material. For the undergrad, I would aim to hire the best teachers (conditioned on a very strong understanding of the material, obviously), while for the graduate school I would aim to hire the best researchers, who would have to teach fewer courses since they would never teach undergrads. Top researchers would be attracted by the benefit of not having to teach any intro courses, top teachers would be attracted by the benefit of not being pressured to constantly put out research, undergrads would be attracted by the benefit of having competent teachers, and PhD students would be attracted by the more individual attention they get from having research faculty’s full focus. And as a university, the amount of top-tier research being outputted would probably increase, since those people don’t have to teach Bio 101 or whatever.
I contend that this leaves all the stakeholders better off without being more expensive, more difficult, or more resource-intensive. Obviously I’m wrong somewhere, or colleges would just do this, but I’m unsure where...
This seems like an obvious solution, so I wonder whether some institutions are already doing it, or there is a catch that we didn’t notice.
(This is just a wild guess, but it perhaps a university that only does a half of that—i.e. hires best teachers and mediocre researchers, or best researchers and mediocre teachers—would be just as popular, for half the cost. You cannot get unlimited amounts of students anyway, so if you already get those who want the best teaching, you don’t need to also attract the ones who want the best research, and vice versa.)
I was thinking from the opposite direction, whether it would make sense for the professors to make pairs—one who wants to teach, plus one who wants to do research—and trade: “I will teach your lessons, if you write my thesis and add me as a co-author to your publications”. Not sure if this is legal. (Also, it seems fragile: if one decides to quit or gets hit by a bus, the other’s career is also over.)
There are in fact many universities that have both “research faculty” and “teaching faculty”. Being research faculty has higher prestige, but nowadays it can be the case that teaching faculty have almost the same job security as research faculty. (This is for permanent teaching faculty, sessional instructors have very low job security.)
In my experience, the teaching faculty often do have a greater enthusiasm for teaching than most research faculty, and also often get better student evaluations. I think it’s generally a good idea to have such teaching faculty.
However, my experience has been that there are some attitudinal differences that indicate that letting the teaching faculty have full control of the teaching aspect of the university’s mission isn’t a good idea.
One such is a tendency for teaching faculty to start to see the smooth running of the undergraduate program as an end in itself. Research faculty are more likely to have an ideological commitment to the advancement of knowledge, even if promoting that is not as convenient.
A couple anecdotes (from my being research faculty at a highly-rated university):
At one point, there was a surge in enrollment in CS. Students enrolled in CS programs found it hard to take all the courses they needed, since they were full. This led some teaching faculty to propose that CS courses (after first year) no longer be open to students in any other department, seeing as such students don’t need CS courses to fulfill their degree requirements. Seems logical: students need to smoothly check off degree requirements and graduate. The little matter that knowledge of CS is crucial to cutting-edge research in many important fields like biology and physics seemed less important...
Another time, I somewhat unusually taught an undergrad course a bit outside my area, which I didn’t teach again the next year. I put all the assignments I gave out, with solutions, on my web page. The teaching faculty instructor the next year asked me to take this down, worrying that students might find answers to future assigned questions on my web page. I pointed out that these were all my own original questions, not from the textbook, and asked whether he also wanted the library to remove from circulation all the books on this topic…
Also, some textbooks written by teaching faculty seem more oriented towards moving students through standard material than teaching them what is actually important.
Nevertheless, it is true that many research faculty are not very good at teaching, and often not much interested either. A comment I once got on a course evaluation was “there’s nothing stupid about this course”. I wonder what other experiences this student had had that made that notable!