I don’t think that overfitting is a good metaphor for your problem. Overfitting involves building a model that is more complicated than an optimal model would be. What exactly is the model here, and why do you think that learning just a subset of the course’s material leads to building a more complicated model?
Instead, your example looks like a case of sampling bias. Think of the material of whole course as the whole distribution, and of the exam topics as a subset of that distribution. “Training” your brain with samples just from that subset is going to produce a learning outcome that is not likely to work well for the whole distribution.
Memorizing disconnected bits of knowledge without understanding the material—that would be a case of overfitting.
I don’t think that overfitting is a good metaphor for your problem. Overfitting involves building a model that is more complicated than an optimal model would be. What exactly is the model here, and why do you think that learning just a subset of the course’s material leads to building a more complicated model?
Instead, your example looks like a case of sampling bias. Think of the material of whole course as the whole distribution, and of the exam topics as a subset of that distribution. “Training” your brain with samples just from that subset is going to produce a learning outcome that is not likely to work well for the whole distribution.
Memorizing disconnected bits of knowledge without understanding the material—that would be a case of overfitting.
That is exactly what most students do. Source: Am student, have watched others learn.