Sometimes in a computer program, it is important that separate portions be changed at the same time if they are ever changed. An example is batch size: if you have your batch size of 16 dotted throughout your program, changing batch size will be slow and error prone.
The canonical solution is “Single source of truth.” Simply store BATCH_SIZE=16 at the top of your program and have all other locations reference the value of this variable. This solves both the slowness and the error-prone-ness issues.
However, single source of truth has a complexity cost, which can be low (python variable for a conntant) medium (inheritance, macros) up to catastrophic (C++ template metaprograms)
One case where the cost has historically been catastrophic is syncing python dependencies between requirements.txt and setup.cfg. In this case, the key insight is that for updating requirements, slowness is much less important than correctness. The solution is then to manually duplicate the requirements (slow), and add a unit test that verifies that the duplication is exact (correct).
Single source of truth is a more elegant solution and it should pretty much always be tried first, but it needs an escape hatch for when its complexity starts to skyrocket. I’ve found that a good heuristic is that if the source-of-truth-disribution machine starts to be an independently turing complete system, bail and switch to manual copying + automatic verification.
It’s all about how the students in your group were selected: either these students were never actually trying and getting their perfect math scores and making it into this class required no effort on their end, or they stopped trying as soon as they qualified to sign up for their class. The second case would just be bad luck for you. The first case suggests that they had some factor X that boosted their performance without meeting your definition of trying. If you could capture that X, it could put you in a class with X and actually trying.