I think I weakly disagree with the implication that “distillation” should be thought of as a different category of activity from “original research”.
(I might be wrong, but) I think there is a relatively large group of people who want to become AI alignment researchers that just wouldn’t be good enough to do very effective alignment research, and I think many of those people might be more effective as distillers. (And I think distillers (and teachers for AI safety) as occupation is currently very neglected.)
Similarly, there may also be people who think they aren’t good enough for alignment research, but may be more encouraged to just learn the stuff well and then teach it to others.
(I might be wrong, but) I think there is a relatively large group of people who want to become AI alignment researchers that just wouldn’t be good enough to do very effective alignment research, and I think many of those people might be more effective as distillers. (And I think distillers (and teachers for AI safety) as occupation is currently very neglected.)
Similarly, there may also be people who think they aren’t good enough for alignment research, but may be more encouraged to just learn the stuff well and then teach it to others.