From the position of uncertainty, there is no optimal direction, only a model of good distribution of global efforts among all directions. A marginal choice of a spherical researcher without specific preferences should be based on identifying relatively neglected directions. A choice of an actual researcher with specific preferences should give weight to those preferences, which might greatly improve productivity.
A marginal choice of a spherical researcher without specific preferences should be based on identifying relatively neglected directions
Inside-view convincingness of these directions still has to be weighted in. E. g., “study the Bible for alignment insights” is a relatively neglected direction (just Unsong on it, really?), but that doesn’t mean it’d be sensible to focus on it just because it’s neglected. And even if your marginal contributions to the correct approach would be minimal because so many other people are working on it, that may still be more expected impact than setting off on a neglected (and very likely incorrect) one.
A choice of an actual researcher with specific preferences should give weight to those preferences
Oh, I’m not saying entirely ignore your preferences/comparative advantages. But if you’re looking at a bunch of plausible directions, you can pick between them not solely based on your comparative advantages.
A marginal choice of a spherical researcher without specific preferences should be based on identifying relatively neglected directions
Inside-view convincingness of these directions still has to be weighted in.
I mean directions neglected relative to estimated good distribution of global effort. If I estimate good distribution of effort towards searching The Silmarillion for insights relevant to mechanistic interpretability to be zero, then it’s not a relatively neglected direction.
A choice of an actual researcher with specific preferences should give weight to those preferences
if you’re looking at a bunch of plausible directions, you can pick between them not solely based on your comparative advantages
Sure, by “give weight” I mean take into account, not take as the sole basis for a decision. The other major factor is that relative neglectedness I mentioned (in the sense I hopefully now clarified).
From the position of uncertainty, there is no optimal direction, only a model of good distribution of global efforts among all directions. A marginal choice of a spherical researcher without specific preferences should be based on identifying relatively neglected directions. A choice of an actual researcher with specific preferences should give weight to those preferences, which might greatly improve productivity.
Inside-view convincingness of these directions still has to be weighted in. E. g., “study the Bible for alignment insights” is a relatively neglected direction (just Unsong on it, really?), but that doesn’t mean it’d be sensible to focus on it just because it’s neglected. And even if your marginal contributions to the correct approach would be minimal because so many other people are working on it, that may still be more expected impact than setting off on a neglected (and very likely incorrect) one.
Oh, I’m not saying entirely ignore your preferences/comparative advantages. But if you’re looking at a bunch of plausible directions, you can pick between them not solely based on your comparative advantages.
I mean directions neglected relative to estimated good distribution of global effort. If I estimate good distribution of effort towards searching The Silmarillion for insights relevant to mechanistic interpretability to be zero, then it’s not a relatively neglected direction.
Sure, by “give weight” I mean take into account, not take as the sole basis for a decision. The other major factor is that relative neglectedness I mentioned (in the sense I hopefully now clarified).