What level of “general problem” do you have in mind? To a large degree I’m thinking about things like “Gosh, it took (unnecessary) centuries or decades for researchers to launch subfields to study normative uncertainty and intelligence explosion”, and that could be a “lack of cause neutrality” problem. And maybe you’re thinking instead on a smaller scale, and want to say something like “Given that people decide to work on X, they’re relatively efficient in working on X, and exploring the space within X, even if they’re completely missing normative uncertainty and intelligence explosion.”
What level of “general problem” do you have in mind? To a large degree I’m thinking about things like “Gosh, it took (unnecessary) centuries or decades for researchers to launch subfields to study normative uncertainty and intelligence explosion”, and that could be a “lack of cause neutrality” problem. And maybe you’re thinking instead on a smaller scale, and want to say something like “Given that people decide to work on X, they’re relatively efficient in working on X, and exploring the space within X, even if they’re completely missing normative uncertainty and intelligence explosion.”