2. Task uncertainty with reasonable prior on goal drift—the system is unsure about the task it tries to do and seeks human inputs about it.
“Task uncertainty with reasonable prior…” sounds to me like an overly-specific operationalization, but I think this desideratum is gesturing at visibility/correctability.
To me, “unsure about the task it tries to do” sounds more like applicability to a wide range of problems.
To me, “unsure about the task it tries to do” sounds more like applicability to a wide range of problems.