What if X, your off-the-cuff solution, turn out to be a bad idea? You’re now baking in the notion of X beyond any hope of revocation even if there’s some piece of knowledge that would make you flee in horror from it.
That’s the generic concern. The only way to circumvent the concern seems to be to have all the relevant pieces of knowledge, including about what we really want and how we’d react to getting it. But that’s not knowledge we’re able to have. We’ll be left with critical engineering choices underconstrained by our knowledge. Well worth being nervous about, but also worth suggesting possible improvements to the engineering choice, I suppose. :(
(paraphrased quote)
That’s the generic concern. The only way to circumvent the concern seems to be to have all the relevant pieces of knowledge, including about what we really want and how we’d react to getting it. But that’s not knowledge we’re able to have. We’ll be left with critical engineering choices underconstrained by our knowledge. Well worth being nervous about, but also worth suggesting possible improvements to the engineering choice, I suppose. :(