What if Archipelagos, your off-the-cuff solution, turn out to be a bad idea? You’re now baking in the notion of Archipelagos beyond any hope of revocation even if there’s some piece of knowledge that would make you flee in horror from it. Thinking about this for another year isn’t going to make me significantly less nervous about it.
What if X, your off-the-cuff solution, turn out to be a bad idea? You’re now baking in the notion of X beyond any hope of revocation even if there’s some piece of knowledge that would make you flee in horror from it.
That’s the generic concern. The only way to circumvent the concern seems to be to have all the relevant pieces of knowledge, including about what we really want and how we’d react to getting it. But that’s not knowledge we’re able to have. We’ll be left with critical engineering choices underconstrained by our knowledge. Well worth being nervous about, but also worth suggesting possible improvements to the engineering choice, I suppose. :(
What if Archipelagos, your off-the-cuff solution, turn out to be a bad idea? You’re now baking in the notion of Archipelagos beyond any hope of revocation even if there’s some piece of knowledge that would make you flee in horror from it. Thinking about this for another year isn’t going to make me significantly less nervous about it.
(paraphrased quote)
That’s the generic concern. The only way to circumvent the concern seems to be to have all the relevant pieces of knowledge, including about what we really want and how we’d react to getting it. But that’s not knowledge we’re able to have. We’ll be left with critical engineering choices underconstrained by our knowledge. Well worth being nervous about, but also worth suggesting possible improvements to the engineering choice, I suppose. :(