What could instrumental rationality mean without reference to a set of terminal goals (or equivalently: intrinsic values, preferences, a utility function)? Given that we seem to have value uncertainty, asking “What is the meaning of life?” seems perfectly reasonable (as long as one doesn’t assume that there must be an answer). There is no reason why we couldn’t have been designed by some creator to serve the creator’s purposes. I’m not sure if it makes sense to be disappointed when that turns out not to be the case (as seems likely at this point), but it certainly doesn’t make the value uncertainty problem any easier.
What could instrumental rationality mean without reference to a set of terminal goals (or equivalently: intrinsic values, preferences, a utility function)? Given that we seem to have value uncertainty, asking “What is the meaning of life?” seems perfectly reasonable (as long as one doesn’t assume that there must be an answer). There is no reason why we couldn’t have been designed by some creator to serve the creator’s purposes. I’m not sure if it makes sense to be disappointed when that turns out not to be the case (as seems likely at this point), but it certainly doesn’t make the value uncertainty problem any easier.