What? It doesn’t say any such thing. It says they’re inexplicable in terms of the goal system being examined, but that doesn’t mean they’re inaccessible, in the same way that you can access the parallel postulate within Euclidian geometry but can’t justify it in terms of the other Euclidian axioms.
That said, I think we’re probably good enough at rationalization that inexplicability isn’t a particularly good way to model terminal goals for human purposes, insofar as humans have well-defined terminal goals.
What? It doesn’t say any such thing. It says they’re inexplicable in terms of the goal system being examined, but that doesn’t mean they’re inaccessible, in the same way that you can access the parallel postulate within Euclidian geometry but can’t justify it in terms of the other Euclidian axioms.
That said, I think we’re probably good enough at rationalization that inexplicability isn’t a particularly good way to model terminal goals for human purposes, insofar as humans have well-defined terminal goals.