The very thing that distinguishes terminal goals is that you don’t “pick” them, you start out with them.
Not really. What terminal goals/values are is basically the top level goals of a recursive search. In other words, terminal goals are a lot like axioms, where terminal goals are the first things you choose, and then generate recursive instrumental goals out of it.
Terminal goals are still changeable, but changing the terminal goals changes all other goals.
And yes, this quote is accurate re terminal goals/values:
Morality. To me it seems like rationality can tell you how to achieve your goals but not what (terminal) goals to pick. Arguments that try to tell you what terminal goals to pick have just never made sense to me.
Not really. What terminal goals/values are is basically the top level goals of a recursive search. In other words, terminal goals are a lot like axioms, where terminal goals are the first things you choose, and then generate recursive instrumental goals out of it.
Terminal goals are still changeable, but changing the terminal goals changes all other goals.
And yes, this quote is accurate re terminal goals/values: