A terminal value could be defined as that for which I would be prepared to knowingly enter a situation that carries a strong risk of death or other major loss.
That seems more like a definition of something one cares a lot about; sure, the two are correlated, but I believe “terminal value” usually refers to something you care about “for itself” rather than because it helps you in another way. So you could care more about an instrumental value (e.g. making money) than about a value-you-care-about-for-itself (e.g. smelling nice flowers).
Both attributes (how much you care, and whether it’s instrumental) are important though.
And if I don’t have good introspective access to my own terminal goals, then it is harder for a potential enemy to find out what they are. Moreover, this would also have applied to my ancestors. So not having good introspective access to my own terminal goals may be a general human survival adaptation.
Eh, I’m not sure; I could come up with equally plausible explanations for why it would be good to have introspective access to my terminal goals. And more importantly, humans (including everybody who could blackmail you) has roughly similar terminal goals, so have a pretty good idea of how you may react to different kinds of threats.
So you could care more about an instrumental value (e.g. making money) than about a value-you-care-about-for-itself (e.g. smelling nice flowers).
Hmmm. Then it seems that I had completely misunderstood the term. My apologies.
If that is the case, then it should be possible to find a terminal value by starting with any value and then repeatedly asking the question “and why do I value that value?” until a terminal value is reached.
For example, I may care about money because it allows me to buy food; I may care about food because it allows me to stay alive; and staying alive might be a terminal value.
That seems more like a definition of something one cares a lot about; sure, the two are correlated, but I believe “terminal value” usually refers to something you care about “for itself” rather than because it helps you in another way. So you could care more about an instrumental value (e.g. making money) than about a value-you-care-about-for-itself (e.g. smelling nice flowers).
Both attributes (how much you care, and whether it’s instrumental) are important though.
Eh, I’m not sure; I could come up with equally plausible explanations for why it would be good to have introspective access to my terminal goals. And more importantly, humans (including everybody who could blackmail you) has roughly similar terminal goals, so have a pretty good idea of how you may react to different kinds of threats.
Hmmm. Then it seems that I had completely misunderstood the term. My apologies.
If that is the case, then it should be possible to find a terminal value by starting with any value and then repeatedly asking the question “and why do I value that value?” until a terminal value is reached.
For example, I may care about money because it allows me to buy food; I may care about food because it allows me to stay alive; and staying alive might be a terminal value.