Like many commenters here, I don’t think we have very good introspective access to our own terminal values, and what we think are terminal values may be wrong.
A terminal value could be defined as that for which I would be prepared to knowingly enter a situation that carries a strong risk of death or other major loss. Working off that definition, it is clear that other people knowing what my terminal goals are is dangerous—if an enemy finds out that information, then he can threaten my terminal goal to force me to abandon a valuable but non-terminal resource. (It’s risky on the enemy’s part, because it leaves open the option that I might preserve my terminal goals by killing or imprisoning the enemy in question; either way, though, I still lose significantly in the process.)
And if I don’t have good introspective access to my own terminal goals, then it is harder for a potential enemy to find out what they are. Moreover, this would also have applied to my ancestors. So not having good introspective access to my own terminal goals may be a general human survival adaptation.
A terminal value could be defined as that for which I would be prepared to knowingly enter a situation that carries a strong risk of death or other major loss.
That seems more like a definition of something one cares a lot about; sure, the two are correlated, but I believe “terminal value” usually refers to something you care about “for itself” rather than because it helps you in another way. So you could care more about an instrumental value (e.g. making money) than about a value-you-care-about-for-itself (e.g. smelling nice flowers).
Both attributes (how much you care, and whether it’s instrumental) are important though.
And if I don’t have good introspective access to my own terminal goals, then it is harder for a potential enemy to find out what they are. Moreover, this would also have applied to my ancestors. So not having good introspective access to my own terminal goals may be a general human survival adaptation.
Eh, I’m not sure; I could come up with equally plausible explanations for why it would be good to have introspective access to my terminal goals. And more importantly, humans (including everybody who could blackmail you) has roughly similar terminal goals, so have a pretty good idea of how you may react to different kinds of threats.
So you could care more about an instrumental value (e.g. making money) than about a value-you-care-about-for-itself (e.g. smelling nice flowers).
Hmmm. Then it seems that I had completely misunderstood the term. My apologies.
If that is the case, then it should be possible to find a terminal value by starting with any value and then repeatedly asking the question “and why do I value that value?” until a terminal value is reached.
For example, I may care about money because it allows me to buy food; I may care about food because it allows me to stay alive; and staying alive might be a terminal value.
A terminal value could be defined as that for which I would be prepared to knowingly enter a situation that carries a strong risk of death or other major loss. Working off that definition, it is clear that other people knowing what my terminal goals are is dangerous—if an enemy finds out that information, then he can threaten my terminal goal to force me to abandon a valuable but non-terminal resource. (It’s risky on the enemy’s part, because it leaves open the option that I might preserve my terminal goals by killing or imprisoning the enemy in question; either way, though, I still lose significantly in the process.)
And if I don’t have good introspective access to my own terminal goals, then it is harder for a potential enemy to find out what they are. Moreover, this would also have applied to my ancestors. So not having good introspective access to my own terminal goals may be a general human survival adaptation.
That seems more like a definition of something one cares a lot about; sure, the two are correlated, but I believe “terminal value” usually refers to something you care about “for itself” rather than because it helps you in another way. So you could care more about an instrumental value (e.g. making money) than about a value-you-care-about-for-itself (e.g. smelling nice flowers).
Both attributes (how much you care, and whether it’s instrumental) are important though.
Eh, I’m not sure; I could come up with equally plausible explanations for why it would be good to have introspective access to my terminal goals. And more importantly, humans (including everybody who could blackmail you) has roughly similar terminal goals, so have a pretty good idea of how you may react to different kinds of threats.
Hmmm. Then it seems that I had completely misunderstood the term. My apologies.
If that is the case, then it should be possible to find a terminal value by starting with any value and then repeatedly asking the question “and why do I value that value?” until a terminal value is reached.
For example, I may care about money because it allows me to buy food; I may care about food because it allows me to stay alive; and staying alive might be a terminal value.