This is very interesting, but I was actually thinking about it in a different manner. I like your idea too, but this is more along the lines of what I meant:
Ultimately, I have goals for the purpose of arriving at some desired state of being. Overtime goals should change rationally to better reach desired states. However, what is viewed as a desired state of being also changes over time.
When I was 12 I wanted to be the strongest person in the world, when I was 18 I wanted to be a world famous comedian. Both of these desired states undoubtedly have goals that the achievement of would more readily and potently produce such desired states. If I had adopted the most efficient methods of pursuing these dreams, I would have been making extreme commitments for the sake of something that later would turn out to be a false desired state. Until one knows their end desired state, any goal that exceeds a certain amount of resources is damaging to the long term achievement of a desired state. Furthermore, I think people rarely know when to cut their losses. It could be that after investing X amount into desired state Y, the individual is unwilling to abandon this belief, even if in reality it is no longer their desired state. People get into relationships and are too afraid of having wasted all that time and resources to get out.
I don’t know if I am being clear, but the train of my logic is roughly
Throughout the progression of time what a person finds to be a desired state changes. (Perhaps the change is more drastic in some than others, but I believe this change is normal. Just as through trial and error you refine your methods of goal achievement, through the trials and errors of life you reshape your beliefs and desires. )
If desired states of being are dynamic, then it not wise to commit to too extreme goals or methods for the sake of my current desired state of being. (There needs to be some anticipation of the likelihood that my current desired state might not be in agreement with my final/ actual desired state of being.)
I certainly agree that the goals people can articulate (e.g., “become a world-famous comedian” or “make a trillion dollars” or whatever) are rarely stable over time, and are rarely satisfying once achieved, such that making non-reversible choices (including, as you say, the consumption of resources) to achieve those goals may be something we regret later.
That said, it’s not clear that we have alternatives we’re guaranteed not to regret.
Incidentally, it’s conventional on LW to talk about this dichotomy in terms of “instrumental” and “terminal” goals, with the understanding that terminal goals are stable and worth optimizing for but mostly we just don’t know what they are. That said, I’m not a fan of that convention myself, except in the most metaphorical of senses, as I see no reason for believing terminal goals exist at all.
I’d have to know more clearly what you mean by “goal orientation” to answer that.
I certainly believe that most (actually, all) people, if asked to articulate their goals at various times during their lives, would articulate different goals at different times. And I’m pretty confident that most (and quite likely all, excepting perhaps those who die very young) people express different implicit goals through their choices at different times during their lives.
Are either of those equivalent to “shifts in goal orientation”?
Then if you believe that, does it seem logical to set up some system of regulation or some type of limitations on the degree of accuracy you are willing to strive for any current goal orientation?
But it certainly seems reasonable for me to, for example, not consume all available resources in pursuit of my currently articulable goals without some reasonable expectation of more resources being made available as a consequence of achieving those goals.
Is that an example of a system of regulation or type of limitation on the degree of accuracy I am willing to strive for my current goal orientation?
Preventing other people from consuming all available resources in pursuit of their currently articulable goals might also be a good idea, though it depends a lot on the costs of prevention and the likelihood that they would choose to do so and be able to do so in the absence of my preventing them.
But it certainly seems reasonable for me to, for example, not consume all available resources in pursuit of my currently articulable goals without some reasonable expectation of more resources being made available as a consequence of achieving those goals.
Is that an example of a system of regulation or type of limitation on the degree of accuracy I am willing to strive for my current goal orientation?
Yes in a sense. What I was getting at is that the implementation of rationality , when one’s capacity for rationality is high (i.e when someone is really rational), is a HUGE consumption of resources. That
1.) Because goal-orientations are dynamic
2.) The implementation of genuine rational methodology to a goal-orientation consumes a huge amount of the individual/group’s resources
3.) Both individuals and groups would benefit from having a system of regulating when to implement rational methodology and to what degree in the pursuit of a specific goal.
This is what my essay is about. This is what I call rational irrationality, or rationally irrational; because I see that a truly rational person for the sake of resource preservation and long-term (terminal) goal achievement would not want to achieve all their immediate goals in the fullest sense. This to me is different than having the goal of losing. Because you still want to achieve your goals, you still have immediate goals, you just do not place the efficient achievement of these goals as your top priority.
I certainly agree that sometimes we do best to put off achieving an immediate goal because we’re optimizing for longer-term or larger-scale goals. I’m not sure why you choose to call that “irrational,” but the labels don’t matter to me much.
I call it irrational, because in pursuit of our immediate goals we are ignoring/avoiding the most effective methodology, thus doing what is potentially ineffective?
But hell, maybe on a subconscious level I did it to be controversial and attack accepted group norms O_O
Yes.
This is very interesting, but I was actually thinking about it in a different manner. I like your idea too, but this is more along the lines of what I meant:
Ultimately, I have goals for the purpose of arriving at some desired state of being. Overtime goals should change rationally to better reach desired states. However, what is viewed as a desired state of being also changes over time.
When I was 12 I wanted to be the strongest person in the world, when I was 18 I wanted to be a world famous comedian. Both of these desired states undoubtedly have goals that the achievement of would more readily and potently produce such desired states. If I had adopted the most efficient methods of pursuing these dreams, I would have been making extreme commitments for the sake of something that later would turn out to be a false desired state. Until one knows their end desired state, any goal that exceeds a certain amount of resources is damaging to the long term achievement of a desired state. Furthermore, I think people rarely know when to cut their losses. It could be that after investing X amount into desired state Y, the individual is unwilling to abandon this belief, even if in reality it is no longer their desired state. People get into relationships and are too afraid of having wasted all that time and resources to get out. I don’t know if I am being clear, but the train of my logic is roughly
Throughout the progression of time what a person finds to be a desired state changes. (Perhaps the change is more drastic in some than others, but I believe this change is normal. Just as through trial and error you refine your methods of goal achievement, through the trials and errors of life you reshape your beliefs and desires. )
If desired states of being are dynamic, then it not wise to commit to too extreme goals or methods for the sake of my current desired state of being. (There needs to be some anticipation of the likelihood that my current desired state might not be in agreement with my final/ actual desired state of being.)
(nods)
I certainly agree that the goals people can articulate (e.g., “become a world-famous comedian” or “make a trillion dollars” or whatever) are rarely stable over time, and are rarely satisfying once achieved, such that making non-reversible choices (including, as you say, the consumption of resources) to achieve those goals may be something we regret later.
That said, it’s not clear that we have alternatives we’re guaranteed not to regret.
Incidentally, it’s conventional on LW to talk about this dichotomy in terms of “instrumental” and “terminal” goals, with the understanding that terminal goals are stable and worth optimizing for but mostly we just don’t know what they are. That said, I’m not a fan of that convention myself, except in the most metaphorical of senses, as I see no reason for believing terminal goals exist at all.
But do you believe that most people pretty predictably experience shifts in goal orientation over a lifetime?
I’d have to know more clearly what you mean by “goal orientation” to answer that.
I certainly believe that most (actually, all) people, if asked to articulate their goals at various times during their lives, would articulate different goals at different times. And I’m pretty confident that most (and quite likely all, excepting perhaps those who die very young) people express different implicit goals through their choices at different times during their lives.
Are either of those equivalent to “shifts in goal orientation”?
Yes
Then yes, I believe that most people pretty predictably experience shifts in goal orientation over a lifetime.
Ok, me to.
Then if you believe that, does it seem logical to set up some system of regulation or some type of limitations on the degree of accuracy you are willing to strive for any current goal orientation?
Again, I’m not exactly sure I know what you mean.
But it certainly seems reasonable for me to, for example, not consume all available resources in pursuit of my currently articulable goals without some reasonable expectation of more resources being made available as a consequence of achieving those goals.
Is that an example of a system of regulation or type of limitation on the degree of accuracy I am willing to strive for my current goal orientation?
Preventing other people from consuming all available resources in pursuit of their currently articulable goals might also be a good idea, though it depends a lot on the costs of prevention and the likelihood that they would choose to do so and be able to do so in the absence of my preventing them.
Is that an example of a system of regulation or type of limitation on the degree of accuracy I am willing to strive for my current goal orientation?
Yes in a sense. What I was getting at is that the implementation of rationality , when one’s capacity for rationality is high (i.e when someone is really rational), is a HUGE consumption of resources. That
1.) Because goal-orientations are dynamic 2.) The implementation of genuine rational methodology to a goal-orientation consumes a huge amount of the individual/group’s resources 3.) Both individuals and groups would benefit from having a system of regulating when to implement rational methodology and to what degree in the pursuit of a specific goal.
This is what my essay is about. This is what I call rational irrationality, or rationally irrational; because I see that a truly rational person for the sake of resource preservation and long-term (terminal) goal achievement would not want to achieve all their immediate goals in the fullest sense. This to me is different than having the goal of losing. Because you still want to achieve your goals, you still have immediate goals, you just do not place the efficient achievement of these goals as your top priority.
I certainly agree that sometimes we do best to put off achieving an immediate goal because we’re optimizing for longer-term or larger-scale goals. I’m not sure why you choose to call that “irrational,” but the labels don’t matter to me much.
I call it irrational, because in pursuit of our immediate goals we are ignoring/avoiding the most effective methodology, thus doing what is potentially ineffective?
But hell, maybe on a subconscious level I did it to be controversial and attack accepted group norms O_O