Personally, it’s a bit of an ugh field for me. And is something I’m confused about, and really wish I had a good answer to.
To me, this get’s at a more general question of, “what should your terminal values be?”. It is my understanding that rationality can help you to achieve terminal values, but not to select them. I’ve thought about it a lot and have tried to think of a reason why one terminal value is “better” or “more rational” than another… but I’ve pretty much failed. I keep arriving at the conclusion that “what should your terminal values be?” is a Wrong Question, which becomes pretty obvious once it’s dissolved.
But at the same time… it’s such an important question that the slightest bit of uncertainty really bothers me. Think of it in terms of expected value—a huge magnitude multiplied by a small probability can still be huge. If I misunderstood something and I’m pursuing the wrong terminal goal(s)… well that’d be bad (how bad depends on how different my current goals are from “the real goals”).
I’d love to hear others’ takes on this. It appears that people live their lives as if things other than Your Happiness matter. Like Altruism and Truth. Ie, people pursue terminal values other than their own happiness. Is this true? I’ve really be interested in seeing a LW survey on terminal goals.
Great question! I’m glad you brought it up!
Personally, it’s a bit of an ugh field for me. And is something I’m confused about, and really wish I had a good answer to.
To me, this get’s at a more general question of, “what should your terminal values be?”. It is my understanding that rationality can help you to achieve terminal values, but not to select them. I’ve thought about it a lot and have tried to think of a reason why one terminal value is “better” or “more rational” than another… but I’ve pretty much failed. I keep arriving at the conclusion that “what should your terminal values be?” is a Wrong Question, which becomes pretty obvious once it’s dissolved.
But at the same time… it’s such an important question that the slightest bit of uncertainty really bothers me. Think of it in terms of expected value—a huge magnitude multiplied by a small probability can still be huge. If I misunderstood something and I’m pursuing the wrong terminal goal(s)… well that’d be bad (how bad depends on how different my current goals are from “the real goals”).
I’d love to hear others’ takes on this. It appears that people live their lives as if things other than Your Happiness matter. Like Altruism and Truth. Ie, people pursue terminal values other than their own happiness. Is this true? I’ve really be interested in seeing a LW survey on terminal goals.