I think utilitarian types need to be honest about what their values are.
Can you dissolve/unpack this a little?
One related thing I would have benefited from in the past (and possibly still would) is being more honest about how difficult it would be to put my values into practice.
Values is a tricky word (so tricky, in fact, that I think it would be reasonable to say that “values” aren’t actually a real thing in the first place). I’m using it approximately to mean “things that you care about.”
I want humanity to flourish and un-necessary suffering to end. But there’s a limit to my caring energy, and I have to divide it between “universal utilitarian good” and “things I personally want for myself.” Right now, UUG gets about 5-10% of my caring energy.
I would take a pill that increased UUG to getting 15-20% of my caring energy. (And in real life, this takes the form of investing myself in altruist communities, which reinforces my self-image as someone who does good things). But I honestly don’t have interest in becoming 100% altruist.
Part of me wants to be able to say “I’d take a pill that makes me 100% altruist, so that I only feel motivation to do the bare minimum of selfish-things to survive, and otherwise direct my energy to whatever accomplishes the most good.” It’s a nice thought to believe that about myself. But it’s not true. (If I became 5% more altruist, I might want to become an additional 5% more altruist, and maybe the cycle would repeated. Not sure. But if I got to take exactly one pill and then there would be no more pills after, I don’t think I’d choose to become more than 50% altruist).
One related thing I would have benefited from in the past (and possibly still would) is being more honest about how difficult it would be to put my values into practice.
Thanks—your comment implied a concrete example of what I was after: someone who thinks that they would take the 100% altruism pill when in fact they wouldn’t, isn’t being honest about their values. I found this helpful.
I think it would be reasonable to say that “values” aren’t actually a real thing in the first place). I’m using it approximately to mean “things that you care about.”
I’d hazard a guess that calling it something different doesn’t make it any realer, but we don’t need to get into that right now ;-)
EDIT: what I meant by “concrete” in this case was “without reifying values/preferences/caring etc.”
That is how I feel. I choose to test for two months how does it feel to be the maximum percent altruist. I got nearly 70%. But after the precommitment of two months ended, the whole story of this post was starting to emerge.
Now 0% and 80% sound emotionally alike, because I dug too deep.
Can you dissolve/unpack this a little?
One related thing I would have benefited from in the past (and possibly still would) is being more honest about how difficult it would be to put my values into practice.
Values is a tricky word (so tricky, in fact, that I think it would be reasonable to say that “values” aren’t actually a real thing in the first place). I’m using it approximately to mean “things that you care about.”
I want humanity to flourish and un-necessary suffering to end. But there’s a limit to my caring energy, and I have to divide it between “universal utilitarian good” and “things I personally want for myself.” Right now, UUG gets about 5-10% of my caring energy.
I would take a pill that increased UUG to getting 15-20% of my caring energy. (And in real life, this takes the form of investing myself in altruist communities, which reinforces my self-image as someone who does good things). But I honestly don’t have interest in becoming 100% altruist.
Part of me wants to be able to say “I’d take a pill that makes me 100% altruist, so that I only feel motivation to do the bare minimum of selfish-things to survive, and otherwise direct my energy to whatever accomplishes the most good.” It’s a nice thought to believe that about myself. But it’s not true. (If I became 5% more altruist, I might want to become an additional 5% more altruist, and maybe the cycle would repeated. Not sure. But if I got to take exactly one pill and then there would be no more pills after, I don’t think I’d choose to become more than 50% altruist).
That is also important, and slightly different.
Thanks—your comment implied a concrete example of what I was after: someone who thinks that they would take the 100% altruism pill when in fact they wouldn’t, isn’t being honest about their values. I found this helpful.
I’d hazard a guess that calling it something different doesn’t make it any realer, but we don’t need to get into that right now ;-)
EDIT: what I meant by “concrete” in this case was “without reifying values/preferences/caring etc.”
That is how I feel. I choose to test for two months how does it feel to be the maximum percent altruist. I got nearly 70%. But after the precommitment of two months ended, the whole story of this post was starting to emerge.
Now 0% and 80% sound emotionally alike, because I dug too deep.