I feel like I feel a similar thing, but with regards to effective altruism and learning intellectual things. I sometimes ask myself, “are my beliefs around EA and utilitarianism just ‘signaling’”, especially since I’m only in high school and don’t really have any immediate plans. But I’m also not a very social person, and when I do talk to others I don’t usually talk about EA.
I guess I’m not a very conscientious person: I like the idea of “maximizing utility” and learning cool things, but my day-to-day fun things (outside of “addictions”: social media, games) are just like reading books and essays and listening to music. It’s as if, I “want to like” EA and learning. Like, I don’t really see any point in being rich and famous, unless you’re going to do good with it, and so just doing to minimum to enjoy my life and not be a net negative in utils seems fine. (That is usually a thought I have when I’m sad+unmotivated.)
Does it even make sense to say that EA/utilitarianism is just signaling? Is there any reason for me to make my actions inline with my “true” preferences (i.e. egoism)? As in: if it wasn’t, “should” I listen to my “true” preferences? grouchymusicologist seems to think, no: whether or not we get those preferences from repetition and socialization, they’re still our preferences.
I feel like I feel a similar thing, but with regards to effective altruism and learning intellectual things. I sometimes ask myself, “are my beliefs around EA and utilitarianism just ‘signaling’”, especially since I’m only in high school and don’t really have any immediate plans. But I’m also not a very social person, and when I do talk to others I don’t usually talk about EA. I guess I’m not a very conscientious person: I like the idea of “maximizing utility” and learning cool things, but my day-to-day fun things (outside of “addictions”: social media, games) are just like reading books and essays and listening to music. It’s as if, I “want to like” EA and learning. Like, I don’t really see any point in being rich and famous, unless you’re going to do good with it, and so just doing to minimum to enjoy my life and not be a net negative in utils seems fine. (That is usually a thought I have when I’m sad+unmotivated.) Does it even make sense to say that EA/utilitarianism is just signaling? Is there any reason for me to make my actions inline with my “true” preferences (i.e. egoism)? As in: if it wasn’t, “should” I listen to my “true” preferences? grouchymusicologist seems to think, no: whether or not we get those preferences from repetition and socialization, they’re still our preferences.