No. This is your first mistake, I think. You take the ideology’s authority for granted. You shouldn’t. Dropping altruism outside of self-based reciprocity was the single best decision I have ever made. The world is not worth saving. It’s not worth destroying either.
If you’re suffering from being low-status in the EA movement, you should not be a part of the EA movement. EA as an ideology has deep flaws, and as a social dynamic, it’s outright horrible. Politically, it’s parasitic.
The last part is the only part I still care about. I went through a curve from caring about making the world a better place and therefore supporting EA to wanting to make the world a better place but being skeptical about EA’s consequences to not wanting to make the world a better place.
If EAs weren’t politically parasitic, we would be free to simply ignore them, and this would be the correct answer. Unfortunately, we can’t ignore them, because they push policies and influence politics in a way that makes us worse off. This is why I’m willing to actively oppose their goals.
I distinguish two aspects of status. One is to feel good about being accepted by others. That’s nice, but I don’t think it’s central. There are many ways to feel good and many options to substitute for acceptance of any particular person or group.
The second aspect is “getting things done”. Unfortunately, we live in a world filled with people who can harm us. Coercing or convincing them not to do so is unfortunately an important practical necessity. This is why we can’t simply ignore the EA movement, or organized religion, or neonazis or any other ideology that wants to extract value from our lives or limit our personal choices.
I really do recommend that you stop supporting the EA movement. Nothing good will come of it.
I think there’s a different sort of conversation where this sort of comment might be helpful (I think there’s plenty of perspectives from which EA, or “A”, doesn’t make sense, that are worth talking about). But it feels a bit outside the scope of this conversation.
(Not 100% about toonalfrink’s goals for the conversation)
I have no idea what toonalfrink’s goals for the conversation are. But when someone writes something like,
>So you find yourself in this volunteering opportunity with some EA’s and they tell you some stuff you can do, and you do it, and you’re left in the dark again. Is this going to steer you into safe waters? Should you do more? Impress more? Maybe spend more time on that Master’s degree to get grades that set you apart, maybe that’ll get you invited with the cool kids?
then the only sensible option from my perspective is to take a step back and consider why you’re seeking status from this community in the first place. What motivations go into this behavior. At this point, I think it’s well worth reflecting
1) Why altruism in the first place?
2) Given 1, why EA?
3) Given 2, why seeking status?
Community norms tend to be self-reinforcing. It’s worth pointing out that there are people with a genuinely different perspective, and that this perspective has a reason.
I do think it makes sense to step back, but in the opposite order (you can’t rederive your entire ontology and goal structure every time something doesn’t make sense—it’s too much work and you’d never get anything done).
“Why am I seeking status?” and “Why is EA and/or EA-organizations the right way to go about A?” seem like plausible steps-backwards to take given the questions toon is raising here.
“Why altruism?” is a question every altruist should take seriously at least once, but none of the dilemmas raised in toon’s post seem like the sort of thing that warrants questioning the entire underpinning of your goal structure. (I realize if you think the entire structure is flawed, you’re going to disagree, but I think it’s strongly meta-level important for people to be able to think through problems within a given paradigm without every conversation being about re-evaluating that paradigm)
Happy to talk more in a different top-level post but not really interested in talking more in this particular comment-section
>We all want to save the world, right?
No. This is your first mistake, I think. You take the ideology’s authority for granted. You shouldn’t. Dropping altruism outside of self-based reciprocity was the single best decision I have ever made. The world is not worth saving. It’s not worth destroying either.
If you’re suffering from being low-status in the EA movement, you should not be a part of the EA movement. EA as an ideology has deep flaws, and as a social dynamic, it’s outright horrible. Politically, it’s parasitic.
The last part is the only part I still care about. I went through a curve from caring about making the world a better place and therefore supporting EA to wanting to make the world a better place but being skeptical about EA’s consequences to not wanting to make the world a better place.
If EAs weren’t politically parasitic, we would be free to simply ignore them, and this would be the correct answer. Unfortunately, we can’t ignore them, because they push policies and influence politics in a way that makes us worse off. This is why I’m willing to actively oppose their goals.
I distinguish two aspects of status. One is to feel good about being accepted by others. That’s nice, but I don’t think it’s central. There are many ways to feel good and many options to substitute for acceptance of any particular person or group.
The second aspect is “getting things done”. Unfortunately, we live in a world filled with people who can harm us. Coercing or convincing them not to do so is unfortunately an important practical necessity. This is why we can’t simply ignore the EA movement, or organized religion, or neonazis or any other ideology that wants to extract value from our lives or limit our personal choices.
I really do recommend that you stop supporting the EA movement. Nothing good will come of it.
I think there’s a different sort of conversation where this sort of comment might be helpful (I think there’s plenty of perspectives from which EA, or “A”, doesn’t make sense, that are worth talking about). But it feels a bit outside the scope of this conversation.
(Not 100% about toonalfrink’s goals for the conversation)
I have no idea what toonalfrink’s goals for the conversation are. But when someone writes something like,
>So you find yourself in this volunteering opportunity with some EA’s and they tell you some stuff you can do, and you do it, and you’re left in the dark again. Is this going to steer you into safe waters? Should you do more? Impress more? Maybe spend more time on that Master’s degree to get grades that set you apart, maybe that’ll get you invited with the cool kids?
then the only sensible option from my perspective is to take a step back and consider why you’re seeking status from this community in the first place. What motivations go into this behavior. At this point, I think it’s well worth reflecting
1) Why altruism in the first place?
2) Given 1, why EA?
3) Given 2, why seeking status?
Community norms tend to be self-reinforcing. It’s worth pointing out that there are people with a genuinely different perspective, and that this perspective has a reason.
I do think it makes sense to step back, but in the opposite order (you can’t rederive your entire ontology and goal structure every time something doesn’t make sense—it’s too much work and you’d never get anything done).
“Why am I seeking status?” and “Why is EA and/or EA-organizations the right way to go about A?” seem like plausible steps-backwards to take given the questions toon is raising here.
“Why altruism?” is a question every altruist should take seriously at least once, but none of the dilemmas raised in toon’s post seem like the sort of thing that warrants questioning the entire underpinning of your goal structure. (I realize if you think the entire structure is flawed, you’re going to disagree, but I think it’s strongly meta-level important for people to be able to think through problems within a given paradigm without every conversation being about re-evaluating that paradigm)
Happy to talk more in a different top-level post but not really interested in talking more in this particular comment-section