Premise: people are fundamentally motivated by the “status” rewarded to them by those around them.
I have experienced the phenomenon of demandingness described in your post, and you’ve elucidated it brilliantly. I regularly frequent in-person EA events, and I can visibly see status being rewarded according to impact, which is very different from how it’s typically rewarded in the broader society. (This is not necessarily a bad thing.) The status hierarchy in EA communities goes something like this:
People who’ve dedicated their careers to effective causes. Or philosophers at Oxford.
People who facilitate people who’ve dedicated their careers to effective causes, e.g. research analysts.
People who donate 99% of their income to effective causes.
People who donate 98% of their income to effective causes.
...
People who donate 1% of their income to effective causes.
People who donate their time and money to ineffective causes.
People who don’t donate.
People who think altruism is bad.
This hierarchy is very “visible” within the in-person circles I frequent, being enforced by a few core members. I recently convinced a non-EA friend to tag along, and following the event, they said, “I felt incredibly unwelcomed”. Within 5 minutes, one of the organisers asked my friend, “What charities do you donate to?” My friend said, “I volunteer at a local charity, and my SO works in sexual health awareness.” Following a bit of back and forth debate, the EA organiser looked disappointed and said “I’m confused.”, then turned his back on my friend. [This is my vague recollection of what happened, it’s not an exact description, and my friend had pre-existing anti-EA biases.]
Upholding the core principles of EA is necessary. Without upholding particular principles at the expense of the rest, the organisation ceases. However, the thing about optimisation and effectiveness is that if we’re naively and greedily maximising, we’re probably doing it wrong. If we are pushing people away from the cause by rewarding them with low status as soon as we meet them, we will not be winning many allies.
If we reward low status to people who don’t donate as much as others, we might cause these people to halt their donations, quit our game, and instead play a different game in which they are rewarded with relatively more status.
I don’t know how to solve this problem either, and I think it is hard. We can only do so much to “design” culture and influence how status is rewarded within communities. Culture is mostly a thing that just happens due to many agents interacting in a world.
I watched an interview with Toby Ord a while back, and during the Q&A session, the interviewer asked Ord:
Given your analysis of existential risks, do you think people should be donating purely to long-term causes?
Ord’s response was fantastic. He said:
No. I do think this is very important, and there is a strong case to be made that this is the central issue of our time. And potentially the most cost-effective as well. Effective Altruism would be much the worse if it specialised completely in one area. Having a breadth of causes that people are interested in, united by their interest in effectiveness is central to the community’s success. [...] We want to be careful not to get into criticising each other for supporting the second-best thing.
Extending this logic, let‘s not get into criticising people for doing good. We can argue and debate how we can do good better, but let’s not attack people for doing whatever good they can and are willing to do.
I have seen snide comments about Planned Parenthood floating around rationalist and EA communities, and I find them distasteful. Yeah, sure, donating to Malaria saves more lives. But again, the thing about optimisation is that if we are pushing people away from our cause by being parochial, then we’re probably doing a lousy job at optimising.
Following a bit of back and forth debate, the EA organiser looked disappointed and said “I’m confused.”, then turned his back on my friend.
I don’t like analogizing EA to a religious movement, but I think such an analogy is appropriate in this instance. If I went to a Christian gathering, accompanying a devout friend, and someone came up to me and asked, “Oh, I haven’t seen you before, which church do you attend?” I would reply, “Oh, I’m not Christian.” Then if, after a bit of discussion, that person chose to turn and walk away, I wouldn’t be offended. In fact, them turning and walking away is one of the better outcomes. Far better than them attempting to continue proselytize at me for the rest of the event.
In this case, the organizer encountered a person who was clearly not bought into EA, ascertained that they were not bought into EA after a short discussion, and then chose to walk away. While that’s not the friendliest response, it’s hardly the worst thing in the world.
Premise: people are fundamentally motivated by the “status” rewarded to them by those around them.
I have experienced the phenomenon of demandingness described in your post, and you’ve elucidated it brilliantly. I regularly frequent in-person EA events, and I can visibly see status being rewarded according to impact, which is very different from how it’s typically rewarded in the broader society. (This is not necessarily a bad thing.) The status hierarchy in EA communities goes something like this:
People who’ve dedicated their careers to effective causes. Or philosophers at Oxford.
People who facilitate people who’ve dedicated their careers to effective causes, e.g. research analysts.
People who donate 99% of their income to effective causes.
People who donate 98% of their income to effective causes.
...
People who donate 1% of their income to effective causes.
People who donate their time and money to ineffective causes.
People who don’t donate.
People who think altruism is bad.
This hierarchy is very “visible” within the in-person circles I frequent, being enforced by a few core members. I recently convinced a non-EA friend to tag along, and following the event, they said, “I felt incredibly unwelcomed”. Within 5 minutes, one of the organisers asked my friend, “What charities do you donate to?” My friend said, “I volunteer at a local charity, and my SO works in sexual health awareness.” Following a bit of back and forth debate, the EA organiser looked disappointed and said “I’m confused.”, then turned his back on my friend. [This is my vague recollection of what happened, it’s not an exact description, and my friend had pre-existing anti-EA biases.]
Upholding the core principles of EA is necessary. Without upholding particular principles at the expense of the rest, the organisation ceases. However, the thing about optimisation and effectiveness is that if we’re naively and greedily maximising, we’re probably doing it wrong. If we are pushing people away from the cause by rewarding them with low status as soon as we meet them, we will not be winning many allies.
If we reward low status to people who don’t donate as much as others, we might cause these people to halt their donations, quit our game, and instead play a different game in which they are rewarded with relatively more status.
I don’t know how to solve this problem either, and I think it is hard. We can only do so much to “design” culture and influence how status is rewarded within communities. Culture is mostly a thing that just happens due to many agents interacting in a world.
I watched an interview with Toby Ord a while back, and during the Q&A session, the interviewer asked Ord:
Ord’s response was fantastic. He said:
Extending this logic, let‘s not get into criticising people for doing good. We can argue and debate how we can do good better, but let’s not attack people for doing whatever good they can and are willing to do.
I have seen snide comments about Planned Parenthood floating around rationalist and EA communities, and I find them distasteful. Yeah, sure, donating to Malaria saves more lives. But again, the thing about optimisation is that if we are pushing people away from our cause by being parochial, then we’re probably doing a lousy job at optimising.
I don’t like analogizing EA to a religious movement, but I think such an analogy is appropriate in this instance. If I went to a Christian gathering, accompanying a devout friend, and someone came up to me and asked, “Oh, I haven’t seen you before, which church do you attend?” I would reply, “Oh, I’m not Christian.” Then if, after a bit of discussion, that person chose to turn and walk away, I wouldn’t be offended. In fact, them turning and walking away is one of the better outcomes. Far better than them attempting to continue proselytize at me for the rest of the event.
In this case, the organizer encountered a person who was clearly not bought into EA, ascertained that they were not bought into EA after a short discussion, and then chose to walk away. While that’s not the friendliest response, it’s hardly the worst thing in the world.
I agree. I don’t think this kind of behaviour is the worst thing in the world. I just think it is instrumentally irrational.