Was having an EA conversation with some uni group organisers recently and it was terrifying to me that a substantial portion of them, in response to FTX, wanted to do PR for EA (implied in for eg supporting putting out messages of the form “EA doesn’t condone fraud” on their uni group’s social media accounts) and also that a couple of them seem to be running a naive version of consequentialism that endorsed committing fraud/breaking promises if the calculations worked out in favour of doing that for the greater good. Most interesting was that one group organiser was in both camps at once.
I think it is bad vibes that these uni students feel so emotionally compelled to defend EA, the ideology and community, from attack, and this seems plausibly really harmful for their own thinking.
I had this idea in my head of university group organisers modifying what they’re saying to be more positive about EA ideas to newcomers but thought this was a scary concern I was mostly making up but after some interactions with uni group organisers outside my bubble, this feels more important to me. People explicitly mentioned policing what they said to newcomers in order to not turn them off or give them reasons to doubt EA, and tips like “don’t criticise new people’s ideas in your first interactions with them as an EA community builder in order to be welcoming” were mentioned.
All this to say: I think some rationality ideas I consider pretty crucial for people trying to do EA uni group organising to be exposed to are not having the reach they should.
a naive version of consequentialism that endorsed committing fraud/breaking promises if the calculations worked out in favour of doing that for the greater good.
How self-aware was the group organizer of being in both camps?
All this to say: I think some rationality ideas I consider pretty crucial for people trying to do EA uni group organising to be exposed to are not having the reach they should.
It might be that they are rational at maximizing utility. It can be useful for someone who is okay with fraud to publically create an image that they aren’t.
You would expect that people who are okay with fraud are also okay with creating a false impression of them appearing to be not okay with fraud.
You’re right. When I meant some rationality ideas, I meant concepts that have been discussed here on LessWrong before, like Eliezer’s Ends Don’t Justify Means (Among Humans) post and Paul Christiano’s Integrity for Consequentialists post, among other things. The above group organiser doesn’t have to agree with those things but in this case, I found it surprising that they just hadn’t been exposed to the ideas around running on corrupted hardware and certainly hadn’t reflected on that and related ideas that seem pretty crucial to me.
My own view is that in our world, basically every time a smart person, even a well-meaning smart EA (like myself :p), does the rough calculations and they come out in favour of lying where a typical honest person wouldn’t or in favour of breaking promises or committing an act that hurts a lot of people in the short term for the “greater good”, almost certainly their calculations are misguided and they should aim for honesty and integrity instead.
Was having an EA conversation with some uni group organisers recently and it was terrifying to me that a substantial portion of them, in response to FTX, wanted to do PR for EA (implied in for eg supporting putting out messages of the form “EA doesn’t condone fraud” on their uni group’s social media accounts) and also that a couple of them seem to be running a naive version of consequentialism that endorsed committing fraud/breaking promises if the calculations worked out in favour of doing that for the greater good. Most interesting was that one group organiser was in both camps at once.
I think it is bad vibes that these uni students feel so emotionally compelled to defend EA, the ideology and community, from attack, and this seems plausibly really harmful for their own thinking.
I had this idea in my head of university group organisers modifying what they’re saying to be more positive about EA ideas to newcomers but thought this was a scary concern I was mostly making up but after some interactions with uni group organisers outside my bubble, this feels more important to me. People explicitly mentioned policing what they said to newcomers in order to not turn them off or give them reasons to doubt EA, and tips like “don’t criticise new people’s ideas in your first interactions with them as an EA community builder in order to be welcoming” were mentioned.
All this to say: I think some rationality ideas I consider pretty crucial for people trying to do EA uni group organising to be exposed to are not having the reach they should.
It’s called utilitarianism!
How self-aware was the group organizer of being in both camps?
It might be that they are rational at maximizing utility. It can be useful for someone who is okay with fraud to publically create an image that they aren’t.
You would expect that people who are okay with fraud are also okay with creating a false impression of them appearing to be not okay with fraud.
You’re right. When I meant some rationality ideas, I meant concepts that have been discussed here on LessWrong before, like Eliezer’s Ends Don’t Justify Means (Among Humans) post and Paul Christiano’s Integrity for Consequentialists post, among other things. The above group organiser doesn’t have to agree with those things but in this case, I found it surprising that they just hadn’t been exposed to the ideas around running on corrupted hardware and certainly hadn’t reflected on that and related ideas that seem pretty crucial to me.
My own view is that in our world, basically every time a smart person, even a well-meaning smart EA (like myself :p), does the rough calculations and they come out in favour of lying where a typical honest person wouldn’t or in favour of breaking promises or committing an act that hurts a lot of people in the short term for the “greater good”, almost certainly their calculations are misguided and they should aim for honesty and integrity instead.