How self-aware was the group organizer of being in both camps?
All this to say: I think some rationality ideas I consider pretty crucial for people trying to do EA uni group organising to be exposed to are not having the reach they should.
It might be that they are rational at maximizing utility. It can be useful for someone who is okay with fraud to publically create an image that they aren’t.
You would expect that people who are okay with fraud are also okay with creating a false impression of them appearing to be not okay with fraud.
You’re right. When I meant some rationality ideas, I meant concepts that have been discussed here on LessWrong before, like Eliezer’s Ends Don’t Justify Means (Among Humans) post and Paul Christiano’s Integrity for Consequentialists post, among other things. The above group organiser doesn’t have to agree with those things but in this case, I found it surprising that they just hadn’t been exposed to the ideas around running on corrupted hardware and certainly hadn’t reflected on that and related ideas that seem pretty crucial to me.
My own view is that in our world, basically every time a smart person, even a well-meaning smart EA (like myself :p), does the rough calculations and they come out in favour of lying where a typical honest person wouldn’t or in favour of breaking promises or committing an act that hurts a lot of people in the short term for the “greater good”, almost certainly their calculations are misguided and they should aim for honesty and integrity instead.
How self-aware was the group organizer of being in both camps?
It might be that they are rational at maximizing utility. It can be useful for someone who is okay with fraud to publically create an image that they aren’t.
You would expect that people who are okay with fraud are also okay with creating a false impression of them appearing to be not okay with fraud.
You’re right. When I meant some rationality ideas, I meant concepts that have been discussed here on LessWrong before, like Eliezer’s Ends Don’t Justify Means (Among Humans) post and Paul Christiano’s Integrity for Consequentialists post, among other things. The above group organiser doesn’t have to agree with those things but in this case, I found it surprising that they just hadn’t been exposed to the ideas around running on corrupted hardware and certainly hadn’t reflected on that and related ideas that seem pretty crucial to me.
My own view is that in our world, basically every time a smart person, even a well-meaning smart EA (like myself :p), does the rough calculations and they come out in favour of lying where a typical honest person wouldn’t or in favour of breaking promises or committing an act that hurts a lot of people in the short term for the “greater good”, almost certainly their calculations are misguided and they should aim for honesty and integrity instead.