Effective Altruists & Consequentalists tend to be vain with plausible deniability, always making a show of their set of beliefs, coming into the room loudly and attracting attention always repeating “effectiveness” “consequences”. It gets annoying. I wish some would have taste.
It sounds as if you’re complaining about something someone’s written in this thread, but I’m having trouble working out what (and what you dislike about it, other than maybe a more general grievance against consequentialism or EAism). Would you care to clarify?
On the face of it your complaint is that EAs are attention-seeking and try to hijack other discussions onto their favoured hobby-horse. But I don’t see that that happened here. helldalgo mentioned a common criticism of Zuckerberg’s recent actions and disagreed with it, no part of which seems unreasonable; LessWrong introduced the topic of EA but doesn’t identify as an EA so nothing s/he wrote can possibly be an example of what you describe; I corrected what looked to me like a wrong statement about EAs, which seems like an obviously unobjectionable thing to do.
Effective Altruists & Consequentalists tend to be vain with plausible deniability, always making a show of their set of beliefs, coming into the room loudly and attracting attention always repeating “effectiveness” “consequences”. It gets annoying. I wish some would have taste.
It sounds as if you’re complaining about something someone’s written in this thread, but I’m having trouble working out what (and what you dislike about it, other than maybe a more general grievance against consequentialism or EAism). Would you care to clarify?
On the face of it your complaint is that EAs are attention-seeking and try to hijack other discussions onto their favoured hobby-horse. But I don’t see that that happened here. helldalgo mentioned a common criticism of Zuckerberg’s recent actions and disagreed with it, no part of which seems unreasonable; LessWrong introduced the topic of EA but doesn’t identify as an EA so nothing s/he wrote can possibly be an example of what you describe; I corrected what looked to me like a wrong statement about EAs, which seems like an obviously unobjectionable thing to do.
What am I missing?