I don’t think the answer is super mysterious; a lot of people are in the field for the fuzzies and it weirds them out that there’s some weirdos that seem to be in the field, but missing “heart”.
It is definitely a serious problem because it gates a lot of resources that could otherwise come to EA, but I think this might be a case where the cure could be worse than the disease if we’re not careful—how much funding needs to be dangled before you’re willing to risk EA’s assimilation into the current nonprofit industrial complex?
I think being in it for the fuzzies is in some way actually pretty important to effectiveness, and bridging these viewpoints would unlock more effective reasoning patterns. Of course don’t give up on effectiveness, but the majority of altruists and solidarity-seekers in the world are fuzzies or anger-at-injustice motivated, and I don’t think that’s actually bad. Seeing it as bad strikes me as a very negative consequence of the current shallow-thought version of the effectiveness mindset; finding the approaches-to-thinking which can combine their benefits reliably seems exciting for a number of reasons, most centrally that it would be compatible with both memeplexes and thereby allow the coordination groups to merge without borging each other. EA has only been around for a few years, but it’s already had some very serious negative impacts under its brand, which I think is in fact a result of having cold calculations at the core. Hmm. “Finally. Warm calculations”
I don’t think the answer is super mysterious; a lot of people are in the field for the fuzzies and it weirds them out that there’s some weirdos that seem to be in the field, but missing “heart”.
It is definitely a serious problem because it gates a lot of resources that could otherwise come to EA, but I think this might be a case where the cure could be worse than the disease if we’re not careful—how much funding needs to be dangled before you’re willing to risk EA’s assimilation into the current nonprofit industrial complex?
I think being in it for the fuzzies is in some way actually pretty important to effectiveness, and bridging these viewpoints would unlock more effective reasoning patterns. Of course don’t give up on effectiveness, but the majority of altruists and solidarity-seekers in the world are fuzzies or anger-at-injustice motivated, and I don’t think that’s actually bad. Seeing it as bad strikes me as a very negative consequence of the current shallow-thought version of the effectiveness mindset; finding the approaches-to-thinking which can combine their benefits reliably seems exciting for a number of reasons, most centrally that it would be compatible with both memeplexes and thereby allow the coordination groups to merge without borging each other. EA has only been around for a few years, but it’s already had some very serious negative impacts under its brand, which I think is in fact a result of having cold calculations at the core. Hmm. “Finally. Warm calculations”