Exciting vs. burdensome seems to be a matter of how you think about success and failure. If you think “we can actually make things better!”, it’s exciting. If you think “if you haven’t succeeded immediately, it’s all your fault”, it’s burdensome.
If I’m working at my capacity, I don’t see how it’s my fault for not having the world fixed immediately. I can’t do any more than I can do and I don’t see how I’m responsible for more than what my efforts could change.
From my perspective, it’s “I have to think about all the problems in the world and care about them.” That’s burdensome. So instead I look vaguely around for 100% solutions to these problems, things where I don’t actually need to think about people currently suffering (as I would in order to determine how effective incremental solutions are), things sufficiently nebulous and far-in-the-future that I don’t have to worry about connecting them to people starving in distant lands.
I’ve read that. It’s definitely been the best argument for convincing me to try EA that I’ve encountered. Not convincing, currently, but more convincing than anything else.
This is why I prefer to frame EA as something exciting, not burdensome.
Exciting vs. burdensome seems to be a matter of how you think about success and failure. If you think “we can actually make things better!”, it’s exciting. If you think “if you haven’t succeeded immediately, it’s all your fault”, it’s burdensome.
This just might have more general application.
If I’m working at my capacity, I don’t see how it’s my fault for not having the world fixed immediately. I can’t do any more than I can do and I don’t see how I’m responsible for more than what my efforts could change.
From my perspective, it’s “I have to think about all the problems in the world and care about them.” That’s burdensome. So instead I look vaguely around for 100% solutions to these problems, things where I don’t actually need to think about people currently suffering (as I would in order to determine how effective incremental solutions are), things sufficiently nebulous and far-in-the-future that I don’t have to worry about connecting them to people starving in distant lands.
Do we have any data on which EA pitches tend to be most effective?
I’ve read that. It’s definitely been the best argument for convincing me to try EA that I’ve encountered. Not convincing, currently, but more convincing than anything else.