Backing up a step, because I’m pretty sure we have different levels of knowledge and assumptions (mostly my failing) about the differences between “hard” and “soft” optimizing.
I should acknowledge that I’m not particularly invested in EA as a community or identity. I try to be effective, and do some good, but I’m exploring rather than advocating here.
Also, I don’t tend to frame things as “how to care”, so much as “how to model the effects of actions, and how to use those models to choose how to act”. I suspect that’s isomorphic to how you’re using “how to care”, but I’m not sure of that.
All that said, I think of “optimizing hard” as truly taking seriously the “shut up and multiply” results, even where it’s uncomfortable epistemically, BECAUSE that’s the only way to actually do the MOST POSSIBLE good. actually OPTIMIZING, you know? “soft” is almost by definition less ambitious, BECAUSE it’s epistemically more conservative, and gives up average expected value in order to increase modal goodness in the face of that uncertainty. I don’t actually know if those are the positions taken by those people. I’d love to hear different definitions of “hard” and “soft”, so I can better understand why they’re both equal in impact.
Backing up a step, because I’m pretty sure we have different levels of knowledge and assumptions (mostly my failing) about the differences between “hard” and “soft” optimizing.
I should acknowledge that I’m not particularly invested in EA as a community or identity. I try to be effective, and do some good, but I’m exploring rather than advocating here.
Also, I don’t tend to frame things as “how to care”, so much as “how to model the effects of actions, and how to use those models to choose how to act”. I suspect that’s isomorphic to how you’re using “how to care”, but I’m not sure of that.
All that said, I think of “optimizing hard” as truly taking seriously the “shut up and multiply” results, even where it’s uncomfortable epistemically, BECAUSE that’s the only way to actually do the MOST POSSIBLE good. actually OPTIMIZING, you know? “soft” is almost by definition less ambitious, BECAUSE it’s epistemically more conservative, and gives up average expected value in order to increase modal goodness in the face of that uncertainty. I don’t actually know if those are the positions taken by those people. I’d love to hear different definitions of “hard” and “soft”, so I can better understand why they’re both equal in impact.