First, this assumes total utilitarianism. While I don’t fully endorse any kind of utilitarianism, average utilitarianism is more appropriate for this purpose IMO (i.e reflects our intrinsic preferences better). I want the world at large to be nicer, not to contain as many minds as possible. I doubt anyone cares that much whether there is one zillion or two zillion minds out there, these numbers don’t mean much to a person. (And, no, I don’t think it’s a “bias”.) And, it seems quite plausible that factory farmed lives are below average. Moreover, the close association of factory farming to human civilization makes the situation worse (because the average is actually weighted by some kind of “distance”). To put it simply, factory farming is an ugly, incredibly cruel thing and I don’t want it to exist, much less to exist anywhere in my “vicinity”.
Second, I don’t understand the statement “EA is generally about optimizing your positive impact on the world, not about purifying your personal actions of any possible negative impact.” I’m guessing that you’re using a model where a person has some limited number of “spoons” for altruistic deeds, so spending spoons on veganism takes them away from other things. This does seem like a popular model in EA, but I also think it’s entirely fake. The reality is, we do a limited number of altruistic deeds because we are just not that altruistic.
If judged by intrinsic preferences alone, then plausibly the tradeoff between selfish and altruistic preferences is s.t. going vegan is not worth it individually but worth it as a society. The reason people go vegan anyway is probably signaling (i.e. reputational gain). And, signaling is a good thing! Signaling is the only tool we have to overcome tragedies of the commons, like this one. The role of EA should be, IMO, precisely creating norms that incentivize behavior which makes the world better. Hence, I want EA to award reputation points for veganism.
First, this assumes total utilitarianism. While I don’t fully endorse any kind of utilitarianism, average utilitarianism is more appropriate for this purpose IMO (i.e reflects our intrinsic preferences better). I want the world at large to be nicer, not to contain as many minds as possible. I doubt anyone cares that much whether there is one zillion or two zillion minds out there, these numbers don’t mean much to a person. (And, no, I don’t think it’s a “bias”.) And, it seems quite plausible that factory farmed lives are below average. Moreover, the close association of factory farming to human civilization makes the situation worse (because the average is actually weighted by some kind of “distance”). To put it simply, factory farming is an ugly, incredibly cruel thing and I don’t want it to exist, much less to exist anywhere in my “vicinity”.
Second, I don’t understand the statement “EA is generally about optimizing your positive impact on the world, not about purifying your personal actions of any possible negative impact.” I’m guessing that you’re using a model where a person has some limited number of “spoons” for altruistic deeds, so spending spoons on veganism takes them away from other things. This does seem like a popular model in EA, but I also think it’s entirely fake. The reality is, we do a limited number of altruistic deeds because we are just not that altruistic.
If judged by intrinsic preferences alone, then plausibly the tradeoff between selfish and altruistic preferences is s.t. going vegan is not worth it individually but worth it as a society. The reason people go vegan anyway is probably signaling (i.e. reputational gain). And, signaling is a good thing! Signaling is the only tool we have to overcome tragedies of the commons, like this one. The role of EA should be, IMO, precisely creating norms that incentivize behavior which makes the world better. Hence, I want EA to award reputation points for veganism.