Another problem is that he doesn’t account for the positive (less evil) effect of her donations as a reason to not hire her. EE would only hire her if the value she would provide in service of their goals exceeds the disvalue of her donations by at least as much as the next available candidate would. Likewise she would only work for them if the value of her donations for altruism exceeds the disvalue of her service to EE by at least as much as if she took a job at a normal organization. There’s no way her employment at EE is a +EV proposition for both of them.
Yeah, if it’s a net goal, then they can’t both be right. But strictly speaking he never says he wants to make the world worse on net. She says she wants to change the world for the better, but he just says he wants to change the world, period. They could deontologically or virtue-ethics-esque value the skillful doing of evil and changing the world from what it would otherwise have been, which is completely consistent with unleashing giant mice to rampage through NYC even as malaria is cured by donations from guilty employees—everyone gets what they want. Effective Evil gets to do evil while changing the world (all those skyscrapers ruined by overgrown rodents are certainly evil, and a visible change in the world), and the employees know they offset the evil by good elsewhere while keeping a handsome salary for themselves.
They could also easily just desire different things (“have different utility functions”). This is the basis for gains from trade, and, more germane to this example, political parties.
If Effective Evil thinks the most efficient way to do evil is assaulting people’s eyeballs with dust specks, and I think the most effective way to do evil would be increasing torture, I can use the money they give me to engineer aeroplane dust distribution technology to reduce torture. If they think 1000 specks equals 1 minute of torture, but I think 10e9 specks equals 1 minute of torture, there is a wide latitude for us to make a trade where I reduce 10 minutes of torture in expectation, and they get more than 10,000 specks-in-eyes. Their conception of evil is maximized, and mine is minimized.
Another problem is that he doesn’t account for the positive (less evil) effect of her donations as a reason to not hire her. EE would only hire her if the value she would provide in service of their goals exceeds the disvalue of her donations by at least as much as the next available candidate would. Likewise she would only work for them if the value of her donations for altruism exceeds the disvalue of her service to EE by at least as much as if she took a job at a normal organization. There’s no way her employment at EE is a +EV proposition for both of them.
Yeah, if it’s a net goal, then they can’t both be right. But strictly speaking he never says he wants to make the world worse on net. She says she wants to change the world for the better, but he just says he wants to change the world, period. They could deontologically or virtue-ethics-esque value the skillful doing of evil and changing the world from what it would otherwise have been, which is completely consistent with unleashing giant mice to rampage through NYC even as malaria is cured by donations from guilty employees—everyone gets what they want. Effective Evil gets to do evil while changing the world (all those skyscrapers ruined by overgrown rodents are certainly evil, and a visible change in the world), and the employees know they offset the evil by good elsewhere while keeping a handsome salary for themselves.
They could also easily just desire different things (“have different utility functions”). This is the basis for gains from trade, and, more germane to this example, political parties.
If Effective Evil thinks the most efficient way to do evil is assaulting people’s eyeballs with dust specks, and I think the most effective way to do evil would be increasing torture, I can use the money they give me to engineer aeroplane dust distribution technology to reduce torture. If they think 1000 specks equals 1 minute of torture, but I think 10e9 specks equals 1 minute of torture, there is a wide latitude for us to make a trade where I reduce 10 minutes of torture in expectation, and they get more than 10,000 specks-in-eyes. Their conception of evil is maximized, and mine is minimized.