With empathy, it turns out that Germans were much more likely to empathize with other Germans than with Juden. With empathy, everyone was cheering as the witches burned.
This required first to, basically, decide that something which looks like a person is actually not and so is not worthy of empathy. That is not a trivial barrier to overcome. Without empathy to start with, burning witches is much easier.
Moral progress is the progress of knowledge.
This is a very… contentious statement. There are a lot of interesting implications.
All I’m saying is that whenever you have finally decided that you should make the world a better place, at that point emotional empathy is a bias that you should discard when choosing a course of action.
And that is what I’m strongly disagreeing with.
You are essentially saying that once you’ve decided on a course of action, you should turn yourself into a sociopath.
You are essentially saying that once you’ve decided on a course of action, you should turn yourself into a sociopath.
Sounds terrible! But, wait, once you’ve decided on a course of action. The main problem with sociopaths is that they do horrible things and do them very effectively, right? Someone who chooses what to do like a non-sociopath and then executes those plans like a sociopath may sound scary and creepy and all, but it’s not at all clear that it’s actually a bad idea.
(I am not convinced that Jacobian is actually arguing that you decide on a course of action and then turn yourself into a sociopath. But even that strawman version of what he’s saying is, I think, much less terrible than you obviously want readers to think it is.)
But, wait, once you’ve decided on a course of action.
You are misreading Jacobian. Let me quote (emphasis mine):
whenever you have finally decided that you should make the world a better place, at that point emotional empathy is a bias that you should discard when choosing a course of action.
.
but it’s not at all clear that it’s actually a bad idea.
Plausible guess, but actually my error was different: I hadn’t noticed the bit of Jacobian’s comment you quote there; I read what you wrote and made the mistake of assuming it was correct.
Those words “once you’ve decided on a course of action” were your words. I just quoted them. It does indeed appear that they don’t quite correspond to what Jacobian wrote, and I should have spotted that, but the original misrepresentation of Jacobian’s position was yours rather than mine.
(But I should make clear that you misrepresented Jacobian’s position by making it look less unreasonable and less easy for you to attack, so there’s something highly creditable about that.)
I am afraid I cannot claim here any particularly noble motives.
In Jacobian’s text there are, basically, two decision points: the first one is deciding to do good, and the second one is deciding on a course of action. You lose empathy in between them. There are (at least) two ways to interpret this. In one when you decide “do good”, you make just a very generic decision to do some unspecified good. All the actual choices are at the “course of action” point. In another one at the first decision point you already decide what particular good do you want to work towards and then the second decision point is just the details of implementation.
I didn’t want to start dissecting Jacobian’s post at this level of detail, so I basically simplified it by saying that you lose your empathy before making some (but not necessarily all) choices. I don’t know if you want to classify it as “technically incorrect” :-/
You still haven’t made a single argument in favor of emotional empathy, other than conflating lack of emotional empathy with, in order of appearance: Stalinism, Nazism, witch hunting, fanaticism. None of this name calling was supported by any evidence re:empathy.
The argument that I was making or, maybe, just implying is a version of the argument for deontological ethics. It rests on two lemmas: (1) You will make mistakes; (2) No one is a villain in his own story.
To unroll a bit, people who do large-scale evil do not go home to stroke a white cat and cackle at their own evilness. They think they are the good guys and that they do what’s necessary to achieve their good goals. We think they’re wrong, but that’s an outside view. As has been pointed out, the road to hell is never in need of repair.
Given this, it’s useful to have firebreaks, boundaries which serve to stop really determined people who think they’re doing good from doing too much evil. A major firebreak is emotional empathy—it serves as a check on runaway optimization processes which are, of course, subject to the Law of Unintended Consequences.
And, besides, I like humans more than I like optimization algorithms :-P
How about: doing evil (even inadvertently) requires coercion. Slavery, Nazis, tying a witch to a stake, you name it. Nothing effective altruists currently do is coercive (except to mosquitoes), so we’re probably good. However, if we come up with a world improvement plan that requires coercing somebody, we should A) hear their take on it and B) empathize with them for a bit. This isn’t a 100% perfect plan, but it seems to be a decent framework.
Some argument along these lines may work; but I don’t believe that doing evil requires coercion.
Suppose that for some reason I am filled with malice against you and wish to do you harm. Here are some things I can do that involve no coercion.
I know that you enjoy boating. I drill a small hole in your boat, and the next time you go out on the lake your boat sinks and you die.
I know that you are an alcoholic. I leave bottles of whisky around places you go, in the hope that it will inspire you to get drunk and get your life into a mess.
The law where we live is (as in many places) rather overstrict and I know that you—like almost everyone in the area—have committed a number of minor offences. I watch you carefully, make notes, and file a report with the police.
I get to know your wife, treat her really nicely, try to give her the impression that I have long been nursing a secret yearning for her. I hope that some day if your marriage hits an otherwise-navigable rocky patch, she will come to me for comfort and (entirely consensually) leave you for me.
I discover your political preferences and make a point of voting for candidates whose values and policies are opposed to them.
I put up posters near where you live, accusing you of horrible things that you haven’t in fact done.
I put up posters near where you live, accusing you of horrible things that you have in fact done.
None of these involves coercion unless you interpret that word very broadly. Several of them don’t, so far as I can see, involve coercion no matter how broadly you interpret it.
So if you want to be assured of not doing evil, you probably need more firebreaks besides “no coercion”.
I agree with gjm that evil does not necessarily require coercion. Contemplate, say, instigating a lynching.
The reason EAs don’t do any coercion is because they don’t have any power. But I don’t see anything in their line of reasoning which would stop them from coercing other people in case they do get some power. They are not libertarians.
This required first to, basically, decide that something which looks like a person is actually not and so is not worthy of empathy. That is not a trivial barrier to overcome. Without empathy to start with, burning witches is much easier.
This is a very… contentious statement. There are a lot of interesting implications.
And that is what I’m strongly disagreeing with.
You are essentially saying that once you’ve decided on a course of action, you should turn yourself into a sociopath.
Sounds terrible! But, wait, once you’ve decided on a course of action. The main problem with sociopaths is that they do horrible things and do them very effectively, right? Someone who chooses what to do like a non-sociopath and then executes those plans like a sociopath may sound scary and creepy and all, but it’s not at all clear that it’s actually a bad idea.
(I am not convinced that Jacobian is actually arguing that you decide on a course of action and then turn yourself into a sociopath. But even that strawman version of what he’s saying is, I think, much less terrible than you obviously want readers to think it is.)
You are misreading Jacobian. Let me quote (emphasis mine):
.
Such people are commonly called “fanatics”.
Plausible guess, but actually my error was different: I hadn’t noticed the bit of Jacobian’s comment you quote there; I read what you wrote and made the mistake of assuming it was correct.
Those words “once you’ve decided on a course of action” were your words. I just quoted them. It does indeed appear that they don’t quite correspond to what Jacobian wrote, and I should have spotted that, but the original misrepresentation of Jacobian’s position was yours rather than mine.
(But I should make clear that you misrepresented Jacobian’s position by making it look less unreasonable and less easy for you to attack, so there’s something highly creditable about that.)
I am afraid I cannot claim here any particularly noble motives.
In Jacobian’s text there are, basically, two decision points: the first one is deciding to do good, and the second one is deciding on a course of action. You lose empathy in between them. There are (at least) two ways to interpret this. In one when you decide “do good”, you make just a very generic decision to do some unspecified good. All the actual choices are at the “course of action” point. In another one at the first decision point you already decide what particular good do you want to work towards and then the second decision point is just the details of implementation.
I didn’t want to start dissecting Jacobian’s post at this level of detail, so I basically simplified it by saying that you lose your empathy before making some (but not necessarily all) choices. I don’t know if you want to classify it as “technically incorrect” :-/
You still haven’t made a single argument in favor of emotional empathy, other than conflating lack of emotional empathy with, in order of appearance: Stalinism, Nazism, witch hunting, fanaticism. None of this name calling was supported by any evidence re:empathy.
The argument that I was making or, maybe, just implying is a version of the argument for deontological ethics. It rests on two lemmas: (1) You will make mistakes; (2) No one is a villain in his own story.
To unroll a bit, people who do large-scale evil do not go home to stroke a white cat and cackle at their own evilness. They think they are the good guys and that they do what’s necessary to achieve their good goals. We think they’re wrong, but that’s an outside view. As has been pointed out, the road to hell is never in need of repair.
Given this, it’s useful to have firebreaks, boundaries which serve to stop really determined people who think they’re doing good from doing too much evil. A major firebreak is emotional empathy—it serves as a check on runaway optimization processes which are, of course, subject to the Law of Unintended Consequences.
And, besides, I like humans more than I like optimization algorithms :-P
How about: doing evil (even inadvertently) requires coercion. Slavery, Nazis, tying a witch to a stake, you name it. Nothing effective altruists currently do is coercive (except to mosquitoes), so we’re probably good. However, if we come up with a world improvement plan that requires coercing somebody, we should A) hear their take on it and B) empathize with them for a bit. This isn’t a 100% perfect plan, but it seems to be a decent framework.
Some argument along these lines may work; but I don’t believe that doing evil requires coercion.
Suppose that for some reason I am filled with malice against you and wish to do you harm. Here are some things I can do that involve no coercion.
I know that you enjoy boating. I drill a small hole in your boat, and the next time you go out on the lake your boat sinks and you die.
I know that you are an alcoholic. I leave bottles of whisky around places you go, in the hope that it will inspire you to get drunk and get your life into a mess.
The law where we live is (as in many places) rather overstrict and I know that you—like almost everyone in the area—have committed a number of minor offences. I watch you carefully, make notes, and file a report with the police.
I get to know your wife, treat her really nicely, try to give her the impression that I have long been nursing a secret yearning for her. I hope that some day if your marriage hits an otherwise-navigable rocky patch, she will come to me for comfort and (entirely consensually) leave you for me.
I discover your political preferences and make a point of voting for candidates whose values and policies are opposed to them.
I put up posters near where you live, accusing you of horrible things that you haven’t in fact done.
I put up posters near where you live, accusing you of horrible things that you have in fact done.
None of these involves coercion unless you interpret that word very broadly. Several of them don’t, so far as I can see, involve coercion no matter how broadly you interpret it.
So if you want to be assured of not doing evil, you probably need more firebreaks besides “no coercion”.
I agree with gjm that evil does not necessarily require coercion. Contemplate, say, instigating a lynching.
The reason EAs don’t do any coercion is because they don’t have any power. But I don’t see anything in their line of reasoning which would stop them from coercing other people in case they do get some power. They are not libertarians.