I also agree with MichaelVassar, I think much religious harm comes from using abnormally explicit reasoning.
This is because (I hypothesize that) great moral failures come about when a group of people (often, a religion, but any ideological group) think they’ve hit upon an absolute “truth” and then expect they can apply this truth to wholly develop an ethical code. The evil comes in when they mistakenly think that morality can be described by some set of universal and self-consistent principles, and they apply a principle valid in one context to another with disastrous results. When they apply the principle to the inappropriate domain, they should feel a twinge of conscience, but they override this twinge with their reason—they believe in this original principle, and it deduces this thing here, which is correct, so that thing over there that it also deduces must also be correct. In the end, they use reason to override their natural human morality.
The Nazis are the main example I have in mind, but to look at a less painful example, the Catholic church is another example of over-extending principles due to reasoning. Valuing human life and general societal openness to procreation are good values, but insisting that women not use condoms amidst an AIDS epidemic is requiring too much consistency of moral principles.
(Though apparently, I agree even more with user:cousin_it that it is the result of putting ideals of any kind over instinct. It’s just that in some cases, the ideal is insisting on consistent, universal moral principles, which religions are fond of doing.)
Here I would guess that you’re underestimating the influence of (evolutionarily conditioned) straightforwardly base motivations: c.f. the Milgram and Stanford Prison Experiments. I recently ran across this fascinating essay by Ron Jones on his experience running an experiment called “The Third Wave” in his high school class. I would guess that the motivation that he describes (of feeling superior to others) played a significantly larger role than abnormally explicit reasoning in the case of the Nazi regime; that (the appearance of?) abnormally explicit reasoning was a result of this underlying motivation rather than the cause.
There may be an issue generalizing from one example here; what your describing sounds to me closer to why a LW poster might have become a Nazi during Nazi times than why a typical person might have become a Nazi during Nazi times. On the other hand, I find it likely that the originators of the underlying ideas (“Aryan” nationalism, communism, Catholic doctrines) used explicit reasoning more often than the typical person does in coming to their conclusions.
I recently ran across this fascinating essay by Ron Jones
It really is fascinating. But I don’t believe him. I don’t believe it was ‘kept secret’ and this is most likely some kind of delusion he experienced. (A very small experiment of this kind might make him feel so guilty that the size of the project grew in his mind.) For example, I believe I would have felt the same way as his students, but I’m certain I would not have kept it secret.
Also, I’m confused about his statement
You are no better or worse than the German Nazis we have been studying.
That seems rather ridiculous. Being sent to the library for not wanting to participate in an assignment isn’t beyond the pale.
However, something just clicked in my mind and I realized an evil that we do as a society that we allow, because we sanction it as a community. So, yes, I see now how people can go along with something that their conscience should naturally fight against.
It really is fascinating. But I don’t believe him.
I agree that the there are reasons to question the accuracy of Ron Jones’ account.
Also, I’m confused about his statement
You are no better or worse than the German Nazis we have been studying.
Being sent to the library for not wanting to participate in an assignment isn’t beyond the pale.
I think that Jones was not suggesting that the consequences of the students’ actions are comparable to the consequences of Nazis’ actions but rather was claiming that the same tendencies that led the Germans to behave as they did were present in his own students.
This may not literally be true; it’s possible that the early childhood development environment in 1950′s Palo Alto were sufficiently different from the environmental factors in the early 1900′s so that the students did not have the same underlying tendencies that the Nazi Germans did, but it’s difficult to tell one way or the other.
However, something just clicked in my mind and I realized an evil that we do as a society that we allow, because we sanction it as a community. So, yes, I see now how people can go along with something that their conscience should naturally fight against.
Right, this is what I was getting at. I think that there are several interrelated things going on here:
•High self-esteem coming from feeling that one is on the right side.
•Desire for acceptance / fear of rejection by one’s peers.
•Desire to reaping material & other goods from the oppressed party.
with each point being experienced only on a semi-conscious level..
In the case of the Catholic Church presumably only the first two points are operative.
Of course empathy is mixed in there as well; but it may play a negligible role relative to the other factors on the table.
I have a question regarding the Milgram experiment. Were the teachers under the impression that the learners were continuing to supply answers voluntarily?
Teachers were instructed to treat silence as an incorrect answer and apply the next shock level to the student.
I imagine—perhaps erroneously—that I would have tried to obtain the verbal agreement of the learner before continuing. But, for example, this is because I know that continuous subject consent is required whereas this might not have been generally known or true in the early 60s.
Of course, I do see the pattern that this is probably such a case where everyone wants to rate themselves as above average (but they couldn’t possibly all be). Still, I will humor my hero-bone by checking out the book and reading about the heroic exceptions, since those must be interesting.
Don’t know the answer to your question; now that I look at the Wikipedia page I realize that I should only have referred to the Zimbardo Stanford Prison Experiment (the phenomenon in the Milgram experiment is not what I had in mind).
I also agree with MichaelVassar, I think much religious harm comes from using abnormally explicit reasoning.
This is because (I hypothesize that) great moral failures come about when a group of people (often, a religion, but any ideological group) think they’ve hit upon an absolute “truth” and then expect they can apply this truth to wholly develop an ethical code. The evil comes in when they mistakenly think that morality can be described by some set of universal and self-consistent principles, and they apply a principle valid in one context to another with disastrous results. When they apply the principle to the inappropriate domain, they should feel a twinge of conscience, but they override this twinge with their reason—they believe in this original principle, and it deduces this thing here, which is correct, so that thing over there that it also deduces must also be correct. In the end, they use reason to override their natural human morality.
The Nazis are the main example I have in mind, but to look at a less painful example, the Catholic church is another example of over-extending principles due to reasoning. Valuing human life and general societal openness to procreation are good values, but insisting that women not use condoms amidst an AIDS epidemic is requiring too much consistency of moral principles.
(Though apparently, I agree even more with user:cousin_it that it is the result of putting ideals of any kind over instinct. It’s just that in some cases, the ideal is insisting on consistent, universal moral principles, which religions are fond of doing.)
Thanks for your feedback.
Here I would guess that you’re underestimating the influence of (evolutionarily conditioned) straightforwardly base motivations: c.f. the Milgram and Stanford Prison Experiments. I recently ran across this fascinating essay by Ron Jones on his experience running an experiment called “The Third Wave” in his high school class. I would guess that the motivation that he describes (of feeling superior to others) played a significantly larger role than abnormally explicit reasoning in the case of the Nazi regime; that (the appearance of?) abnormally explicit reasoning was a result of this underlying motivation rather than the cause.
There may be an issue generalizing from one example here; what your describing sounds to me closer to why a LW poster might have become a Nazi during Nazi times than why a typical person might have become a Nazi during Nazi times. On the other hand, I find it likely that the originators of the underlying ideas (“Aryan” nationalism, communism, Catholic doctrines) used explicit reasoning more often than the typical person does in coming to their conclusions.
It really is fascinating. But I don’t believe him. I don’t believe it was ‘kept secret’ and this is most likely some kind of delusion he experienced. (A very small experiment of this kind might make him feel so guilty that the size of the project grew in his mind.) For example, I believe I would have felt the same way as his students, but I’m certain I would not have kept it secret.
Also, I’m confused about his statement
That seems rather ridiculous. Being sent to the library for not wanting to participate in an assignment isn’t beyond the pale.
However, something just clicked in my mind and I realized an evil that we do as a society that we allow, because we sanction it as a community. So, yes, I see now how people can go along with something that their conscience should naturally fight against.
I agree that the there are reasons to question the accuracy of Ron Jones’ account.
I think that Jones was not suggesting that the consequences of the students’ actions are comparable to the consequences of Nazis’ actions but rather was claiming that the same tendencies that led the Germans to behave as they did were present in his own students.
This may not literally be true; it’s possible that the early childhood development environment in 1950′s Palo Alto were sufficiently different from the environmental factors in the early 1900′s so that the students did not have the same underlying tendencies that the Nazi Germans did, but it’s difficult to tell one way or the other.
Right, this is what I was getting at. I think that there are several interrelated things going on here:
•High self-esteem coming from feeling that one is on the right side.
•Desire for acceptance / fear of rejection by one’s peers.
•Desire to reaping material & other goods from the oppressed party.
with each point being experienced only on a semi-conscious level..
In the case of the Catholic Church presumably only the first two points are operative.
Of course empathy is mixed in there as well; but it may play a negligible role relative to the other factors on the table.
Add in desire for something more interesting than school usually is.
I have a question regarding the Milgram experiment. Were the teachers under the impression that the learners were continuing to supply answers voluntarily?
The learner was perceived to initially agree to the experiment, but among the recordings in the programmed resistance was one demanding to be let out.
Ah, also this sentence helped my understanding:
I imagine—perhaps erroneously—that I would have tried to obtain the verbal agreement of the learner before continuing. But, for example, this is because I know that continuous subject consent is required whereas this might not have been generally known or true in the early 60s.
Of course, I do see the pattern that this is probably such a case where everyone wants to rate themselves as above average (but they couldn’t possibly all be). Still, I will humor my hero-bone by checking out the book and reading about the heroic exceptions, since those must be interesting.
Don’t know the answer to your question; now that I look at the Wikipedia page I realize that I should only have referred to the Zimbardo Stanford Prison Experiment (the phenomenon in the Milgram experiment is not what I had in mind).