In gist, if your ingroup does things that harm others, you are likely to subsequently shift your moral attitudes away from principles that tell you that harming others is wrong, and towards principles that value loyalty and obedience.
A quote from near the end:
Although we conceive of morality shifting as motivated by the need to protect one’s identity, and thus as a beneficial mechanism to the individual, we expect it to have much more negative consequences for intergroup relations and for society at large. It can give more leeway in the mistreatment of outgroup members, or lead to their exclusion from the scope of justice (Opotow, 1990), reducing the chance of seeing such mistreatment as violating principles of harm and fairness. Morality shifting can thus be seen as a mechanism that allows people to make a virtue of evil (see Reicher, Haslam, & Rath, 2008). Once the shift occurs, further actions are even more likely to be interpreted from a loyalty/authority perspective rather than from a harm/fairness perspective.
This seems like it may be part of the cult attractor; and is also a good reason to keep your identity small; it effectively means that your ingroup doing harmful things can act as a murder pill for you.
Maybe this is how “being a member of a group which slowly shifts towards evil” feels from inside: Increasingly realizing the importance of loyalty, and that fairness is not as important as it seemed once.
So when you notice yourself thinking: “well, technically this is not completely fair, but our group is good and we do many good things, so in the long term I can do more good by sticking to my group than by needlessly opposing it on a minor issue”, you have an evidence of your group becoming just a little bit more evil.
(To be precise, “a little bit more evil” can still be predominantly good, and can still be your best available choice. It’s just good to notice this feeling, especially if it starts happening rather frequently.)
In gist, if your ingroup does things that harm others, you are likely to subsequently shift your moral attitudes away from principles that tell you that harming others is wrong, and towards principles that value loyalty and obedience.
A more generalized version of this would read: “if your ingroup does [x], you are likely to subsequently shift your moral attitudes away from principles that tell you that [x is bad], and towards principles that [tell you that x is good as long as it’s your ingroup doing it].” The chapter’s from Cialdini’s Influence social proof and identity self-modification seem relevent.
Via Reddit: Morality shifting in the context of intergroup violence.
In gist, if your ingroup does things that harm others, you are likely to subsequently shift your moral attitudes away from principles that tell you that harming others is wrong, and towards principles that value loyalty and obedience.
A quote from near the end:
This seems like it may be part of the cult attractor; and is also a good reason to keep your identity small; it effectively means that your ingroup doing harmful things can act as a murder pill for you.
Maybe this is how “being a member of a group which slowly shifts towards evil” feels from inside: Increasingly realizing the importance of loyalty, and that fairness is not as important as it seemed once.
So when you notice yourself thinking: “well, technically this is not completely fair, but our group is good and we do many good things, so in the long term I can do more good by sticking to my group than by needlessly opposing it on a minor issue”, you have an evidence of your group becoming just a little bit more evil.
(To be precise, “a little bit more evil” can still be predominantly good, and can still be your best available choice. It’s just good to notice this feeling, especially if it starts happening rather frequently.)
Gah… just when I wasn’t terrified of politics anymore...
A more generalized version of this would read: “if your ingroup does [x], you are likely to subsequently shift your moral attitudes away from principles that tell you that [x is bad], and towards principles that [tell you that x is good as long as it’s your ingroup doing it].” The chapter’s from Cialdini’s Influence social proof and identity self-modification seem relevent.