If X doesn’t offend you, why would self-modify to make X offend you to stop people from doing X, since X doesn’t offend you?
It’s a Schellingian idea: in conflict situations, it is often a rational strategy to pre-commit to act irrationally (i.e. without regards to cost and benefit) unless the opponent yields. The idea in this case is that I’ll self-modify to care about X far more than I initially do, and thus pre-commit to lash out if anyone does it.
If we have a dispute and I credibly signal that I’m going to flip out and create drama out of all proportion to the issue at stake, you’re faced with a choice between conceding to my demands or getting into an unpleasant situation that will cost more than the matter of dispute is worth. I’m sure you can think of many examples where people successfully get the upper hand in disputes using this strategy. The only way to disincentivize such behavior is to pre-commit credibly to be defiant in face of threats of drama. In contrast, if you act like a (naive) utilitarian, you are exceptionally vulnerable to this strategy, since I don’t even need drama to get what I want, if I can self-modify to care tremendously about every single thing I want. (Which I won’t do if I’m a good naive utilitarian myself, but the whole point is that it’s not a stable strategy.)
Now, the key point is that such behavior is usually not consciously manipulative and calculated. On the contrary—someone flipping out and creating drama for a seemingly trivial reason is likely to be under God-honest severe distress, feeling genuine pain of offense and injustice. This is a common pattern in human social behavior: humans are extremely good at detecting faked emotions and conscious manipulation, and as a result, we have evolved so that our brains lash out with honest strong emotion that is nevertheless directed by some module that performs game-theoretic assessment of the situation. This of course prompts strategic responses from others, leading to a strategic arms race without end.
The further crucial point is that these game-theoretic calculators in our brains are usually smart enough to assess whether the flipping out strategy is likely to be successful, given what might be expected in response. Basically, it is a part of the human brain that responds to rational incentives even though it’s not under the control of the conscious mind. With this in mind, you can resolve the seeming contradiction between the sincerity of the pain of offense and the fact that it responds to rational incentives.
All this is somewhat complicated when we consider issues of group conflict rather than individual conflict, but the same basic principles apply.
The question is better phrased by asking what will be the practical consequences of treating an offense as legitimate and ceasing the offending action (and perhaps also apologizing) versus treating it as illegitimate and standing your ground (and perhaps even escalating). Clearly, this is a difficult question of great practical value in life, and like every such question, it’s impossible to give a simple and universally applicable answer. (And of course, even if you know the answer in some concrete situation, you’ll need extraordinary composure and self-control to apply it if it’s contrary to your instinctive reaction.)
Tentatively—game theoretic exaggeration of offense will simply be followed by more and more demands. Natural offense is about a desire that can be satiated.
However, there’s another sort of breakdown of negotiations that just occurred to me. If A asks for less than they want because they think that’s all they can get and/or they’re trying to do a utilitarian calculation, they aren’t going to be happy even if they get it. This means they’re likely to push for more even if they get it, and then they start looking like a utility monster.
Tentatively—game theoretic exaggeration of offense will simply be followed by more and more demands. Natural offense is about a desire that can be satiated.
What do you mean by “satiated”?
From a utilitarian/consequentialist point of view, a desire being “satiated” simply means that the marginal utility gains from pursuing it further are less than opportunity cost of however much effort it takes.
Note that by this definition when a desire is satiated depends on how easy it is to pursue.
If you’re hungry you might feel as though you could just keep eating and eating. However, if enough food is available, you’ll stop and hit a point where more food would make you feel worse instead of better. You’ll get hungry again, but part of the cycle includes satiation. For purposes of discussion, I’m talking about most people here, not those with eating disorders or unusual metabolisms that affect their ability to feel satiety.
I think most people have a limit on their desire for status, though that might be more like the situation you describe. Few would turn down a chance to be the world’s Dictator for Life, but they’ve hit a point where trying for more status than they’ve got seems like too much trouble.
It’s a Schellingian idea: in conflict situations, it is often a rational strategy to pre-commit to act irrationally (i.e. without regards to cost and benefit) unless the opponent yields. The idea in this case is that I’ll self-modify to care about X far more than I initially do, and thus pre-commit to lash out if anyone does it.
If we have a dispute and I credibly signal that I’m going to flip out and create drama out of all proportion to the issue at stake, you’re faced with a choice between conceding to my demands or getting into an unpleasant situation that will cost more than the matter of dispute is worth. I’m sure you can think of many examples where people successfully get the upper hand in disputes using this strategy. The only way to disincentivize such behavior is to pre-commit credibly to be defiant in face of threats of drama. In contrast, if you act like a (naive) utilitarian, you are exceptionally vulnerable to this strategy, since I don’t even need drama to get what I want, if I can self-modify to care tremendously about every single thing I want. (Which I won’t do if I’m a good naive utilitarian myself, but the whole point is that it’s not a stable strategy.)
Now, the key point is that such behavior is usually not consciously manipulative and calculated. On the contrary—someone flipping out and creating drama for a seemingly trivial reason is likely to be under God-honest severe distress, feeling genuine pain of offense and injustice. This is a common pattern in human social behavior: humans are extremely good at detecting faked emotions and conscious manipulation, and as a result, we have evolved so that our brains lash out with honest strong emotion that is nevertheless directed by some module that performs game-theoretic assessment of the situation. This of course prompts strategic responses from others, leading to a strategic arms race without end.
The further crucial point is that these game-theoretic calculators in our brains are usually smart enough to assess whether the flipping out strategy is likely to be successful, given what might be expected in response. Basically, it is a part of the human brain that responds to rational incentives even though it’s not under the control of the conscious mind. With this in mind, you can resolve the seeming contradiction between the sincerity of the pain of offense and the fact that it responds to rational incentives.
All this is somewhat complicated when we consider issues of group conflict rather than individual conflict, but the same basic principles apply.
Do you have strategies for distinguishing between game theoretic exaggeration of offense vs. natural offense?
The question is better phrased by asking what will be the practical consequences of treating an offense as legitimate and ceasing the offending action (and perhaps also apologizing) versus treating it as illegitimate and standing your ground (and perhaps even escalating). Clearly, this is a difficult question of great practical value in life, and like every such question, it’s impossible to give a simple and universally applicable answer. (And of course, even if you know the answer in some concrete situation, you’ll need extraordinary composure and self-control to apply it if it’s contrary to your instinctive reaction.)
I don’t see the distinction you’re trying to make.
Tentatively—game theoretic exaggeration of offense will simply be followed by more and more demands. Natural offense is about a desire that can be satiated.
However, there’s another sort of breakdown of negotiations that just occurred to me. If A asks for less than they want because they think that’s all they can get and/or they’re trying to do a utilitarian calculation, they aren’t going to be happy even if they get it. This means they’re likely to push for more even if they get it, and then they start looking like a utility monster.
What do you mean by “satiated”?
From a utilitarian/consequentialist point of view, a desire being “satiated” simply means that the marginal utility gains from pursuing it further are less than opportunity cost of however much effort it takes.
Note that by this definition when a desire is satiated depends on how easy it is to pursue.
If you’re hungry you might feel as though you could just keep eating and eating. However, if enough food is available, you’ll stop and hit a point where more food would make you feel worse instead of better. You’ll get hungry again, but part of the cycle includes satiation. For purposes of discussion, I’m talking about most people here, not those with eating disorders or unusual metabolisms that affect their ability to feel satiety.
I think most people have a limit on their desire for status, though that might be more like the situation you describe. Few would turn down a chance to be the world’s Dictator for Life, but they’ve hit a point where trying for more status than they’ve got seems like too much trouble.