Right. What you are saying is related to the notion of “credible threats”. If other agents can give you disutility with little disutility for themselves, then they have a credible threat against you. And unless you either change your utility function, or find a way of making it much more difficult and costly for them to harm you, the rational course is to give in to the extortion.
One way to make it costly for others to harm you is to join a large coalition which threatens massive retaliation against anyone practicing extortion against coalition members. But notice that if you join such a coalition, you must be willing to bear your share of the burden should such retaliation be necessary.
The alternative I suggested in the grandparent was to change your utility function so as to make you less vulnerable—only care about things you have control over. Unfortunately, this is advice that may be impossible to carry out. Preferences, as several commentators here have pointed out, tend to be incorrigible.
The alternative I suggested in the grandparent was to change your utility function so as to make you less vulnerable—only care about things you have control over. Unfortunately, this is advice that may be impossible to carry out. Preferences, as several commentators here have pointed out, tend to be incorrigible.
I took the obvious solution to that difficulty. I self modified to an agent that behaves exactly as if he had self modified to be an agent with preferences that make him less vulnerable. This is a coherent configuration for my atoms to be in terms of physics and is also one that benefits me.
Right. What you are saying is related to the notion of “credible threats”. If other agents can give you disutility with little disutility for themselves, then they have a credible threat against you. And unless you either change your utility function, or find a way of making it much more difficult and costly for them to harm you, the rational course is to give in to the extortion.
One way to make it costly for others to harm you is to join a large coalition which threatens massive retaliation against anyone practicing extortion against coalition members. But notice that if you join such a coalition, you must be willing to bear your share of the burden should such retaliation be necessary.
The alternative I suggested in the grandparent was to change your utility function so as to make you less vulnerable—only care about things you have control over. Unfortunately, this is advice that may be impossible to carry out. Preferences, as several commentators here have pointed out, tend to be incorrigible.
I took the obvious solution to that difficulty. I self modified to an agent that behaves exactly as if he had self modified to be an agent with preferences that make him less vulnerable. This is a coherent configuration for my atoms to be in terms of physics and is also one that benefits me.