Not to be rude, but this article is terminally confused. The principles of rationality do not tell you what your values should be; rather, they guide you in achieving whatever your values actually are. The principles and processes of rational thought and action are the same for all of us, but they lead to different prescriptions for different people. What is a rational action for me is not always a rational action for you, and vice versa, not only because our circumstances are different (and hence we will get different results) but because our values are different.
So, no, risk aversion is not a bias; it is a property of one’s utility function.
A desire for justice is a bias only if it does not accord with your actual terminal values. If I spend some of my limited time and effort to see that a criminal who has harmed me gets punished, or to see that an innocent person goes free, that is not irrational—I derive some measure of satisfaction from these. On the other hand, if I obsessively pursue a criminal who has harmed me, to the point that I lose my job, use up all of my wealth, and alienate my family, then I have acted irrationally: putting the guy in jail won’t give me that much satisfaction, not enough to compensate for all those losses.
The feeling that you are responsible for some things and not others, like say, the safety of your family, but not people being tortured in Syria, seems noble and practical. But I take it to be a bias.
And who are you to dictate my utility function? I value the welfare of my own family much more than that of a stranger in Syria; that’s a simple fact about my own utility function. If I had to choose between saving my daughter from being raped and saving ten Syrian men from being tortured, I would choose my daughter, without hesitation, and without apology. It would be quite irrational for me to choose to save the Syrian men instead, out of some misguided notion that my utility function ought to be something other than what it actually is.
What is a rational action for me is not always a rational action for you, and vice versa, not only because our circumstances are different (and hence we will get different results) but because our values are different.
Are they really? Our understandings are certainly different, but that can change.
So, no, risk aversion is not a bias; it is a property of one’s utility function.
it’s a bias if it keeps you from achieving your other values.
A desire for justice is a bias only if it does not accord with your actual terminal values. If I spend some of my limited time and effort to see that a criminal who has harmed me gets punished, or to see that an innocent person goes free, that is not irrational—I derive some measure of satisfaction from these. On the other hand, if I obsessively pursue a criminal who has harmed me, to the point that I lose my job, use up all of my wealth, and alienate my family, then I have acted irrationally: putting the guy in jail won’t give me that much satisfaction, not enough to compensate for all those losses.
Hmm. Nothing I can say to that except that I think a lot of things aren’t worth doing for the satisfaction of punishment, like for example, ruining someones life at a cost of hundreds of thousands of dollars.
And who are you to dictate my utility function? I value the welfare of my own family much more than that of a stranger in Syria; that’s a simple fact about my own utility function. If I had to choose between saving my daughter from being raped and saving ten Syrian men from being tortured, I would choose my daughter, without hesitation, and without apology. It would be quite irrational for me to choose to save the Syrian men instead, out of some misguided notion that my utility function ought to be something other than what it actually is.
That’s a good point, people you have a connection to carry much greater weight in your utility function.
it’s a bias if it keeps you from achieving your other values.
This is nonsense. I have a desire for ice cream, but also a desire to stick to my diet and lose weight. Oh no, my desire to stick to my diet is preventing my from achieving my desire for ice cream, it must be a bias!
Agreed, but nyan_sandwich touches on an interesting point.
Certainly, there are lots of situations where I have one set of cognitive structures that encourage me to behave one way (say, eating ice cream, or experimenting to discover what’s actually true about my environment, or whatever) and a different set encouraging me to behave a different way (say, avoiding empty calories, or having high confidence in what I was taught as a child, or whatever).
It seems to me that when I call one of those structures a “bias” I’m suggesting two things: first, that I don’t endorse it, and second, that it’s relatively broad in applicability.
But that in turn suggests that I can eliminate a bias simply by endorsing its conclusions, which is.. not uncontroversial.
If they conflict, one or the other is currently a ‘bias’. You get to decide which one you like more.
Is eating ice-cream more important than your desire to stay healthy? You must overcome your desire to stay healthy.
Is staying healthy more important than eating ice-cream? Then you must overcome the desire to eat ice-cream.
‘bias’ is a fuzzy category referring to the corner of the conflict*value space where value is low and conflict with other values is high. Stretched all the way over to ice-cream and health, it starts to lose meaning. Just talk about which one you want to overcome.
That’s really not how I would understand a bias. I would think of a bias as a feature of your psychology that distorts your decision-making process away from the rational; that is, optimal pursuit of your goals. The planning fallacy is a bias, having conflicting goals is just a feature of my utility function.
Not to be rude, but this article is terminally confused. The principles of rationality do not tell you what your values should be; rather, they guide you in achieving whatever your values actually are. The principles and processes of rational thought and action are the same for all of us, but they lead to different prescriptions for different people. What is a rational action for me is not always a rational action for you, and vice versa, not only because our circumstances are different (and hence we will get different results) but because our values are different.
So, no, risk aversion is not a bias; it is a property of one’s utility function.
A desire for justice is a bias only if it does not accord with your actual terminal values. If I spend some of my limited time and effort to see that a criminal who has harmed me gets punished, or to see that an innocent person goes free, that is not irrational—I derive some measure of satisfaction from these. On the other hand, if I obsessively pursue a criminal who has harmed me, to the point that I lose my job, use up all of my wealth, and alienate my family, then I have acted irrationally: putting the guy in jail won’t give me that much satisfaction, not enough to compensate for all those losses.
And who are you to dictate my utility function? I value the welfare of my own family much more than that of a stranger in Syria; that’s a simple fact about my own utility function. If I had to choose between saving my daughter from being raped and saving ten Syrian men from being tortured, I would choose my daughter, without hesitation, and without apology. It would be quite irrational for me to choose to save the Syrian men instead, out of some misguided notion that my utility function ought to be something other than what it actually is.
Are they really? Our understandings are certainly different, but that can change.
it’s a bias if it keeps you from achieving your other values.
Hmm. Nothing I can say to that except that I think a lot of things aren’t worth doing for the satisfaction of punishment, like for example, ruining someones life at a cost of hundreds of thousands of dollars.
That’s a good point, people you have a connection to carry much greater weight in your utility function.
This is nonsense. I have a desire for ice cream, but also a desire to stick to my diet and lose weight. Oh no, my desire to stick to my diet is preventing my from achieving my desire for ice cream, it must be a bias!
Agreed, but nyan_sandwich touches on an interesting point.
Certainly, there are lots of situations where I have one set of cognitive structures that encourage me to behave one way (say, eating ice cream, or experimenting to discover what’s actually true about my environment, or whatever) and a different set encouraging me to behave a different way (say, avoiding empty calories, or having high confidence in what I was taught as a child, or whatever).
It seems to me that when I call one of those structures a “bias” I’m suggesting two things: first, that I don’t endorse it, and second, that it’s relatively broad in applicability.
But that in turn suggests that I can eliminate a bias simply by endorsing its conclusions, which is.. not uncontroversial.
If they conflict, one or the other is currently a ‘bias’. You get to decide which one you like more.
Is eating ice-cream more important than your desire to stay healthy? You must overcome your desire to stay healthy.
Is staying healthy more important than eating ice-cream? Then you must overcome the desire to eat ice-cream.
‘bias’ is a fuzzy category referring to the corner of the conflict*value space where value is low and conflict with other values is high. Stretched all the way over to ice-cream and health, it starts to lose meaning. Just talk about which one you want to overcome.
That’s really not how I would understand a bias. I would think of a bias as a feature of your psychology that distorts your decision-making process away from the rational; that is, optimal pursuit of your goals. The planning fallacy is a bias, having conflicting goals is just a feature of my utility function.