I think the utility company example is fine. Lots of biases can be described as resulting from the use of a pretty good heuristic which leads people astray in that particular case, but that’s still a cost of imperfect thinking. And this was a case where the alternative to the status quo was relatively simple—it was defined precisely and differed on only a small number of easily understandable dimensions—so concerns about swindles, unintended consequences, or limited understanding of complex changes shouldn’t play a big role here.
In real life, social processes might eventually overcome the status quo bias, but there’s still a lot of waste in the interim which the clever (aka more rational) minority would be able to avoid. Actually, in this case the change to utility coverage would probably have to be made for a whole neighborhood at once, so I don’t think that your model of social change would work.
I’d say the utility company example is, in an important sense, the mirror image of the Albanian example. In both cases, we have someone approaching the common folk with a certain air of authority and offering some sort of deal that’s supposed to sound great. In the first case, people reject a favorable deal (though only in the hypothetical) due to the status quo bias, and in the second case, people enthusiastically embrace what turns out to be a pernicious scam. At least superficially, this seems like the same kind of bias, only pointed in opposite directions.
Now, while I can think of situations where the status quo bias has been disastrous for some people, and even situations where this bias might lead to great disasters and existential risks, I’d say that in the huge majority of situations, the reluctance to embrace changes that are supposed to improve what already works tolerably well is an important force that prevents people from falling for various sorts of potentially disastrous scams like those that happened in Albania. This is probably even more true when it comes to the mass appeal of radical politics. Yes, it would be great if people’s intellects were powerful and unbiased enough to analyze every idea with pristine objectivity and crystal analytical clarity, but since humans are what they are, I’m much happier if they’re harder to convince to change things that are already functioning adequately.
Therefore, I’m inclined to believe that a considerable dose of status quo bias is optimal from a purely consequentialist perspective. Situations where the status quo bias is gravely dangerous are far from nonexistent, but still exceptional, whereas when it comes to the opposite sort of danger, every human society is sitting on a powder keg all of the time.
I think the utility company example is fine. Lots of biases can be described as resulting from the use of a pretty good heuristic which leads people astray in that particular case, but that’s still a cost of imperfect thinking. And this was a case where the alternative to the status quo was relatively simple—it was defined precisely and differed on only a small number of easily understandable dimensions—so concerns about swindles, unintended consequences, or limited understanding of complex changes shouldn’t play a big role here.
In real life, social processes might eventually overcome the status quo bias, but there’s still a lot of waste in the interim which the clever (aka more rational) minority would be able to avoid. Actually, in this case the change to utility coverage would probably have to be made for a whole neighborhood at once, so I don’t think that your model of social change would work.
I’d say the utility company example is, in an important sense, the mirror image of the Albanian example. In both cases, we have someone approaching the common folk with a certain air of authority and offering some sort of deal that’s supposed to sound great. In the first case, people reject a favorable deal (though only in the hypothetical) due to the status quo bias, and in the second case, people enthusiastically embrace what turns out to be a pernicious scam. At least superficially, this seems like the same kind of bias, only pointed in opposite directions.
Now, while I can think of situations where the status quo bias has been disastrous for some people, and even situations where this bias might lead to great disasters and existential risks, I’d say that in the huge majority of situations, the reluctance to embrace changes that are supposed to improve what already works tolerably well is an important force that prevents people from falling for various sorts of potentially disastrous scams like those that happened in Albania. This is probably even more true when it comes to the mass appeal of radical politics. Yes, it would be great if people’s intellects were powerful and unbiased enough to analyze every idea with pristine objectivity and crystal analytical clarity, but since humans are what they are, I’m much happier if they’re harder to convince to change things that are already functioning adequately.
Therefore, I’m inclined to believe that a considerable dose of status quo bias is optimal from a purely consequentialist perspective. Situations where the status quo bias is gravely dangerous are far from nonexistent, but still exceptional, whereas when it comes to the opposite sort of danger, every human society is sitting on a powder keg all of the time.