How much should I worry about the unilateralist’s curse when making arguments that it seems like some people should have already thought of and that they might have avoided making because they anticipated side effects that I don’t understand?
In most domains people don’t make arguments because they either think they aren’t strong or because making the argument would lose them social status.
The cases where an argument carries with it real danger are relatively small, and in most of those cases it should be possible to know that you are in a problematic area. In those cases, you should make arguments first nonpublically with people who you consider to be good judges of whether those arguments should be made publically.
Adding to your first point: Or they don’t make arguments simply because—even if strong and in the absence of social costs—it does not pay.
(I think of the example of some policy debates where I know tons of academics who could easily provide tons of very strong, rather obvious arguments, that are essentially not made because none seems to care getting involved)
I paint a stylized case of some type of situation where the question arises, and where my gut feeling tells me it may often be better to release the argument than to hide it, for the sake of long-term social cohesion and advancement:
You’re part of an intellectual elite, with their own values/biases, and you consider hiding a sensible argument (say, on a political topic) because commoners, given their separate values/biases, would risk to act to it in a way that goes counter your agenda. You might likely not release the argument thus.
In the long-run this can backfire. The only way for society to advance is by reducing the gap between the elite and commoners. Commoners understand if they are regularly fed biased info by the elite; and the less seriously the elite engages, the less commoners will trust and be able to be hauled into more nuanced ways of thinking.
In short, in this stylized case: Intellectual honesty, even risking an immediate harm to your values, may likely enough pay in the long-term. Lifting the level of the discussion, by bringing up rational arguments for both ‘sides’, is important, especially in democratic systems where you eventually anyways rely on a common understanding of the world.
Maybe this does not generalize well, and my hunch is wrong: in the long-run we’re all dead, and the level of public discussions is often so low that adding an argument in one direction is often just ammunition being exploited without impacting much along other dimensions.
How much should I worry about the unilateralist’s curse when making arguments that it seems like some people should have already thought of and that they might have avoided making because they anticipated side effects that I don’t understand?
In most domains people don’t make arguments because they either think they aren’t strong or because making the argument would lose them social status.
The cases where an argument carries with it real danger are relatively small, and in most of those cases it should be possible to know that you are in a problematic area. In those cases, you should make arguments first nonpublically with people who you consider to be good judges of whether those arguments should be made publically.
Adding to your first point: Or they don’t make arguments simply because—even if strong and in the absence of social costs—it does not pay.
(I think of the example of some policy debates where I know tons of academics who could easily provide tons of very strong, rather obvious arguments, that are essentially not made because none seems to care getting involved)
I paint a stylized case of some type of situation where the question arises, and where my gut feeling tells me it may often be better to release the argument than to hide it, for the sake of long-term social cohesion and advancement:
You’re part of an intellectual elite, with their own values/biases, and you consider hiding a sensible argument (say, on a political topic) because commoners, given their separate values/biases, would risk to act to it in a way that goes counter your agenda. You might likely not release the argument thus.
In the long-run this can backfire. The only way for society to advance is by reducing the gap between the elite and commoners. Commoners understand if they are regularly fed biased info by the elite; and the less seriously the elite engages, the less commoners will trust and be able to be hauled into more nuanced ways of thinking.
In short, in this stylized case: Intellectual honesty, even risking an immediate harm to your values, may likely enough pay in the long-term. Lifting the level of the discussion, by bringing up rational arguments for both ‘sides’, is important, especially in democratic systems where you eventually anyways rely on a common understanding of the world.
Maybe this does not generalize well, and my hunch is wrong: in the long-run we’re all dead, and the level of public discussions is often so low that adding an argument in one direction is often just ammunition being exploited without impacting much along other dimensions.