I would not consider it a downside. When society labels an idea “offensive,” it almost certainly biases people against that idea, so making people believe such ideas more would, on average, probably push them closer to the truth.
I’m not sure this is correct. Let us consider a toy model (emphasis in toy, which should be read as “baby-toy”):
There are three opinions: Center, Left and Right (C, L, R). Almost everyone believes C; positions on L are deemed offensive by people believing C and R, and likewise those on R by people on C and L. However, there is actually a 1⁄3 chance that the true position on any issue is on either of C, L or R. So there is a strong bias against “offensive” positions, as you say.
Now say you are on C and take the course, which gives you a certain probability of switching to L or R. Staying on C, you expect to be right 1⁄3 of the time and only “one slot away” from the truth 2/3s of the time. If you switch, you would expect again to be on the right position 1⁄3 of the time, but 1⁄3 of time “one slot away” and 1⁄3 of the time “two slots away”, i.e. on the complete opposite of truth. This is clearly worse.
The key assumption here was that your chances of switching to L vs R are equal, uncorrelated with which one is true. You might think that this is not so when you are exposed to the best arguments for all positions. But I’m afraid this is an optimistic view: it is at least equally likely which way you switch is determined by hidden pre-existing dispositions to react to propaganda in a given direction, or by other psychological and social factors uncorrelated with truth.
I’m somewhat surprised to see that my baby-toy model implies you should be a mushy centrist on all issues, expect when you have strong evidence that you know better than other people. On reflection, this seems correct.
I don’t think you’re looking at the right quantity. You’re right that if what matters is how close you are to truth, but you have no clue what the truth is, then you should stick to the center of mass.
But I think that having a population distribution that’s closer to the probability distribution of the truth is more useful than individual beliefs. If 99% of people believe C, then positions L and R are not going to get any traction even if they are true. However, if 33% of people believe each of L, C, and R, then at least the true position has lots of supporters, whatever it is, and so arguments for the truth are more likely to be discovered.
I would not consider it a downside. When society labels an idea “offensive,” it almost certainly biases people against that idea, so making people believe such ideas more would, on average, probably push them closer to the truth.
I’m not sure this is correct. Let us consider a toy model (emphasis in toy, which should be read as “baby-toy”):
There are three opinions: Center, Left and Right (C, L, R). Almost everyone believes C; positions on L are deemed offensive by people believing C and R, and likewise those on R by people on C and L. However, there is actually a 1⁄3 chance that the true position on any issue is on either of C, L or R. So there is a strong bias against “offensive” positions, as you say.
Now say you are on C and take the course, which gives you a certain probability of switching to L or R. Staying on C, you expect to be right 1⁄3 of the time and only “one slot away” from the truth 2/3s of the time. If you switch, you would expect again to be on the right position 1⁄3 of the time, but 1⁄3 of time “one slot away” and 1⁄3 of the time “two slots away”, i.e. on the complete opposite of truth. This is clearly worse.
The key assumption here was that your chances of switching to L vs R are equal, uncorrelated with which one is true. You might think that this is not so when you are exposed to the best arguments for all positions. But I’m afraid this is an optimistic view: it is at least equally likely which way you switch is determined by hidden pre-existing dispositions to react to propaganda in a given direction, or by other psychological and social factors uncorrelated with truth.
I’m somewhat surprised to see that my baby-toy model implies you should be a mushy centrist on all issues, expect when you have strong evidence that you know better than other people. On reflection, this seems correct.
I don’t think you’re looking at the right quantity. You’re right that if what matters is how close you are to truth, but you have no clue what the truth is, then you should stick to the center of mass.
But I think that having a population distribution that’s closer to the probability distribution of the truth is more useful than individual beliefs. If 99% of people believe C, then positions L and R are not going to get any traction even if they are true. However, if 33% of people believe each of L, C, and R, then at least the true position has lots of supporters, whatever it is, and so arguments for the truth are more likely to be discovered.