Can you elaborate? I agree this a difficult risky topic to discuss, and I tried to evade the landmines while writing it (like accidentally implying that this evolutionary instinct is somehow good), but though I very much like and agree with Yes requires the possibility of no, and know what filtered evidence is, I don’t really understand the first part of your comment. Also I’d be interested to hear what you think are the epistemic landmines.
If you are about to say something socially neutral or approved, but a salient alternative to what you are saying comes with a cost (or otherwise a target of appeal to consequences), integrity in making the claim requires a resolve to have said that alternative too if it (counterfactually) turned out to be what you believe (with some unclear “a priori” weighing that doesn’t take into account your thinking on that particular topic). But that’s not enough if you want others to have a fair opportunity to debate the claim you make, for they would also incur the cost of the alternative claims, and the trial preregistration pact must be acausally negotiated with them and not just accepted on your own.
See this comment and its parent for a bit more on this. This is a large topic, related to glomarization and (dis)honesty. These contraptions have to be built around anti-epistemology to counteract its distorting effects.
Can you elaborate? I agree this a difficult risky topic to discuss, and I tried to evade the landmines while writing it (like accidentally implying that this evolutionary instinct is somehow good), but though I very much like and agree with Yes requires the possibility of no, and know what filtered evidence is, I don’t really understand the first part of your comment. Also I’d be interested to hear what you think are the epistemic landmines.
If you are about to say something socially neutral or approved, but a salient alternative to what you are saying comes with a cost (or otherwise a target of appeal to consequences), integrity in making the claim requires a resolve to have said that alternative too if it (counterfactually) turned out to be what you believe (with some unclear “a priori” weighing that doesn’t take into account your thinking on that particular topic). But that’s not enough if you want others to have a fair opportunity to debate the claim you make, for they would also incur the cost of the alternative claims, and the trial preregistration pact must be acausally negotiated with them and not just accepted on your own.
See this comment and its parent for a bit more on this. This is a large topic, related to glomarization and (dis)honesty. These contraptions have to be built around anti-epistemology to counteract its distorting effects.
I also saw you saying a similar thing here. I think there’s a top level post here waiting to be written. I’ll be glad to read it if you write it.