There seems to be a straightforward meaning to “collaborative truth seeking”. Consider two rational agents who have a common interest in understanding part of reality better. The obvious thing for them to do is to share relevant arguments and evidence that they have with each other, as openly, efficiently, and unfiltered-ly as possible under their resource constraints. That’s the sort of thing that I see as the ideal of “collaborative truth seeking”. (ETA: combining resources to gather new evidence and think up new models/arguments is another big part of my ideal of “collaborative truth seeking”.)
The thing where people are attached to their “side”, and want to win the argument in order to gain status seems to clearly fall short of that ideal, as well as introduce questionable incentives (as you point out). That’s to be expected because humans, but it seems like we should still try to do better. And I do think humans can and do do better than this sort of attachment-based argumentation style that seems to be our native mode of dealing with belief differences, though it is hard and takes effort.
That said, I agree it’s suspicious when someone pulls out the “collaborative truth seeking” card in lieu of sharing evidence and arguments (because it’s an easy way for the attachment status motivation to come into play). I also am not particularly sold on things like the principle of charity, steelmanning, or ideological Turing tests because they often seem more like a ploy to have undue attention placed on a particular position than the actual sharing of arguments and evidence that seems to be the real principle to me.
There seems to be a straightforward meaning to “collaborative truth seeking”. Consider two rational agents who have a common interest in understanding part of reality better. The obvious thing for them to do is to share relevant arguments and evidence that they have with each other, as openly, efficiently, and unfiltered-ly as possible under their resource constraints. That’s the sort of thing that I see as the ideal of “collaborative truth seeking”. (ETA: combining resources to gather new evidence and think up new models/arguments is another big part of my ideal of “collaborative truth seeking”.)
The thing where people are attached to their “side”, and want to win the argument in order to gain status seems to clearly fall short of that ideal, as well as introduce questionable incentives (as you point out). That’s to be expected because humans, but it seems like we should still try to do better. And I do think humans can and do do better than this sort of attachment-based argumentation style that seems to be our native mode of dealing with belief differences, though it is hard and takes effort.
That said, I agree it’s suspicious when someone pulls out the “collaborative truth seeking” card in lieu of sharing evidence and arguments (because it’s an easy way for the attachment status motivation to come into play). I also am not particularly sold on things like the principle of charity, steelmanning, or ideological Turing tests because they often seem more like a ploy to have undue attention placed on a particular position than the actual sharing of arguments and evidence that seems to be the real principle to me.