(Posted as a comment rather than an answer because all of this is pretty rambling, and I’m not super-confident about any of the stuff I say below, even if my tone or phrasing seems to suggest otherwise.)
For the purposes of a discussion like this, rather than talk about what intellectual honesty is, I think it makes more sense to talk about what intellectual honesty is not. Specifically, I’d suggest that the kinds of behavior we consider “intellectually honest” are simply what human behavior looks like when it’s not being warped by some combination of outside incentives. The reason intellectual honesty is so hard to find, then, is simply that humans tend to find themselves influenced by external incentives almost all of the time. Even absent more obvious factors like money or power, humans are social creatures, and all of us unconsciously track the social status of ourselves and others. Throw in the fact that social status is scarce by definition, and we end up playing all sorts of social games “under the table”.
This affects practically all of our interactions with other people, even interactions ostensibly for some other purpose (such as solving a problem or answering a question). Unless people are in a very specific kind of environment, by default, all interactions have an underlying status component: if I say something wrong and someone corrects me on it, I’m made to seem less knowledgeable in comparison, and so that person gains status at my expense. If you’re in an environment where this sort of thing is happening (and you pretty much always are), naturally you’re going to divert some effort away from accomplishing whatever the actual goal is, and toward maintaining or increasing your social standing. (Of course, this behavior needn’t be conscious at all; we’re perfectly capable of executing status-increasing maneuvers without realizing we’re doing it.)
This would suggest that intellectual honesty is most prevalent in fields that prioritize problem-solving over status, and (although confirmation bias is obviously a thing) I do think this is observably true. For example, when a mathematician finds that they’ve made a mistake, they pretty much always own up to it immediately, and other mathematicians don’t respect them less for doing so. (Ditto physicists.) And this isn’t because mathematicians and physicists have some magical personality trait that makes them immune to status games—it’s simply because they’re focused on actually doing something, and the thing they’re doing is more important to them than showing off their own cleverness.
If you and I are working together to solve a particular problem, and both of us actually care about solving the problem, then there’s no reason for me to feel threatened by you, even if you do something that looks vaguely like a status grab (such as correcting me when I make a mistake). Because I know that we’re fundamentally on the same side, I don’t need to worry nearly as much about what I say or do in front of you, which in turn allows me to voice my actual thoughts and opinions much more freely. The atmosphere is collaborative rather than competitive. In that situation, both of us can act “intellectually honest”, but importantly, there’s not even a need for that term. No one’s going to compliment me on how “intellectually honest” I’m being if I quickly admit that I made a mistake, because, well, why would I be doing anything other than trying to solve the problem I set out to solve? It’s a given that I’d immediately abandon any unpromising or mistaken approaches; there’s nothing special about that kind of behavior, and so there’s no need to give it a special name like “intellectual honesty”.
The only context in which “intellectual honesty” is a useful concept is one that’s already dominated by status games. Only in cases where the incentives are sharply aligned against admitting that you’re wrong does it become something laudable, something unusual, something to be praised whenever someone actually does it. In practice, these kinds of situations crop up all the time because status is something humans breathe, but I still think it’s useful to point out that “intellectual honesty” is really just the default mode of behavior, even if that default mode is often corrupted by other stuff.
(Posted as a comment rather than an answer because all of this is pretty rambling, and I’m not super-confident about any of the stuff I say below, even if my tone or phrasing seems to suggest otherwise.)
For the purposes of a discussion like this, rather than talk about what intellectual honesty is, I think it makes more sense to talk about what intellectual honesty is not. Specifically, I’d suggest that the kinds of behavior we consider “intellectually honest” are simply what human behavior looks like when it’s not being warped by some combination of outside incentives. The reason intellectual honesty is so hard to find, then, is simply that humans tend to find themselves influenced by external incentives almost all of the time. Even absent more obvious factors like money or power, humans are social creatures, and all of us unconsciously track the social status of ourselves and others. Throw in the fact that social status is scarce by definition, and we end up playing all sorts of social games “under the table”.
This affects practically all of our interactions with other people, even interactions ostensibly for some other purpose (such as solving a problem or answering a question). Unless people are in a very specific kind of environment, by default, all interactions have an underlying status component: if I say something wrong and someone corrects me on it, I’m made to seem less knowledgeable in comparison, and so that person gains status at my expense. If you’re in an environment where this sort of thing is happening (and you pretty much always are), naturally you’re going to divert some effort away from accomplishing whatever the actual goal is, and toward maintaining or increasing your social standing. (Of course, this behavior needn’t be conscious at all; we’re perfectly capable of executing status-increasing maneuvers without realizing we’re doing it.)
This would suggest that intellectual honesty is most prevalent in fields that prioritize problem-solving over status, and (although confirmation bias is obviously a thing) I do think this is observably true. For example, when a mathematician finds that they’ve made a mistake, they pretty much always own up to it immediately, and other mathematicians don’t respect them less for doing so. (Ditto physicists.) And this isn’t because mathematicians and physicists have some magical personality trait that makes them immune to status games—it’s simply because they’re focused on actually doing something, and the thing they’re doing is more important to them than showing off their own cleverness.
If you and I are working together to solve a particular problem, and both of us actually care about solving the problem, then there’s no reason for me to feel threatened by you, even if you do something that looks vaguely like a status grab (such as correcting me when I make a mistake). Because I know that we’re fundamentally on the same side, I don’t need to worry nearly as much about what I say or do in front of you, which in turn allows me to voice my actual thoughts and opinions much more freely. The atmosphere is collaborative rather than competitive. In that situation, both of us can act “intellectually honest”, but importantly, there’s not even a need for that term. No one’s going to compliment me on how “intellectually honest” I’m being if I quickly admit that I made a mistake, because, well, why would I be doing anything other than trying to solve the problem I set out to solve? It’s a given that I’d immediately abandon any unpromising or mistaken approaches; there’s nothing special about that kind of behavior, and so there’s no need to give it a special name like “intellectual honesty”.
The only context in which “intellectual honesty” is a useful concept is one that’s already dominated by status games. Only in cases where the incentives are sharply aligned against admitting that you’re wrong does it become something laudable, something unusual, something to be praised whenever someone actually does it. In practice, these kinds of situations crop up all the time because status is something humans breathe, but I still think it’s useful to point out that “intellectual honesty” is really just the default mode of behavior, even if that default mode is often corrupted by other stuff.