I think I’m largely (albeit tentatively) with Dagon here: it’s not clear that we don’t _want_ our responses to his wrongness to back-propagate into his idea generation. Isn’t that part of how a person’s idea generation gets better?
One possible counterargument: a person’s idea-generation process actually consists of (at least) two parts, generation and filtering, and most of us would do better to have more fluent _generation_. But even if so, we want the _filtering_ to work well, and I don’t know how you enable evaluations to propagate back as far as the filtering stage but to stop before affecting the generation stage.
I’m not saying that the suggestion here is definitely wrong. It could well be that if we follow the path of least resistance, the result will be _too much_ idea-suppression. But you can’t just say “if there’s a substantial cost to saying very wrong things then that’s bad because it may make people less willing or even less able to come up with contrarian ideas in future” without acknowledging that there’s an upside too, in making people less inclined to come up with _bad_ ideas in future.
I think I’m largely (albeit tentatively) with Dagon here: it’s not clear that we don’t _want_ our responses to his wrongness to back-propagate into his idea generation. Isn’t that part of how a person’s idea generation gets better?
It is important that Bob was surprisingly right about something in the past; this means something was going on in his epistemology that wasn’t going on in the group epistemology, and the group’s attempt to update Bob may fail because it misses that important structure. Epistemic tenure is, in some sense, the group saying to Bob “we don’t really get what’s going on with you, and we like it, so keep it up, and we’ll be tolerant of wackiness that is the inevitable byproduct of keeping it up.”
That is, a typical person should care a lot about not believing bad things, and the typical ‘intellectual venture capitalist’ who backs a lot of crackpot horses should likely end up losing their claim on the group’s attention. But when the intellectual venture capitalist is right, it’s important to keep their strategy around, even if you think it’s luck or that you’ve incorporated all of the technique that went into their first prediction, because maybe you haven’t, and their value comes from their continued ability to be a maverick without losing all of their claim on group attention.
If Bob’s history is that over and over again he’s said things that seem obviously wrong but he’s always turned out to be right, I don’t think we need a notion of “epistemic tenure” to justify taking him seriously the next time he says something that seems obviously wrong: we’ve already established that when he says apparently-obviously-wrong things he’s usually right, so plain old induction will get us where we need to go. I think the OP is making a stronger claim. (And a different one: note that OP says explicitly that he isn’t saying we should take Bob seriously because he might be right, but that we should take Bob seriously so as not to discourage him from thinking original thoughts in future.)
And the OP doesn’t (at least as I read it) seem like it stipulates that Bob is strikingly better epistemically than his peers in that sort of way. It says:
Let Bob be an individual that I have a lot intellectual respect for. For example, maybe Bob has a history of believing true things long before anyone else, or Bob has discovered or invented some ideas that I have found very useful.
which isn’t quite the same. One of the specific ways in which Bob might have earned that “lot of intellectual respect” is by believing true things long before everyone else, but that’s just one example. And someone can merit a lot of intellectual respect without being so much better than everyone else.
For an “intellectual venture capitalist” who generates a lot of wild ideas, mostly wrong but right more often than you’d expect, I do agree that we want to avoid stifling them. But we do also want to avoid letting them get entirely untethered from reality, and it’s not obvious to me what degree of epistemic tenure best makes that balance.
(Analogy: successful writers get more freedom to ignore the advice of their editors. Sometimes that’s a good thing, but not always.)
I think I’m largely (albeit tentatively) with Dagon here: it’s not clear that we don’t _want_ our responses to his wrongness to back-propagate into his idea generation. Isn’t that part of how a person’s idea generation gets better?
One possible counterargument: a person’s idea-generation process actually consists of (at least) two parts, generation and filtering, and most of us would do better to have more fluent _generation_. But even if so, we want the _filtering_ to work well, and I don’t know how you enable evaluations to propagate back as far as the filtering stage but to stop before affecting the generation stage.
I’m not saying that the suggestion here is definitely wrong. It could well be that if we follow the path of least resistance, the result will be _too much_ idea-suppression. But you can’t just say “if there’s a substantial cost to saying very wrong things then that’s bad because it may make people less willing or even less able to come up with contrarian ideas in future” without acknowledging that there’s an upside too, in making people less inclined to come up with _bad_ ideas in future.
It is important that Bob was surprisingly right about something in the past; this means something was going on in his epistemology that wasn’t going on in the group epistemology, and the group’s attempt to update Bob may fail because it misses that important structure. Epistemic tenure is, in some sense, the group saying to Bob “we don’t really get what’s going on with you, and we like it, so keep it up, and we’ll be tolerant of wackiness that is the inevitable byproduct of keeping it up.”
That is, a typical person should care a lot about not believing bad things, and the typical ‘intellectual venture capitalist’ who backs a lot of crackpot horses should likely end up losing their claim on the group’s attention. But when the intellectual venture capitalist is right, it’s important to keep their strategy around, even if you think it’s luck or that you’ve incorporated all of the technique that went into their first prediction, because maybe you haven’t, and their value comes from their continued ability to be a maverick without losing all of their claim on group attention.
If Bob’s history is that over and over again he’s said things that seem obviously wrong but he’s always turned out to be right, I don’t think we need a notion of “epistemic tenure” to justify taking him seriously the next time he says something that seems obviously wrong: we’ve already established that when he says apparently-obviously-wrong things he’s usually right, so plain old induction will get us where we need to go. I think the OP is making a stronger claim. (And a different one: note that OP says explicitly that he isn’t saying we should take Bob seriously because he might be right, but that we should take Bob seriously so as not to discourage him from thinking original thoughts in future.)
And the OP doesn’t (at least as I read it) seem like it stipulates that Bob is strikingly better epistemically than his peers in that sort of way. It says:
which isn’t quite the same. One of the specific ways in which Bob might have earned that “lot of intellectual respect” is by believing true things long before everyone else, but that’s just one example. And someone can merit a lot of intellectual respect without being so much better than everyone else.
For an “intellectual venture capitalist” who generates a lot of wild ideas, mostly wrong but right more often than you’d expect, I do agree that we want to avoid stifling them. But we do also want to avoid letting them get entirely untethered from reality, and it’s not obvious to me what degree of epistemic tenure best makes that balance.
(Analogy: successful writers get more freedom to ignore the advice of their editors. Sometimes that’s a good thing, but not always.)