I think truels are a game-theoretic structure that suggests that there are costs to (short-sighted) “winning”, just as there are costs to “truth-seeking”.
I found the post interesting… except for this penultimate paragraph; I don’t think there’s a good analogy here. An evolutionary motive for “choking” or signaling choking is an interesting enough observation on its own.
LW is on refining the art of thinking; there’s no need to strain for the segues.
(To be specific about where the analogy is strained: One question is about whether common human goals are likely to conflict with epistemic rationality; the other question is about signaling short-term failure as a road to long-term success under clearly defined criteria. Standard instrumental versus terminal goals, which is not at all as thorny as alleged instrumental epistemic stupidity.)
I found the post interesting… except for this penultimate paragraph; I don’t think there’s a good analogy here. An evolutionary motive for “choking” or signaling choking is an interesting enough observation on its own.
LW is on refining the art of thinking; there’s no need to strain for the segues.
(To be specific about where the analogy is strained: One question is about whether common human goals are likely to conflict with epistemic rationality; the other question is about signaling short-term failure as a road to long-term success under clearly defined criteria. Standard instrumental versus terminal goals, which is not at all as thorny as alleged instrumental epistemic stupidity.)
Thanks. I’ll try to avoid strained segues in the future.