One of the advantages to remaining agnostic comes from the same argument that users put forth in the comment sections on this very site way back in the age of the Sequences (I can look up the specific links if people really want me to, they were in response to the Doublethink Sequence) for why it’s not necessarily instrumentally rational for limited beings like humans to actually believe in the Litany of Tarski: if you are in a precarious social situation, in which retaining status/support/friends/resources is contingent on you successfully signaling to your in-group that you maintain faith in their core teachings, it simply doesn’t suffice to say “acquire all the private truth through regular means and don’t talk/signal publicly the stuff that would be most dangerous to you,” because you don’t get complete control over what you signal.
If you learn that the in-group is wrong about some critical matter, and you understand that in-group members realizing you no longer agree with them will result in harm to you (directly, or through your resources being cut off), your only option is, to act (to some extent) deceptively. To take on the role, QuirrellMort-style, of somebody who does not have access to the information you have actually stumbled upon, and to pretend to be just another happy & clueless member of the community.
This is capital-H Hard. Lying (or even something smaller-scale like lesser deceptions), when done consistently and routinely, to people that you consider(ed) your family/friends/acquaintances, is very hard for (the vast majority of) people. For straightforward evolutionary reasons, we have evolved to be really good at detecting when one of our own is not being fully forthcoming. You can bypass this obstacle if the number of interactions you have is small, or if, as is usually the case in modern life when people get away with lies, nobody actually cares about the lie and it’s all just a game of make-believe where you just have to “utter the magic words.” But when it’s not a game, when people do care about honestly signaling your continued adherence to the group’s beliefs and epistemology, you’re in big trouble.
Indeed, by far the most efficient way of convincing others of your bullshit on a regular basis is to convince yourself first, and by putting yourself in a position where you must do the former, you are increasing the likelihood of the latter with every passing day. Quite the opposite of what you’d like to see happen, if you are about truth-seeking to any large extent.
(addendum: admittedly, this doesn’t answer the question fully, since it doesn’t deal with the critical distinction between agnosticism and explicit advocacy, but I think it does get at something reasonably important in the vicinity of it anyway)
Fair point; I was assuming you had the capacity to lie/omit/deceive, and you’re right that we often don’t, at least not fully.
I still prefer my policy to the OPs, but I accept your argument that mine isn’t a simple Pareto improvement.
Still:
I really don’t like letting social forces put “don’t think about X” flinches into my or my friends’ heads; and the OPs policy seems to me like an instance of that;
Much less importantly: as an intelligent/self-reflective adult, you may be better at hiding info if you know what you’re hiding, compared to if you have guesses you’re not letting yourself see, that your friends might still notice. (The “don’t look into dragons” path often still involves hiding info, since often your brain takes a guess anyhow, and that’s part of how you know not to look into this one. If you acknowledge the whole situation, you can manage your relationships consciously, including taking conscious steps to buy openness-offsets, stay freely and transparently friends where you can scheme out how.)
The “don’t look into dragons” path often still involves hiding info, since often your brain takes a guess anyhow
In many cases I have guesses, but because I just have vague impressions they’re all very speculative. This is consistent with being able to say “I haven’t looked into it” and “I really don’t know”, and because these are all areas where the truth is not decision relevant it’s been easy to leave it at that. Perhaps people notice I have doubts, but at least in my social circles that’s acceptable if not made explicit.
I think it’s a pretty weak hit, though not zero. There are so many things I want to look into that I don’t have time for that having this as another factor in my prioritization doesn’t feel very limiting to my intellectual freedom.
I do think it is good to have a range of people in society who are taking a range of approaches, though!
One of the advantages to remaining agnostic comes from the same argument that users put forth in the comment sections on this very site way back in the age of the Sequences (I can look up the specific links if people really want me to, they were in response to the Doublethink Sequence) for why it’s not necessarily instrumentally rational for limited beings like humans to actually believe in the Litany of Tarski: if you are in a precarious social situation, in which retaining status/support/friends/resources is contingent on you successfully signaling to your in-group that you maintain faith in their core teachings, it simply doesn’t suffice to say “acquire all the private truth through regular means and don’t talk/signal publicly the stuff that would be most dangerous to you,” because you don’t get complete control over what you signal.
If you learn that the in-group is wrong about some critical matter, and you understand that in-group members realizing you no longer agree with them will result in harm to you (directly, or through your resources being cut off), your only option is, to act (to some extent) deceptively. To take on the role, QuirrellMort-style, of somebody who does not have access to the information you have actually stumbled upon, and to pretend to be just another happy & clueless member of the community.
This is capital-H Hard. Lying (or even something smaller-scale like lesser deceptions), when done consistently and routinely, to people that you consider(ed) your family/friends/acquaintances, is very hard for (the vast majority of) people. For straightforward evolutionary reasons, we have evolved to be really good at detecting when one of our own is not being fully forthcoming. You can bypass this obstacle if the number of interactions you have is small, or if, as is usually the case in modern life when people get away with lies, nobody actually cares about the lie and it’s all just a game of make-believe where you just have to “utter the magic words.” But when it’s not a game, when people do care about honestly signaling your continued adherence to the group’s beliefs and epistemology, you’re in big trouble.
Indeed, by far the most efficient way of convincing others of your bullshit on a regular basis is to convince yourself first, and by putting yourself in a position where you must do the former, you are increasing the likelihood of the latter with every passing day. Quite the opposite of what you’d like to see happen, if you are about truth-seeking to any large extent.
(addendum: admittedly, this doesn’t answer the question fully, since it doesn’t deal with the critical distinction between agnosticism and explicit advocacy, but I think it does get at something reasonably important in the vicinity of it anyway)
Fair point; I was assuming you had the capacity to lie/omit/deceive, and you’re right that we often don’t, at least not fully.
I still prefer my policy to the OPs, but I accept your argument that mine isn’t a simple Pareto improvement.
Still:
I really don’t like letting social forces put “don’t think about X” flinches into my or my friends’ heads; and the OPs policy seems to me like an instance of that;
Much less importantly: as an intelligent/self-reflective adult, you may be better at hiding info if you know what you’re hiding, compared to if you have guesses you’re not letting yourself see, that your friends might still notice. (The “don’t look into dragons” path often still involves hiding info, since often your brain takes a guess anyhow, and that’s part of how you know not to look into this one. If you acknowledge the whole situation, you can manage your relationships consciously, including taking conscious steps to buy openness-offsets, stay freely and transparently friends where you can scheme out how.)
In many cases I have guesses, but because I just have vague impressions they’re all very speculative. This is consistent with being able to say “I haven’t looked into it” and “I really don’t know”, and because these are all areas where the truth is not decision relevant it’s been easy to leave it at that. Perhaps people notice I have doubts, but at least in my social circles that’s acceptable if not made explicit.
Does it feel to you as though your epistemic habits / self-trust / intellectual freedom and autonomy / self-honesty takes a hit here?
I think it’s a pretty weak hit, though not zero. There are so many things I want to look into that I don’t have time for that having this as another factor in my prioritization doesn’t feel very limiting to my intellectual freedom.
I do think it is good to have a range of people in society who are taking a range of approaches, though!