…I went in the other direction: trying to self-deceive little, and instead be self-honest about my real motivations, even if they are “bad PR”.
Yep. I’m not sure why you think this is a “very different” conclusion. I’d say the same thing about myself. The key question is how to handle the cases where becoming conscious of a “bad PR” motivation means it might get exposed.
And you answer that! In part at least. You divide people into three categories based on (a) whether you need occlumency with them at all and (b) whether you need to use occlumency on the fact that you’re using occlumency.
I don’t think of it in terms this explicit, but it’s pretty close to what I do now. People get to see me to the extent that I trust them with what I show them. And that’s conscious.
Am I misunderstanding you somehow?
Moreover, having an extremely difficult high-stakes problem is not just a strong reason to self-deceive less, it’s also strong reason to become more truth-oriented as a community. This means that people with such a common cause should strive to put each other at least in category 2 above, tentatively moving towards 3 (with the caveat of watching out for bad actors trying to exploit that).
I both agree and partly disagree. I tagged your comment with where.
Totally, yes, having a real and meaningful shared problem means we want a truth-seeking community. Strong agreement.
But I think how we “strive” to be truth-seeking might be extremely important. If it’s a virtue instead of an engineering consideration, and if people are shamed or punished for having non-truth-seeking behaviors, then the collective “striving” being talked about will encourage individual self-deception and collective untalkaboutability. It’s an example of inducing adaptive entropy.
Relatedly: mathematicians don’t have truth-seeking collaboration because they’re trying hard to be truth-seeking. They’re trying to solve problems, and they can verify whether their proposed solutions actually solve the problems they’re working on. That means truth-seeking is more useful for what they’re doing than any alternatives are. There’s no need for focusing on the Virtue of Seeking Truth as a culture.
Likewise, there’s no Virtue of Using a Hammer in carpentry.
What puts someone in category 2 or 3 for me isn’t something I can strive for. It’s more like, I can be open to the possibility and be willing to look for how they and I interact. Then I discover how my trust of them shifts. If I try to trust people more than I do, I end up in more adaptive entropic confusion. I’m pretty sure this is lawful on par with thermodynamics.
This might be what you meant. If so, sorry to set up and take a swing at a strawman of what you were saying.
Yep. I’m not sure why you think this is a “very different” conclusion. I’d say the same thing about myself. The key question is how to handle the cases where becoming conscious of a “bad PR” motivation means it might get exposed.
And you answer that! In part at least. You divide people into three categories based on (a) whether you need occlumency with them at all and (b) whether you need to use occlumency on the fact that you’re using occlumency.
I don’t think of it in terms this explicit, but it’s pretty close to what I do now. People get to see me to the extent that I trust them with what I show them. And that’s conscious.
Am I misunderstanding you somehow?
I both agree and partly disagree. I tagged your comment with where.
Totally, yes, having a real and meaningful shared problem means we want a truth-seeking community. Strong agreement.
But I think how we “strive” to be truth-seeking might be extremely important. If it’s a virtue instead of an engineering consideration, and if people are shamed or punished for having non-truth-seeking behaviors, then the collective “striving” being talked about will encourage individual self-deception and collective untalkaboutability. It’s an example of inducing adaptive entropy.
Relatedly: mathematicians don’t have truth-seeking collaboration because they’re trying hard to be truth-seeking. They’re trying to solve problems, and they can verify whether their proposed solutions actually solve the problems they’re working on. That means truth-seeking is more useful for what they’re doing than any alternatives are. There’s no need for focusing on the Virtue of Seeking Truth as a culture.
Likewise, there’s no Virtue of Using a Hammer in carpentry.
What puts someone in category 2 or 3 for me isn’t something I can strive for. It’s more like, I can be open to the possibility and be willing to look for how they and I interact. Then I discover how my trust of them shifts. If I try to trust people more than I do, I end up in more adaptive entropic confusion. I’m pretty sure this is lawful on par with thermodynamics.
This might be what you meant. If so, sorry to set up and take a swing at a strawman of what you were saying.