I’ve been thinking along verysimilarlines for a while (my inside name for this is “mask theory of the mind”: consciousness is a “mask”). But my personal conclusion is very different. While self-deception is a valid strategy in many circumstances, I think that it’s too costly when trying to solve an extremely difficult high-stakes problem (e.g. stopping the AI apocalypse). Hence, I went in the other direction: trying to self-deceive little, and instead be self-honest about my[1] real motivations, even if they are “bad PR”. In practice, this means never making excuses to myself such as “I wanted to do A, but I didn’t have the willpower so I did B instead”, but rather owning the fact I wanted to do B and thinking how to integrate this into a coherent long-term plan for my life.
My solution to “hostile telepaths” is diving other people into ~3 categories:
People that are adversarial or untrustworthy, either individually or as representatives of the system on behalf of which they act. With such people, I have no compunction to consciously lie (“the Jews are not in the basement… I packed the suitcase myself...”) or act adversarially.
People that seem cooperative, so that they deserve my good will even if not complete trust. With such people, I will be at least metahonest: I will not tell direct lies, and I will be honest about in which circumstances I’m honest (i.e. reveal all relevant information). More generally, I will act cooperatively towards such people, expecting them to reciprocate. My attitude towards in this group is that I don’t need to pretend to be something other than I am to gain cooperation, I can just rely on their civility and/or (super)rationality.
Inner circle: People that have my full trust. With them I have no hostile telepath problem because they are not hostile. My attitude towards this group is that we can resolve any difference by putting all the cards on the table and doing whatever is best for the group in aggregate.
Moreover, having an extremely difficult high-stakes problem is not just a strong reason to self-deceive less, it’s also strong reason to become more truth-oriented as a community. This means that people with such a common cause should strive to put each other at least in category 2 above, tentatively moving towards 3 (with the caveat of watching out for bad actors trying to exploit that).
…I went in the other direction: trying to self-deceive little, and instead be self-honest about my real motivations, even if they are “bad PR”.
Yep. I’m not sure why you think this is a “very different” conclusion. I’d say the same thing about myself. The key question is how to handle the cases where becoming conscious of a “bad PR” motivation means it might get exposed.
And you answer that! In part at least. You divide people into three categories based on (a) whether you need occlumency with them at all and (b) whether you need to use occlumency on the fact that you’re using occlumency.
I don’t think of it in terms this explicit, but it’s pretty close to what I do now. People get to see me to the extent that I trust them with what I show them. And that’s conscious.
Am I misunderstanding you somehow?
Moreover, having an extremely difficult high-stakes problem is not just a strong reason to self-deceive less, it’s also strong reason to become more truth-oriented as a community. This means that people with such a common cause should strive to put each other at least in category 2 above, tentatively moving towards 3 (with the caveat of watching out for bad actors trying to exploit that).
I both agree and partly disagree. I tagged your comment with where.
Totally, yes, having a real and meaningful shared problem means we want a truth-seeking community. Strong agreement.
But I think how we “strive” to be truth-seeking might be extremely important. If it’s a virtue instead of an engineering consideration, and if people are shamed or punished for having non-truth-seeking behaviors, then the collective “striving” being talked about will encourage individual self-deception and collective untalkaboutability. It’s an example of inducing adaptive entropy.
Relatedly: mathematicians don’t have truth-seeking collaboration because they’re trying hard to be truth-seeking. They’re trying to solve problems, and they can verify whether their proposed solutions actually solve the problems they’re working on. That means truth-seeking is more useful for what they’re doing than any alternatives are. There’s no need for focusing on the Virtue of Seeking Truth as a culture.
Likewise, there’s no Virtue of Using a Hammer in carpentry.
What puts someone in category 2 or 3 for me isn’t something I can strive for. It’s more like, I can be open to the possibility and be willing to look for how they and I interact. Then I discover how my trust of them shifts. If I try to trust people more than I do, I end up in more adaptive entropic confusion. I’m pretty sure this is lawful on par with thermodynamics.
This might be what you meant. If so, sorry to set up and take a swing at a strawman of what you were saying.
Agree with the approach with the caveat that some people in group 2 are naive cooperators and therefore second order defectors since they are suckers for group 1. Eg the person who will tell the truth to the Nazis out of mistaken theories of ethics or just behavioral conditioning.
...never making excuses to myself such as “I wanted to do A, but I didn’t have the willpower so I did B instead”, but rather owning the fact I wanted to do B and thinking how to integrate this...
AKA integrating the ego-dystonic into the homunculus
I’ve been thinking along very similar lines for a while (my inside name for this is “mask theory of the mind”: consciousness is a “mask”). But my personal conclusion is very different. While self-deception is a valid strategy in many circumstances, I think that it’s too costly when trying to solve an extremely difficult high-stakes problem (e.g. stopping the AI apocalypse). Hence, I went in the other direction: trying to self-deceive little, and instead be self-honest about my[1] real motivations, even if they are “bad PR”. In practice, this means never making excuses to myself such as “I wanted to do A, but I didn’t have the willpower so I did B instead”, but rather owning the fact I wanted to do B and thinking how to integrate this into a coherent long-term plan for my life.
My solution to “hostile telepaths” is diving other people into ~3 categories:
People that are adversarial or untrustworthy, either individually or as representatives of the system on behalf of which they act. With such people, I have no compunction to consciously lie (“the Jews are not in the basement… I packed the suitcase myself...”) or act adversarially.
People that seem cooperative, so that they deserve my good will even if not complete trust. With such people, I will be at least metahonest: I will not tell direct lies, and I will be honest about in which circumstances I’m honest (i.e. reveal all relevant information). More generally, I will act cooperatively towards such people, expecting them to reciprocate. My attitude towards in this group is that I don’t need to pretend to be something other than I am to gain cooperation, I can just rely on their civility and/or (super)rationality.
Inner circle: People that have my full trust. With them I have no hostile telepath problem because they are not hostile. My attitude towards this group is that we can resolve any difference by putting all the cards on the table and doing whatever is best for the group in aggregate.
Moreover, having an extremely difficult high-stakes problem is not just a strong reason to self-deceive less, it’s also strong reason to become more truth-oriented as a community. This means that people with such a common cause should strive to put each other at least in category 2 above, tentatively moving towards 3 (with the caveat of watching out for bad actors trying to exploit that).
While making sure to use the word “I” to refer to the elephant/unconscious-self and not to the mask/conscious-self.
Yep. I’m not sure why you think this is a “very different” conclusion. I’d say the same thing about myself. The key question is how to handle the cases where becoming conscious of a “bad PR” motivation means it might get exposed.
And you answer that! In part at least. You divide people into three categories based on (a) whether you need occlumency with them at all and (b) whether you need to use occlumency on the fact that you’re using occlumency.
I don’t think of it in terms this explicit, but it’s pretty close to what I do now. People get to see me to the extent that I trust them with what I show them. And that’s conscious.
Am I misunderstanding you somehow?
I both agree and partly disagree. I tagged your comment with where.
Totally, yes, having a real and meaningful shared problem means we want a truth-seeking community. Strong agreement.
But I think how we “strive” to be truth-seeking might be extremely important. If it’s a virtue instead of an engineering consideration, and if people are shamed or punished for having non-truth-seeking behaviors, then the collective “striving” being talked about will encourage individual self-deception and collective untalkaboutability. It’s an example of inducing adaptive entropy.
Relatedly: mathematicians don’t have truth-seeking collaboration because they’re trying hard to be truth-seeking. They’re trying to solve problems, and they can verify whether their proposed solutions actually solve the problems they’re working on. That means truth-seeking is more useful for what they’re doing than any alternatives are. There’s no need for focusing on the Virtue of Seeking Truth as a culture.
Likewise, there’s no Virtue of Using a Hammer in carpentry.
What puts someone in category 2 or 3 for me isn’t something I can strive for. It’s more like, I can be open to the possibility and be willing to look for how they and I interact. Then I discover how my trust of them shifts. If I try to trust people more than I do, I end up in more adaptive entropic confusion. I’m pretty sure this is lawful on par with thermodynamics.
This might be what you meant. If so, sorry to set up and take a swing at a strawman of what you were saying.
Agree with the approach with the caveat that some people in group 2 are naive cooperators and therefore second order defectors since they are suckers for group 1. Eg the person who will tell the truth to the Nazis out of mistaken theories of ethics or just behavioral conditioning.
AKA integrating the ego-dystonic into the homunculus