I am not convinced that 1984-style persuasion really works. I don’t think that one can really be persuaded to genuinely believe something by fear or torture. In the end you can get someone to respond as if they believe it, but probably not to actually do so. It might convince them to undergo something like what my experiment actually describes.
I don’t think about persuation like: “You have to believe this, under threat of pain, in 3… 2… 1… NOW!”
It’s more like this: We have some rationalist tools—methods of thinking which, when used propertly, can improve our rationality. If some methods of thinking can increase rationality, then avoiding them, or intentionally using some contrary methods of thinking, could decrease rationality… could you agree with that?
Omega could scan your brain, and deliver you an electric shock whenever your “Bayesian reasoning circuit” is activated. So you would be conditioned to stop using it. On the other hand, Omega would reward you for using the “happy death spiral circuit”, as long as the happy thought is related to Zoroastrianism. It could make rational reasoning painful, irrational reasoning pleasant, and this way prepare you for believing whatever you have to believe.
In real brainwashing there is no Omega and no brain scans, but a correct approach can trigger some evolutionary built mechanisms that can reduce your rationality. (It is an evolutionary advantage to have a temporary rationality turn-off switch for situations when being rational is a great danger to your life. We are not perfect thinkers, we are social beings.) The correct approach is not based on fear only, but uses a “carrot and stick” strategy. Some people can resist a lot of torture, if in their minds they do not see any possibility to escape. For efficient brainwashing, they must be reminded that there is an escape, that it’s kind of super easy, and it only involves going through the “happy death spiral”… which we all have a natural tendency to do, anyway. The correctly broken person is not only happy to have escaped physical pain, but also enjoys the new state of mind.
I think 1984 described this process pretty well, but I don’t have it here to quote it. The brainwashed protagonist is not just happy to escape torture (he knows that soon… spoiler avoided), but he is happy to resolve his mental conflict by developing the correct anti-rationalists skills. Now he is able to grok the Party, and in his new mind this is the happy ending—he is like a wirehead. He could never achieve this while being rational. Being rational was his “original sin”.
I am not convinced that 1984-style persuasion really works. I don’t think that one can really be persuaded to genuinely believe something by fear or torture. In the end you can get someone to respond as if they believe it, but probably not to actually do so. It might convince them to undergo something like what my experiment actually describes.
I don’t think about persuation like: “You have to believe this, under threat of pain, in 3… 2… 1… NOW!”
It’s more like this: We have some rationalist tools—methods of thinking which, when used propertly, can improve our rationality. If some methods of thinking can increase rationality, then avoiding them, or intentionally using some contrary methods of thinking, could decrease rationality… could you agree with that?
Omega could scan your brain, and deliver you an electric shock whenever your “Bayesian reasoning circuit” is activated. So you would be conditioned to stop using it. On the other hand, Omega would reward you for using the “happy death spiral circuit”, as long as the happy thought is related to Zoroastrianism. It could make rational reasoning painful, irrational reasoning pleasant, and this way prepare you for believing whatever you have to believe.
In real brainwashing there is no Omega and no brain scans, but a correct approach can trigger some evolutionary built mechanisms that can reduce your rationality. (It is an evolutionary advantage to have a temporary rationality turn-off switch for situations when being rational is a great danger to your life. We are not perfect thinkers, we are social beings.) The correct approach is not based on fear only, but uses a “carrot and stick” strategy. Some people can resist a lot of torture, if in their minds they do not see any possibility to escape. For efficient brainwashing, they must be reminded that there is an escape, that it’s kind of super easy, and it only involves going through the “happy death spiral”… which we all have a natural tendency to do, anyway. The correctly broken person is not only happy to have escaped physical pain, but also enjoys the new state of mind.
I think 1984 described this process pretty well, but I don’t have it here to quote it. The brainwashed protagonist is not just happy to escape torture (he knows that soon… spoiler avoided), but he is happy to resolve his mental conflict by developing the correct anti-rationalists skills. Now he is able to grok the Party, and in his new mind this is the happy ending—he is like a wirehead. He could never achieve this while being rational. Being rational was his “original sin”.