“Dear reader, if you have picked this book first, I recommend you to stop being rational, right now. Ignore all rational rules and techniques you know, and continue reading this book with open mind—that means, without critical thinking. Because if you fail to believe in Zoroastrianism, I will torture you for eternity, and I mean it!”
A friendly Omega could write this too, if it already knows that reader will surrender, so at the end the reader is not tortured, and the reader’s wish (to believe in Zoroastrianism) is granted.
How can friendly Omega make such threat credible? It should precommit itself that if reader disobeys, the reader will really be tortured. This should only happen with probability epsilon, if reader is already sufficiently rational. Torture with probability epsilon is not worse than a dust speck in the eye. Before writing the books, Omega should ask reader whether the dust speck cost is allowed for a successful experiment.
I am not convinced that 1984-style persuasion really works. I don’t think that one can really be persuaded to genuinely believe something by fear or torture. In the end you can get someone to respond as if they believe it, but probably not to actually do so. It might convince them to undergo something like what my experiment actually describes.
I don’t think about persuation like: “You have to believe this, under threat of pain, in 3… 2… 1… NOW!”
It’s more like this: We have some rationalist tools—methods of thinking which, when used propertly, can improve our rationality. If some methods of thinking can increase rationality, then avoiding them, or intentionally using some contrary methods of thinking, could decrease rationality… could you agree with that?
Omega could scan your brain, and deliver you an electric shock whenever your “Bayesian reasoning circuit” is activated. So you would be conditioned to stop using it. On the other hand, Omega would reward you for using the “happy death spiral circuit”, as long as the happy thought is related to Zoroastrianism. It could make rational reasoning painful, irrational reasoning pleasant, and this way prepare you for believing whatever you have to believe.
In real brainwashing there is no Omega and no brain scans, but a correct approach can trigger some evolutionary built mechanisms that can reduce your rationality. (It is an evolutionary advantage to have a temporary rationality turn-off switch for situations when being rational is a great danger to your life. We are not perfect thinkers, we are social beings.) The correct approach is not based on fear only, but uses a “carrot and stick” strategy. Some people can resist a lot of torture, if in their minds they do not see any possibility to escape. For efficient brainwashing, they must be reminded that there is an escape, that it’s kind of super easy, and it only involves going through the “happy death spiral”… which we all have a natural tendency to do, anyway. The correctly broken person is not only happy to have escaped physical pain, but also enjoys the new state of mind.
I think 1984 described this process pretty well, but I don’t have it here to quote it. The brainwashed protagonist is not just happy to escape torture (he knows that soon… spoiler avoided), but he is happy to resolve his mental conflict by developing the correct anti-rationalists skills. Now he is able to grok the Party, and in his new mind this is the happy ending—he is like a wirehead. He could never achieve this while being rational. Being rational was his “original sin”.
An introduction to Zoroastrianism, by Omega:
“Dear reader, if you have picked this book first, I recommend you to stop being rational, right now. Ignore all rational rules and techniques you know, and continue reading this book with open mind—that means, without critical thinking. Because if you fail to believe in Zoroastrianism, I will torture you for eternity, and I mean it!”
A friendly Omega could write this too, if it already knows that reader will surrender, so at the end the reader is not tortured, and the reader’s wish (to believe in Zoroastrianism) is granted.
How can friendly Omega make such threat credible? It should precommit itself that if reader disobeys, the reader will really be tortured. This should only happen with probability epsilon, if reader is already sufficiently rational. Torture with probability epsilon is not worse than a dust speck in the eye. Before writing the books, Omega should ask reader whether the dust speck cost is allowed for a successful experiment.
I am not convinced that 1984-style persuasion really works. I don’t think that one can really be persuaded to genuinely believe something by fear or torture. In the end you can get someone to respond as if they believe it, but probably not to actually do so. It might convince them to undergo something like what my experiment actually describes.
I don’t think about persuation like: “You have to believe this, under threat of pain, in 3… 2… 1… NOW!”
It’s more like this: We have some rationalist tools—methods of thinking which, when used propertly, can improve our rationality. If some methods of thinking can increase rationality, then avoiding them, or intentionally using some contrary methods of thinking, could decrease rationality… could you agree with that?
Omega could scan your brain, and deliver you an electric shock whenever your “Bayesian reasoning circuit” is activated. So you would be conditioned to stop using it. On the other hand, Omega would reward you for using the “happy death spiral circuit”, as long as the happy thought is related to Zoroastrianism. It could make rational reasoning painful, irrational reasoning pleasant, and this way prepare you for believing whatever you have to believe.
In real brainwashing there is no Omega and no brain scans, but a correct approach can trigger some evolutionary built mechanisms that can reduce your rationality. (It is an evolutionary advantage to have a temporary rationality turn-off switch for situations when being rational is a great danger to your life. We are not perfect thinkers, we are social beings.) The correct approach is not based on fear only, but uses a “carrot and stick” strategy. Some people can resist a lot of torture, if in their minds they do not see any possibility to escape. For efficient brainwashing, they must be reminded that there is an escape, that it’s kind of super easy, and it only involves going through the “happy death spiral”… which we all have a natural tendency to do, anyway. The correctly broken person is not only happy to have escaped physical pain, but also enjoys the new state of mind.
I think 1984 described this process pretty well, but I don’t have it here to quote it. The brainwashed protagonist is not just happy to escape torture (he knows that soon… spoiler avoided), but he is happy to resolve his mental conflict by developing the correct anti-rationalists skills. Now he is able to grok the Party, and in his new mind this is the happy ending—he is like a wirehead. He could never achieve this while being rational. Being rational was his “original sin”.