Now, what I don’t get is why he let you force him to change his position. If he really believed that it was impossible for you to create AI, why wouldn’t he have just said “yes,” and then sit back, comfortable in his belief that you will never create an AI?
He didn’t believe his religion was definitely true, even though he belived that he believed that. There is nothing paradoxical about being wrong about things, even one’s own beliefs.
“My religion is true” is a statement with no consequences for his anticipations, so it is isolated from his belief network. He was deeply committed to “I believe in my religion”, and that second belief required him to pretend that the first statement was a member of his belief network by pretending that he believed in consequences as if his religion were true occurred and just happened to be untestable. Once he realized that he had goofed and said he anticipated consequences if his religion were true different from if it weren’t true in a testable area, he had to backtrack. By never anticipating different consequences if his religion is true than if it isn’t, he protects his belief that he believes his religion is true from falsification.
if you have a whole general concept like “post-colonial alienation”, which does not have specifications bound to any specific experience, you may just have a little bunch of arrows off on the side of your causal graph, not bound to anything at all; and these may well be meaningless.
So he changed his position only insofar as he updated his belief framework, but he didn’t change any core belief. Everything snaps into place if you realize the terms of his real belief the way the movement of planets makes sense if you realize the Earth is not the center of the Solar System, and the Sun is. His religious belief is meaningless, not just wrong, and his belief that he believes is wrong.
People are capable of even more sophisticated levels levels of self deception than it seems this guy had.
Now, what I don’t get is why he let you force him to change his position. If he really believed that it was impossible for you to create AI, why wouldn’t he have just said “yes,” and then sit back, comfortable in his belief that you will never create an AI?
He didn’t believe his religion was definitely true, even though he belived that he believed that. There is nothing paradoxical about being wrong about things, even one’s own beliefs.
“My religion is true” is a statement with no consequences for his anticipations, so it is isolated from his belief network. He was deeply committed to “I believe in my religion”, and that second belief required him to pretend that the first statement was a member of his belief network by pretending that he believed in consequences as if his religion were true occurred and just happened to be untestable. Once he realized that he had goofed and said he anticipated consequences if his religion were true different from if it weren’t true in a testable area, he had to backtrack. By never anticipating different consequences if his religion is true than if it isn’t, he protects his belief that he believes his religion is true from falsification.
So he changed his position only insofar as he updated his belief framework, but he didn’t change any core belief. Everything snaps into place if you realize the terms of his real belief the way the movement of planets makes sense if you realize the Earth is not the center of the Solar System, and the Sun is. His religious belief is meaningless, not just wrong, and his belief that he believes is wrong.
People are capable of even more sophisticated levels levels of self deception than it seems this guy had.
Welcome to Less Wrong!