Most commenters refer to cultists as if they are not them. Wrong. Understanding how the brain works—how it clings to belief—does not place one outside his own brain.
It is our emotional core of thought that leads to irrational attachment to belief. If machine intelligence can precipitate around a different “thought model” it may escape most(?) of the irrationality of its forerunners.
However, it will probably end up with a different set of irrationalities. We haven’t got any examples of a near-human intelligence that’s inherently rational, and I’d conjecture it’s unlikely that our first few attempts will succeed in this.
Most commenters refer to cultists as if they are not them. Wrong. Understanding how the brain works—how it clings to belief—does not place one outside his own brain.
It is our emotional core of thought that leads to irrational attachment to belief. If machine intelligence can precipitate around a different “thought model” it may escape most(?) of the irrationality of its forerunners.
However, it will probably end up with a different set of irrationalities. We haven’t got any examples of a near-human intelligence that’s inherently rational, and I’d conjecture it’s unlikely that our first few attempts will succeed in this.