But all of the foregoing is a lot less interesting than the technology and the quote. The quote itself eloquently sums up the real reason why Zakharov can’t stand Miriam. It also makes clear the reason why he chose the title “For I Have Tasted The Fruit” for his core text. He is referring to the story of the Garden of Eden that accompanied the opening cinematic. The difference is that, instead of mourning his expulsion from Eden, he instead revels in the acquisition of wrongly-forbidden knowledge.
That this is tied to a technology called Intellectual Integrity is quite intriguing, if one is willing to entertain the idea for a moment. What would it mean for a society to have real intellectual integrity? For one, people would be expected to follow their stated beliefs to wherever they led. Unprincipled exceptions and an inability or unwillingness to correlate beliefs among different domains would be subject to social sanction. Valid attempts to persuade would be expected to be based on solid argumentation, meaning that what passes for typical salesmanship nowadays would be considered a grave affront. Probably something along the lines of punching someone in the face and stealing their money.
This makes the fact that this technology relies on Ethical Calculus and Doctrine: Loyalty a bit of inspired genius on Reynolds’s part. We know that Ethical Calculus means that the colonists are now capable of building valid mathematical models for ethical behavior. Doctrine: Loyalty consists of all of the social techniques of reinforcement and punishment that actually fuses people into coherent teams around core leaders and ideas. If a faction puts the two together, that means that they are really building fanatical loyalty to the math. Ethical Calculus provides the answers; Doctrine: Loyalty makes a person act like he really believes it. We’re only at the third level of the tech tree and society is already starting to head in some wild directions compared to what we’re familiar with.
It is beneath my epistemic dignity to be tribally mind-killed, for one social faction or another. I will not predictably sacrifice my mind on the altar of countersignaling the outgroup.
It is beneath my epistemic dignity to be mind-killed by status flinches—to study any topic because of its social status rather than its instrumental worth alone. I refuse to be status-killed.
In my medianworld’s culture, it’s beneath everyone’s epistemic dignity to be lessened as people by social flinches or tribal distortions. Status points aren’t bits of Bayesian evidence; they don’t count towards fixing your model of the world, not even a smidgen.
On the Epistemic Scale, Fool’s Gold Weighs Nothing
Epistemic scales are applications of Bayesian inference that we, as embedded algorithms, wield. The universe bequeaths to us a certain amount of correlational evidence. We then make what use of it we will.
If we place illegitimate weights on our epistemic scale by factoring in our status or tribal considerations, we predictably see the world less clearly. We turn a blind eye to something the universe is telling us.
When we weigh plans—predicted outcomes conditional on various feasible actions—flinching away from a plan because it feels like it’s too high-status an action will predictably worsen our planning. A slight flinch away from embarrassing or scary plans will slightly deafen you to what the universe is saying. A major hang-up could easily a priori exclude the important section of the action space and doom you from the very get go.
Beneath My Epistemic Dignity
It is beneath my epistemic dignity to be tribally mind-killed, for one social faction or another. I will not predictably sacrifice my mind on the altar of countersignaling the outgroup.
It is beneath my epistemic dignity to be mind-killed by status flinches—to study any topic because of its social status rather than its instrumental worth alone. I refuse to be status-killed.
In my medianworld’s culture, it’s beneath everyone’s epistemic dignity to be lessened as people by social flinches or tribal distortions. Status points aren’t bits of Bayesian evidence; they don’t count towards fixing your model of the world, not even a smidgen.
On the Epistemic Scale, Fool’s Gold Weighs Nothing
It’s in the physics of our universe that inference is everywhere. Possible configurations are correlated with possible observations, and algorithms that best leverage that correlation have an edge over algorithms that don’t.
Epistemic scales are applications of Bayesian inference that we, as embedded algorithms, wield. The universe bequeaths to us a certain amount of correlational evidence. We then make what use of it we will.
If we place illegitimate weights on our epistemic scale by factoring in our status or tribal considerations, we predictably see the world less clearly. We turn a blind eye to something the universe is telling us.
When we weigh plans—predicted outcomes conditional on various feasible actions—flinching away from a plan because it feels like it’s too high-status an action will predictably worsen our planning. A slight flinch away from embarrassing or scary plans will slightly deafen you to what the universe is saying. A major hang-up could easily a priori exclude the important section of the action space and doom you from the very get go.