When one person says “I guess we’ll have to agree to disagree” and the second person says “Actually according to Aumann’s Agreement Theorem, we can’t” is the second person making a type error?
Making a type error is not easy to distinguish from attempting to shift frame. (If it were, the frame control wouldn’t be very effective.) In the example Eliezer gave from the sequences, he was shifting frame from one that implicitly acknowledges interpretive labor as a cost, to one that demands unlimited amounts of interpretive labor by assuming that we’re all perfect Bayesians (and therefore have unlimited computational ability, memory, etc).
This is a big part of the dynamic underlying mistake vs conflict theory.
Eliezer’s behavior in the story you’re alluding to only seems “rational” insofar as we think the other side ends up with a better opinion—I can easily imagine a structurally identical interaction where the protagonist manipulates someone into giving up on a genuine but hard to articulate objection, or proceeding down a conversational path they’re ill-equipped to navigate, thus “closing the sale.”
It’s not at all clear that improving the other person’s opinion was really one of Eliezer’s goals on this occasion, as opposed to showing up the other person’s intellectual inferiority. He called the post “Bayesian Judo”, and highlighted how his showing-off impressed someone of the opposite sex.
He does also suggest that in the end he and the other person came to some sort of agreement—but it seems pretty clear that the thing they agreed on had little to do with the claim the other guy had originally been making, and that the other guy’s opinion on that didn’t actually change. So I think an accurate, though arguably unkind, summary of “Bayesian Judo” goes like this: “I was at a party, I got into an argument with a religious guy who didn’t believe AI was possible, I overwhelmed him with my superior knowledge and intelligence, he submitted to my manifest superiority, and the whole performance impressed a woman”. On this occasion, helping the other party to have better opinions doesn’t seem to have been a high priority.
Making a type error is not easy to distinguish from attempting to shift frame. (If it were, the frame control wouldn’t be very effective.) In the example Eliezer gave from the sequences, he was shifting frame from one that implicitly acknowledges interpretive labor as a cost, to one that demands unlimited amounts of interpretive labor by assuming that we’re all perfect Bayesians (and therefore have unlimited computational ability, memory, etc).
This is a big part of the dynamic underlying mistake vs conflict theory.
Eliezer’s behavior in the story you’re alluding to only seems “rational” insofar as we think the other side ends up with a better opinion—I can easily imagine a structurally identical interaction where the protagonist manipulates someone into giving up on a genuine but hard to articulate objection, or proceeding down a conversational path they’re ill-equipped to navigate, thus “closing the sale.”
It’s not at all clear that improving the other person’s opinion was really one of Eliezer’s goals on this occasion, as opposed to showing up the other person’s intellectual inferiority. He called the post “Bayesian Judo”, and highlighted how his showing-off impressed someone of the opposite sex.
He does also suggest that in the end he and the other person came to some sort of agreement—but it seems pretty clear that the thing they agreed on had little to do with the claim the other guy had originally been making, and that the other guy’s opinion on that didn’t actually change. So I think an accurate, though arguably unkind, summary of “Bayesian Judo” goes like this: “I was at a party, I got into an argument with a religious guy who didn’t believe AI was possible, I overwhelmed him with my superior knowledge and intelligence, he submitted to my manifest superiority, and the whole performance impressed a woman”. On this occasion, helping the other party to have better opinions doesn’t seem to have been a high priority.