“I think the Sequences are right about everything” is a pro-Sequences position of roughly the same extremity as “I think the Sequences are wrong about everything.”
As far as I know, nobody claims the latter, and it seems kind of absurd on the face of it. The closest I’ve seen anyone come to it is something like “everything true in the Sequences is unoriginal, everything original in them is false, and they are badly written”… which is a pretty damning criticism, but still allows for the idea that quite a lot of what the text expresses is true.
Its equally extreme counterpart on the positive axis should, then, allow for the idea that quite a lot of what the text expresses is false.
To reject overly-extreme positive statements more strongly than less-extreme negative ones is not necessarily expressing a rejection of positivity; it might be a rejection of extremism instead.
“I think the Sequences are right about everything” is a pro-Sequences position of roughly the same extremity as “I think the Sequences are wrong about everything.”
As far as I know, nobody claims the latter, and it seems kind of absurd on the face of it.
I wouldn’t say that they’re comparably extreme. A whole lot of the content of the sequences is completely uncontroversial, and even if you think Eliezer is a total lunatic, it’s unlikely that they wouldn’t contain any substantively true claims. I’d bet heavily against “The Biggest Secret” by David Icke containing no substantively true claims at all.
I would have some points of disagreement with someone who thinks that everything in the sequences is correct (personally, I doubt that MWI is a slam dunk, because I’m not convinced Eliezer is accurately framing the implications of collapse, and I think CEV is probably a dead end when it comes to programming an FAI, although I don’t know of any team which I think has better odds of developing an FAI than the SIAI.) But I think someone who agrees with the entirety of the contents is being far more reasonable than someone who disagrees with the entirety.
Suppose you separate the Sequences into “original” and “unoriginal”.
The “unoriginal” segment is very likely to be true: agreeing with all of it is fairly straightforward, and disagreeing with all of it is ridiculously extreme.
To a first approximation, we can say that the middle-ground stance on any given point in the “original” statement is uncertainty. That is, accepting that point and rejecting it are equally extreme. If we use the general population for reference, of course, that is nowhere near correct: even considering the possibility that cryonics might work is a fairly extreme stance, for instance.
But taking the approximation at face value tells us that agreeing with every “original” claim, and disagreeing with every “original” claim, are equally extreme positions. If we now add the further stipulation that both positions agree with every “unoriginal” claim, they both move slightly toward the Sequences, but not by much.
So actually (1) “I agree with everything in the sequences” and (2) “Everything true in the Sequences is unoriginal, everything original in them is false” are roughly equally extreme. If anything, we have made an error in favor of (1). On the other hand, (3) “Everything in the Sequences ever is false” is much more extreme because it also rejects the “unoriginal” claims, each of which is almost certainly true.
P.S. If you are like me, you are wondering about what “extreme” means now. To be extremely technical (ha) I am interpreting it as measuring the probability of a position re: Sequences that you expect a reasonable, boundedly-rational person to have. For instance, a post that says “Confirmation bias is a thing” is un-controversial, and you expect that reasonable people will believe it with probability close to 1. A post that says “MWI is obviously true” is controversial, and if you are generous you will say that there is a probability of 0.5 that someone will agree with it. This might be higher or lower for other posts in the “original” category but on the whole the approximation of 0.5 is probably favorable to the person that agrees with everything.
So when I conclude that (1) and (2) are roughly equally extreme, I am saying that a “reasonable person” is roughly equally likely to end up at either one of them. This is an approximation, of course, but they are certainly both closer to each other than they are to (3).
Yeah, I think I agree with everything here as far as it goes, though I haven’t looked at it carefully. I’m not sure originality is as crisp a concept as you want it to be, but I can imagine us both coming up with a list of propositions that we believe captures everything in the Sequences that some reasonable person somewhere might conceivably disagree with, weighted by how reasonable we think a person could be and still disagree with that proposition, and that we’d end up with very similar lists (perhaps with fairly different weights).
.
“I think the Sequences are right about everything” is a pro-Sequences position of roughly the same extremity as “I think the Sequences are wrong about everything.”
As far as I know, nobody claims the latter, and it seems kind of absurd on the face of it. The closest I’ve seen anyone come to it is something like “everything true in the Sequences is unoriginal, everything original in them is false, and they are badly written”… which is a pretty damning criticism, but still allows for the idea that quite a lot of what the text expresses is true.
Its equally extreme counterpart on the positive axis should, then, allow for the idea that quite a lot of what the text expresses is false.
To reject overly-extreme positive statements more strongly than less-extreme negative ones is not necessarily expressing a rejection of positivity; it might be a rejection of extremism instead.
I wouldn’t say that they’re comparably extreme. A whole lot of the content of the sequences is completely uncontroversial, and even if you think Eliezer is a total lunatic, it’s unlikely that they wouldn’t contain any substantively true claims. I’d bet heavily against “The Biggest Secret” by David Icke containing no substantively true claims at all.
I would have some points of disagreement with someone who thinks that everything in the sequences is correct (personally, I doubt that MWI is a slam dunk, because I’m not convinced Eliezer is accurately framing the implications of collapse, and I think CEV is probably a dead end when it comes to programming an FAI, although I don’t know of any team which I think has better odds of developing an FAI than the SIAI.) But I think someone who agrees with the entirety of the contents is being far more reasonable than someone who disagrees with the entirety.
I agree that “I think the Sequences are wrong about everything” would be an absurdly extreme claim, for the reasons you point out.
We disagree about the extremity of “I think the Sequences are right about everything”.
I’m not sure where to go from there, though.
Suppose you separate the Sequences into “original” and “unoriginal”.
The “unoriginal” segment is very likely to be true: agreeing with all of it is fairly straightforward, and disagreeing with all of it is ridiculously extreme.
To a first approximation, we can say that the middle-ground stance on any given point in the “original” statement is uncertainty. That is, accepting that point and rejecting it are equally extreme. If we use the general population for reference, of course, that is nowhere near correct: even considering the possibility that cryonics might work is a fairly extreme stance, for instance.
But taking the approximation at face value tells us that agreeing with every “original” claim, and disagreeing with every “original” claim, are equally extreme positions. If we now add the further stipulation that both positions agree with every “unoriginal” claim, they both move slightly toward the Sequences, but not by much.
So actually (1) “I agree with everything in the sequences” and (2) “Everything true in the Sequences is unoriginal, everything original in them is false” are roughly equally extreme. If anything, we have made an error in favor of (1). On the other hand, (3) “Everything in the Sequences ever is false” is much more extreme because it also rejects the “unoriginal” claims, each of which is almost certainly true.
P.S. If you are like me, you are wondering about what “extreme” means now. To be extremely technical (ha) I am interpreting it as measuring the probability of a position re: Sequences that you expect a reasonable, boundedly-rational person to have. For instance, a post that says “Confirmation bias is a thing” is un-controversial, and you expect that reasonable people will believe it with probability close to 1. A post that says “MWI is obviously true” is controversial, and if you are generous you will say that there is a probability of 0.5 that someone will agree with it. This might be higher or lower for other posts in the “original” category but on the whole the approximation of 0.5 is probably favorable to the person that agrees with everything.
So when I conclude that (1) and (2) are roughly equally extreme, I am saying that a “reasonable person” is roughly equally likely to end up at either one of them. This is an approximation, of course, but they are certainly both closer to each other than they are to (3).
Yeah, I think I agree with everything here as far as it goes, though I haven’t looked at it carefully. I’m not sure originality is as crisp a concept as you want it to be, but I can imagine us both coming up with a list of propositions that we believe captures everything in the Sequences that some reasonable person somewhere might conceivably disagree with, weighted by how reasonable we think a person could be and still disagree with that proposition, and that we’d end up with very similar lists (perhaps with fairly different weights). .