If you agree with everything Eliezer wrote, you remember him writing about how every cause wants to be a cult.
But also what he said about swinging too far in the opposite direction. It’s bad for a community to reach a point where it’s taboo to profess dissent, but also for it to reach a point where it’s taboo to profess wholehearted agreement.
Wouldn’t it be better if the professed agreement was agreement with ideas rather than with people? The dissent counterpart of this post would say “I disagree with everything what this person says”. That’s clearly pathological.
This may sound wrong, but “who said that” is a Bayesian evidence, sometimes rather strong one. If your experience tells you that given person is right in 95% of things they say, it is rational to give 95% prior probability to other things they say.
It is said that we should judge the ideas by the ideas alone. In Bayes-speak it means that if you update correctly, enough evidence can fix a wrong prior (how much evidence is needed depends on how wrong the prior was). But gathering evidence is costly, and we cannot spend too high cost for every idea around us. Why use a worse prior, if a better one is available?
Back to human-speak: if person A is a notorious liar (or a mindkilled person that repeats someone else’s lies), person B is careless about their beliefs, and person C examines every idea carefully before telling it to others, then it is rational to react differently to ideas spoken by these three people. The word “everything” is too strong, but saying “if a person C said that, I believe it” is OK (assuming that if there is enough counter-evidence, the person will update: both on the idea and on credibility of C).
Disagreeing with everything one says would be trying to reverse stupidity. There are people who do “worse than random”, so doing an opposite of what they said could be a good heuristic; but even for most of them assigning 95% probability that they are wrong would be too much.
You are right, but there is probably some misunderstanding. That the personal considerations should be ignored when assessing probability of an idea, and that one shouldn’t express collective agreement with ideas based on their author, are different suggestions. You argue against the former while I was stating the latter.
It’s important to take into account the context. When an idea X is being questioned, saying “I agree with X, because a very trustworthy person Y agrees with X” is fine with me, although it isn’t the best sort of argument one could provide. Starting the discussion “I agree with X1, X2, X3, … Xn”, on the other hand, makes any reasonable debate almost impossible, since it is not practical to argue n distinct ideas at once.
Well, the professed agreement of this post is with the Sequences, which are a set of ideas rather than a person, even if they were all written by one person. The dissent counterpart of this post would be “I disagree with the entire content of the sequences.”
Professing agreement or disagreement with a broad set of rather unrelated ideas is not conductive to productive discussion because there is no single topic to concentrate on with object-level arguments. Having the set of ideas defined by their author brings in tribal political instincts, which too is not helpful. You are right that the post was formulated as agreement with the Sequences rather than with everything which Yudkowsky ever said but I don’t see how this distinction is important. “Everything which Yudkowsky ever said” would also denote a set of ideas, after all.
“I think the Sequences are right about everything” is a pro-Sequences position of roughly the same extremity as “I think the Sequences are wrong about everything.”
As far as I know, nobody claims the latter, and it seems kind of absurd on the face of it. The closest I’ve seen anyone come to it is something like “everything true in the Sequences is unoriginal, everything original in them is false, and they are badly written”… which is a pretty damning criticism, but still allows for the idea that quite a lot of what the text expresses is true.
Its equally extreme counterpart on the positive axis should, then, allow for the idea that quite a lot of what the text expresses is false.
To reject overly-extreme positive statements more strongly than less-extreme negative ones is not necessarily expressing a rejection of positivity; it might be a rejection of extremism instead.
“I think the Sequences are right about everything” is a pro-Sequences position of roughly the same extremity as “I think the Sequences are wrong about everything.”
As far as I know, nobody claims the latter, and it seems kind of absurd on the face of it.
I wouldn’t say that they’re comparably extreme. A whole lot of the content of the sequences is completely uncontroversial, and even if you think Eliezer is a total lunatic, it’s unlikely that they wouldn’t contain any substantively true claims. I’d bet heavily against “The Biggest Secret” by David Icke containing no substantively true claims at all.
I would have some points of disagreement with someone who thinks that everything in the sequences is correct (personally, I doubt that MWI is a slam dunk, because I’m not convinced Eliezer is accurately framing the implications of collapse, and I think CEV is probably a dead end when it comes to programming an FAI, although I don’t know of any team which I think has better odds of developing an FAI than the SIAI.) But I think someone who agrees with the entirety of the contents is being far more reasonable than someone who disagrees with the entirety.
Suppose you separate the Sequences into “original” and “unoriginal”.
The “unoriginal” segment is very likely to be true: agreeing with all of it is fairly straightforward, and disagreeing with all of it is ridiculously extreme.
To a first approximation, we can say that the middle-ground stance on any given point in the “original” statement is uncertainty. That is, accepting that point and rejecting it are equally extreme. If we use the general population for reference, of course, that is nowhere near correct: even considering the possibility that cryonics might work is a fairly extreme stance, for instance.
But taking the approximation at face value tells us that agreeing with every “original” claim, and disagreeing with every “original” claim, are equally extreme positions. If we now add the further stipulation that both positions agree with every “unoriginal” claim, they both move slightly toward the Sequences, but not by much.
So actually (1) “I agree with everything in the sequences” and (2) “Everything true in the Sequences is unoriginal, everything original in them is false” are roughly equally extreme. If anything, we have made an error in favor of (1). On the other hand, (3) “Everything in the Sequences ever is false” is much more extreme because it also rejects the “unoriginal” claims, each of which is almost certainly true.
P.S. If you are like me, you are wondering about what “extreme” means now. To be extremely technical (ha) I am interpreting it as measuring the probability of a position re: Sequences that you expect a reasonable, boundedly-rational person to have. For instance, a post that says “Confirmation bias is a thing” is un-controversial, and you expect that reasonable people will believe it with probability close to 1. A post that says “MWI is obviously true” is controversial, and if you are generous you will say that there is a probability of 0.5 that someone will agree with it. This might be higher or lower for other posts in the “original” category but on the whole the approximation of 0.5 is probably favorable to the person that agrees with everything.
So when I conclude that (1) and (2) are roughly equally extreme, I am saying that a “reasonable person” is roughly equally likely to end up at either one of them. This is an approximation, of course, but they are certainly both closer to each other than they are to (3).
Yeah, I think I agree with everything here as far as it goes, though I haven’t looked at it carefully. I’m not sure originality is as crisp a concept as you want it to be, but I can imagine us both coming up with a list of propositions that we believe captures everything in the Sequences that some reasonable person somewhere might conceivably disagree with, weighted by how reasonable we think a person could be and still disagree with that proposition, and that we’d end up with very similar lists (perhaps with fairly different weights).
.
I just read that essay and I disagree with it. Stating one’s points of disagreement amounts to giving the diffs between your mind and that of an author. What’s good practice for scientific papers (in terms of remaining dispassionate) is probably good practice in general. The way to solve the cooperation problem is not to cancel out professing disagreement with professing agreement, it’s to track group members’ beliefs (e.g. by polling them) and act as a group on whatever the group consensus happens to be. In other words, teach people the value of majoritarianism and its ilk and tell them to use this outside view when making decisions.
What’s good practice for scientific papers (in terms of remaining dispassionate) is probably good practice in general.
In terms of epistemic rationality, you can get by fine by raising only points of disagreement and keeping it implicit that you accept everything you do not dispute. But in terms of creating effective group cooperation, which has instrumental value, this strategy performs poorly.
But also what he said about swinging too far in the opposite direction. It’s bad for a community to reach a point where it’s taboo to profess dissent, but also for it to reach a point where it’s taboo to profess wholehearted agreement.
Wouldn’t it be better if the professed agreement was agreement with ideas rather than with people? The dissent counterpart of this post would say “I disagree with everything what this person says”. That’s clearly pathological.
This may sound wrong, but “who said that” is a Bayesian evidence, sometimes rather strong one. If your experience tells you that given person is right in 95% of things they say, it is rational to give 95% prior probability to other things they say.
It is said that we should judge the ideas by the ideas alone. In Bayes-speak it means that if you update correctly, enough evidence can fix a wrong prior (how much evidence is needed depends on how wrong the prior was). But gathering evidence is costly, and we cannot spend too high cost for every idea around us. Why use a worse prior, if a better one is available?
Back to human-speak: if person A is a notorious liar (or a mindkilled person that repeats someone else’s lies), person B is careless about their beliefs, and person C examines every idea carefully before telling it to others, then it is rational to react differently to ideas spoken by these three people. The word “everything” is too strong, but saying “if a person C said that, I believe it” is OK (assuming that if there is enough counter-evidence, the person will update: both on the idea and on credibility of C).
Disagreeing with everything one says would be trying to reverse stupidity. There are people who do “worse than random”, so doing an opposite of what they said could be a good heuristic; but even for most of them assigning 95% probability that they are wrong would be too much.
You are right, but there is probably some misunderstanding. That the personal considerations should be ignored when assessing probability of an idea, and that one shouldn’t express collective agreement with ideas based on their author, are different suggestions. You argue against the former while I was stating the latter.
It’s important to take into account the context. When an idea X is being questioned, saying “I agree with X, because a very trustworthy person Y agrees with X” is fine with me, although it isn’t the best sort of argument one could provide. Starting the discussion “I agree with X1, X2, X3, … Xn”, on the other hand, makes any reasonable debate almost impossible, since it is not practical to argue n distinct ideas at once.
Well, the professed agreement of this post is with the Sequences, which are a set of ideas rather than a person, even if they were all written by one person. The dissent counterpart of this post would be “I disagree with the entire content of the sequences.”
Am I misunderstanding you about something?
Professing agreement or disagreement with a broad set of rather unrelated ideas is not conductive to productive discussion because there is no single topic to concentrate on with object-level arguments. Having the set of ideas defined by their author brings in tribal political instincts, which too is not helpful. You are right that the post was formulated as agreement with the Sequences rather than with everything which Yudkowsky ever said but I don’t see how this distinction is important. “Everything which Yudkowsky ever said” would also denote a set of ideas, after all.
Albeit an internally inconsistent set, given that Yudkowsky has occasionally changed his mind about things.
Well, that’s either an exaggeration or an audacious lie.
In any case, I had neither the means nor the desire to list every idea of theirs I agree with.
“I think the Sequences are right about everything” is a pro-Sequences position of roughly the same extremity as “I think the Sequences are wrong about everything.”
As far as I know, nobody claims the latter, and it seems kind of absurd on the face of it. The closest I’ve seen anyone come to it is something like “everything true in the Sequences is unoriginal, everything original in them is false, and they are badly written”… which is a pretty damning criticism, but still allows for the idea that quite a lot of what the text expresses is true.
Its equally extreme counterpart on the positive axis should, then, allow for the idea that quite a lot of what the text expresses is false.
To reject overly-extreme positive statements more strongly than less-extreme negative ones is not necessarily expressing a rejection of positivity; it might be a rejection of extremism instead.
I wouldn’t say that they’re comparably extreme. A whole lot of the content of the sequences is completely uncontroversial, and even if you think Eliezer is a total lunatic, it’s unlikely that they wouldn’t contain any substantively true claims. I’d bet heavily against “The Biggest Secret” by David Icke containing no substantively true claims at all.
I would have some points of disagreement with someone who thinks that everything in the sequences is correct (personally, I doubt that MWI is a slam dunk, because I’m not convinced Eliezer is accurately framing the implications of collapse, and I think CEV is probably a dead end when it comes to programming an FAI, although I don’t know of any team which I think has better odds of developing an FAI than the SIAI.) But I think someone who agrees with the entirety of the contents is being far more reasonable than someone who disagrees with the entirety.
I agree that “I think the Sequences are wrong about everything” would be an absurdly extreme claim, for the reasons you point out.
We disagree about the extremity of “I think the Sequences are right about everything”.
I’m not sure where to go from there, though.
Suppose you separate the Sequences into “original” and “unoriginal”.
The “unoriginal” segment is very likely to be true: agreeing with all of it is fairly straightforward, and disagreeing with all of it is ridiculously extreme.
To a first approximation, we can say that the middle-ground stance on any given point in the “original” statement is uncertainty. That is, accepting that point and rejecting it are equally extreme. If we use the general population for reference, of course, that is nowhere near correct: even considering the possibility that cryonics might work is a fairly extreme stance, for instance.
But taking the approximation at face value tells us that agreeing with every “original” claim, and disagreeing with every “original” claim, are equally extreme positions. If we now add the further stipulation that both positions agree with every “unoriginal” claim, they both move slightly toward the Sequences, but not by much.
So actually (1) “I agree with everything in the sequences” and (2) “Everything true in the Sequences is unoriginal, everything original in them is false” are roughly equally extreme. If anything, we have made an error in favor of (1). On the other hand, (3) “Everything in the Sequences ever is false” is much more extreme because it also rejects the “unoriginal” claims, each of which is almost certainly true.
P.S. If you are like me, you are wondering about what “extreme” means now. To be extremely technical (ha) I am interpreting it as measuring the probability of a position re: Sequences that you expect a reasonable, boundedly-rational person to have. For instance, a post that says “Confirmation bias is a thing” is un-controversial, and you expect that reasonable people will believe it with probability close to 1. A post that says “MWI is obviously true” is controversial, and if you are generous you will say that there is a probability of 0.5 that someone will agree with it. This might be higher or lower for other posts in the “original” category but on the whole the approximation of 0.5 is probably favorable to the person that agrees with everything.
So when I conclude that (1) and (2) are roughly equally extreme, I am saying that a “reasonable person” is roughly equally likely to end up at either one of them. This is an approximation, of course, but they are certainly both closer to each other than they are to (3).
Yeah, I think I agree with everything here as far as it goes, though I haven’t looked at it carefully. I’m not sure originality is as crisp a concept as you want it to be, but I can imagine us both coming up with a list of propositions that we believe captures everything in the Sequences that some reasonable person somewhere might conceivably disagree with, weighted by how reasonable we think a person could be and still disagree with that proposition, and that we’d end up with very similar lists (perhaps with fairly different weights). .
I just read that essay and I disagree with it. Stating one’s points of disagreement amounts to giving the diffs between your mind and that of an author. What’s good practice for scientific papers (in terms of remaining dispassionate) is probably good practice in general. The way to solve the cooperation problem is not to cancel out professing disagreement with professing agreement, it’s to track group members’ beliefs (e.g. by polling them) and act as a group on whatever the group consensus happens to be. In other words, teach people the value of majoritarianism and its ilk and tell them to use this outside view when making decisions.
In terms of epistemic rationality, you can get by fine by raising only points of disagreement and keeping it implicit that you accept everything you do not dispute. But in terms of creating effective group cooperation, which has instrumental value, this strategy performs poorly.