I don’t see what this adds beyond making LW more political. Let’s discuss ideas, not affiliations!
If you agree with everything Eliezer wrote, you remember him writing about how every cause wants to be a cult. This post looks exactly like the sort of cultish entropy that he advised guarding against to me. Can you imagine a similar post on any run-of-the-mill, non-cultish online forum?
It worries me a lot that you relate ideas so strongly to the people who say them, especially since most of the people you refer to are so high status. Perhaps you could experimentally start using the less wrong anti-kibitzer feature to see if your perception of LW changes?
If you agree with everything Eliezer wrote, you remember him writing about how every cause wants to be a cult.
But also what he said about swinging too far in the opposite direction. It’s bad for a community to reach a point where it’s taboo to profess dissent, but also for it to reach a point where it’s taboo to profess wholehearted agreement.
Wouldn’t it be better if the professed agreement was agreement with ideas rather than with people? The dissent counterpart of this post would say “I disagree with everything what this person says”. That’s clearly pathological.
This may sound wrong, but “who said that” is a Bayesian evidence, sometimes rather strong one. If your experience tells you that given person is right in 95% of things they say, it is rational to give 95% prior probability to other things they say.
It is said that we should judge the ideas by the ideas alone. In Bayes-speak it means that if you update correctly, enough evidence can fix a wrong prior (how much evidence is needed depends on how wrong the prior was). But gathering evidence is costly, and we cannot spend too high cost for every idea around us. Why use a worse prior, if a better one is available?
Back to human-speak: if person A is a notorious liar (or a mindkilled person that repeats someone else’s lies), person B is careless about their beliefs, and person C examines every idea carefully before telling it to others, then it is rational to react differently to ideas spoken by these three people. The word “everything” is too strong, but saying “if a person C said that, I believe it” is OK (assuming that if there is enough counter-evidence, the person will update: both on the idea and on credibility of C).
Disagreeing with everything one says would be trying to reverse stupidity. There are people who do “worse than random”, so doing an opposite of what they said could be a good heuristic; but even for most of them assigning 95% probability that they are wrong would be too much.
You are right, but there is probably some misunderstanding. That the personal considerations should be ignored when assessing probability of an idea, and that one shouldn’t express collective agreement with ideas based on their author, are different suggestions. You argue against the former while I was stating the latter.
It’s important to take into account the context. When an idea X is being questioned, saying “I agree with X, because a very trustworthy person Y agrees with X” is fine with me, although it isn’t the best sort of argument one could provide. Starting the discussion “I agree with X1, X2, X3, … Xn”, on the other hand, makes any reasonable debate almost impossible, since it is not practical to argue n distinct ideas at once.
Well, the professed agreement of this post is with the Sequences, which are a set of ideas rather than a person, even if they were all written by one person. The dissent counterpart of this post would be “I disagree with the entire content of the sequences.”
Professing agreement or disagreement with a broad set of rather unrelated ideas is not conductive to productive discussion because there is no single topic to concentrate on with object-level arguments. Having the set of ideas defined by their author brings in tribal political instincts, which too is not helpful. You are right that the post was formulated as agreement with the Sequences rather than with everything which Yudkowsky ever said but I don’t see how this distinction is important. “Everything which Yudkowsky ever said” would also denote a set of ideas, after all.
“I think the Sequences are right about everything” is a pro-Sequences position of roughly the same extremity as “I think the Sequences are wrong about everything.”
As far as I know, nobody claims the latter, and it seems kind of absurd on the face of it. The closest I’ve seen anyone come to it is something like “everything true in the Sequences is unoriginal, everything original in them is false, and they are badly written”… which is a pretty damning criticism, but still allows for the idea that quite a lot of what the text expresses is true.
Its equally extreme counterpart on the positive axis should, then, allow for the idea that quite a lot of what the text expresses is false.
To reject overly-extreme positive statements more strongly than less-extreme negative ones is not necessarily expressing a rejection of positivity; it might be a rejection of extremism instead.
“I think the Sequences are right about everything” is a pro-Sequences position of roughly the same extremity as “I think the Sequences are wrong about everything.”
As far as I know, nobody claims the latter, and it seems kind of absurd on the face of it.
I wouldn’t say that they’re comparably extreme. A whole lot of the content of the sequences is completely uncontroversial, and even if you think Eliezer is a total lunatic, it’s unlikely that they wouldn’t contain any substantively true claims. I’d bet heavily against “The Biggest Secret” by David Icke containing no substantively true claims at all.
I would have some points of disagreement with someone who thinks that everything in the sequences is correct (personally, I doubt that MWI is a slam dunk, because I’m not convinced Eliezer is accurately framing the implications of collapse, and I think CEV is probably a dead end when it comes to programming an FAI, although I don’t know of any team which I think has better odds of developing an FAI than the SIAI.) But I think someone who agrees with the entirety of the contents is being far more reasonable than someone who disagrees with the entirety.
Suppose you separate the Sequences into “original” and “unoriginal”.
The “unoriginal” segment is very likely to be true: agreeing with all of it is fairly straightforward, and disagreeing with all of it is ridiculously extreme.
To a first approximation, we can say that the middle-ground stance on any given point in the “original” statement is uncertainty. That is, accepting that point and rejecting it are equally extreme. If we use the general population for reference, of course, that is nowhere near correct: even considering the possibility that cryonics might work is a fairly extreme stance, for instance.
But taking the approximation at face value tells us that agreeing with every “original” claim, and disagreeing with every “original” claim, are equally extreme positions. If we now add the further stipulation that both positions agree with every “unoriginal” claim, they both move slightly toward the Sequences, but not by much.
So actually (1) “I agree with everything in the sequences” and (2) “Everything true in the Sequences is unoriginal, everything original in them is false” are roughly equally extreme. If anything, we have made an error in favor of (1). On the other hand, (3) “Everything in the Sequences ever is false” is much more extreme because it also rejects the “unoriginal” claims, each of which is almost certainly true.
P.S. If you are like me, you are wondering about what “extreme” means now. To be extremely technical (ha) I am interpreting it as measuring the probability of a position re: Sequences that you expect a reasonable, boundedly-rational person to have. For instance, a post that says “Confirmation bias is a thing” is un-controversial, and you expect that reasonable people will believe it with probability close to 1. A post that says “MWI is obviously true” is controversial, and if you are generous you will say that there is a probability of 0.5 that someone will agree with it. This might be higher or lower for other posts in the “original” category but on the whole the approximation of 0.5 is probably favorable to the person that agrees with everything.
So when I conclude that (1) and (2) are roughly equally extreme, I am saying that a “reasonable person” is roughly equally likely to end up at either one of them. This is an approximation, of course, but they are certainly both closer to each other than they are to (3).
Yeah, I think I agree with everything here as far as it goes, though I haven’t looked at it carefully. I’m not sure originality is as crisp a concept as you want it to be, but I can imagine us both coming up with a list of propositions that we believe captures everything in the Sequences that some reasonable person somewhere might conceivably disagree with, weighted by how reasonable we think a person could be and still disagree with that proposition, and that we’d end up with very similar lists (perhaps with fairly different weights).
.
I just read that essay and I disagree with it. Stating one’s points of disagreement amounts to giving the diffs between your mind and that of an author. What’s good practice for scientific papers (in terms of remaining dispassionate) is probably good practice in general. The way to solve the cooperation problem is not to cancel out professing disagreement with professing agreement, it’s to track group members’ beliefs (e.g. by polling them) and act as a group on whatever the group consensus happens to be. In other words, teach people the value of majoritarianism and its ilk and tell them to use this outside view when making decisions.
What’s good practice for scientific papers (in terms of remaining dispassionate) is probably good practice in general.
In terms of epistemic rationality, you can get by fine by raising only points of disagreement and keeping it implicit that you accept everything you do not dispute. But in terms of creating effective group cooperation, which has instrumental value, this strategy performs poorly.
Oho! But it is not. You know, the nervousness associated with wanting to not be part of a cult is also a cult attractor. Once again I must point out that you are conveying only connotations, not denotations.
Let’s discuss ideas, not affiliations!
No slogans!
Perhaps you could experimentally start using the less wrong anti-kibitzer feature to see if your perception of LW changes?
I tried this for a few weeks; it didn’t change anything.
I’m more motivated by making Less Wrong a good place to discuss ideas
Meta-ideas are ideas too, for example:
An idea: “Many-Worlds Interpretation of quantum physics is correct, because it’s mathematically correct and simplest according to Occam’s Razor (if Occam’s Razor means selecting the interpretation with greatest Solomonoff Prior).”—agree or disagree.
A meta-idea: “Here is this smart guy called Eliezer. He wrote a series of articles about Occam’s razor, Solomonoff prior, and quantum physics; those articles are relatively easy to read for a layman, and they also explain frequently made mistakes when discussing these topic. Reading those articles before you start discussing your opinions (with high probability repeating the frequently made mistakes explained in those articles) is a good idea to make the conversation efficient.”—agree or disagree.
Yes, but that was not what Grognor wrote… He professed a bunch of beliefs and complained that he felt in the minority for having them.
He even explicitly discouraged discussing individual beliefs:
If you also stand by the sequences, feel free to say that. If you don’t, feel free to say that too, but please don’t substantiate it. I don’t want this thread to be a low-level rehash of tired debates, though it will surely have some of that in spite of my sincerest wishes.
In other words, he prefers to discuss high-level concerns like whether you are with him or against him over low-level nuts-and-bolts details.
Edit: I see that Grognor has added a statement of regret at the end of his post. I’m willing to give him some of his credibility back.
He professed a bunch of beliefs and complained that he felt in the minority for having them.
I don’t like your tone. Anyway, this is wrong; I suspected I was part of a silent majority. Judging by the voting patterns (every comment indicating disagreement is above 15, every comment indicating agreement is below 15 and half are even negative) and the replies themselves, I was wrong and the silence is because this actually is a minority position.
In other words, he prefers to discuss high-level concerns like whether you are with him or against him over low-level nuts-and-bolts details.
No! Just in this thread! There are all the other threads on the entire website to debate at the object-level. I am tempted to say that fifteen more times, if you do not believe it.
I’m willing to give him some of his credibility back.
O frabjuous day, JMIV does not consider me to be completely ridiculous anymore. Could you be more patronizing?
Edit, in response to reply: In retrospect, a poll would have been better than what I ended up doing. But doing nothing would have been better still. At least we agree on that.
(every comment indicating disagreement is above 15, every comment indicating agreement is below 15 and half are even negative)
This may not indicate what you think it indicates. In particular, I (and I suspect other people) try to vote up comments that make interesting points even if we disagree with them. In this context, some upvoting may be due to the interestingness of the remarks which in some contexts is inversely correlated with agreement. I don’t think that this accounts for the entire disparity, but it does likely count for some of it. This, combined with a deliberate desire to be non-cultish in voting patterns may account for much of the difference.
I don’t see what this adds beyond making LW more political. Let’s discuss ideas, not affiliations!
If you agree with everything Eliezer wrote, you remember him writing about how every cause wants to be a cult. This post looks exactly like the sort of cultish entropy that he advised guarding against to me. Can you imagine a similar post on any run-of-the-mill, non-cultish online forum?
It worries me a lot that you relate ideas so strongly to the people who say them, especially since most of the people you refer to are so high status. Perhaps you could experimentally start using the less wrong anti-kibitzer feature to see if your perception of LW changes?
But also what he said about swinging too far in the opposite direction. It’s bad for a community to reach a point where it’s taboo to profess dissent, but also for it to reach a point where it’s taboo to profess wholehearted agreement.
Wouldn’t it be better if the professed agreement was agreement with ideas rather than with people? The dissent counterpart of this post would say “I disagree with everything what this person says”. That’s clearly pathological.
This may sound wrong, but “who said that” is a Bayesian evidence, sometimes rather strong one. If your experience tells you that given person is right in 95% of things they say, it is rational to give 95% prior probability to other things they say.
It is said that we should judge the ideas by the ideas alone. In Bayes-speak it means that if you update correctly, enough evidence can fix a wrong prior (how much evidence is needed depends on how wrong the prior was). But gathering evidence is costly, and we cannot spend too high cost for every idea around us. Why use a worse prior, if a better one is available?
Back to human-speak: if person A is a notorious liar (or a mindkilled person that repeats someone else’s lies), person B is careless about their beliefs, and person C examines every idea carefully before telling it to others, then it is rational to react differently to ideas spoken by these three people. The word “everything” is too strong, but saying “if a person C said that, I believe it” is OK (assuming that if there is enough counter-evidence, the person will update: both on the idea and on credibility of C).
Disagreeing with everything one says would be trying to reverse stupidity. There are people who do “worse than random”, so doing an opposite of what they said could be a good heuristic; but even for most of them assigning 95% probability that they are wrong would be too much.
You are right, but there is probably some misunderstanding. That the personal considerations should be ignored when assessing probability of an idea, and that one shouldn’t express collective agreement with ideas based on their author, are different suggestions. You argue against the former while I was stating the latter.
It’s important to take into account the context. When an idea X is being questioned, saying “I agree with X, because a very trustworthy person Y agrees with X” is fine with me, although it isn’t the best sort of argument one could provide. Starting the discussion “I agree with X1, X2, X3, … Xn”, on the other hand, makes any reasonable debate almost impossible, since it is not practical to argue n distinct ideas at once.
Well, the professed agreement of this post is with the Sequences, which are a set of ideas rather than a person, even if they were all written by one person. The dissent counterpart of this post would be “I disagree with the entire content of the sequences.”
Am I misunderstanding you about something?
Professing agreement or disagreement with a broad set of rather unrelated ideas is not conductive to productive discussion because there is no single topic to concentrate on with object-level arguments. Having the set of ideas defined by their author brings in tribal political instincts, which too is not helpful. You are right that the post was formulated as agreement with the Sequences rather than with everything which Yudkowsky ever said but I don’t see how this distinction is important. “Everything which Yudkowsky ever said” would also denote a set of ideas, after all.
Albeit an internally inconsistent set, given that Yudkowsky has occasionally changed his mind about things.
Well, that’s either an exaggeration or an audacious lie.
In any case, I had neither the means nor the desire to list every idea of theirs I agree with.
“I think the Sequences are right about everything” is a pro-Sequences position of roughly the same extremity as “I think the Sequences are wrong about everything.”
As far as I know, nobody claims the latter, and it seems kind of absurd on the face of it. The closest I’ve seen anyone come to it is something like “everything true in the Sequences is unoriginal, everything original in them is false, and they are badly written”… which is a pretty damning criticism, but still allows for the idea that quite a lot of what the text expresses is true.
Its equally extreme counterpart on the positive axis should, then, allow for the idea that quite a lot of what the text expresses is false.
To reject overly-extreme positive statements more strongly than less-extreme negative ones is not necessarily expressing a rejection of positivity; it might be a rejection of extremism instead.
I wouldn’t say that they’re comparably extreme. A whole lot of the content of the sequences is completely uncontroversial, and even if you think Eliezer is a total lunatic, it’s unlikely that they wouldn’t contain any substantively true claims. I’d bet heavily against “The Biggest Secret” by David Icke containing no substantively true claims at all.
I would have some points of disagreement with someone who thinks that everything in the sequences is correct (personally, I doubt that MWI is a slam dunk, because I’m not convinced Eliezer is accurately framing the implications of collapse, and I think CEV is probably a dead end when it comes to programming an FAI, although I don’t know of any team which I think has better odds of developing an FAI than the SIAI.) But I think someone who agrees with the entirety of the contents is being far more reasonable than someone who disagrees with the entirety.
I agree that “I think the Sequences are wrong about everything” would be an absurdly extreme claim, for the reasons you point out.
We disagree about the extremity of “I think the Sequences are right about everything”.
I’m not sure where to go from there, though.
Suppose you separate the Sequences into “original” and “unoriginal”.
The “unoriginal” segment is very likely to be true: agreeing with all of it is fairly straightforward, and disagreeing with all of it is ridiculously extreme.
To a first approximation, we can say that the middle-ground stance on any given point in the “original” statement is uncertainty. That is, accepting that point and rejecting it are equally extreme. If we use the general population for reference, of course, that is nowhere near correct: even considering the possibility that cryonics might work is a fairly extreme stance, for instance.
But taking the approximation at face value tells us that agreeing with every “original” claim, and disagreeing with every “original” claim, are equally extreme positions. If we now add the further stipulation that both positions agree with every “unoriginal” claim, they both move slightly toward the Sequences, but not by much.
So actually (1) “I agree with everything in the sequences” and (2) “Everything true in the Sequences is unoriginal, everything original in them is false” are roughly equally extreme. If anything, we have made an error in favor of (1). On the other hand, (3) “Everything in the Sequences ever is false” is much more extreme because it also rejects the “unoriginal” claims, each of which is almost certainly true.
P.S. If you are like me, you are wondering about what “extreme” means now. To be extremely technical (ha) I am interpreting it as measuring the probability of a position re: Sequences that you expect a reasonable, boundedly-rational person to have. For instance, a post that says “Confirmation bias is a thing” is un-controversial, and you expect that reasonable people will believe it with probability close to 1. A post that says “MWI is obviously true” is controversial, and if you are generous you will say that there is a probability of 0.5 that someone will agree with it. This might be higher or lower for other posts in the “original” category but on the whole the approximation of 0.5 is probably favorable to the person that agrees with everything.
So when I conclude that (1) and (2) are roughly equally extreme, I am saying that a “reasonable person” is roughly equally likely to end up at either one of them. This is an approximation, of course, but they are certainly both closer to each other than they are to (3).
Yeah, I think I agree with everything here as far as it goes, though I haven’t looked at it carefully. I’m not sure originality is as crisp a concept as you want it to be, but I can imagine us both coming up with a list of propositions that we believe captures everything in the Sequences that some reasonable person somewhere might conceivably disagree with, weighted by how reasonable we think a person could be and still disagree with that proposition, and that we’d end up with very similar lists (perhaps with fairly different weights). .
I just read that essay and I disagree with it. Stating one’s points of disagreement amounts to giving the diffs between your mind and that of an author. What’s good practice for scientific papers (in terms of remaining dispassionate) is probably good practice in general. The way to solve the cooperation problem is not to cancel out professing disagreement with professing agreement, it’s to track group members’ beliefs (e.g. by polling them) and act as a group on whatever the group consensus happens to be. In other words, teach people the value of majoritarianism and its ilk and tell them to use this outside view when making decisions.
In terms of epistemic rationality, you can get by fine by raising only points of disagreement and keeping it implicit that you accept everything you do not dispute. But in terms of creating effective group cooperation, which has instrumental value, this strategy performs poorly.
Oho! But it is not. You know, the nervousness associated with wanting to not be part of a cult is also a cult attractor. Once again I must point out that you are conveying only connotations, not denotations.
No slogans!
I tried this for a few weeks; it didn’t change anything.
That sounds wrong to me.
I’m more motivated by making Less Wrong a good place to discuss ideas than any kind of nervousness.
Meta-ideas are ideas too, for example:
An idea: “Many-Worlds Interpretation of quantum physics is correct, because it’s mathematically correct and simplest according to Occam’s Razor (if Occam’s Razor means selecting the interpretation with greatest Solomonoff Prior).”—agree or disagree.
A meta-idea: “Here is this smart guy called Eliezer. He wrote a series of articles about Occam’s razor, Solomonoff prior, and quantum physics; those articles are relatively easy to read for a layman, and they also explain frequently made mistakes when discussing these topic. Reading those articles before you start discussing your opinions (with high probability repeating the frequently made mistakes explained in those articles) is a good idea to make the conversation efficient.”—agree or disagree.
This topic is about the meta-idea.
Yes, but that was not what Grognor wrote… He professed a bunch of beliefs and complained that he felt in the minority for having them.
He even explicitly discouraged discussing individual beliefs:
In other words, he prefers to discuss high-level concerns like whether you are with him or against him over low-level nuts-and-bolts details.
Edit: I see that Grognor has added a statement of regret at the end of his post. I’m willing to give him some of his credibility back.
I don’t like your tone. Anyway, this is wrong; I suspected I was part of a silent majority. Judging by the voting patterns (every comment indicating disagreement is above 15, every comment indicating agreement is below 15 and half are even negative) and the replies themselves, I was wrong and the silence is because this actually is a minority position.
No! Just in this thread! There are all the other threads on the entire website to debate at the object-level. I am tempted to say that fifteen more times, if you do not believe it.
O frabjuous day, JMIV does not consider me to be completely ridiculous anymore. Could you be more patronizing?
Edit, in response to reply: In retrospect, a poll would have been better than what I ended up doing. But doing nothing would have been better still. At least we agree on that.
Hm. Perhaps you could have created an anonymous poll if you wished to measure opinions? Anonymity means people are less likely to form affiliations.
Just one thread devoted to politics is probably okay, but I would prefer zero.
This may not indicate what you think it indicates. In particular, I (and I suspect other people) try to vote up comments that make interesting points even if we disagree with them. In this context, some upvoting may be due to the interestingness of the remarks which in some contexts is inversely correlated with agreement. I don’t think that this accounts for the entire disparity, but it does likely count for some of it. This, combined with a deliberate desire to be non-cultish in voting patterns may account for much of the difference.
See Cultish Countercultishness, unless of course you disagree with that too.