if someone spoke for something smaller than LW, e.g. Bayesian Epistemology, that’d be fine. CR and Objectivism, for example, can be questioned and have people who will answer (unlike science itself).
and if someone wanted to take responsibility for gjm-LW or lumifer-LW or some other body of ideas which is theirs alone, that’d be fine too. but people aren’t doing this as a group or individually!
The fact that objectivism has cultists who want to defend the objectivist way isn’t a quality that’s worthy of emulation. If CR is copying the same group think structures that’s also no argument in favor of it either.
If you have an argument about Ayn Rand’s ideas, that would be important.
Regardless, you can get correct answers to tons of common questions about Objectivism at a variety of places online (including both pro-ARI and anti-ARI places). That’s good. And Binswanger, linked negatively above, engaged with Popperian criticism more than anyone at LW has. He also has combined seriously writing down ideas with discussing ideas, whereas LW people seem to only do much of one or the other, which I think is a big problem.
That’s … not a very accurate way of describing what happened. Not because there’s literally no way to understand it that makes it factually correct, but because it gives entirely the wrong impression.
Here’s a more complete description of what happened.
curi came here in early April 2011 (well, he actually first appeared earlier, but before then he made a total of three comments ever) and posted five lengthy top-level posts in five days. They were increasingly badly received by the community, getting scores of −1,-1,-1,-22,-38. The last one was entitled “The conjunction fallacy does not exist” and what it attempted to refute was a completely wrong statement of what the c.f. is about, namely the claim (which no one believes) that “people attribute higher probability to X&Y than to Y” for all X and Y.
As this was happening, more and more of the comments on curi’s posts were along the general lines of this one saying, in essence: This is not productive, you are just repeating the same wrong things without listening to criticism, so please stop.
It was suggested that there was some reason to think curi was using sockpuppets to undo others’ downvotes and keep enough karma to carry on posting.
And then, in that context, curi’s fifth post—which attempted to refute the conjunction fallacy but which completely misunderstood what the conjunction fallacy is, and which was sitting on −38 points—was removed.
Now, maybe that’s because Eliezer was afraid of curi’s ideas and wanted to close down discussion or something of the sort. But a more plausible explanation is that he thought further discussion was likely to be a waste of time for the same reason as several commenters.
I don’t think removing the post was a good decision, and generally I think Eliezer’s moderation has been too heavy-handed on multiple occasions. But I don’t think the kind of explanation curi is offering for this is at all likely to be correct.
On the other hand, if curi is merely saying that Eliezer is unlikely to be interested if curi contacts him and asks for a debate on Bayes versus CR, then I think he’s clearly right about that.
Well, both Lumifer and I have (mostly in different venues) been answering a lot of questions and criticisms you’ve posed. But no, I don’t think either of us feels “responsibility” in the specific (and, I think, entirely non-standard) sense you’re using here, where to “take responsibility” for a set of ideas is to incur a limitless obligation to answer any and all questions and criticisms made of those ideas.
The total of what your “paths forward” page says about limited resources: (1) instead of writing your own answers to every criticism, you can point critics to already-written things that address their criticisms; (2) if you have a suitable forum with like-thinking other people there, they may address the criticisms for you.
Perhaps it seems to you that these make it reasonable to have a policy of addressing every criticism and question despite limited resources. It doesn’t seem so to me.
I have read your document, I am not convinced by your arguments that we should attempt to address every single criticism and question, I am not convinced by your arguments that we can realistically do so, and I think the main practical effects of embracing your principles on this point would be (1) to favour obsessive cranks who have nothing else to do with their time than argue about their pet theories, (2) to encourage obsessive-crank-like behaviour, and (3) to make those who embrace them spend more time arguing on the internet. I can’t speak for others, but I don’t want to give advantages to obsessive cranks, I don’t want to become more obsessive and cranky myself, and I think it much more likely that I spend too much time arguing on the internet rather than too little.
I see nothing to suggest that further investigation of “paths forward” is likely to be a productive use of my time.
So: no, I don’t want to spend more time learning, discussing, or using “paths forward”. I think it would be a suboptimal way to use that time.
if someone spoke for something smaller than LW, e.g. Bayesian Epistemology, that’d be fine. CR and Objectivism, for example, can be questioned and have people who will answer (unlike science itself).
and if someone wanted to take responsibility for gjm-LW or lumifer-LW or some other body of ideas which is theirs alone, that’d be fine too. but people aren’t doing this as a group or individually!
The fact that objectivism has cultists who want to defend the objectivist way isn’t a quality that’s worthy of emulation. If CR is copying the same group think structures that’s also no argument in favor of it either.
I like Ayn Rand’s writing, not whatever you think is a “cult”. See e.g. http://curi.us/1930-harry-binswanger-refuses-to-think
If you have an argument about Ayn Rand’s ideas, that would be important.
Regardless, you can get correct answers to tons of common questions about Objectivism at a variety of places online (including both pro-ARI and anti-ARI places). That’s good. And Binswanger, linked negatively above, engaged with Popperian criticism more than anyone at LW has. He also has combined seriously writing down ideas with discussing ideas, whereas LW people seem to only do much of one or the other, which I think is a big problem.
Speaking for “objectivism” instead of someone personal opinions implies structures that get people think alike in a cultish way.
You can, of course, go and bother Eliezer. I doubt he would be inclined to listen to you, though.
Eliezer has already indicated [1] he’d prefer to take administrative action to prevent discussion than speak to the issues. No Paths Forward there!
[1] http://lesswrong.com/lw/56m/the_conjunction_fallacy_does_not_exist/3wf5
That’s … not a very accurate way of describing what happened. Not because there’s literally no way to understand it that makes it factually correct, but because it gives entirely the wrong impression.
Here’s a more complete description of what happened.
curi came here in early April 2011 (well, he actually first appeared earlier, but before then he made a total of three comments ever) and posted five lengthy top-level posts in five days. They were increasingly badly received by the community, getting scores of −1,-1,-1,-22,-38. The last one was entitled “The conjunction fallacy does not exist” and what it attempted to refute was a completely wrong statement of what the c.f. is about, namely the claim (which no one believes) that “people attribute higher probability to X&Y than to Y” for all X and Y.
As this was happening, more and more of the comments on curi’s posts were along the general lines of this one saying, in essence: This is not productive, you are just repeating the same wrong things without listening to criticism, so please stop.
It was suggested that there was some reason to think curi was using sockpuppets to undo others’ downvotes and keep enough karma to carry on posting.
And then, in that context, curi’s fifth post—which attempted to refute the conjunction fallacy but which completely misunderstood what the conjunction fallacy is, and which was sitting on −38 points—was removed.
Now, maybe that’s because Eliezer was afraid of curi’s ideas and wanted to close down discussion or something of the sort. But a more plausible explanation is that he thought further discussion was likely to be a waste of time for the same reason as several commenters.
I don’t think removing the post was a good decision, and generally I think Eliezer’s moderation has been too heavy-handed on multiple occasions. But I don’t think the kind of explanation curi is offering for this is at all likely to be correct.
On the other hand, if curi is merely saying that Eliezer is unlikely to be interested if curi contacts him and asks for a debate on Bayes versus CR, then I think he’s clearly right about that.
Yep, sounds like Eliezer. No surprises.
Well, both Lumifer and I have (mostly in different venues) been answering a lot of questions and criticisms you’ve posed. But no, I don’t think either of us feels “responsibility” in the specific (and, I think, entirely non-standard) sense you’re using here, where to “take responsibility” for a set of ideas is to incur a limitless obligation to answer any and all questions and criticisms made of those ideas.
there are methods for doing Paths Forward with limited resource use. you just don’t want to learn/discuss/use them.
The total of what your “paths forward” page says about limited resources: (1) instead of writing your own answers to every criticism, you can point critics to already-written things that address their criticisms; (2) if you have a suitable forum with like-thinking other people there, they may address the criticisms for you.
Perhaps it seems to you that these make it reasonable to have a policy of addressing every criticism and question despite limited resources. It doesn’t seem so to me.
I have read your document, I am not convinced by your arguments that we should attempt to address every single criticism and question, I am not convinced by your arguments that we can realistically do so, and I think the main practical effects of embracing your principles on this point would be (1) to favour obsessive cranks who have nothing else to do with their time than argue about their pet theories, (2) to encourage obsessive-crank-like behaviour, and (3) to make those who embrace them spend more time arguing on the internet. I can’t speak for others, but I don’t want to give advantages to obsessive cranks, I don’t want to become more obsessive and cranky myself, and I think it much more likely that I spend too much time arguing on the internet rather than too little.
I see nothing to suggest that further investigation of “paths forward” is likely to be a productive use of my time.
So: no, I don’t want to spend more time learning, discussing, or using “paths forward”. I think it would be a suboptimal way to use that time.
By Jove, I think you got it!
:-D