Criticism of this article was found at a talk page at RationalWiki.
The Sequences do not contain unique ideas, and they present the ideas they do contain in misleading ways using parochial language. The “Law of Conservation of Expected Confidence” essay, for instance, covers ideas that are often covered in introductory philosophical methods or critical thinking courses. There is no novelty either in the idea that your expected future credence must match your current credence (otherwise, why not update your credence now?), nor in the idea that if E is evidence for H, then ~E is evidence for ~H (though E and ~E may have very different evidential strength), and Yudkowsky’s treatment is imprecise and, in combining multiple points, muddles things more than it illuminates them. Besides that, the former notion has sparked substantial controversy in epistemology, owing to cases wherein people apparently can reasonably expect to have their minds changed without changing them right now. While popular, Bayesianism is not univocally accepted by epistemologists, and it’s not because they’re irrational.
I found this article because someone linked it to the all roads load to Rome (ARLR) fallacy [https://www.reddit.com/r/Destiny/comments/1dyuxtm/the_all_road_lead_to_rome_fallacy_is_better/], where there’s no evidence that would reduce someone’s confidence but there is evidence that would increase it. This commonly occurs with conspiracy theorists where an absence of evidence is evidence of a cover up. I think the article was a good pointer even if the point is kind of trivial. I still had to sit down and formally prove some of the authors conclusions, however. For example, P(H|E) > P(E) --> P(H|~E) < P(E) which more directly shows that a Bayesian reasoner can’t fall for the ARLR fallacy. So I think the author could have said more with less. As a matter of fact, the quote you gave actually clarified some points for me.
I’d like to add that this article isn’t clear about its normative commitments such as strong Bayesianism [that all valid inferences are Bayesian ones]. Given this article’s place in The Sequences, it could prime readers to think strong Bayesianism is uncontroversial among other things. On a personal note, I’m in the camp that we have limited control over our beliefs [they are largely determined by the information we consume] and this supports my applied position that we should deliberately expose ourselves to confirmatory information until our credence’s are at a prescribed level. This is to make the point that the author offers a certain theory of mind when they say “this realization can take quite a load off your mind”. So you definitely aren’t getting the same amount of neutrality as you would get from, say, SEP. However, I don’t think this is as bad as quote guy makes it out to be if it’s even bad?
All that being said, I think quote guy is kind of missing the point. I don’t think any blog about the basics of reasoning is going to introduce anything new and that wouldn’t be covered in class [that’s just beyond its scope]. So I think that criticism isn’t entirely fair since I see way more value in stating the obvious. However, the front page vibe of The Sequences feels like it’s supposed to be “deep” and it seems to support an epistemology that I’m heavily opposed to [I’ll affectionately call it optimistic rationalism]. And because I’m one to think the truth tends to be boring, that kind of marketing is always going to give me pause.
Really, this article is better for its pure philosophy and I don’t think there’s any unique applications for this. After all, witch hunters aren’t Bayesian reasoners.
Criticism of this article was found at a talk page at RationalWiki.
What do you guys think?
https://rationalwiki.org/wiki/Talk:LessWrong#EA_orgs_praising_AI_pseudoscience_charity._Is_it_useful.3F
I found this article because someone linked it to the all roads load to Rome (ARLR) fallacy [https://www.reddit.com/r/Destiny/comments/1dyuxtm/the_all_road_lead_to_rome_fallacy_is_better/], where there’s no evidence that would reduce someone’s confidence but there is evidence that would increase it. This commonly occurs with conspiracy theorists where an absence of evidence is evidence of a cover up. I think the article was a good pointer even if the point is kind of trivial. I still had to sit down and formally prove some of the authors conclusions, however. For example, P(H|E) > P(E) --> P(H|~E) < P(E) which more directly shows that a Bayesian reasoner can’t fall for the ARLR fallacy. So I think the author could have said more with less. As a matter of fact, the quote you gave actually clarified some points for me.
I’d like to add that this article isn’t clear about its normative commitments such as strong Bayesianism [that all valid inferences are Bayesian ones]. Given this article’s place in The Sequences, it could prime readers to think strong Bayesianism is uncontroversial among other things. On a personal note, I’m in the camp that we have limited control over our beliefs [they are largely determined by the information we consume] and this supports my applied position that we should deliberately expose ourselves to confirmatory information until our credence’s are at a prescribed level. This is to make the point that the author offers a certain theory of mind when they say “this realization can take quite a load off your mind”. So you definitely aren’t getting the same amount of neutrality as you would get from, say, SEP. However, I don’t think this is as bad as quote guy makes it out to be if it’s even bad?
All that being said, I think quote guy is kind of missing the point. I don’t think any blog about the basics of reasoning is going to introduce anything new and that wouldn’t be covered in class [that’s just beyond its scope]. So I think that criticism isn’t entirely fair since I see way more value in stating the obvious. However, the front page vibe of The Sequences feels like it’s supposed to be “deep” and it seems to support an epistemology that I’m heavily opposed to [I’ll affectionately call it optimistic rationalism]. And because I’m one to think the truth tends to be boring, that kind of marketing is always going to give me pause.
Really, this article is better for its pure philosophy and I don’t think there’s any unique applications for this. After all, witch hunters aren’t Bayesian reasoners.