I found this article because someone linked it to the all roads load to Rome (ARLR) fallacy [https://www.reddit.com/r/Destiny/comments/1dyuxtm/the_all_road_lead_to_rome_fallacy_is_better/], where there’s no evidence that would reduce someone’s confidence but there is evidence that would increase it. This commonly occurs with conspiracy theorists where an absence of evidence is evidence of a cover up. I think the article was a good pointer even if the point is kind of trivial. I still had to sit down and formally prove some of the authors conclusions, however. For example, P(H|E) > P(E) --> P(H|~E) < P(E) which more directly shows that a Bayesian reasoner can’t fall for the ARLR fallacy. So I think the author could have said more with less. As a matter of fact, the quote you gave actually clarified some points for me.
I’d like to add that this article isn’t clear about its normative commitments such as strong Bayesianism [that all valid inferences are Bayesian ones]. Given this article’s place in The Sequences, it could prime readers to think strong Bayesianism is uncontroversial among other things. On a personal note, I’m in the camp that we have limited control over our beliefs [they are largely determined by the information we consume] and this supports my applied position that we should deliberately expose ourselves to confirmatory information until our credence’s are at a prescribed level. This is to make the point that the author offers a certain theory of mind when they say “this realization can take quite a load off your mind”. So you definitely aren’t getting the same amount of neutrality as you would get from, say, SEP. However, I don’t think this is as bad as quote guy makes it out to be if it’s even bad?
All that being said, I think quote guy is kind of missing the point. I don’t think any blog about the basics of reasoning is going to introduce anything new and that wouldn’t be covered in class [that’s just beyond its scope]. So I think that criticism isn’t entirely fair since I see way more value in stating the obvious. However, the front page vibe of The Sequences feels like it’s supposed to be “deep” and it seems to support an epistemology that I’m heavily opposed to [I’ll affectionately call it optimistic rationalism]. And because I’m one to think the truth tends to be boring, that kind of marketing is always going to give me pause.
Really, this article is better for its pure philosophy and I don’t think there’s any unique applications for this. After all, witch hunters aren’t Bayesian reasoners.
I found this article because someone linked it to the all roads load to Rome (ARLR) fallacy [https://www.reddit.com/r/Destiny/comments/1dyuxtm/the_all_road_lead_to_rome_fallacy_is_better/], where there’s no evidence that would reduce someone’s confidence but there is evidence that would increase it. This commonly occurs with conspiracy theorists where an absence of evidence is evidence of a cover up. I think the article was a good pointer even if the point is kind of trivial. I still had to sit down and formally prove some of the authors conclusions, however. For example, P(H|E) > P(E) --> P(H|~E) < P(E) which more directly shows that a Bayesian reasoner can’t fall for the ARLR fallacy. So I think the author could have said more with less. As a matter of fact, the quote you gave actually clarified some points for me.
I’d like to add that this article isn’t clear about its normative commitments such as strong Bayesianism [that all valid inferences are Bayesian ones]. Given this article’s place in The Sequences, it could prime readers to think strong Bayesianism is uncontroversial among other things. On a personal note, I’m in the camp that we have limited control over our beliefs [they are largely determined by the information we consume] and this supports my applied position that we should deliberately expose ourselves to confirmatory information until our credence’s are at a prescribed level. This is to make the point that the author offers a certain theory of mind when they say “this realization can take quite a load off your mind”. So you definitely aren’t getting the same amount of neutrality as you would get from, say, SEP. However, I don’t think this is as bad as quote guy makes it out to be if it’s even bad?
All that being said, I think quote guy is kind of missing the point. I don’t think any blog about the basics of reasoning is going to introduce anything new and that wouldn’t be covered in class [that’s just beyond its scope]. So I think that criticism isn’t entirely fair since I see way more value in stating the obvious. However, the front page vibe of The Sequences feels like it’s supposed to be “deep” and it seems to support an epistemology that I’m heavily opposed to [I’ll affectionately call it optimistic rationalism]. And because I’m one to think the truth tends to be boring, that kind of marketing is always going to give me pause.
Really, this article is better for its pure philosophy and I don’t think there’s any unique applications for this. After all, witch hunters aren’t Bayesian reasoners.