Yes, this is a semantic issue of what counts as “checking”, but that is exactly the issue at hand. Of course it’s possible to check claims against memory, intuition, mental calculation, the Internet, etc, but every such check has only limited reliability.
That is correct, but as Isaac Asimov pointed out in The Relativity of Wrong, there is a big difference between saying, “Every such check has limited reliability,” and “Checking is the the same as not checking.” If someone came to me tomorrow and said, “You’re completely wrong, quanticle, in fact Australia has a larger land mass than Asia,” I would be skeptical, and I would point out the massive preponderance of evidence in my favor. But if they managed to produce the extraordinary evidence required for me to update my beliefs, I would. However, they would have to actually produce that evidence. Simply saying, “I intuitively believe it to be true with high probability,” is not evidence.
To go back to the original claim you took issue with:
Couldn’t it also be the case that the claim is already known through intuition
In this case I did mean “intuition” to include some checks, e.g. compatibility with memory, analogy with similar cases, etc. Brains already do checks when processing thoughts (because, some thoughts register as surprising and some don’t). But these checks are insufficient to convince a skeptical audience, is the point. Which is why “I intuitively believe this” is not an argument, even if it’s Bayesian evidence to the intuition-haver. (And, trivially, intuitions could be Bayesian evidence, in cases where they are correlated with reality, e.g. due to mental architecture, and such correlations can be evaluated historically)
There seem to be some semantic disagreements here about what constitutes “evidence”, “intuition”, “checking”, etc, which I’m not that enthusiastic about resolving in this discussion, but are worth noting anyway.
But these checks are insufficient to convince a skeptical audience, is the point.
Yes, I see that as a feature, whereas you see to see it as somewhat of a bug. Given our propensity for self-deception and the limits of our brains, we should gather evidence, even when our intuition is very strong, and we should be suspicious of others who have strong intuitions, but don’t seem to have any sort of analytical evidence to back their claims up.
I don’t see any risk to hiding the origins of one’s ideas, if one has experimental evidence confirming them. Similarly, I don’t see the benefit of disclosing the sources of unconfirmed ideas. Where the idea comes from (a dream, an intuitive leap, an LSD trip, a reasoned inference from a literature review) is far less important than actually doing the work to confirm or disprove the idea.
That is correct, but as Isaac Asimov pointed out in The Relativity of Wrong, there is a big difference between saying, “Every such check has limited reliability,” and “Checking is the the same as not checking.” If someone came to me tomorrow and said, “You’re completely wrong, quanticle, in fact Australia has a larger land mass than Asia,” I would be skeptical, and I would point out the massive preponderance of evidence in my favor. But if they managed to produce the extraordinary evidence required for me to update my beliefs, I would. However, they would have to actually produce that evidence. Simply saying, “I intuitively believe it to be true with high probability,” is not evidence.
To go back to the original claim you took issue with:
In this case I did mean “intuition” to include some checks, e.g. compatibility with memory, analogy with similar cases, etc. Brains already do checks when processing thoughts (because, some thoughts register as surprising and some don’t). But these checks are insufficient to convince a skeptical audience, is the point. Which is why “I intuitively believe this” is not an argument, even if it’s Bayesian evidence to the intuition-haver. (And, trivially, intuitions could be Bayesian evidence, in cases where they are correlated with reality, e.g. due to mental architecture, and such correlations can be evaluated historically)
There seem to be some semantic disagreements here about what constitutes “evidence”, “intuition”, “checking”, etc, which I’m not that enthusiastic about resolving in this discussion, but are worth noting anyway.
Yes, I see that as a feature, whereas you see to see it as somewhat of a bug. Given our propensity for self-deception and the limits of our brains, we should gather evidence, even when our intuition is very strong, and we should be suspicious of others who have strong intuitions, but don’t seem to have any sort of analytical evidence to back their claims up.
I don’t see any risk to hiding the origins of one’s ideas, if one has experimental evidence confirming them. Similarly, I don’t see the benefit of disclosing the sources of unconfirmed ideas. Where the idea comes from (a dream, an intuitive leap, an LSD trip, a reasoned inference from a literature review) is far less important than actually doing the work to confirm or disprove the idea.