Let’s say I want to know whether it’s safe for my friend to bike to work. My own memories are truth indicative, but so are my friends’ and neighbors [and online surveys]… The trouble is my own memories arrive in my head with extreme salience, and move my automatic anticipations a lot; while my friend’s have less automatic impact, and those of the surveyed neighbors still less...our automatic cognition tends not to weigh the evidence evenly at all. <
I sometimes wonder, though, if giving one’s own experiences greater weight in situations like these (though not in the thermometer situation) is rational:
People lie (especially in online surveys); first hand evidence should be more valuable than evidence whose validity is in question
There are a large number of unknown and unanalyzed factors, some of which may vary with the individual: (I’m less/more coordinated and accident-prone, I am on better/worse terms with the rough crowd in the neighborhood, etc). This information may not be obvious enough to consciously consider.
If I have a sneezing fit every single time I encounter a bullfrog, and no one’s ever heard of a bullfrog allergy, and medical science doesn’t support the notion, it’s still going to be difficult (and I think possibly irrational) to arrive to the pond without a kleenex. It seems to me that in gray-area situations with strong personal evidence, admitting you don’t know why you don’t know why is at least as rational as concluding you’re wrong.
I sometimes wonder, though, if giving one’s own experiences greater weight in situations like these (though not in the thermometer situation) is rational:
The relevant question, I believe, is how much weight you should give the evidence from different sources. You should not think that the amount of weight we intuitively give evidence from our own experience is optimal, and this permits a reversal test.
I sometimes wonder, though, if giving one’s own experiences greater weight in situations like these (though not in the thermometer situation) is rational:
People lie (especially in online surveys); first hand evidence should be more valuable than evidence whose validity is in question
There are a large number of unknown and unanalyzed factors, some of which may vary with the individual: (I’m less/more coordinated and accident-prone, I am on better/worse terms with the rough crowd in the neighborhood, etc). This information may not be obvious enough to consciously consider.
If I have a sneezing fit every single time I encounter a bullfrog, and no one’s ever heard of a bullfrog allergy, and medical science doesn’t support the notion, it’s still going to be difficult (and I think possibly irrational) to arrive to the pond without a kleenex. It seems to me that in gray-area situations with strong personal evidence, admitting you don’t know why you don’t know why is at least as rational as concluding you’re wrong.
The relevant question, I believe, is how much weight you should give the evidence from different sources. You should not think that the amount of weight we intuitively give evidence from our own experience is optimal, and this permits a reversal test.