The title says that sufficiently sincere confirmation bias is indistinguishable from real science. But I don’t see how this differs too much from real science (the attitude of the NYU people versus scientists.)
You say:
What made this work? I think what happened is that they took their own beliefs literally. They actually believed that people hated Hillary because she was a woman, and so their idea of something that they were confident would show this clearly was a fair test.
I’m a little confused. Isn’t this just saying that these people held real beliefs, rather than, say, belief-in-belief? So when contrary evidence appeared, they were able to change their mind?
I dunno; I feel not super convinced that its confirmation bias which causes this sort of good epistemic behavior? (As in, I wouldn’t expect this sort of thing in this sort of situation to happen much and this is maybe unique?)
I’m unsure I have a good internal picture of what sincerity is pointing at. Does being sincere differ much from “truly, actually, super-duper, very much so” believing in something?
I think I mean the same thing you mean by “real beliefs, rather than, say, belief-in-belief”. So, I’m saying, it’s not confirmation bias that causes the good thing, it’s sincerity that makes the confirmation bias comparatively harmless.
Real belief is actually moderately rare. People don’t generally believe stuff anymore that they might get laughed at about. Find one person who believes something they didn’t read on wikipedia and it’s a weird week.
I grant that most people may not hold too many real beliefs, in the normal sense of the word, but is this also generally true of scientists who are conducting studies? It feels like you’d need to belief that X was true in order for you to run a study in the first place..?
Or are we assuming that most scientists are just running off weakly held beliefs and just “doing things”?
(I really don’t know much about what the field might be like.)
The title says that sufficiently sincere confirmation bias is indistinguishable from real science. But I don’t see how this differs too much from real science (the attitude of the NYU people versus scientists.)
You say:
I’m a little confused. Isn’t this just saying that these people held real beliefs, rather than, say, belief-in-belief? So when contrary evidence appeared, they were able to change their mind?
I dunno; I feel not super convinced that its confirmation bias which causes this sort of good epistemic behavior? (As in, I wouldn’t expect this sort of thing in this sort of situation to happen much and this is maybe unique?)
It’s sincerity that causes this sort of behavior.
I’m unsure I have a good internal picture of what sincerity is pointing at. Does being sincere differ much from “truly, actually, super-duper, very much so” believing in something?
I think I mean the same thing you mean by “real beliefs, rather than, say, belief-in-belief”. So, I’m saying, it’s not confirmation bias that causes the good thing, it’s sincerity that makes the confirmation bias comparatively harmless.
Gotcha, thanks.
Real belief is actually moderately rare. People don’t generally believe stuff anymore that they might get laughed at about. Find one person who believes something they didn’t read on wikipedia and it’s a weird week.
I grant that most people may not hold too many real beliefs, in the normal sense of the word, but is this also generally true of scientists who are conducting studies? It feels like you’d need to belief that X was true in order for you to run a study in the first place..?
Or are we assuming that most scientists are just running off weakly held beliefs and just “doing things”?
(I really don’t know much about what the field might be like.)