But, when the researchers took a closer look, they found that the only people who had changed their views were those who were ideologically predisposed to disbelieve the fact in question. If someone held a contrary attitude, the correction not only didn’t work—it made the subject more distrustful of the source. [...] If information doesn’t square with someone’s prior beliefs, he discards the beliefs if they’re weak and discards the information if the beliefs are strong.
As unfortunate as this may be, even perfect Bayesians would reason similarly; Bayes’s rule essentially quantifies the trade-off between discarding new information and discarding your prior when the two conflict. (Which is one way in which Bayesianism is a theory of consistency rather than simple correctness.)
[Link] why do people persist in believing things that just aren’t true
The square brackets are greedy. What you want to do is this:
which looks like:
[Link]: Why do people persist in believing things that just aren’t true?
fixed. Thanks.
This bit of the article jumped out at me:
As unfortunate as this may be, even perfect Bayesians would reason similarly; Bayes’s rule essentially quantifies the trade-off between discarding new information and discarding your prior when the two conflict. (Which is one way in which Bayesianism is a theory of consistency rather than simple correctness.)