“Not if they change their minds when confronted with the evidence.”
“Would you do that?”
“Yeah.”
This is where I think the chain of logic makes a misstep. It is assumed that you will be able to distinguish evidence which should change your mind from evidence that is not sufficient to change your mind. But doing so is not trivial. Especially in complicated fields, simply being able to understand new evidence enough to update on it is a task that can require significant education.
I would not encourage a layperson to have an opinion on the quantization of gravity, regardless of how willing they might be to update based on new evidence, because they’re not going to be able to understand new evidence. And that’s assuming they can even understand the issue well enough to have a coherent opinion at all. I do work pretty adjacent to the field of quantized gravity and I barely understand the issue well enough to grasp the different positions. I wouldn’t trust myself to meaningfully update based on new papers (beyond how the authors of the papers tell me to update), let alone a layperson.
The capacity to change a wrong belief is more than just the will to do so. And in cases where one cannot reliably interpret data well enough to reject wrong beliefs, it is incredibly important to not hold beliefs. Instead cultivate good criteria for trusting relevant authority figures or, lacking trusted authority figures, simply acknowledge your ignorance and that any decision you make will be rooted in loose guesswork.
“Not if they change their minds when confronted with the evidence.”
“Would you do that?”
“Yeah.”
This is where I think the chain of logic makes a misstep. It is assumed that you will be able to distinguish evidence which should change your mind from evidence that is not sufficient to change your mind. But doing so is not trivial. Especially in complicated fields, simply being able to understand new evidence enough to update on it is a task that can require significant education.
I would not encourage a layperson to have an opinion on the quantization of gravity, regardless of how willing they might be to update based on new evidence, because they’re not going to be able to understand new evidence. And that’s assuming they can even understand the issue well enough to have a coherent opinion at all. I do work pretty adjacent to the field of quantized gravity and I barely understand the issue well enough to grasp the different positions. I wouldn’t trust myself to meaningfully update based on new papers (beyond how the authors of the papers tell me to update), let alone a layperson.
The capacity to change a wrong belief is more than just the will to do so. And in cases where one cannot reliably interpret data well enough to reject wrong beliefs, it is incredibly important to not hold beliefs. Instead cultivate good criteria for trusting relevant authority figures or, lacking trusted authority figures, simply acknowledge your ignorance and that any decision you make will be rooted in loose guesswork.