To apply it to a more manageable example, my beliefs about psychological sex differences in humans have changed considerably over both long and short timescales, to the point where I actively anticipate having different beliefs about them in the near future. In spite of this, I have no way of knowing which of those beliefs I’m going to demote or reject in future, because if I had such information it would be factored into the beliefs themselves.
Beliefs about facts that were extensively studied probably won’t change, unless I expect new observations to be made that resolve some significant uncertainty. For example, special relativity and population of USA in 2007 will stay about the same, while my belief about USD:EUR ratio in 2011 will change in 2011, updating with actual observations. I don’t see any problem with being able to distinguish such cases, it always comes down to whether I expect new observations/inferences to be made.
Your second paragraph still sounds to me as if you continue to make the mistake I pointed out. You can’t know how your beliefs will change (become stronger or become weaker), but you can know that certain beliefs will probably change (in one of these directions). So, you can’t know which belief you’ll accept in the future, but you can know that the level of certainty in a given belief will probably shift.
I got the sense that the question is asking you to look for beliefs you predict will change for the worse. So, you can’t predict which direction your beliefs will change in, but if you have an inkling that one will go in the direction of “false”, then that is some sort of warning sign:
You haven’t thought the belief through fully, so you are semi-aware there might be contradictions down the line you haven’t encountered yet, or
You haven’t considered all the evidence fully, so you are semi-aware that there might be a small amount of very strong evidence against the belief, or
You have privileged your hypothesis, and you are semi-aware there might be explanations that fit the evidence better, or
You are semi-aware that you have done one of these things, but don’t know which because you haven’t thought about it.
In any case, your motivated cognition has let you believe the belief, but motivated cognition doesn’t feel precisely like exhaustive double-checking, and a question like this tries to find that feeling.
I got the sense that the question is asking you to look for beliefs you predict will change for the worse.
Er, no, I more meant beliefs that you’ll change for the better. For example, some people find themselves flip-flipping from one fad or intellectual community to the next, each time being very enthusiastic about the new set of ideas. In such cases, their friends can often predict that later on their beliefs will move back toward their normal beliefs, and so the individual probably can too.
This was sort of what I was aiming for. Evidence saying you’re going to change your mind about something should be the same as evidence for changing your mind about something.
I think the question has implied acceptance of this.
Then, could you describe your idea in more detail?
Well, how would you answer the question?
To apply it to a more manageable example, my beliefs about psychological sex differences in humans have changed considerably over both long and short timescales, to the point where I actively anticipate having different beliefs about them in the near future. In spite of this, I have no way of knowing which of those beliefs I’m going to demote or reject in future, because if I had such information it would be factored into the beliefs themselves.
Beliefs about facts that were extensively studied probably won’t change, unless I expect new observations to be made that resolve some significant uncertainty. For example, special relativity and population of USA in 2007 will stay about the same, while my belief about USD:EUR ratio in 2011 will change in 2011, updating with actual observations. I don’t see any problem with being able to distinguish such cases, it always comes down to whether I expect new observations/inferences to be made.
Your second paragraph still sounds to me as if you continue to make the mistake I pointed out. You can’t know how your beliefs will change (become stronger or become weaker), but you can know that certain beliefs will probably change (in one of these directions). So, you can’t know which belief you’ll accept in the future, but you can know that the level of certainty in a given belief will probably shift.
I don’t think I’m making a mistake. I think we’re agreeing.
I don’t have an understanding of that, but don’t think it’s worth pursuing further.
I got the sense that the question is asking you to look for beliefs you predict will change for the worse. So, you can’t predict which direction your beliefs will change in, but if you have an inkling that one will go in the direction of “false”, then that is some sort of warning sign:
You haven’t thought the belief through fully, so you are semi-aware there might be contradictions down the line you haven’t encountered yet, or
You haven’t considered all the evidence fully, so you are semi-aware that there might be a small amount of very strong evidence against the belief, or
You have privileged your hypothesis, and you are semi-aware there might be explanations that fit the evidence better, or
You are semi-aware that you have done one of these things, but don’t know which because you haven’t thought about it.
In any case, your motivated cognition has let you believe the belief, but motivated cognition doesn’t feel precisely like exhaustive double-checking, and a question like this tries to find that feeling.
Er, no, I more meant beliefs that you’ll change for the better. For example, some people find themselves flip-flipping from one fad or intellectual community to the next, each time being very enthusiastic about the new set of ideas. In such cases, their friends can often predict that later on their beliefs will move back toward their normal beliefs, and so the individual probably can too.
This was sort of what I was aiming for. Evidence saying you’re going to change your mind about something should be the same as evidence for changing your mind about something.