No, you are talking about a different property of beliefs, lack of stability to new information. I claim that because of lack of reflective understanding of the origins of the belief, you currently shouldn’t be certain, without any additional object-level arguments pointing out specific problems or arguments for an incompatible position.
I don’t quite understand. I am not currently certain, in the way I use the term. The way I think about moral question is by imaging some extrapolated version of myself, who has thought for long enough to arrive at stable beliefs. My confidence in a moral assertion is synonymous with my confidence that it is also held by this extrapolated version of myself. Then I am certain of a view precisely when my view is stable.
You can come to different conclusions depending on future observations, for example, in which case further reflection would not move your level of certainty, the belief would be stable, and yet you’d remain uncertain. For example, consider your belief about the outcome of a future coin toss: this belief is stable under reflection, but doesn’t claim certainty.
Generally, there are many ways in which you can (or should) make decisions or come to conclusions, your whole decision problem, all heuristics that make up your mind, can have a hand in deciding how any given detail of your mind should be.
(Also, being certain for the reason that you don’t expect to change your mind sounds like a bad idea, this could license arbitrary beliefs, since the future process of potentially changing your mind that you’re thinking about could be making the same calculation, locking in into a belief with no justification other than itself. This doesn’t obviously work this way only because you retain other, healthy reasons for making conclusions, so this particular wrong ritual washes out.)
I don’t quite understand. I am not currently certain, in the way I use the term. The way I think about moral question is by imaging some extrapolated version of myself, who has thought for long enough to arrive at stable beliefs. My confidence in a moral assertion is synonymous with my confidence that it is also held by this extrapolated version of myself. Then I am certain of a view precisely when my view is stable.
In what other way can I be certain or uncertain?
You can come to different conclusions depending on future observations, for example, in which case further reflection would not move your level of certainty, the belief would be stable, and yet you’d remain uncertain. For example, consider your belief about the outcome of a future coin toss: this belief is stable under reflection, but doesn’t claim certainty.
Generally, there are many ways in which you can (or should) make decisions or come to conclusions, your whole decision problem, all heuristics that make up your mind, can have a hand in deciding how any given detail of your mind should be.
(Also, being certain for the reason that you don’t expect to change your mind sounds like a bad idea, this could license arbitrary beliefs, since the future process of potentially changing your mind that you’re thinking about could be making the same calculation, locking in into a belief with no justification other than itself. This doesn’t obviously work this way only because you retain other, healthy reasons for making conclusions, so this particular wrong ritual washes out.)