Is your purpose to describe the underlying moral issue or what different issues feel like?
For instance:
I feel morally certain but in reality would change my view if strong evidence were presented.
I feel morally certain and won’t change my mind.
I feel descriptive uncertainty, and if I just read the right moral formulation I would agree that it described me perfectly
I feel descriptive uncertainty, but actually have a deep internal conflict
I feel deep internal conflict, and am right about it
I feel deep internal conflict, but more epistemic information would resolve the issue
I think the issue may be that you’re trying to categorize “how it feels now” with “my underlying morality, to which I have limited access” in the same system. Maybe two sets of categories are needed? For instance, the top level system can experience descriptive uncertainty, but the underlying reality cannot.
ETA: Here’s my attempt at extending the categories to cover both conscious feelings and the underlying reality.
Conscious moral states:
Moral certainty—I feel like I know the answer with no serious reservations
Basic moral uncertainty—I feel like I don’t know how to tackle the problem at all
Descriptive moral uncertainty—I feel like I know this, but can’t come up with a good description
Epistemic moral uncertainty—I feel like I need more information to figure it out
Conflicted moral uncertainty—I feel like there are two values competing
Subconscious (real? territory?) moral states:
Moral certainty—I have a clear algorithm for this problem
Basic moral uncertainty—I just don’t have an answer at all
Conflicted moral uncertainty—I have two or more systems which give different answers
Epistemic moral uncertainty—I need more information to give a confident answer
Is your purpose to describe the underlying moral issue or what different issues feel like?
For instance:
I feel morally certain but in reality would change my view if strong evidence were presented.
I feel morally certain and won’t change my mind.
I feel descriptive uncertainty, and if I just read the right moral formulation I would agree that it described me perfectly
I feel descriptive uncertainty, but actually have a deep internal conflict
I feel deep internal conflict, and am right about it
I feel deep internal conflict, but more epistemic information would resolve the issue
I think the issue may be that you’re trying to categorize “how it feels now” with “my underlying morality, to which I have limited access” in the same system. Maybe two sets of categories are needed? For instance, the top level system can experience descriptive uncertainty, but the underlying reality cannot.
ETA: Here’s my attempt at extending the categories to cover both conscious feelings and the underlying reality.
Conscious moral states:
Moral certainty—I feel like I know the answer with no serious reservations
Basic moral uncertainty—I feel like I don’t know how to tackle the problem at all
Descriptive moral uncertainty—I feel like I know this, but can’t come up with a good description
Epistemic moral uncertainty—I feel like I need more information to figure it out
Conflicted moral uncertainty—I feel like there are two values competing
Subconscious (real? territory?) moral states:
Moral certainty—I have a clear algorithm for this problem
Basic moral uncertainty—I just don’t have an answer at all
Conflicted moral uncertainty—I have two or more systems which give different answers
Epistemic moral uncertainty—I need more information to give a confident answer
Interesting. Your extended categorization seems like it could very possibly be useful—I’ll have to think about it some more.