New research group on AI. Previously Longview Philanthropy; FHI. I do a podcast called Hear This Idea. finmoorhouse.com.
fin
Karma: 390
New research group on AI. Previously Longview Philanthropy; FHI. I do a podcast called Hear This Idea. finmoorhouse.com.
Hmm. Moral uncertainty definitely doesn’t assume moral realism. You could just have some credence in the possibility that there are no moral facts.
If instead by ‘essentialism’ you mean moral cognitivism (the view that moral beliefs can take truth values) then you’re right that moral uncertainty makes most sense under cognitivism. But non-cognitivist versions (where your moral beliefs are just expressions of preference, approval, or desire) also seem workable. I’m not sure what any of this has to do with ‘non-physical essences’ though. I think I know what you mean by that, but maybe you could clarify?
Interesting point about moral uncertainty favouring elegant theories. Not sure it’s necessarily true however — again, I could just have some credence in the possibility that a messy version of folk morality is true.