No, it’s not meaningful to put a prior probability on it, unless you seriously think something like absolute morality exists. Having said that, the prior for “killing animals is wrong” is still higher than the prior for the God of Abraham existing.
Note that Bayesian probability is not absolute, so it’s not appropriate to demand absolute morality in order to put probabilities on moral claims. You just need a meaningful (subjective) concept of morality. This holds for any concept one can consider, any statement can be assigned a subjective probability, and morality isn’t an exceptional special case.
If morality is a fixed computation, you can place probabilities on possible outputs of that computation (or more concretely, on possible outputs of an extrapolation of your or humanity’s volition).
No, it’s not meaningful to put a prior probability on it, unless you seriously think something like absolute morality exists. Having said that, the prior for “killing animals is wrong” is still higher than the prior for the God of Abraham existing.
Note that Bayesian probability is not absolute, so it’s not appropriate to demand absolute morality in order to put probabilities on moral claims. You just need a meaningful (subjective) concept of morality. This holds for any concept one can consider, any statement can be assigned a subjective probability, and morality isn’t an exceptional special case.
If morality is a fixed computation, you can place probabilities on possible outputs of that computation (or more concretely, on possible outputs of an extrapolation of your or humanity’s volition).