Hmm. I tend to frame deontololgical moral/decision frameworks as evolved heuristics from consequentialism, with lost purposes as to how they came about, and some loss of future-optimization flexibility from that, but also some serious advantages from that in that it reduces computability paralysis AND motivated cognition to justify worse behaviors. So, not “correct”, but “performs better than correct for many parts of normal life”.
The recent discussion of acausal human decisions (which I think is incorrect) has made me wonder—is deontology a form of acausal thinking? “Do this because it’s right”, as distinct from “do this because it makes the world better according to your values”, is a pretty clear denial of causality.
I think this works if you use the adjudicator framing of acausal coordination, where it’s shared ideas that coordinate people, but people don’t themselves coordinate each other. Deontological principles and norms are such shared ideas, though I think mere words/concepts also count.
This becomes a form of consequentialism when the shared ideas are themselves agents, doing consequentialist decision making. Here, asking “What is right?” becomes asking what the updateless policy of the Ideal of Right (as an agent that acts as an adjudicator between people) would say about responding to your circumstance, taking into account the willingness and ability of people in various circumstances to listen to what the Ideal of Right has to say, as they jointly channel its will.
Hmm. I tend to frame deontololgical moral/decision frameworks as evolved heuristics from consequentialism, with lost purposes as to how they came about, and some loss of future-optimization flexibility from that, but also some serious advantages from that in that it reduces computability paralysis AND motivated cognition to justify worse behaviors. So, not “correct”, but “performs better than correct for many parts of normal life”.
The recent discussion of acausal human decisions (which I think is incorrect) has made me wonder—is deontology a form of acausal thinking? “Do this because it’s right”, as distinct from “do this because it makes the world better according to your values”, is a pretty clear denial of causality.
I think this works if you use the adjudicator framing of acausal coordination, where it’s shared ideas that coordinate people, but people don’t themselves coordinate each other. Deontological principles and norms are such shared ideas, though I think mere words/concepts also count.
This becomes a form of consequentialism when the shared ideas are themselves agents, doing consequentialist decision making. Here, asking “What is right?” becomes asking what the updateless policy of the Ideal of Right (as an agent that acts as an adjudicator between people) would say about responding to your circumstance, taking into account the willingness and ability of people in various circumstances to listen to what the Ideal of Right has to say, as they jointly channel its will.