But maybe not, but if you don’t see a moral system which leads one to regard oneself as morally flawed as having an inherent punitive element, I’m going to question whether you experience morality, or whether you just think about it.
Wow. Now I’m curious whether your moral framwork applies only to yourself, or to all people, or to all people who “experience morality” similarly to you.
I do primarily use system 2 for moral evaluations—my gut reactions tend to be fairly short-term and selfish for my reasoned preferences. And I do recognize that I (and all known instances of a moral agent) am flawed—I sometimes do things that I don’t think are best, and my reasons for the failures aren’t compelling to me.
Wow. Now I’m curious whether your moral framwork applies only to yourself, or to all people, or to all people who “experience morality” similarly to you.
Mu? It applies to whoever thinks it useful.
I think morality is an experience, which people have greater or lesser access to; I don’t think it is actually meaningful to judge other people’s morality. Insofar as you judge other people immoral, I think you’re missing the substantive nature of morality in favor of a question of whether or not other people sufficiently maximize your values.
If this seems like an outlandish claim, consider whether or not a hurricane is moral or immoral. Okay, the hurricane isn’t making decisions—really morality is about how we make decisions. Well, separating it from decision theory—that is, assuming morality is in fact distinct from decision theory—it is not in fact about how decisions are arrived at. So what is it about?
Consider an unspecified animal. Is it a moral agent? Okay, what if I specify that the animal can experience guilt?
I’d say morality is a cluster of concepts we have which are related to a specific set of experiences we have in making decisions. These experiences are called “moral intuition”; morality is the exercise in figuring out the common elements, the common values, which give rise to these experiences, such that we can, for example, feel guilt, and figuring out a way of living which is in harmony with these values, such that we improve our personal well-being with respect to those experiences.
If your moral system leads to a reduced personal well-being with respect to those experiences in spite of doing your relative best—that is, if your moral system makes you feel fundamentally flawed in an unfixable way—then I think your moral system is faulty. It’s making you miserable for no reason.
I think morality is an experience, which people have greater or lesser access to;
Interesting. I’ll have to think on that. My previous conception of the topic is that it’s a focal topic for a subset of decision theory—it’s a lens to look at which predictions and payouts should be considered for impact on other people.
Wow. Now I’m curious whether your moral framwork applies only to yourself, or to all people, or to all people who “experience morality” similarly to you.
I do primarily use system 2 for moral evaluations—my gut reactions tend to be fairly short-term and selfish for my reasoned preferences. And I do recognize that I (and all known instances of a moral agent) am flawed—I sometimes do things that I don’t think are best, and my reasons for the failures aren’t compelling to me.
Mu? It applies to whoever thinks it useful.
I think morality is an experience, which people have greater or lesser access to; I don’t think it is actually meaningful to judge other people’s morality. Insofar as you judge other people immoral, I think you’re missing the substantive nature of morality in favor of a question of whether or not other people sufficiently maximize your values.
If this seems like an outlandish claim, consider whether or not a hurricane is moral or immoral. Okay, the hurricane isn’t making decisions—really morality is about how we make decisions. Well, separating it from decision theory—that is, assuming morality is in fact distinct from decision theory—it is not in fact about how decisions are arrived at. So what is it about?
Consider an unspecified animal. Is it a moral agent? Okay, what if I specify that the animal can experience guilt?
I’d say morality is a cluster of concepts we have which are related to a specific set of experiences we have in making decisions. These experiences are called “moral intuition”; morality is the exercise in figuring out the common elements, the common values, which give rise to these experiences, such that we can, for example, feel guilt, and figuring out a way of living which is in harmony with these values, such that we improve our personal well-being with respect to those experiences.
If your moral system leads to a reduced personal well-being with respect to those experiences in spite of doing your relative best—that is, if your moral system makes you feel fundamentally flawed in an unfixable way—then I think your moral system is faulty. It’s making you miserable for no reason.
Interesting. I’ll have to think on that. My previous conception of the topic is that it’s a focal topic for a subset of decision theory—it’s a lens to look at which predictions and payouts should be considered for impact on other people.