I think many of us “rationalists” here would agree that rationality is a tool for assessing and manipulating reality. I would say much the same about morality. There’s not really a dichotomy between morality being “grounded on evolved behavioral patterns” and having “a computational basis implemented somewhere in the brain and accessed through the conscious mind as an intuition”. Rather, the moral intuitions we have are computed in our brains, and the form of that computation is determined both by the selection pressures of evolution and the ways that our evolved brain structures interact with our various environments.
So what is our highest priority here? It’s neither Rationality nor Truth, but Morality in the broad sense—the somewhat arbitrary and largely incoherent set of states of reality that our moral intuition prefers. I say arbitrary because our moral intuition does not aim entirely at the optimization target of the evolutionary process that generated it—propagating our genes. Call that moral relativism if you want to.
There’s not really a dichotomy between morality being “grounded on evolved behavioral patterns” and having “a computational basis implemented somewhere in the brain and accessed through the conscious mind as an intuition”.
There is a difference. Computing a moral axiom is not the same as encoding it. With computation the moral value would be an intrinsic property of some kind of mathematical structure. An encoding on the other hand is an implementation of an environmental adaptation as behavior based on selection pressure. It does not contain an implicit rational justification but it is objective in the sense of it being adapted to an external reality.
Moral value is not an “intrinsic property” of a mathematical structure—aliens couldn’t look at this mathematical structure and tell that it was morally important. And yet, whenever we compute something, there is a corresponding abstract structure. And when we reason about morality, we say that what is right wouldn’t change if you gave us brain surgery, so by morality we don’t mean “whatever we happen to think,” we mean that abstract structure.
Meanwhile, we are actual evolved mammals, and the reason we think what we do about morality is because of evolution, culture, and chance, in that order. I’m not sure what the point is of calling this objective or not, but it definitely has reasons for being how it is. But maybe you can see how this evolved morality can also be talked about as an abstract structure, and therefore both of these paragraphs can be true at the same time.
It seems like you were looking for things with “intrinsic properties” and “objective”-ness that we don’t much care about, and maybe this is why the things you were thinking of were incompatible, but the things we’re thinking of are compatible.
Meanwhile, we are actual evolved mammals, and the reason we think what we do about morality is because of evolution, culture, and chance, in that order. I’m not sure what the point is of calling this objective or not, but it definitely has reasons for being how it is.
It seems like you were looking for things with “intrinsic properties” and “objective”-ness that we don’t much care about..
I was making a comment on the specific points of dogiv but the discussion is about trying to discover if morality 1) has an objective basis or is completely relative and 2) it has a rational/computational basis or not. Is it that you don’t care about approaching truth on this matter, or that you believe you already know the answer?
In any case my main point is that Jordan Peterson’s perspective is (in my opinion) the most rational, cohesive and supported by evidence available and would love to see the community taking the time to study it, understand it and try to dispute it properly.
Nevertheless, I know not everyone has the time for that so If you expand on your perspective on this ‘abstract structure’ and its basis we can debate :)
I think many of us “rationalists” here would agree that rationality is a tool for assessing and manipulating reality. I would say much the same about morality. There’s not really a dichotomy between morality being “grounded on evolved behavioral patterns” and having “a computational basis implemented somewhere in the brain and accessed through the conscious mind as an intuition”. Rather, the moral intuitions we have are computed in our brains, and the form of that computation is determined both by the selection pressures of evolution and the ways that our evolved brain structures interact with our various environments.
So what is our highest priority here? It’s neither Rationality nor Truth, but Morality in the broad sense—the somewhat arbitrary and largely incoherent set of states of reality that our moral intuition prefers. I say arbitrary because our moral intuition does not aim entirely at the optimization target of the evolutionary process that generated it—propagating our genes. Call that moral relativism if you want to.
There is a difference. Computing a moral axiom is not the same as encoding it. With computation the moral value would be an intrinsic property of some kind of mathematical structure. An encoding on the other hand is an implementation of an environmental adaptation as behavior based on selection pressure. It does not contain an implicit rational justification but it is objective in the sense of it being adapted to an external reality.
Moral value is not an “intrinsic property” of a mathematical structure—aliens couldn’t look at this mathematical structure and tell that it was morally important. And yet, whenever we compute something, there is a corresponding abstract structure. And when we reason about morality, we say that what is right wouldn’t change if you gave us brain surgery, so by morality we don’t mean “whatever we happen to think,” we mean that abstract structure.
Meanwhile, we are actual evolved mammals, and the reason we think what we do about morality is because of evolution, culture, and chance, in that order. I’m not sure what the point is of calling this objective or not, but it definitely has reasons for being how it is. But maybe you can see how this evolved morality can also be talked about as an abstract structure, and therefore both of these paragraphs can be true at the same time.
It seems like you were looking for things with “intrinsic properties” and “objective”-ness that we don’t much care about, and maybe this is why the things you were thinking of were incompatible, but the things we’re thinking of are compatible.
I was making a comment on the specific points of dogiv but the discussion is about trying to discover if morality 1) has an objective basis or is completely relative and 2) it has a rational/computational basis or not. Is it that you don’t care about approaching truth on this matter, or that you believe you already know the answer?
In any case my main point is that Jordan Peterson’s perspective is (in my opinion) the most rational, cohesive and supported by evidence available and would love to see the community taking the time to study it, understand it and try to dispute it properly.
Nevertheless, I know not everyone has the time for that so If you expand on your perspective on this ‘abstract structure’ and its basis we can debate :)