You see morality as something beyond just a choice? Yes, once you set down some principles, you rationally derive the morality of various situations from that, but the laying down of the original principal, where does that come from? It is just declared, is it not? “I hold this truth to be self-evident, that X is a moral principle.”
If I’m understanding what you’re saying here correctly (which is far from certain), I agree with you that what moral systems I endorse depend on what I assign value to. If that’s all you meant by morality not being fundamentally rational, then I don’t think we disagree on anything here.
Cool! I assume what I assign value to is largely determined by evolution. That my ancestors who had very different inborn value systems didn’t make it (and so are not really my ancestors) and the values I have are the ones that produced a coherent and effective organized cadre of humans who could then outcompete the other humans for control over the resources, as a group.
To me it seems irrational to assign much weight to these inborn values. I can’t say I know what my choice is, what my alternative is. But an example of the kind of irrationality that i see is cryonics, and the deification, it seems to me, of the individual life. I suppose evolution built into each of us a fear of death and a drive to survive. Our potential ancestors that didn’t have that as strongly lost out to our actual ancestors, that at least makes sense. But why would I just “internalize” that value? I can see it came about as a direction, not a true end. All my ancestors no matter how strong their drive to survive have died. Indeed without their dying, evolution would have stopped, or slowed down gigantically. So one might also say my candidate=ancestors that didn’t die as readily are also not my true ancestors, my true ancestors are the ones who wanted to live but didnt and so evolved a bit faster and beat the slower-evolving longer-lived groups.
The world “wants” brilliant and active minds. It is not at all clear it benefits more, or even equally, from an old frozen mind as it does from a new mind getting filled from the start with new stuff and having the peculiar plasticities that newer minds have.
It is clear to me that the reason I am afraid of death is because I was bred to be afraid of death.
It is THIS sense in which I say our values are irrational. They are the result of evolution. Further, a lot of those values evolved when our neocortexes were a lot less effective: I believe what we have are mammalian and primate values, that the values part of our brain has been evolving for millions of years that were social not just the 100s of thousands of years that our neocortex was so bitchin’.
So to me just using rationality to advance my inbred values would be like using modern materials science to build a beautiful temple to Neptune to improve our ocean-faring commerce.
That values, regardless of their source, may be the only motivational game in town is not evidence that it makes sense to exalt them more than they already assert themselves. Rather the opposite I would imagine, it would make it likely to be valuable to question them, to reverse engineer nature’s purposes in giving them to us.
Agreed that our inborn values are the result of our evolutionary heritage. Of course, so is the system that we use to decide whether to optimize for those values or some other set.
If I reject what I model as my evolution-dictated value system (hereafter EDV) in favor of some other value set, I don’t thereby somehow separate myself from my evolutionary heritage. It’s not clear to me that there’s any way to do that, or that there’s any particular reason to do so if there were.
I happen to value consistency, and it’s easier to get at least a superficial consistency if I reject certain subsets of EDV, but if I go too far in that direction I end up with a moral system that’s inconsistent with my actual behaviors. So I try to straddle that line of maximum consistency. But that’s me.
Agreed that it’s not clear that my continued existence provides more value to the world than various theoretically possible alternatives. Then again, it’s also not clear that my continued existence is actually in competition with those alternatives. And if world A has everything else I want + me remaining alive, and world B has everything else I want to the same extent + me not alive, I see no reason to choose B over A.
Agreed that it’s valuable to understand the mechanisms (both evolutionary and cognitive) whereby we come to hold the values we hold.
You see morality as something beyond just a choice? Yes, once you set down some principles, you rationally derive the morality of various situations from that, but the laying down of the original principal, where does that come from? It is just declared, is it not? “I hold this truth to be self-evident, that X is a moral principle.”
If I have that wrong, I would LOVE to know it!
If I’m understanding what you’re saying here correctly (which is far from certain), I agree with you that what moral systems I endorse depend on what I assign value to. If that’s all you meant by morality not being fundamentally rational, then I don’t think we disagree on anything here.
Cool! I assume what I assign value to is largely determined by evolution. That my ancestors who had very different inborn value systems didn’t make it (and so are not really my ancestors) and the values I have are the ones that produced a coherent and effective organized cadre of humans who could then outcompete the other humans for control over the resources, as a group.
To me it seems irrational to assign much weight to these inborn values. I can’t say I know what my choice is, what my alternative is. But an example of the kind of irrationality that i see is cryonics, and the deification, it seems to me, of the individual life. I suppose evolution built into each of us a fear of death and a drive to survive. Our potential ancestors that didn’t have that as strongly lost out to our actual ancestors, that at least makes sense. But why would I just “internalize” that value? I can see it came about as a direction, not a true end. All my ancestors no matter how strong their drive to survive have died. Indeed without their dying, evolution would have stopped, or slowed down gigantically. So one might also say my candidate=ancestors that didn’t die as readily are also not my true ancestors, my true ancestors are the ones who wanted to live but didnt and so evolved a bit faster and beat the slower-evolving longer-lived groups.
The world “wants” brilliant and active minds. It is not at all clear it benefits more, or even equally, from an old frozen mind as it does from a new mind getting filled from the start with new stuff and having the peculiar plasticities that newer minds have.
It is clear to me that the reason I am afraid of death is because I was bred to be afraid of death.
It is THIS sense in which I say our values are irrational. They are the result of evolution. Further, a lot of those values evolved when our neocortexes were a lot less effective: I believe what we have are mammalian and primate values, that the values part of our brain has been evolving for millions of years that were social not just the 100s of thousands of years that our neocortex was so bitchin’.
So to me just using rationality to advance my inbred values would be like using modern materials science to build a beautiful temple to Neptune to improve our ocean-faring commerce.
That values, regardless of their source, may be the only motivational game in town is not evidence that it makes sense to exalt them more than they already assert themselves. Rather the opposite I would imagine, it would make it likely to be valuable to question them, to reverse engineer nature’s purposes in giving them to us.
Agreed that our inborn values are the result of our evolutionary heritage.
Of course, so is the system that we use to decide whether to optimize for those values or some other set.
If I reject what I model as my evolution-dictated value system (hereafter EDV) in favor of some other value set, I don’t thereby somehow separate myself from my evolutionary heritage. It’s not clear to me that there’s any way to do that, or that there’s any particular reason to do so if there were.
I happen to value consistency, and it’s easier to get at least a superficial consistency if I reject certain subsets of EDV, but if I go too far in that direction I end up with a moral system that’s inconsistent with my actual behaviors. So I try to straddle that line of maximum consistency. But that’s me.
Agreed that it’s not clear that my continued existence provides more value to the world than various theoretically possible alternatives. Then again, it’s also not clear that my continued existence is actually in competition with those alternatives. And if world A has everything else I want + me remaining alive, and world B has everything else I want to the same extent + me not alive, I see no reason to choose B over A.
Agreed that it’s valuable to understand the mechanisms (both evolutionary and cognitive) whereby we come to hold the values we hold.