how exactly would you distinguish the universe in which we live in from the universe in which human moral change was determined by something like a random walk through value space? Now naturally a random walk through value space dosen’t sound like something to which you are willing to outsource future moral and value development. But then why is unknown process X that happens to make you feel sort of good, because you like what its done so far, something which inspires so much confidence that you’d like a godlike AI to emulate (quite closely) its output?
What do you mean by ‘value space’ (any human values or desires?) and ‘moral change’ (generally desirable human values?)? Also, a godlike AI is like playing with infinities when inserted in your moral calculus; a godlike AI leading to horrible consequences per your morality doesn’t show that your morality was fully flawed (maybe there was a tiny loophole no human would try or be capable of exploiting). And I think you discount the possibility that there are many moral solutions like there are many solutions to chess (this is especially important when noting the impact of culture on values).
This is a possible case of a currently ongoing mild value change in the US:
What I am proposing here is that for most Americans multi-generational living is a means toward maintaining the lifestyle and values which they hold dear, but the shift itself may change that lifestyle and those values in deep and fundamental ways. The initial trigger here is economic, with the first-order causal effects sociological. But the downstream effects may also be economic, as Americans become less mobile and more familialist. What can we expect? Look abroad, and look to the past.
Is this “generally desirable human values”? Depends on the values you already hold. I certainly think radically undesirable moral change might occur, looking from the value sets of past humans we see it almost certainly would not be a freak occurrence.
My key argument is that we have very little idea bout the mechanics of moral change in real human societies. Thus future moral change is fundamentally unsafe from our current value set and we do not have the tools to do anything about it. Yet. Once we get them excluding us realizing the process we are currently chained to has some remarkable properties we like, we will probably do away with it.
If moral progress is a salvageable concept then we shall see it for the first time in the history of mankind. If not we will finally do away with the tragedy of being doomed to an alien future devoid of all we value.
Of course “we” is misleading. The society that eventually gets these tools might be one that has values that are quite worthless or even horrifying from our perspective.
What do you mean by ‘value space’ (any human values or desires?) and ‘moral change’ (generally desirable human values?)? Also, a godlike AI is like playing with infinities when inserted in your moral calculus; a godlike AI leading to horrible consequences per your morality doesn’t show that your morality was fully flawed (maybe there was a tiny loophole no human would try or be capable of exploiting). And I think you discount the possibility that there are many moral solutions like there are many solutions to chess (this is especially important when noting the impact of culture on values).
This is a possible case of a currently ongoing mild value change in the US:
Is this “generally desirable human values”? Depends on the values you already hold. I certainly think radically undesirable moral change might occur, looking from the value sets of past humans we see it almost certainly would not be a freak occurrence.
My key argument is that we have very little idea bout the mechanics of moral change in real human societies. Thus future moral change is fundamentally unsafe from our current value set and we do not have the tools to do anything about it. Yet. Once we get them excluding us realizing the process we are currently chained to has some remarkable properties we like, we will probably do away with it.
If moral progress is a salvageable concept then we shall see it for the first time in the history of mankind. If not we will finally do away with the tragedy of being doomed to an alien future devoid of all we value.
Of course “we” is misleading. The society that eventually gets these tools might be one that has values that are quite worthless or even horrifying from our perspective.