This seems to imply that the relativists are right. Of course there’s no right way to sort pebbles, but if there really is an absolute morality that AIs are smart enough to find, then they’ll find it and rule us with it.
Of course, there could be an absolute morality that AIs aren’t smart enough to find either. Then we’d take pot luck. That might not be so good. Many humans believe that there is an absolute morality that governs their treatment of other human beings, but that no morality is required when dealing with lower animals, who lack souls and full intelligence and language etc. I would not find it implausible if AIs decided their morality demanded careful consideration of other AIs but had nothing to do with their treatment of humans, who after all are slow and stupid and might lack things AIs would have that we can’t even imagine.
And yet, attempts to limit AIs would surely give bad results. If you tell somebody “I want you to be smart, but you can only think about these topics and in these particular ways, and your smart thinking must only get these results and not those results” what’s the chance he’ll wind up stupid? When you tell people there are thoughts they must not think, how can they be sure not to think them except by trying to avoid thinking at all? When you think a new thought you can’t be sure where it will lead.
Seeing as the universe itself, on it’s most fundamental level seems to lack any absolutes, i.e. that it is purely a locality question, and that the only constants seem to be the ones embedded in the laws of physics, I am having trouble believing in absolute morality.
Like, of the “I am confused by this” variety.
To paraphrase “there is no term for fairness in the equations of general relativity.” You cannot derive morality from the absolute laws of the universe. You probably cannot even do it from mathematical truth.
I don’t think that they would tell the Als to not think things. When to them piling pebbles is all one should ever want to do. Its life to them so if you were super smart you would want to use to the only point in life.
This seems to imply that the relativists are right. Of course there’s no right way to sort pebbles, but if there really is an absolute morality that AIs are smart enough to find, then they’ll find it and rule us with it.
Of course, there could be an absolute morality that AIs aren’t smart enough to find either. Then we’d take pot luck. That might not be so good. Many humans believe that there is an absolute morality that governs their treatment of other human beings, but that no morality is required when dealing with lower animals, who lack souls and full intelligence and language etc. I would not find it implausible if AIs decided their morality demanded careful consideration of other AIs but had nothing to do with their treatment of humans, who after all are slow and stupid and might lack things AIs would have that we can’t even imagine.
And yet, attempts to limit AIs would surely give bad results. If you tell somebody “I want you to be smart, but you can only think about these topics and in these particular ways, and your smart thinking must only get these results and not those results” what’s the chance he’ll wind up stupid? When you tell people there are thoughts they must not think, how can they be sure not to think them except by trying to avoid thinking at all? When you think a new thought you can’t be sure where it will lead.
It’s a problem.
Seeing as the universe itself, on it’s most fundamental level seems to lack any absolutes, i.e. that it is purely a locality question, and that the only constants seem to be the ones embedded in the laws of physics, I am having trouble believing in absolute morality.
Like, of the “I am confused by this” variety.
To paraphrase “there is no term for fairness in the equations of general relativity.” You cannot derive morality from the absolute laws of the universe. You probably cannot even do it from mathematical truth.
You might want to read Least Convenient Possible World.
I don’t think that they would tell the Als to not think things. When to them piling pebbles is all one should ever want to do. Its life to them so if you were super smart you would want to use to the only point in life.