Seems like an appeal to ?false? authority. May not be a fallacy because there’s a demonstrable trend between technological superiority and moral superiority at least on Earth. Assuming that trend extends to other civilizations off Earth? I’m sure there’s something fallacious about that, maybe too geocentric.
It might generally be Moral Realism (anti-moral-relativism). The notion that morality is some universal objective truth that we gradually uncover more of as we grow wiser. That’s how those people usually conceive it.
I want to explain my downvoting this post. I think you are attacking a massive strawman by equating moral realism with [disagreeing with the orthogonality thesis].
Moral realism says that moral questions have objective answers. I’m almost certain this is true. The relevant form of the orthogonality thesis says that there exist minds such that intelligence is independent of goals. I’m almost certain this is true.
It does not say that intelligence is orthogonal to goals for all agents. Relevant quote from EY:
I mean, I would potentially object a little bit to the way that Nick Bostrom took the word “orthogonality” for that thesis. I think, for example, that if you have humans and you make the human smarter, this is not orthogonal to the humans’ values. It is certainly possible to have agents such that as they get smarter, what they would report as their utility functions will change. A paperclip maximizer is not one of those agents, but humans are.
Good comment, but… Have you read Three Worlds Collide? If you were in a situation similar to what it describes, would you still be calling your position moral realism?
I am not out to attack the position that humans fundamentally, generally align with humans. I don’t yet agree with it, its claim, “every moral question has a single true answer” might turn out to be a confused paraphrasing of “every war has a victor”, but I’m open to the possibility that it’s meaningfully true as well.
Good comment, but… Have you read Three Worlds Collide? If you were in a situation similar to what it describes, would you still be calling your position moral realism?
Yes and yes. I got very emotional when reading that. I thought rejecting the happiness… surgery or whatever it wast that the advanced alien species prescribed was blatantly insane.
Is there a name for intuition/fallacy that an advanced AI or alien race must also be morally superior?
I think you can refer the person to orthogonality thesis
Seems like an appeal to ?false? authority. May not be a fallacy because there’s a demonstrable trend between technological superiority and moral superiority at least on Earth. Assuming that trend extends to other civilizations off Earth? I’m sure there’s something fallacious about that, maybe too geocentric.
It might generally be Moral Realism (anti-moral-relativism). The notion that morality is some universal objective truth that we gradually uncover more of as we grow wiser. That’s how those people usually conceive it.
I sometimes call it anti-orthogonalism.
I want to explain my downvoting this post. I think you are attacking a massive strawman by equating moral realism with [disagreeing with the orthogonality thesis].
Moral realism says that moral questions have objective answers. I’m almost certain this is true. The relevant form of the orthogonality thesis says that there exist minds such that intelligence is independent of goals. I’m almost certain this is true.
It does not say that intelligence is orthogonal to goals for all agents. Relevant quote from EY:
And the wiki page Filipe Marchesini linked to also gets this right:
Good comment, but… Have you read Three Worlds Collide? If you were in a situation similar to what it describes, would you still be calling your position moral realism?
I am not out to attack the position that humans fundamentally, generally align with humans. I don’t yet agree with it, its claim, “every moral question has a single true answer” might turn out to be a confused paraphrasing of “every war has a victor”, but I’m open to the possibility that it’s meaningfully true as well.
Yes and yes. I got very emotional when reading that. I thought rejecting the happiness… surgery or whatever it wast that the advanced alien species prescribed was blatantly insane.