If moral realism were true, then if something became super smart, so too would it realize that some things were worth pursuing.
The genie knows, but doesn’t care.
I agree that the AI would not, merely by virtue of being smart enough to understand what humans really had in mind, start pursuing that. But fortunately, this is not my argument—I am not Steven Pinker.
That’s not the entirety of the problem—the AI wouldn’t start pursuing objective morality merely by the virtue of knowing what it is.
Is a claim, not a fact.
The genie knows, but doesn’t care.
I agree that the AI would not, merely by virtue of being smart enough to understand what humans really had in mind, start pursuing that. But fortunately, this is not my argument—I am not Steven Pinker.
That’s not the entirety of the problem—the AI wouldn’t start pursuing objective morality merely by the virtue of knowing what it is.
Is a claim, not a fact.