If there is anything that anyone should in fact do, then I would say that meets the standards of “realism.”
Does “anyone” refer to any human, or any possible being?
Sorry, I should have been clearer. I mean to say: “If there exists at least one entity, such that the entity should do something, then that meets the standards of ‘realism.’”
I understand “moral realism” as a claim that there is a sequence of clever words that would convince the superintelligent spider that reducing human suffering is a good thing.
I don’t think I’m aware of anyone who identifies as a “moral realist” who believes this. At least, it’s not part of a normal definition of “moral realism.”
The term “moral realism” is used differently by different people, but typically it’s either used roughly synonymously with “normative realism” (as I’ve defined it in this post) or to pick out a slightly more specific position: that normative realism is true and that people should do things besides just try to fulfill their own preferences.
Sorry, I should have been clearer. I mean to say: “If there exists at least one entity, such that the entity should do something, then that meets the standards of ‘realism.’”
I don’t think I’m aware of anyone who identifies as a “moral realist” who believes this. At least, it’s not part of a normal definition of “moral realism.”
The term “moral realism” is used differently by different people, but typically it’s either used roughly synonymously with “normative realism” (as I’ve defined it in this post) or to pick out a slightly more specific position: that normative realism is true and that people should do things besides just try to fulfill their own preferences.
Some people seem to believe that about artificial intelligence. (Which will likely be more different from us than spiders are.)