@G: ” if ethics were all about avoiding “getting caught”, then the very idea that there could be an ethical “right thing to do” as opposed to what society wants one to do would be incoherent.”
Well, I don’t think anyone here actually asserted that the basis of ethics was avoiding getting caught, or even fear of getting caught. It seems to me that Eliezer posited an innate moral sense inhibiting risk-taking in the moral domain, and in my opinion this is more a reflection of his early childhood environment of development than any innate moral sense such as pride or disgust. Even though I think Eliezer was working from the wrong basis, I think he’s offered a valid observation on the apparent benefit of “deep wisdom” with regard to tending to avoid “black swans.”
But there seems to be an even more direct problem with your query, in that it’s strictly impractical in terms of the information model it would entail, that individual agents would somehow be equipped with the same model of “right” as the necessarily larger model supported by society.
Apologies in advance, but I’m going to bow out of this discussion now due to diminishing returns and sensitivity to our host.
@G: ” if ethics were all about avoiding “getting caught”, then the very idea that there could be an ethical “right thing to do” as opposed to what society wants one to do would be incoherent.”
Well, I don’t think anyone here actually asserted that the basis of ethics was avoiding getting caught, or even fear of getting caught. It seems to me that Eliezer posited an innate moral sense inhibiting risk-taking in the moral domain, and in my opinion this is more a reflection of his early childhood environment of development than any innate moral sense such as pride or disgust. Even though I think Eliezer was working from the wrong basis, I think he’s offered a valid observation on the apparent benefit of “deep wisdom” with regard to tending to avoid “black swans.”
But there seems to be an even more direct problem with your query, in that it’s strictly impractical in terms of the information model it would entail, that individual agents would somehow be equipped with the same model of “right” as the necessarily larger model supported by society.
Apologies in advance, but I’m going to bow out of this discussion now due to diminishing returns and sensitivity to our host.