If importance were objective, then a Clippie could realise that paperclips are unimporant. The OT then comes down to intrinsic moral motivation, ie whether a clippie could realise the importance without being online to act on it.
OT implies the possibility of oracle AI, but the falsehood of OT does not imply they falsehood of oracle .AI. If OT is false , then some or no combinations of goals and intelligence are possible. Oracle .AI could still fall in the set of limited combinations.
OT does not imply that MR is false, any more than MR implies OT is false. The intuitoveness of oracle .AI does not support the OT for reasons given above.
Moral realists are not obviously in need of a definition of objecti8ve truth, anymore than physicalists are. They may be in need of a epistemology to explain how it is arrived at, it’s justification.
It is uncontentious that physical and mathematical facts do not compel all minds. Objective truth is not unconditional compulsion.
Moral realists do not have to, and often do not, claim that there is anything special about the truth or justification of their claims: at least, you have the burden to justify the claim that moral realists have the special about notion of truth.
The fact that some people are more dogmatic about their moral beliefs than proper epistemology would allow is no argument against MR. Dogmatism, confirmation bias, is widespread. Much what has been believe by scientists and rationalists has been wrong. If you had been born 4000 years go, you would have little e evidence of objective truth mathematical or physical truth .
Your steel manning of MR is fair enough. (It would have helped to emphasize that high level principles, .such as”don’t annoy people” are more defensible than fine grained stuff like “don’t scrape your fingernails across black board”) It is not as fair as reading and commenting on an actual moral realist. (Lesswrong is Lesswrong)
You are possibly the first person in the world to do think that morality has something to do with your copies. (By definition, you cannot interact with your MWI counterparts)
Reducing the hhuge set of possibilities is not so f.ar away from Gurus CEV: nor is it so far away from utilitariamsim. I don’t think that either is obviously true and I don’t think either is obviously false. It’s an open question.
If importance were objective, then a Clippie could realise that paperclips are unimporant. The OT then comes down to intrinsic moral motivation, ie whether a clippie could realise the importance without being online to act on it.
OT implies the possibility of oracle AI, but the falsehood of OT does not imply they falsehood of oracle .AI. If OT is false , then some or no combinations of goals and intelligence are possible. Oracle .AI could still fall in the set of limited combinations.
OT does not imply that MR is false, any more than MR implies OT is false. The intuitoveness of oracle .AI does not support the OT for reasons given above.
Moral realists are not obviously in need of a definition of objecti8ve truth, anymore than physicalists are. They may be in need of a epistemology to explain how it is arrived at, it’s justification.
It is uncontentious that physical and mathematical facts do not compel all minds. Objective truth is not unconditional compulsion.
Moral realists do not have to, and often do not, claim that there is anything special about the truth or justification of their claims: at least, you have the burden to justify the claim that moral realists have the special about notion of truth.
The fact that some people are more dogmatic about their moral beliefs than proper epistemology would allow is no argument against MR. Dogmatism, confirmation bias, is widespread. Much what has been believe by scientists and rationalists has been wrong. If you had been born 4000 years go, you would have little e evidence of objective truth mathematical or physical truth .
Your steel manning of MR is fair enough. (It would have helped to emphasize that high level principles, .such as”don’t annoy people” are more defensible than fine grained stuff like “don’t scrape your fingernails across black board”) It is not as fair as reading and commenting on an actual moral realist. (Lesswrong is Lesswrong)
You are possibly the first person in the world to do think that morality has something to do with your copies. (By definition, you cannot interact with your MWI counterparts)
Reducing the hhuge set of possibilities is not so f.ar away from Gurus CEV: nor is it so far away from utilitariamsim. I don’t think that either is obviously true and I don’t think either is obviously false. It’s an open question.