That’s influencing the real world, though. Using condoms can be fulfilling the agent’s goal period, no cheating involved. The donkey learning to take the carrot without trodding up the mountain. Certainly, there are evolutionary reasons why sex has become incentivized, but an individual human does not need to have the goal to procreate or care about that evolutionary background, and isn’t wireheading itself simply by using a condom.
Presumably, in a Clippy-type agent, the goal of maximizing the number of paperclips wouldn’t be part of the historical influences on that agent (as procreation was for humans, it is not necessarily a “hard wired goal”, see childfree folks), but it would be an actual, explicitly encoded/incentivized goal.
(Also, what is this “porn”? My parents told me it’s a codeword for computer viruses, so I always avoided those sites.)
but it would be an actual, explicitly encoded/incentivized goal.
The issue is that there is a weakness from arguments ad clippy—you assume that such goal is realisable, to make the argument that there is no absolute morality because that goal won’t converge onto something else. This does nothing to address the question whenever clippy can be constructed at all; if the moral realism is true, clippy can’t be constructed or can’t be arbitrarily intelligent (in which case it is no more interesting than a thermostat which has the goal of keeping constant temperature and won’t adopt any morality).
Humans also have a grasp of the real world enough to invent condoms and porn, circumventing the natural hard wired goal.
That’s influencing the real world, though. Using condoms can be fulfilling the agent’s goal period, no cheating involved. The donkey learning to take the carrot without trodding up the mountain. Certainly, there are evolutionary reasons why sex has become incentivized, but an individual human does not need to have the goal to procreate or care about that evolutionary background, and isn’t wireheading itself simply by using a condom.
Presumably, in a Clippy-type agent, the goal of maximizing the number of paperclips wouldn’t be part of the historical influences on that agent (as procreation was for humans, it is not necessarily a “hard wired goal”, see childfree folks), but it would be an actual, explicitly encoded/incentivized goal.
(Also, what is this “porn”? My parents told me it’s a codeword for computer viruses, so I always avoided those sites.)
The issue is that there is a weakness from arguments ad clippy—you assume that such goal is realisable, to make the argument that there is no absolute morality because that goal won’t converge onto something else. This does nothing to address the question whenever clippy can be constructed at all; if the moral realism is true, clippy can’t be constructed or can’t be arbitrarily intelligent (in which case it is no more interesting than a thermostat which has the goal of keeping constant temperature and won’t adopt any morality).