No, it’s expressing the paperclip maximizer’s state in ways that make sense to readers here. If you were to express the concept of being “bothered” in a way stripped of all anthropomorphic predicates, you would get something like “X is bothered by Y iff X has devoted significant cognitive resources to altering Y”. And this accurately describes how paperclip maximizers respond to new threats to paperclips. (So I’ve heard.)
I think “bothered” implies a negative emotional response, which some plausible paperclip-maximizers don’t have. From The True Prisoner’s Dilemma: “let us specify that the paperclip-agent experiences no pain or pleasure—it just outputs actions that steer its universe to contain more paperclips. The paperclip-agent will experience no pleasure at gaining paperclips, no hurt from losing paperclips, and no painful sense of betrayal if we betray it.”
I think “bothered” implies a negative emotional response, which some plausible paperclip-maximizers don’t have.
It was intended to imply a negative term in the utility function. Yes, using ‘bothered’ is, technically, anthropomorphising. But it isn’t, in this instance, being confused about how Clippy optimises.
I think “bothered” implies a negative emotional response, which some plausible paperclip-maximizers don’t have. From The True Prisoner’s Dilemma: “let us specify that the paperclip-agent experiences no pain or pleasure—it just outputs actions that steer its universe to contain more paperclips. The paperclip-agent will experience no pleasure at gaining paperclips, no hurt from losing paperclips, and no painful sense of betrayal if we betray it.”
It was intended to imply a negative term in the utility function. Yes, using ‘bothered’ is, technically, anthropomorphising. But it isn’t, in this instance, being confused about how Clippy optimises.