No, it’s expressing the paperclip maximizer’s state in ways that make sense to readers here. If you were to express the concept of being “bothered” in a way stripped of all anthropomorphic predicates, you would get something like “X is bothered by Y iff X has devoted significant cognitive resources to altering Y”. And this accurately describes how paperclip maximizers respond to new threats to paperclips. (So I’ve heard.)
It also depends on how the utility function relates to time. It it’s focused on end-of-universe paperclips, It might not care at all about melting paperclips, because it can recycle the metal later. (It would care more about the wasted energy!)
I don’t follow. Wasted energy is wasted paperclips.
If it cares about paperclip-seconds then it WOULD view such tactics as a bonus, perhaps feigning panic and granting token concessions to get you to ‘ransom’ a billion times as many paperclips, and then pleading for time to satisfy your demands.
Okay, that’s a decent point. Usually, such a direct “time value of paperclips” doesn’t come up, but if someone were to make such a offer, that might be convincing: 1 billion paperclips held “out of use” as ransom may be better than a guaranteed paperclip now.
Getting something analogous to threatening torture depends on a more precise understanding of what the paperclipper wants. …
Good examples. Similarly, a paperclip maximizer could, hypothetically, make a human-like mockup that just repetitively asks for help on how to create a table of contents in Word.
Tip: Use the shortcut alt+E,S in Word and Excel to do “paste special”. This lets you choose which aspects you want to carry over from the clipboard!
I don’t follow. Wasted energy is wasted paperclips.
But that has nothing to do with the paperclips you’re melting. Any other use that loses the same amount of energy would be just as threatening. (Although this does assume that the paperclipper thinks it can someday beat you and use that energy and materials.)
No, it’s expressing the paperclip maximizer’s state in ways that make sense to readers here. If you were to express the concept of being “bothered” in a way stripped of all anthropomorphic predicates, you would get something like “X is bothered by Y iff X has devoted significant cognitive resources to altering Y”. And this accurately describes how paperclip maximizers respond to new threats to paperclips. (So I’ve heard.)
I think “bothered” implies a negative emotional response, which some plausible paperclip-maximizers don’t have. From The True Prisoner’s Dilemma: “let us specify that the paperclip-agent experiences no pain or pleasure—it just outputs actions that steer its universe to contain more paperclips. The paperclip-agent will experience no pleasure at gaining paperclips, no hurt from losing paperclips, and no painful sense of betrayal if we betray it.”
I think “bothered” implies a negative emotional response, which some plausible paperclip-maximizers don’t have.
It was intended to imply a negative term in the utility function. Yes, using ‘bothered’ is, technically, anthropomorphising. But it isn’t, in this instance, being confused about how Clippy optimises.
Okay, that’s a decent point. Usually, such a direct “time value of paperclips” doesn’t come up, but if someone were to make such a offer, that might be convincing: 1 billion paperclips held “out of use” as ransom may be better than a guaranteed paperclip now.
No, it’s expressing the paperclip maximizer’s state in ways that make sense to readers here. If you were to express the concept of being “bothered” in a way stripped of all anthropomorphic predicates, you would get something like “X is bothered by Y iff X has devoted significant cognitive resources to altering Y”. And this accurately describes how paperclip maximizers respond to new threats to paperclips. (So I’ve heard.)
I don’t follow. Wasted energy is wasted paperclips.
Okay, that’s a decent point. Usually, such a direct “time value of paperclips” doesn’t come up, but if someone were to make such a offer, that might be convincing: 1 billion paperclips held “out of use” as ransom may be better than a guaranteed paperclip now.
Good examples. Similarly, a paperclip maximizer could, hypothetically, make a human-like mockup that just repetitively asks for help on how to create a table of contents in Word.
Tip: Use the shortcut alt+E,S in Word and Excel to do “paste special”. This lets you choose which aspects you want to carry over from the clipboard!
But that has nothing to do with the paperclips you’re melting. Any other use that loses the same amount of energy would be just as threatening. (Although this does assume that the paperclipper thinks it can someday beat you and use that energy and materials.)
I think “bothered” implies a negative emotional response, which some plausible paperclip-maximizers don’t have. From The True Prisoner’s Dilemma: “let us specify that the paperclip-agent experiences no pain or pleasure—it just outputs actions that steer its universe to contain more paperclips. The paperclip-agent will experience no pleasure at gaining paperclips, no hurt from losing paperclips, and no painful sense of betrayal if we betray it.”
It was intended to imply a negative term in the utility function. Yes, using ‘bothered’ is, technically, anthropomorphising. But it isn’t, in this instance, being confused about how Clippy optimises.
You don’t even know your own utility function!!!!
Oh, because you do????
I knew I was going to have to clarify. I can’t write it out, but if you input something I can give you the right output!
I guess it should read “You can’t even say what your own utility function outputs!”
I actually don’t think you can.
I don’t really think my response was fair anyway. Clippy has a simple utility function by construction—you would expect it to know what it was.