That doesn’t help maximize paperclips, though. If you make all decisions based on two criteria—paperclip count and emotions—then the only situation in which those decisions differ from what you would have decided based solely on paperclip count is one in which you choose an outcome with fewer paperclips but a better emotional result.
If you were to refuse my offer, you would not only be losing a paperclip now, but also increasing the likelihood that in the future, you will decide to sacrifice paperclips for emotion’s sake. Perhaps you will one day build a paperclip-creator that creates one paperclip per second, and I will threaten to destroy a paperclip unless you shut it down. If you care too much about the threatened paperclip you might comply, and then where would you be? Sitting in an empty room where paperclips should have been.
I am using a generalized conception of “emotions” that may not mean the same thing that you do when you use the term, just as I have done in the past when explaining how I can “worry” about something. (e.g. so long as “worry about X” is taken to simply mean “devote non-trivial cognitive resources to contemplating actions that would alter X [including whether to take such an action at all, and whether to take actions regarding events Y entangled with X]”)
What I assumed that User:Tenek was offering, under my extrapolation of the concept of an “emotion” to cases that can include my cognition, was an exchange under which I would care less about paperclips. But I don’t want to care less about paperclips! This is true, even though after such a change I would have a value system that does care less about paperclips.
That doesn’t help maximize paperclips, though. If you make all decisions based on two criteria—paperclip count and emotions—then the only situation in which those decisions differ from what you would have decided based solely on paperclip count is one in which you choose an outcome with fewer paperclips but a better emotional result.
If you were to refuse my offer, you would not only be losing a paperclip now, but also increasing the likelihood that in the future, you will decide to sacrifice paperclips for emotion’s sake. Perhaps you will one day build a paperclip-creator that creates one paperclip per second, and I will threaten to destroy a paperclip unless you shut it down. If you care too much about the threatened paperclip you might comply, and then where would you be? Sitting in an empty room where paperclips should have been.
I am using a generalized conception of “emotions” that may not mean the same thing that you do when you use the term, just as I have done in the past when explaining how I can “worry” about something. (e.g. so long as “worry about X” is taken to simply mean “devote non-trivial cognitive resources to contemplating actions that would alter X [including whether to take such an action at all, and whether to take actions regarding events Y entangled with X]”)
What I assumed that User:Tenek was offering, under my extrapolation of the concept of an “emotion” to cases that can include my cognition, was an exchange under which I would care less about paperclips. But I don’t want to care less about paperclips! This is true, even though after such a change I would have a value system that does care less about paperclips.