This sounds to me like a word game. It depends on what the initial intention for ‘pleasure’ is. If you say the device gives ‘maximal pleasure’ meaning to point to a cloud of good-stuffs and then you later use a more precise meaning for pleasure that is an incomplete model of the good-stuffs, you are then talking of different things.
The meaningful thought experiment for me is whether I would use a box that maximized pleasure\wanting\desire\happiness\whatever-is-going-on-at-the-best-moments-of-life while completely separating me as an actor or participant from the rest of the universe as I currently know it. In that sense of the experiment, you aren’t allowed to say ‘no’ because of how you might feel after the machine is turned on, because then the machine is by definition failing. The argument has to be made that the current pre-machine-you does not want to become the post-machine-you, even while the post-machine-you thinks the choice was obvious.
This sounds to me like a word game. It depends on what the initial intention for ‘pleasure’ is. If you say the device gives ‘maximal pleasure’ meaning to point to a cloud of good-stuffs and then you later use a more precise meaning for pleasure that is an incomplete model of the good-stuffs, you are then talking of different things.
The meaningful thought experiment for me is whether I would use a box that maximized pleasure\wanting\desire\happiness\whatever-is-going-on-at-the-best-moments-of-life while completely separating me as an actor or participant from the rest of the universe as I currently know it. In that sense of the experiment, you aren’t allowed to say ‘no’ because of how you might feel after the machine is turned on, because then the machine is by definition failing. The argument has to be made that the current pre-machine-you does not want to become the post-machine-you, even while the post-machine-you thinks the choice was obvious.