I think it likely that the people Luke spoke with were likely intelligent people who knew that hypotheticals are supposed to test your values and priorities and responded in the spirit of the question.
I suspect that, with these much safer sounding provisions, most people would opt to have access to the machine rather than not, and would eventually use it for 100% of the time.
Many people become addicted to drugs, and end up using them nearly 100% of the time. That doesn’t mean that’s what they really want, it just means they don’t have enough willpower to resist.
How humans would behave if encountered with a pleasure machine is not a reliable guide to how humans would want to behave if they were encountered with it, in the same way that the way humans would behave if encountered with heroin is not a reliable guide to how humans would want to behave when encountered with heroin. There are lots of regretful heroin users.
Personal example: The greatest feeling of bliss I have experienced is dozing off in a naturally doped up state after highly satisfying sex. This state is utterly passive, but so thoroughly pleasant that I could see myself opting for an eternity in it.
Wouldn’t it be even better to constantly be feeling this bliss, but also still mentally able to pursue non-pleasure related goals? I might not mind engineering the human race to feel pleasure more easily, as long as we were still able to pursue other worthwhile goals.
Sorry for the late reply, I haven’t checked this in a while.
Please don’t fight the hypothetical.
Most components of our thought processes are subconscious. The hypothetical question you posed presses a LOT of subconscious buttons. It is largely impossible for most people, even intelligent ones, to take a hypothetical question at face value without being influenced by the subconscious effects of the way it’s phrased.
You can’t fix a bad hypothetical question by asking people to not fight the hypothetical.
For example, who wants to spend an eternity isolated in space? That must be one of the worst fears for many people. How do you disentangle that from the question? That’s like asking a kid if he wants candy while you’re dressed up as a monster from his nightmares.
There are lots of regretful heroin users.
Because not all components of the heroin experience are pleasant.
Wouldn’t it be even better to constantly be feeling this bliss, but also still mentally able to pursue non-pleasure related goals?
I suppose, yes. Valuable X + valuable Y is strictly better than just valuable X.
For example, who wants to spend an eternity isolated in space? That must be one of the worst fears for many people.
When I heard that hypothetical I took the whole “launching you into space” thing as another way of saying “Assume for the sake of the argument that no outside force or person will ever break into the pleasure machine and kill you.” I took the specific methodology (launching into space) to just be a way to add a little color to the thought experiment and make it a little more grounded in reality. To me if a different method of preventing interference with the machine had been specified, such as burying the machine underground, or establishing a trust fund that hired security guards to protect it for the rest of your life, my answer wouldn’t be any different.
I suppose you are right that someone other than me might give the “space launch” details much more salience. As you yourself pointed out in your original post, modifying the experiment’s parameters might change the results. Although what I read in this thread makes me think that people might not gradually choose to use the machine all the time after all.
Because not all components of the heroin experience are pleasant.
Much regret probably comes from things like heroin preventing them from finding steady work, or risks of jailtime. But I think a lot of people also regret not accomplishing goals that heroin distracts them from. Many drug users, for instance, regret neglecting their friends and family.
I suppose, yes. Valuable X + valuable Y is strictly better than just valuable X.
I agree. I would think it terrific if people in the future are able to modify themselves to feel more intense and positive emotions and sensations, as long as doing so did not rob them of their will and desire to do things and pursue non-pleasure-related values. I don’t see doing that as any different from taking an antidepressant, which is something I myself have done. There’s no reason to think our default moods setting are optimal. I just think it would be bad if increasing our pleasure makes it harder to achieve our other values.
I think you also imply here, if I am reading you correctly, that a form of wireheading that did not exclude non-pleasure experiences would be vastly superior to to one with just pleasure and nothing else.
In order to be happy (using present-me’s definition of “happy”) I need to interact with other people. So there’s no way for a holodeck to make me happy unless it includes other people.
I agree. Interacting with other people is one of the “non-pleasure-related values” that I was talking about (obviously interacting with other people brings me pleasure, but I’d still want to interact with others even if I had a drug that gave me the same amount of pleasure). So I wouldn’t spend my life in a holodeck unless it was multiplayer. I think that during my discussion with denisbider at some point the conversation shifted from “holodeck” to “wireheading.”
I think that the present-you’s definition of “happy” is closer to the present-me’s definition of “satisfaction.” I generally think of happiness as an emotion one feels, and satisfaction as the state where a large amount of your preferences are satisfied.
I think that the present-you’s definition of “happy” is closer to the present-me’s definition of “satisfaction.” I generally think of happiness as an emotion one feels, and satisfaction as the state where a large amount of your preferences are satisfied.
Yes. (I think the standard way of distinguishing them is to call yours hedonic happiness and mine eudaimonic happiness, or something like that.)
Please don’t fight the hypothetical.
I think it likely that the people Luke spoke with were likely intelligent people who knew that hypotheticals are supposed to test your values and priorities and responded in the spirit of the question.
Many people become addicted to drugs, and end up using them nearly 100% of the time. That doesn’t mean that’s what they really want, it just means they don’t have enough willpower to resist.
How humans would behave if encountered with a pleasure machine is not a reliable guide to how humans would want to behave if they were encountered with it, in the same way that the way humans would behave if encountered with heroin is not a reliable guide to how humans would want to behave when encountered with heroin. There are lots of regretful heroin users.
Wouldn’t it be even better to constantly be feeling this bliss, but also still mentally able to pursue non-pleasure related goals? I might not mind engineering the human race to feel pleasure more easily, as long as we were still able to pursue other worthwhile goals.
Sorry for the late reply, I haven’t checked this in a while.
Most components of our thought processes are subconscious. The hypothetical question you posed presses a LOT of subconscious buttons. It is largely impossible for most people, even intelligent ones, to take a hypothetical question at face value without being influenced by the subconscious effects of the way it’s phrased.
You can’t fix a bad hypothetical question by asking people to not fight the hypothetical.
For example, who wants to spend an eternity isolated in space? That must be one of the worst fears for many people. How do you disentangle that from the question? That’s like asking a kid if he wants candy while you’re dressed up as a monster from his nightmares.
Because not all components of the heroin experience are pleasant.
I suppose, yes. Valuable X + valuable Y is strictly better than just valuable X.
When I heard that hypothetical I took the whole “launching you into space” thing as another way of saying “Assume for the sake of the argument that no outside force or person will ever break into the pleasure machine and kill you.” I took the specific methodology (launching into space) to just be a way to add a little color to the thought experiment and make it a little more grounded in reality. To me if a different method of preventing interference with the machine had been specified, such as burying the machine underground, or establishing a trust fund that hired security guards to protect it for the rest of your life, my answer wouldn’t be any different.
I suppose you are right that someone other than me might give the “space launch” details much more salience. As you yourself pointed out in your original post, modifying the experiment’s parameters might change the results. Although what I read in this thread makes me think that people might not gradually choose to use the machine all the time after all.
Much regret probably comes from things like heroin preventing them from finding steady work, or risks of jailtime. But I think a lot of people also regret not accomplishing goals that heroin distracts them from. Many drug users, for instance, regret neglecting their friends and family.
I agree. I would think it terrific if people in the future are able to modify themselves to feel more intense and positive emotions and sensations, as long as doing so did not rob them of their will and desire to do things and pursue non-pleasure-related values. I don’t see doing that as any different from taking an antidepressant, which is something I myself have done. There’s no reason to think our default moods setting are optimal. I just think it would be bad if increasing our pleasure makes it harder to achieve our other values.
I think you also imply here, if I am reading you correctly, that a form of wireheading that did not exclude non-pleasure experiences would be vastly superior to to one with just pleasure and nothing else.
In order to be happy (using present-me’s definition of “happy”) I need to interact with other people. So there’s no way for a holodeck to make me happy unless it includes other people.
I agree. Interacting with other people is one of the “non-pleasure-related values” that I was talking about (obviously interacting with other people brings me pleasure, but I’d still want to interact with others even if I had a drug that gave me the same amount of pleasure). So I wouldn’t spend my life in a holodeck unless it was multiplayer. I think that during my discussion with denisbider at some point the conversation shifted from “holodeck” to “wireheading.”
I think that the present-you’s definition of “happy” is closer to the present-me’s definition of “satisfaction.” I generally think of happiness as an emotion one feels, and satisfaction as the state where a large amount of your preferences are satisfied.
Yes. (I think the standard way of distinguishing them is to call yours hedonic happiness and mine eudaimonic happiness, or something like that.)