“You explained how pleasure from our natural environment “caps out” past a certain threshold—I can’t eat infinity sugar and derive infinity pleasure. So, obviously, my instinctive evaluation is that if I get wire-headed, I’ll eventually get sick of it and want something else!”
I think you’re lumping the concept of wireheading and the “experience machine” into one here. Wireheading basically consists of you pushing down a button because you want to, not because you like to do it. It would basically feel like you’re a heroin junkie, but instead of needles it’s pressing buttons for you.
The experience machine on the other hand is basically a completely immersive virtual reality that satisfies all your desires in any way necessary to make you happy. It’s not required that you’ll be in an orgasmic state all the time… as you said yourself, you may get bored with that (I just think of that poor woman with a disorder, that makes her orgasm every few minutes and she apparently doesn’t like it at all). In the experience machine scenario, you would never get bored—if you desire some form of variety in your “perfect experience” and would be unhappy without it, then the machine would make everything to make you happy nontheless.
The point of the machine is that it gives you whatever you desire in just the right amounts to max you out on pleasure and happiness, whatever means necessary and regardless of how convoluted and complex the means may have to be. So if you’re hooked up to the machine, you feel happy no matter what. The point is that your pleasure doesn’t build on achievements in the real world and that there may perhaps be other meaningful things you may desire apart from pleasure.
As we’ve seen from luke, there appear to be at least two other human desires next to pleasure—namely “wanting” and “learning”. But if the machine is capable of conjuring up any means of making me happy, then it perhaps would have to throw a bit of wanting and learning into the mix to make me as happy as possible (because these 3 things seem to be intricately connected and you may need the other 2 to max out on pleasure). But at the end of the day the experience machine is simply a poor thought experiment as I see it.
If you say I can be in a virtual machine that always makes me happy and then say I’m somehow not happy because I’m still missing important ingredient X, then that is not a good argument but you’ve simply lied to me about your premise—namely that the machine would make me happy no matter what.
However it doesn’t really have anything to do with wireheading, as in the example with dying lab rats. That’s just artificial addiction, not a happiness-machine.
By the way, I’m beginning to think that the experience machine would be a really sweet deal and I may take it if it was offered to me.
Sure, my happiness wouldn’t be justified by my real-world achievements but so what? What’s so special about “real” achievements? Feeling momentarily happier because I gain money, social status and get laid… sure there’s some pride and appeal in knowing I’ve earned these things due to my whatever, but in what kind oftranscendent way are these things really achievements or meaningful? My answer would be that they aren’t meaningful in any important way, they are simply primitive behaviors based on my animalistic nature and the fact that my genes fell out of the treetops yesterday.
I struggle to see any worthwile meaning in these real “achievements”. They can make you feel good and they can make you feel miserable, but at the end of the day they are perfectly transparent apeish behaviors based on reproductive urges which I simply can’t outgrow because of my hardwired nature. The only meaningful activity that would be worth leaving my experience machine for would be to tackle existential risks… just so that I can get back to my virtual world and enjoy it “indefinitely”.
Personally though, I have the feeling that it would still be a lot cleverer to redesign my own brain from the ground up to make it impervious to any kind of emotional trauma or feelings of hurt, and to make it run entirely on a streamlined and perfectly rational “pleasure priority hierarchy”. No pain, all fun, and still living in the real world—perhaps with occasional trips into virtual reality to spice things up.
But I find it really hard to imagine how I could still value human life, if I would measure everything on a scale of happiness and entirely lacked the dimension of pain. Can one still feel the equivalent of compassion without pain? It’s hard to see myself having fun at the funeral of my parents.
Less fun than if they were still alive of cause, but it would still be fun if I lacked the dimension of pain… hell that would be weird.
But I find it really hard to imagine how I could still value human life, if I would measure everything on a scale of happiness and entirely lacked the dimension of pain. Can one still feel the equivalent of compassion without pain? It’s hard to see myself having fun at the funeral of my parents.
Well, I think you could still feel compassion, or something like it (without the sympathy, maybe; just concern) - even while happy, I wouldn’t want someone else to be unhappy. But on the other hand, it does seem like there’s a connection, just because of how our brains are wired. You need to be able to at least imagine unhappiness for empathy, I suppose.
I read an article about a man with brain damage, and it seems relevant to this situation. Apparently, an accident left him with damage to a certain part of his brain, and it resulted in the loss of unhappy emotions. He would constantly experience mild euphoria. It seems like a good deal, but his mother told a story about visiting him in the hospital; his sister had died in the meantime, and when she told him, he paused for a second, said something along the lines of “oh” or “shame”… then went back to cracking jokes. She was quoted as saying he “didn’t seem like her son any more.”
I’ve always felt the same way that you do, however. I would very much like to redesign myself to be pain-free and pleasure-maximized. One of the first objections I hear to this is “but pain is useful, because it lets you know when you’re being damaged.” Okay—then we’ll simply have a “damage indicator”, and leave the “pull back from hot object” reflex alone. Similarly, I think concerns about compassion could be solved (or at least mitigated) by equipping ourselves with an “off” switch for the happiness—at the funeral, we allow ourselves sadness… then when the grief becomes unbearable, it’s back to euphoria.
Very good real world example about the guy with brain damage! Interesting case, any chance of finding this story online? A quick and dirty google search on my part didn’t turn up anything.
Also, nice idea with the switch. I fully acknowledge, that there are some situations when I somehow have the need to feel pain—funerals being one occasion. Your idea with the switch would be brilliantly simple. Unfortunately, my spider-senses tell me the redesigning part itself will be anything but.
Case studies of brain damage are pure gold when it comes to figuring out “what would happen to me if I remove/augment my brain in such and such a way”.
I was about to come back (actually on my way to the computer) and regretfully inform you that I had no idea where I had seen it… but then a key phrase came back to me, and voila! (I had the story a little wrong: it was a stroke that caused the damage, and it was a leukemia relapse the sister had.)
The page has a lot of other interesting case studies involving the brain, as well. I need to give the whole site a re-browse… it’s been quite a while since I’ve looked at it. I seem to remember it being like an atheism-oriented LessWrong.
Thank you very much for going through the trouble of finding all these case-studies! :)
(For anyone else interested, I should remark these aren’t the actual studies, but quick summaries within an atheistic context that is concerned with disproving the notion of a soul—but there are references to all the books within which these symptoms are described.)
The Alien Hand Syndrome is always good for a serious head-scratching indeed.
In the experience machine scenario, you would never get bored
Exactly! My intuition was wrong; it’s trained on an ancestral environment where that isn’t true, so it irrationally rejects the experience machine as “obviously” suffering from the same flaw. Now that I’m aware of that irrationality, I can route around it and say that the experience machine actually sounds like a really sweet deal :)
“You explained how pleasure from our natural environment “caps out” past a certain threshold—I can’t eat infinity sugar and derive infinity pleasure. So, obviously, my instinctive evaluation is that if I get wire-headed, I’ll eventually get sick of it and want something else!”
I think you’re lumping the concept of wireheading and the “experience machine” into one here. Wireheading basically consists of you pushing down a button because you want to, not because you like to do it. It would basically feel like you’re a heroin junkie, but instead of needles it’s pressing buttons for you.
The experience machine on the other hand is basically a completely immersive virtual reality that satisfies all your desires in any way necessary to make you happy. It’s not required that you’ll be in an orgasmic state all the time… as you said yourself, you may get bored with that (I just think of that poor woman with a disorder, that makes her orgasm every few minutes and she apparently doesn’t like it at all). In the experience machine scenario, you would never get bored—if you desire some form of variety in your “perfect experience” and would be unhappy without it, then the machine would make everything to make you happy nontheless.
The point of the machine is that it gives you whatever you desire in just the right amounts to max you out on pleasure and happiness, whatever means necessary and regardless of how convoluted and complex the means may have to be. So if you’re hooked up to the machine, you feel happy no matter what. The point is that your pleasure doesn’t build on achievements in the real world and that there may perhaps be other meaningful things you may desire apart from pleasure.
As we’ve seen from luke, there appear to be at least two other human desires next to pleasure—namely “wanting” and “learning”. But if the machine is capable of conjuring up any means of making me happy, then it perhaps would have to throw a bit of wanting and learning into the mix to make me as happy as possible (because these 3 things seem to be intricately connected and you may need the other 2 to max out on pleasure). But at the end of the day the experience machine is simply a poor thought experiment as I see it.
If you say I can be in a virtual machine that always makes me happy and then say I’m somehow not happy because I’m still missing important ingredient X, then that is not a good argument but you’ve simply lied to me about your premise—namely that the machine would make me happy no matter what.
However it doesn’t really have anything to do with wireheading, as in the example with dying lab rats. That’s just artificial addiction, not a happiness-machine.
By the way, I’m beginning to think that the experience machine would be a really sweet deal and I may take it if it was offered to me.
Sure, my happiness wouldn’t be justified by my real-world achievements but so what? What’s so special about “real” achievements? Feeling momentarily happier because I gain money, social status and get laid… sure there’s some pride and appeal in knowing I’ve earned these things due to my whatever, but in what kind oftranscendent way are these things really achievements or meaningful? My answer would be that they aren’t meaningful in any important way, they are simply primitive behaviors based on my animalistic nature and the fact that my genes fell out of the treetops yesterday.
I struggle to see any worthwile meaning in these real “achievements”. They can make you feel good and they can make you feel miserable, but at the end of the day they are perfectly transparent apeish behaviors based on reproductive urges which I simply can’t outgrow because of my hardwired nature. The only meaningful activity that would be worth leaving my experience machine for would be to tackle existential risks… just so that I can get back to my virtual world and enjoy it “indefinitely”.
Personally though, I have the feeling that it would still be a lot cleverer to redesign my own brain from the ground up to make it impervious to any kind of emotional trauma or feelings of hurt, and to make it run entirely on a streamlined and perfectly rational “pleasure priority hierarchy”. No pain, all fun, and still living in the real world—perhaps with occasional trips into virtual reality to spice things up.
But I find it really hard to imagine how I could still value human life, if I would measure everything on a scale of happiness and entirely lacked the dimension of pain. Can one still feel the equivalent of compassion without pain? It’s hard to see myself having fun at the funeral of my parents.
Less fun than if they were still alive of cause, but it would still be fun if I lacked the dimension of pain… hell that would be weird.
Well, I think you could still feel compassion, or something like it (without the sympathy, maybe; just concern) - even while happy, I wouldn’t want someone else to be unhappy. But on the other hand, it does seem like there’s a connection, just because of how our brains are wired. You need to be able to at least imagine unhappiness for empathy, I suppose.
I read an article about a man with brain damage, and it seems relevant to this situation. Apparently, an accident left him with damage to a certain part of his brain, and it resulted in the loss of unhappy emotions. He would constantly experience mild euphoria. It seems like a good deal, but his mother told a story about visiting him in the hospital; his sister had died in the meantime, and when she told him, he paused for a second, said something along the lines of “oh” or “shame”… then went back to cracking jokes. She was quoted as saying he “didn’t seem like her son any more.”
I’ve always felt the same way that you do, however. I would very much like to redesign myself to be pain-free and pleasure-maximized. One of the first objections I hear to this is “but pain is useful, because it lets you know when you’re being damaged.” Okay—then we’ll simply have a “damage indicator”, and leave the “pull back from hot object” reflex alone. Similarly, I think concerns about compassion could be solved (or at least mitigated) by equipping ourselves with an “off” switch for the happiness—at the funeral, we allow ourselves sadness… then when the grief becomes unbearable, it’s back to euphoria.
Very good real world example about the guy with brain damage! Interesting case, any chance of finding this story online? A quick and dirty google search on my part didn’t turn up anything.
Also, nice idea with the switch. I fully acknowledge, that there are some situations when I somehow have the need to feel pain—funerals being one occasion. Your idea with the switch would be brilliantly simple. Unfortunately, my spider-senses tell me the redesigning part itself will be anything but.
Case studies of brain damage are pure gold when it comes to figuring out “what would happen to me if I remove/augment my brain in such and such a way”.
I was about to come back (actually on my way to the computer) and regretfully inform you that I had no idea where I had seen it… but then a key phrase came back to me, and voila! (I had the story a little wrong: it was a stroke that caused the damage, and it was a leukemia relapse the sister had.)
The page has a lot of other interesting case studies involving the brain, as well. I need to give the whole site a re-browse… it’s been quite a while since I’ve looked at it. I seem to remember it being like an atheism-oriented LessWrong.
Thank you very much for going through the trouble of finding all these case-studies! :)
(For anyone else interested, I should remark these aren’t the actual studies, but quick summaries within an atheistic context that is concerned with disproving the notion of a soul—but there are references to all the books within which these symptoms are described.)
The Alien Hand Syndrome is always good for a serious head-scratching indeed.
Exactly! My intuition was wrong; it’s trained on an ancestral environment where that isn’t true, so it irrationally rejects the experience machine as “obviously” suffering from the same flaw. Now that I’m aware of that irrationality, I can route around it and say that the experience machine actually sounds like a really sweet deal :)