It seems the reason why we have the values we do is because we don’t live in the least (or in this case most) convenient possible world.
In other words, imagine that you’re stuck on some empty planet in the middle of a huge volume of known-life-free space. In this case a pleasant virtual world probably sounds like a much better deal. Even then you still have to worry about asteroids and supernovas and whatnot.
My point is that I’m not convinced that people’s objection to wireheading is genuinely because of a fundamental preference for the “real” world (even at enormous hedonic cost), rather than because of inescapable practical concerns and their associated feelings.
edit:
A related question might be, how bad would the real world have to be before you’d prefer the matrix? If you’d prefer to “advanced wirehead” over a lifetime of torture, then clearly you’re thinking about cost-benefit trade-offs, not some preference for the real-world that overrides everything else. In that case, a rejection of advanced wireheading may simply reflect a failure to imagine just how good it could be.
People usually seem so intent on thinking up reasons why it might not be so great, that I’m having a really hard time getting a read on what folks think of the core premise.
My life/corner of the world is what I think most people would call very good, but I’d pick the Matrix in a heartbeat. But note that I am taking the Matrix at face value, rather than wondering whether it’s a trick of advertising. I can’t even begin to imagine myself objecting to a happy, low-stress Matrix.
I agree—I think the original post is accurate in what people would respond to the suggestion, in abstract, but the actual implementation would undoubtedly hook vast swathes of the population. We live in a world where people already become addicted to vastly inferior simulations such as WoW already.
I disagree. I think that even the average long-term tortured prisoner would balk and resist if you walked up to him with this machine. In fact, I think fewer people would accept in real life than those who claim they would, in conversations like these.
The resistance may in fact reveal an inability to properly conceptualize the machine working, or it may not. As others have said, maybe you don’t want to do something you think is wrong (like abandoning your relatives or being unproductive) even if later you’re guaranteed to forget all about it and live in bliss. What if the machine ran on tortured animals? Or tortured humans that you don’t know? That shouldn’t bother you any more than if it didn’t, if all that matters is how you feel once you’re hooked up.
We have some present-day corrolaries. What about a lobotomy, or suicide? Even if these can be shown to be a guaranteed escape from unhappiness or neuroses, most people aren’t interested, including some really unhappy people.
I think that even the average long-term tortured prisoner would balk and resist if you walked up to him with this machine.
I think the average long-term tortured prisoner would be desperate for any option that’s not “get tortured more”, considering that real torture victims will confess to crimes that carry the death penalty if they think this will make the torturer stop. Or, for that matter, crimes that carry the torture penalty, IIRC.
Yes, I agree that while not the first objection a person makes, this could be close to the ‘true rejection’. Simulated happiness is fine—unless it isn’t really stable and dependable (because it wasn’t real) and you’re crudely awoken to discover the whole world has gone to pot and you’ve got a lot of work to do. Then you’ll regret having wasted time ‘feeling good’.
If you’d prefer to “advanced wirehead” over a lifetime of torture, then clearly you’re thinking about cost-benefit trade-offs, not some preference for the real-world that overrides everything else.
Whatever your meta-level goals, unless they are “be tortured for the rest of my life,” there’s simply no way to accomplish them while being tortured indefinitely. That said, suppose you had some neurological condition that caused you to live in constant excrutiating pain, but otherwise in no way incapacitated you—now, you could still accomplish meta-level goals, but you might still prefer the pain-free simulator. I doubt there’s anyone who sincerely places zero value on hedons, but no one ever claimed such people existed.
It seems the reason why we have the values we do is because we don’t live in the least (or in this case most) convenient possible world.
In other words, imagine that you’re stuck on some empty planet in the middle of a huge volume of known-life-free space. In this case a pleasant virtual world probably sounds like a much better deal. Even then you still have to worry about asteroids and supernovas and whatnot.
My point is that I’m not convinced that people’s objection to wireheading is genuinely because of a fundamental preference for the “real” world (even at enormous hedonic cost), rather than because of inescapable practical concerns and their associated feelings.
edit:
A related question might be, how bad would the real world have to be before you’d prefer the matrix? If you’d prefer to “advanced wirehead” over a lifetime of torture, then clearly you’re thinking about cost-benefit trade-offs, not some preference for the real-world that overrides everything else. In that case, a rejection of advanced wireheading may simply reflect a failure to imagine just how good it could be.
People usually seem so intent on thinking up reasons why it might not be so great, that I’m having a really hard time getting a read on what folks think of the core premise.
My life/corner of the world is what I think most people would call very good, but I’d pick the Matrix in a heartbeat. But note that I am taking the Matrix at face value, rather than wondering whether it’s a trick of advertising. I can’t even begin to imagine myself objecting to a happy, low-stress Matrix.
I agree—I think the original post is accurate in what people would respond to the suggestion, in abstract, but the actual implementation would undoubtedly hook vast swathes of the population. We live in a world where people already become addicted to vastly inferior simulations such as WoW already.
I disagree. I think that even the average long-term tortured prisoner would balk and resist if you walked up to him with this machine. In fact, I think fewer people would accept in real life than those who claim they would, in conversations like these.
The resistance may in fact reveal an inability to properly conceptualize the machine working, or it may not. As others have said, maybe you don’t want to do something you think is wrong (like abandoning your relatives or being unproductive) even if later you’re guaranteed to forget all about it and live in bliss. What if the machine ran on tortured animals? Or tortured humans that you don’t know? That shouldn’t bother you any more than if it didn’t, if all that matters is how you feel once you’re hooked up.
We have some present-day corrolaries. What about a lobotomy, or suicide? Even if these can be shown to be a guaranteed escape from unhappiness or neuroses, most people aren’t interested, including some really unhappy people.
I think the average long-term tortured prisoner would be desperate for any option that’s not “get tortured more”, considering that real torture victims will confess to crimes that carry the death penalty if they think this will make the torturer stop. Or, for that matter, crimes that carry the torture penalty, IIRC.
Yes, I agree that while not the first objection a person makes, this could be close to the ‘true rejection’. Simulated happiness is fine—unless it isn’t really stable and dependable (because it wasn’t real) and you’re crudely awoken to discover the whole world has gone to pot and you’ve got a lot of work to do. Then you’ll regret having wasted time ‘feeling good’.
Whatever your meta-level goals, unless they are “be tortured for the rest of my life,” there’s simply no way to accomplish them while being tortured indefinitely. That said, suppose you had some neurological condition that caused you to live in constant excrutiating pain, but otherwise in no way incapacitated you—now, you could still accomplish meta-level goals, but you might still prefer the pain-free simulator. I doubt there’s anyone who sincerely places zero value on hedons, but no one ever claimed such people existed.