Most people say they wouldn’t choose the pleasure machine
Possibly because the word “machine” is sneaking in connotations that lead to the observed conclusion: we picture something like a morphine pump, or something perhaps only slightly less primitive.
What if we interpret “machine” to mean “a very large computer running a polis under a fun-theoretically optimal set of rules” and “hook up your brain” to mean “upload”?
Then you’re talking Friendly AI with the prior restriction that you have to live alone. Many¹ people will still run the “I would be subjected to a machine” cached thought, will still disbelieve that a Machine™ could ever understand our so-complex-it’s-holly psyche, that even if it does, it will automatically be horrible, and that the whole concept is absurd anyway.
In that case they wouldn’t reject the possibility because they don’t want to live alone and happy, but because they positively believe FAI is impossible. My solution in that case is just to propose them to live a guaranteed happy life, but alone. For people who still refuse to answer on the grounds of impossibility, invoking the supernatural may help.
1: I derive that “many” from one example alone, but I suspect it extends to most enlightened people who treat philosophy as closer to literature than science (wanting to read the sources, and treating questions like “was Niezche/Kant/Spinoza plain wrong on such point” as ill typed —there’s no truths or fallacies, only schools of thought). Michel Onfray appears to say that’s typically European.
Of course it would. My question is, to what extent would you mind being alone? Not feeling alone, not even believing you are alone, just being alone.
Of course, once I’m plugged in to my Personal Matrix, I would not mind any more, for I wouldn’t feel nor believe I am alone. But right now I do mind. Whatever the real reasons behind it, being cut off from the rest of the world just feels wrong. Basically, I believe I want Multiplayer Fun bad enough to sacrifice some Personal Fun.
Now, I probably wouldn’t want to sacrifice much personal fun, so given the choice between maximum Personal Fun and my present life, (no third alternative allowed), I would probably take the blue pill. Though it would really bother me if everyone else wouldn’t be given the same choice.
Now to get back on topic, I suspect Luke did want to talk about a primitive system that would turn you into an Orgasmium. Something that would even sacrifice Boredom to maximize subjective pleasure and happiness. (By the way, I suspect that “Eternal Bliss” promised by some beliefs systems is just as primitive.) Such a primitive system would exactly serve his point: do you only want happiness and pleasure? Would you sacrifice everything else to get it?
If this is indeed Luke’s intended offer, than I believe it to be a lie. Without the ability to introduced varied pleasure, an Orgasmium would fail to deliver on its promise of “maximal pleasure.”
For the offer to be true, it would need to be a Personal Matrix.
Some people think that extended periods of euphoria give up no marginal pleasure. I haven’t found that to be the case—but perhaps if we take away any sense of time passing then it would work.
Possibly because the word “machine” is sneaking in connotations that lead to the observed conclusion: we picture something like a morphine pump, or something perhaps only slightly less primitive.
What if we interpret “machine” to mean “a very large computer running a polis under a fun-theoretically optimal set of rules” and “hook up your brain” to mean “upload”?
Then you’re talking Friendly AI with the prior restriction that you have to live alone. Many¹ people will still run the “I would be subjected to a machine” cached thought, will still disbelieve that a Machine™ could ever understand our so-complex-it’s-holly psyche, that even if it does, it will automatically be horrible, and that the whole concept is absurd anyway.
In that case they wouldn’t reject the possibility because they don’t want to live alone and happy, but because they positively believe FAI is impossible. My solution in that case is just to propose them to live a guaranteed happy life, but alone. For people who still refuse to answer on the grounds of impossibility, invoking the supernatural may help.
1: I derive that “many” from one example alone, but I suspect it extends to most enlightened people who treat philosophy as closer to literature than science (wanting to read the sources, and treating questions like “was Niezche/Kant/Spinoza plain wrong on such point” as ill typed —there’s no truths or fallacies, only schools of thought). Michel Onfray appears to say that’s typically European.
This machine, if it were to give you maximal pleasure, should be able to make you feel as if you are not alone.
The only way I can see this machine actually making good on its promise is to be a Matrix-quality reality engine, but with you in the king seat.
I would take it.
Of course it would. My question is, to what extent would you mind being alone? Not feeling alone, not even believing you are alone, just being alone.
Of course, once I’m plugged in to my Personal Matrix, I would not mind any more, for I wouldn’t feel nor believe I am alone. But right now I do mind. Whatever the real reasons behind it, being cut off from the rest of the world just feels wrong. Basically, I believe I want Multiplayer Fun bad enough to sacrifice some Personal Fun.
Now, I probably wouldn’t want to sacrifice much personal fun, so given the choice between maximum Personal Fun and my present life, (no third alternative allowed), I would probably take the blue pill. Though it would really bother me if everyone else wouldn’t be given the same choice.
Now to get back on topic, I suspect Luke did want to talk about a primitive system that would turn you into an Orgasmium. Something that would even sacrifice Boredom to maximize subjective pleasure and happiness. (By the way, I suspect that “Eternal Bliss” promised by some beliefs systems is just as primitive.) Such a primitive system would exactly serve his point: do you only want happiness and pleasure? Would you sacrifice everything else to get it?
If this is indeed Luke’s intended offer, than I believe it to be a lie. Without the ability to introduced varied pleasure, an Orgasmium would fail to deliver on its promise of “maximal pleasure.”
For the offer to be true, it would need to be a Personal Matrix.
Some people think that extended periods of euphoria give up no marginal pleasure. I haven’t found that to be the case—but perhaps if we take away any sense of time passing then it would work.