Person A: If you become orgasmium, you would feel more pleasure than you otherwise would.
Person B: But I don’t want to become orgasmium.
Person A: But if you want to feel as much pleasure as possible, then you should become orgasmium! Person B: But… I don’t want to become orgasmium.
I see Person B’s position as being the final word on the matter (especially if, as you say, we’re ignoring external consequences). Person A may be entirely right — but so what? Why should that affect Person B’s judgments? Why should the mathematical requirements behind Person A’s framework have any relevance to Person B’s decisions? In other words, why should we be hedonistic utilitarians, if we don’t want to be?
The difficulty here, of course, is that Person B is using a cached heuristic that outputs “no” for “become orgasmium”; and we cannot be certain that this heuristic is correct in this case. Just as Person A is using the (almost certainly flawed) heuristic “feel as much pleasure as possible”, which outputs “yes” for “become orgasmium”.
The difficulty here, of course, is that Person B is using a cached heuristic that outputs “no” for “become orgasmium”
Why do you think so?
we cannot be certain that this heuristic is correct in this case.
What do you mean by “correct”?
Edit: I think it would be useful for any participants in discussions like this to read Eliezer’s Three Worlds Collide. Not as fictional evidence, but as an examination of the issues, which I think it does quite well. A relevant quote, from chapter 4, “Interlude with the Confessor”:
A sigh came from that hood. “Well… would you prefer a life entirely free of pain and sorrow, having sex all day long?”
“Not… really,” Akon said.
The shoulders of the robe shrugged. “You have judged. What else is there?”
I give a decent probability to the optimal order of things containing absolutely zero pleasure. I assign a lower, but still significant, probability to it containing an infinite amount of pain in any given subjective interval.
… why? Humans definitely appear to want to avoid pain and enjoy pleasure. i suppose I can see pleasure being replaced with “better” emotions, but I’m really baffled regarding the pain. Is it to do with punishment? Challenge? Something I haven’t thought of?
Agreed, pretty much. I said significant probability, not big. I’m not good at translating anticipations into numbers, but no more than 5%. Mostly based on extreme outside view, as in “something I haven’t thought of”.
The difficulty here, of course, is that Person B is using a cached heuristic that outputs “no” for “become orgasmium”; and we cannot be certain that this heuristic is correct in this case. Just as Person A is using the (almost certainly flawed) heuristic “feel as much pleasure as possible”, which outputs “yes” for “become orgasmium”.
Why do you think so?
What do you mean by “correct”?
Edit: I think it would be useful for any participants in discussions like this to read Eliezer’s Three Worlds Collide. Not as fictional evidence, but as an examination of the issues, which I think it does quite well. A relevant quote, from chapter 4, “Interlude with the Confessor”:
Humans are not perfect reasoners.
[Edited for clarity.]
I give a decent probability to the optimal order of things containing absolutely zero pleasure. I assign a lower, but still significant, probability to it containing an infinite amount of pain in any given subjective interval.
Is this intended as a reply to my comment?
reply to the entire thread really.
Fair enough.
Is this intended as a reply to my comment?
… why? Humans definitely appear to want to avoid pain and enjoy pleasure. i suppose I can see pleasure being replaced with “better” emotions, but I’m really baffled regarding the pain. Is it to do with punishment? Challenge? Something I haven’t thought of?
Agreed, pretty much. I said significant probability, not big. I’m not good at translating anticipations into numbers, but no more than 5%. Mostly based on extreme outside view, as in “something I haven’t thought of”.
Oh, right. “Significance” is subjective, I guess. I assumed it meant, I don’t know, >10% or whatever.