2) Honestly, I would have been happy with the aliens’ deal (even before it was implemented), and I think there is a ~60% chance that Elizier agrees.
I’m of the opinion that pain is a bad thing, except insofar as it prevents you from damaging yourself. People argue that pain is necessary to provide contrast to happiness, and that pleasure wouldn’t be meaningful without pain, but I would say that boredom and slight discomfort provide more than enough contrast.
However, this future society disagrees. The idea that “pain is important” is ingrained in these people’s minds, in much the same way that “rape is bad” is ingrained in ours. I think one of the main points Elizier is trying to make is that we would disagree with future humans almost as much as we would disagree with the baby-eaters or superhappies.
(Edit 1.5 years later: I was exaggerating in that second paragraph. I suspect I was trying too hard to sound insightful. The claims may or may not have merit, but I would no longer word them as forcefully.)
I think one of the main points Elizier is trying to make is that we would disagree with future humans almost as much as we would disagree with the baby-eaters or superhappies.
I never had this impression; if anything, I thought that all the things Eliezer mentioned in any detail—changes in gender and sexuality, the arcane libertarian framework that replaces the state and generally all the differences that seem important by the measure of our own history—are still intended to underscore how humanity still operates against a scale recongizable to its past. The aliens are simply unrelated, and that’s why dialogue fails.
When faced with such a catastrophic choice, we humans argue whether to use consequentialist or non-consequentialist ethics, whether an utilitarian model should value billions of lives more than the permanent extinction of some of our deepest emotions, etc, etc. To the Superhappies all of this is simply an incomprehensible hellish nightmare; if a human asked them to go ahead with the transformation but leave a fork of the species as a control group (like in Joe Haldeman’s Forever War), this would sound to them like a Holocaust survivor asking us to set up a new camp with intensified torture, so it can “fix” something that’s been wrong with the human condition.
(This does not imply some deep xenophobia in me; indeed, after thorough thinking, I say that I would’ve risked waiting the full eight hours for the evacuation—not because I like the alien mode of being so much, but because I find myself unable to form any judgment about it strong enough to outweigh the cost in lives. My utility function here simply runs into a boundary, with 16 billion ~ infinity to a factor of X, where I don’t quite understand what value to assign X with)
I think that point would make more sense than the point he is apparently actually making… which is that we must keep negative aspects of ourselves (such as pain) to remain “human” (as defined by current specimens, I suppose), which is apparently something important. Either that or, as you say, Yudkowsky believes that suffering is required to appreciate happiness.
I too would have been happy to take the SH deal; or, if not happy, at least happier than with any of the alternatives.
2) Honestly, I would have been happy with the aliens’ deal (even before it was implemented), and I think there is a ~60% chance that Elizier agrees.
I’m of the opinion that pain is a bad thing, except insofar as it prevents you from damaging yourself. People argue that pain is necessary to provide contrast to happiness, and that pleasure wouldn’t be meaningful without pain, but I would say that boredom and slight discomfort provide more than enough contrast.
However, this future society disagrees. The idea that “pain is important” is ingrained in these people’s minds, in much the same way that “rape is bad” is ingrained in ours. I think one of the main points Elizier is trying to make is that we would disagree with future humans almost as much as we would disagree with the baby-eaters or superhappies.
(Edit 1.5 years later: I was exaggerating in that second paragraph. I suspect I was trying too hard to sound insightful. The claims may or may not have merit, but I would no longer word them as forcefully.)
I never had this impression; if anything, I thought that all the things Eliezer mentioned in any detail—changes in gender and sexuality, the arcane libertarian framework that replaces the state and generally all the differences that seem important by the measure of our own history—are still intended to underscore how humanity still operates against a scale recongizable to its past. The aliens are simply unrelated, and that’s why dialogue fails.
When faced with such a catastrophic choice, we humans argue whether to use consequentialist or non-consequentialist ethics, whether an utilitarian model should value billions of lives more than the permanent extinction of some of our deepest emotions, etc, etc. To the Superhappies all of this is simply an incomprehensible hellish nightmare; if a human asked them to go ahead with the transformation but leave a fork of the species as a control group (like in Joe Haldeman’s Forever War), this would sound to them like a Holocaust survivor asking us to set up a new camp with intensified torture, so it can “fix” something that’s been wrong with the human condition.
(This does not imply some deep xenophobia in me; indeed, after thorough thinking, I say that I would’ve risked waiting the full eight hours for the evacuation—not because I like the alien mode of being so much, but because I find myself unable to form any judgment about it strong enough to outweigh the cost in lives. My utility function here simply runs into a boundary, with 16 billion ~ infinity to a factor of X, where I don’t quite understand what value to assign X with)
You should read this: http://www.nickbostrom.com/fable/dragon.html
It makes your point well. This is also touched on in HPMOR.
I think that point would make more sense than the point he is apparently actually making… which is that we must keep negative aspects of ourselves (such as pain) to remain “human” (as defined by current specimens, I suppose), which is apparently something important. Either that or, as you say, Yudkowsky believes that suffering is required to appreciate happiness.
I too would have been happy to take the SH deal; or, if not happy, at least happier than with any of the alternatives.