How did it not go far enough? What would you like me to add?
even if they were just sitting in a box for eternity.
The could be super fulfilled doing other things as well. Some people (I think EY is included in this group) wouldn’t want to just sit in a box for eternity. However, they could still be super fulfilled by altering their hedonic set-point digitally.
According to that view, making a FAI would be a HUGE missed opportunity, since it wouldn’t do that.
There were too many pronouns for me to understand what you were talking about. Which view? And what wouldn’t the FAI do? Generally when I hear missed opportunity I think of something that you don’t do, but you should have done. So I don’t really understand how making an FAI is a missed opportunity.
even if they were just sitting in a box for eternity.
The could be super fulfilled doing other things as well.
Right, but if they can be just as fulfilled in the box, and it allows more humans to be in this simulated utopia, why not stack ’em in boxes?
If this sounds bad, it’s because it’s horrible from a utility-maximizing standpoint. Utility maximizers use their own standards when evaluating the future, not the standards of the beings of the time. And so when you hear “stack ’em in boxes,” the utility maximizer part of you goes “that sounds like an awful life.” But that’s pretty provincial—according to the people in the boxes, they’re having the most wonderful life possible.
According to that view, making a FAI would be a HUGE missed opportunity, since it wouldn’t do that.
There were too many pronouns for me to understand what you were talking about.
If an AI filled the universe with computers simulating happiest people in boxes, I’d label it as not FAI. But from the standpoint of cosmopolitan, “whatever floats your boat” utilitarianism, it would be a smashing success, maybe the greatest success possible. And so the greatest success comes from not making an FAI.
True, and it is a great possible explanation of the Fermi paradox as well. All the advanced alien civilizations could just be stacked in boxes with no desire to expand. On the other hand, it seems more conducive to survival to want to expand through the universe, and then stack in boxes. Also, it hardly seems objective to say that we should maximize the number of ems possible. That sounds like Kantian ethics of valuing people in themselves. Even if you agreed with Kant, (I personally Kant stand him :D) you might no value uploads in themselves.
How did it not go far enough? What would you like me to add?
The could be super fulfilled doing other things as well. Some people (I think EY is included in this group) wouldn’t want to just sit in a box for eternity. However, they could still be super fulfilled by altering their hedonic set-point digitally.
There were too many pronouns for me to understand what you were talking about. Which view? And what wouldn’t the FAI do? Generally when I hear missed opportunity I think of something that you don’t do, but you should have done. So I don’t really understand how making an FAI is a missed opportunity.
Right, but if they can be just as fulfilled in the box, and it allows more humans to be in this simulated utopia, why not stack ’em in boxes?
If this sounds bad, it’s because it’s horrible from a utility-maximizing standpoint. Utility maximizers use their own standards when evaluating the future, not the standards of the beings of the time. And so when you hear “stack ’em in boxes,” the utility maximizer part of you goes “that sounds like an awful life.” But that’s pretty provincial—according to the people in the boxes, they’re having the most wonderful life possible.
If an AI filled the universe with computers simulating happiest people in boxes, I’d label it as not FAI. But from the standpoint of cosmopolitan, “whatever floats your boat” utilitarianism, it would be a smashing success, maybe the greatest success possible. And so the greatest success comes from not making an FAI.
True, and it is a great possible explanation of the Fermi paradox as well. All the advanced alien civilizations could just be stacked in boxes with no desire to expand. On the other hand, it seems more conducive to survival to want to expand through the universe, and then stack in boxes. Also, it hardly seems objective to say that we should maximize the number of ems possible. That sounds like Kantian ethics of valuing people in themselves. Even if you agreed with Kant, (I personally Kant stand him :D) you might no value uploads in themselves.
Or just increasing the number of people if they have positive utility.