If you are a utilitarian, and you believe in shut-up-and-multiply, then the correct thing for the FAI to do is to use up all available resources so as to maximize the number of beings, and then induce a state of permanent and ultimate enjoyment in every one of them. This enjoyment could be of any type—it could be explorative or creative or hedonic enjoyment as we know it. The most energy efficient way to create any kind of enjoyment, however, is to stimulate the brain-equivalent directly. Therefore, the greatest utility will be achieved by wire-heading. Everything else falls short of that.
That’s why utilitarianism is a bad idea, expecially if you are allowed to modify the agents. Think about it, humans would be a terrible waste of energy if their only purpose was to have their hedonic pleasure maximized. Mice would be more efficient. Or monocellular organisms. Or registers inside the memory of a computer that get incremented as fast as possible.
What I don’t quite understand is why everyone thinks that this would be such a horrible outcome. As far as I can tell, these seem to be cached emotions that are suitable for our world, but not for the world of FAI. In our world, we truly do need to constantly explore and create, or else we will suffer the consequences of not mastering our environment.
You have it backwards. Why do you need not to suffer the consequences of not mastering our environment?
In a world where FAI exists, there is no longer a point, nor even a possibility, of mastering our environment. The FAI masters our environment for us, and there is no longer a reason to avoid hedonic pleasure. It is no longer a trap.
There is no longer any “us” in your hedonic “heaven”. It is a world populated by minimalistic agents, all equal to each other, with no memories, no sense of personal identity, no conscious experiences. Life and death would be meanignless concepts to those things, like any other concept, since they wouldn’t be capable of anything close to what we call thinking.
Is that what you want the world to become?
Since the FAI can sustain us in safety until the universe goes poof, there is no reason for everyone not to experience ultimate enjoyment in the meanwhile. In fact, I can hardly tell this apart from the concept of a Christian Heaven, which appears to be a place where Christians very much want to get.
Yes, and in fact the Christian Heaven is not a coherent concept. There can’t be happiness without pain. No satisfaction without unquenched desire. If you give an agent anything it can possibly ever want, it stops being an agent.
These parallels reinforce my belief that Singularitarianism is just a thinly veiled version of Christianity.
That’s why utilitarianism is a bad idea, expecially if you are allowed to modify the agents. Think about it, humans would be a terrible waste of energy if their only purpose was to have their hedonic pleasure maximized. Mice would be more efficient. Or monocellular organisms. Or registers inside the memory of a computer that get incremented as fast as possible.
You have it backwards. Why do you need not to suffer the consequences of not mastering our environment?
There is no longer any “us” in your hedonic “heaven”. It is a world populated by minimalistic agents, all equal to each other, with no memories, no sense of personal identity, no conscious experiences. Life and death would be meanignless concepts to those things, like any other concept, since they wouldn’t be capable of anything close to what we call thinking.
Is that what you want the world to become?
Yes, and in fact the Christian Heaven is not a coherent concept. There can’t be happiness without pain. No satisfaction without unquenched desire. If you give an agent anything it can possibly ever want, it stops being an agent.
These parallels reinforce my belief that Singularitarianism is just a thinly veiled version of Christianity.