An AI that runs conscious beings has finite resources. Complex conscious beings consume more of these resources than do simpler conscious beings. Suppose, for purposes of discussion, that a friendly AI would maximize the aggregate utility of the beings it runs by modulating their level of complexity.
The utility-maximizing level of complexity would depend on the natural laws that govern the relation between the complexity of a being and the magnitude of the utility that it can experience.
When you are uploaded to the AI, then, one of three things will happen, depending on the result above:
1. You may wake up to discover that most of your mind, memory, and identity had disappeared, because most of your computational resources had been stolen in order to create a set of new, simple beings who, in total, would enjoy more utility than you had thereby lost. You may even be much less happy than you now are.
2. You may not wake up at all, because the totality of your computational resources had been stolen from you and given to a superbeing who was capable, because of some economy of scale in utility consumption, of extracting more utility from them than you could.
3. You may wake up to discover that you are much smarter and happier than you now are, because the utility-maximizing level of complexity is larger than your own, but not so large that, if all people were to upload, it would exhaust the AI’s resources, thus making rationing necessary.
The utility-maximizing complexity of conscious beings
An AI that runs conscious beings has finite resources. Complex conscious beings consume more of these resources than do simpler conscious beings. Suppose, for purposes of discussion, that a friendly AI would maximize the aggregate utility of the beings it runs by modulating their level of complexity.
The utility-maximizing level of complexity would depend on the natural laws that govern the relation between the complexity of a being and the magnitude of the utility that it can experience.
When you are uploaded to the AI, then, one of three things will happen, depending on the result above:
1. You may wake up to discover that most of your mind, memory, and identity had disappeared, because most of your computational resources had been stolen in order to create a set of new, simple beings who, in total, would enjoy more utility than you had thereby lost. You may even be much less happy than you now are.
2. You may not wake up at all, because the totality of your computational resources had been stolen from you and given to a superbeing who was capable, because of some economy of scale in utility consumption, of extracting more utility from them than you could.
3. You may wake up to discover that you are much smarter and happier than you now are, because the utility-maximizing level of complexity is larger than your own, but not so large that, if all people were to upload, it would exhaust the AI’s resources, thus making rationing necessary.