If a machine superoptimizer’s goal system is programmed to maximize pleasure, it might not tile the local universe with tiny digital minds running continuous loops of a single, maximally pleasurable experience, but we think it would do something undesirable like that.
Step 1 - for many minds without too short a horzion—is to conquer the galaxy to make sure there are no aliens arount that might threaten their entire value system. Tiling the local universe with tiny happy digital minds could easily turn out to be a recipe for long-term disaster—resulting in a universe with happiness levels being dictated by others.
Step 1 - for many minds without too short a horzion—is to conquer the galaxy to make sure there are no aliens arount that might threaten their entire value system. Tiling the local universe with tiny happy digital minds could easily turn out to be a recipe for long-term disaster—resulting in a universe with happiness levels being dictated by others.