If you haven’t read CEV, I strongly recommend doing so. It resolved some of my confusions about utopia that were unresolved even after reading the Fun Theory sequence.
Specifically, I had an aversion to the idea of being in a utopia because “what’s the point, you’ll have everything you want”. The concrete pictures that Eliezer gestures at in the CEV document do engage with this confusion, and gesture at the idea that we can have a utopia where the AI does not simply make things easy for us, but perhaps just puts guardrails onto our reality, such that we don’t die, for example, but we do have the option to struggle to do things by ourselves.
Yes, the Fun Theory sequence tries to communicate this point, but it didn’t make sense to me until I could conceive of an ASI singleton that could actually simply not help us.
If you haven’t read CEV, I strongly recommend doing so. It resolved some of my confusions about utopia that were unresolved even after reading the Fun Theory sequence.
Specifically, I had an aversion to the idea of being in a utopia because “what’s the point, you’ll have everything you want”. The concrete pictures that Eliezer gestures at in the CEV document do engage with this confusion, and gesture at the idea that we can have a utopia where the AI does not simply make things easy for us, but perhaps just puts guardrails onto our reality, such that we don’t die, for example, but we do have the option to struggle to do things by ourselves.
Yes, the Fun Theory sequence tries to communicate this point, but it didn’t make sense to me until I could conceive of an ASI singleton that could actually simply not help us.