A possible problem with a fun universe—it seems to me that a good many people get their sense of their own value by doing things to make the world better, with better being somewhat framed as making it less bad for other people, or making it good in ways which prevent badness. That is, you feed a baby both because the baby is happy being fed and because the baby will be miserable if it isn’t fed.
This is called “wanting to be needed”. What makes this desire go away? It’s possible that people will stop feeling it if they’re sure that they don’t need to prove their value, or it might be that they’d feel adrift and pointless in a universe where they feel that there’s nothing important for them to do.
As for a fast upgrade, I think being intelligent is fun, and I assume (perhaps wrongly) that being more intelligent would be more fun. A fast upgrade (if safe, I don’t think I’d be a fast adopter) sounds good to me. I’d be waking up in a world which would be incomprehensible to me as I am now, but presumably manageable for me as I would be then, or at least no worse than being a baby in this world.
Fun Theory, in my imagination, would cover “wanting to be needed”. I’d bet that’s part of why you’d not want an FAI to instantly make everything as good as possible.
A possible problem with a fun universe—it seems to me that a good many people get their sense of their own value by doing things to make the world better, with better being somewhat framed as making it less bad for other people, or making it good in ways which prevent badness. That is, you feed a baby both because the baby is happy being fed and because the baby will be miserable if it isn’t fed.
This is called “wanting to be needed”. What makes this desire go away? It’s possible that people will stop feeling it if they’re sure that they don’t need to prove their value, or it might be that they’d feel adrift and pointless in a universe where they feel that there’s nothing important for them to do.
As for a fast upgrade, I think being intelligent is fun, and I assume (perhaps wrongly) that being more intelligent would be more fun. A fast upgrade (if safe, I don’t think I’d be a fast adopter) sounds good to me. I’d be waking up in a world which would be incomprehensible to me as I am now, but presumably manageable for me as I would be then, or at least no worse than being a baby in this world.
Fun Theory, in my imagination, would cover “wanting to be needed”. I’d bet that’s part of why you’d not want an FAI to instantly make everything as good as possible.