Give me a good argument of why an FAI shouldn’t devote all its resources to trying to leave the universe rather than supporting a galactic civilization for a few years?
Now this looks like a wrong kind of question to consider in this context. The amount of fun your human existence is delivering, in connection with what you abstractly believe is the better course of action, is something relevant, but the details of how FAI would manage the future is not your human existence’s explicit problem, unless you are working on FAI design.
If it’s better for FAI to spend the next 3^^^3 multiverse millenia planning the future, why should that have a reflection in your psychological outlook? That’s an obscure technical question. What matters is whether it’s better, not whether it has a certain individual surface feature.
Now this looks like a wrong kind of question to consider in this context. The amount of fun your human existence is delivering, in connection with what you abstractly believe is the better course of action, is something relevant, but the details of how FAI would manage the future is not your human existence’s explicit problem, unless you are working on FAI design.
If it’s better for FAI to spend the next 3^^^3 multiverse millenia planning the future, why should that have a reflection in your psychological outlook? That’s an obscure technical question. What matters is whether it’s better, not whether it has a certain individual surface feature.