[Question] If AI starts to end the world, is suicide a good idea?

For a while I’ve thought to myself that if AI starts to obviously end the world I would just commit suicide, mainly to avoid any potential s-riskiness. But I’ve become far less certain recently that that would be a good idea.

Between the possibility of resurrection, quantum immortality, and weird acausal shenanigans, I’m not sure what my plan is for if the nano-factories starting popping up and whatnot.

This uncertainty about what I should do in such a situation causes me more discomfort than having a plan, even if it’s grim.

What do you think is the right thing to do if it becomes obvious that the world is ending due to AI?