I think you may be missing some context here. The meaninglessness comes from the expectation that such a super-intelligence will take over the world and kill all humans once created. Discovering a massive asteroid hurtling towards Earth would have much the same effect on meaning. If someone could build a friendly super-intelligence that didn’t want to kill anyone, then life would still be fully meaningful and everything would be hunky-dory.
I think you may be missing some context here. The meaninglessness comes from the expectation that such a super-intelligence will take over the world and kill all humans once created. Discovering a massive asteroid hurtling towards Earth would have much the same effect on meaning. If someone could build a friendly super-intelligence that didn’t want to kill anyone, then life would still be fully meaningful and everything would be hunky-dory.