If you believe that the existence of a superintelligence smarter than you makes your continued existence and work meaningless, what does that say about your beliefs about people who are not as smart as you?
I think you may be missing some context here. The meaninglessness comes from the expectation that such a super-intelligence will take over the world and kill all humans once created. Discovering a massive asteroid hurtling towards Earth would have much the same effect on meaning. If someone could build a friendly super-intelligence that didn’t want to kill anyone, then life would still be fully meaningful and everything would be hunky-dory.
If you believe that the existence of a superintelligence smarter than you makes your continued existence and work meaningless, what does that say about your beliefs about people who are not as smart as you?
I think you may be missing some context here. The meaninglessness comes from the expectation that such a super-intelligence will take over the world and kill all humans once created. Discovering a massive asteroid hurtling towards Earth would have much the same effect on meaning. If someone could build a friendly super-intelligence that didn’t want to kill anyone, then life would still be fully meaningful and everything would be hunky-dory.
The meaninglessness comes from the idea akin to to “why bother with anything if AGI will destroy everything it”
Read Feynman’s citation from the beginning. It describes his feelings about atom bomb that are relevant for some people’s thoughts about AGI.