I presume this is downvoted due to some inferential gap… How does one get from no AGI to no humans? Or, conversely, why humans implies AGI?
I hope they all downvoted it because the OP asked about a story idea without calling it plausible in our world.
I downvoted mainly because Eliezer is being rude. Dude didn’t even link http://lesswrong.com/lw/ql/my_childhood_role_model/ or anything.
I presume this is downvoted due to some inferential gap… How does one get from no AGI to no humans? Or, conversely, why humans implies AGI?
I hope they all downvoted it because the OP asked about a story idea without calling it plausible in our world.
I downvoted mainly because Eliezer is being rude. Dude didn’t even link http://lesswrong.com/lw/ql/my_childhood_role_model/ or anything.