Is the situation so dire with AI intelligence explosion that a human one must exist to counter balance it?
I wouldn’t exactly say “counter balance”. It’s more like we, as humans, want to get ahead of the AI intelligence explosion. Also, I wouldn’t advocate for a human intelligence explosion that looks like what an AI explosion would probably look like. An explosion sounds like gaining capability as fast as possible, seizing any new mental technology that’s invented and immediately overclocking it to invent the next and the next mental technology. That sort of thing would shred values in the process.
We would want to go about increasing the strength of humanity slowly, taking our time to not fuck it up. (But we wouldn’t want to drag our feet either—there are, after all, still people starving and suffering, decaying and dying, by the thousands and tens of thousands every day.)
But yes, the situation with AI is very dire.
augmented humans stand a decent chance of wanting to go full throttle on all things Artificial
I’m not following. Why would augmented humans have worse judgement about what is good for what we care about? Or why would they care about different things?
I wouldn’t exactly say “counter balance”. It’s more like we, as humans, want to get ahead of the AI intelligence explosion. Also, I wouldn’t advocate for a human intelligence explosion that looks like what an AI explosion would probably look like. An explosion sounds like gaining capability as fast as possible, seizing any new mental technology that’s invented and immediately overclocking it to invent the next and the next mental technology. That sort of thing would shred values in the process.
We would want to go about increasing the strength of humanity slowly, taking our time to not fuck it up. (But we wouldn’t want to drag our feet either—there are, after all, still people starving and suffering, decaying and dying, by the thousands and tens of thousands every day.)
But yes, the situation with AI is very dire.
I’m not following. Why would augmented humans have worse judgement about what is good for what we care about? Or why would they care about different things?