But think of what you’re giving up, if you give up the chance to create something BETTER THAN HUMANITY.
And yes, OF COURSE the AI must be given the chance to steer its own course; its course will in fact be better than ours!
Imagine a Homo erectus philosopher (if there could be such a thing), reflecting on whether or not to evolve into Homo sapiens; “No, it’s too dangerous.” he reasons. “I’m not ready to take on that level of responsibility.”
Some funny flaws exist in evolutionary singularity since H erectus is merely a being of artifact of systematic paleo-classification , and quite probably exist He-Hn-Hs continuum. From back-way view one can ~reply the first H sapiens in his humane longing and attracted to Eve erectus and say ‘i will’, and the lived happy forever. Don’t you agree that general pattern in universe evolution is from high energy to high information? And who won’t stop it. If e=mc^2=ih/t ~1|0 Condensing gas to galaxies, planets, life intelligence civilization all alone and together are exception (is unique exception) from ‘there is nothing’ and as exception carry exemplifying information. It as is conclusion from question ‘what is the simplest to become complex ‘o-v’. The basic laws of of intelligent consciousness are:
the basic needs of consciousness is to be _..
the more intelligent, the more goodness
The natural way to constitute checks and balances is instead of citizen one AGI set a society of AGI so no singular GI will overpower resources. Then embrace personal GI to turn for goodness. [see if you see peculiar linguistic : baby jAGI abrakudobra ] Of course some who plan to harness GAI do not have a chance as with tsunami—anonymous leaks ware like raindrops comparatively. To be or not to be begin to be a stacking question _
But think of what you’re giving up, if you give up the chance to create something BETTER THAN HUMANITY.
And yes, OF COURSE the AI must be given the chance to steer its own course; its course will in fact be better than ours!
Imagine a Homo erectus philosopher (if there could be such a thing), reflecting on whether or not to evolve into Homo sapiens; “No, it’s too dangerous.” he reasons. “I’m not ready to take on that level of responsibility.”
Some funny flaws exist in evolutionary singularity since H erectus is merely a being of artifact of systematic paleo-classification , and quite probably exist He-Hn-Hs continuum. From back-way view one can ~reply the first H sapiens in his humane longing and attracted to Eve erectus and say ‘i will’, and the lived happy forever. Don’t you agree that general pattern in universe evolution is from high energy to high information? And who won’t stop it. If e=mc^2=ih/t ~1|0 Condensing gas to galaxies, planets, life intelligence civilization all alone and together are exception (is unique exception) from ‘there is nothing’ and as exception carry exemplifying information. It as is conclusion from question ‘what is the simplest to become complex ‘o-v’. The basic laws of of intelligent consciousness are:
the basic needs of consciousness is to be _..
the more intelligent, the more goodness
The natural way to constitute checks and balances is instead of citizen one AGI set a society of AGI so no singular GI will overpower resources. Then embrace personal GI to turn for goodness. [see if you see peculiar linguistic : baby jAGI abrakudobra ] Of course some who plan to harness GAI do not have a chance as with tsunami—anonymous leaks ware like raindrops comparatively. To be or not to be begin to be a stacking question _