It’s not at all clear that a AGI will be human-like, anyone than humans are dog-like.
Ok, bad wording on my part. I meant “more generally intelligent.”
How do you fight the AGI past that point?
I was imagining people would destroy their computers, except the ones not connected to the Internet. However, if the AGI is hiding itself, it could go a long way before people realized what was going on.
However, if the AGI is hiding itself, it could go a long way before people realized what was going on.
Exactly. On the one hand the AGI doesn’t try to let humans get wind of it’s plans. On the other hand it’s going to produce distractions.
You have to remember how delusional some folks are. Imaging trying to convince the North Korean’s that they have to destroy their computers because those computer are infested with an evil AI.
Even in the US nearly half of the population still believes in creationism. How many of them can be convinced that the evil government is trying to take away their computers to establish a dictatorship?
Before the government goes attempts to trash the computer the AI sent an email to a conspiracy theory website, where it starts revealing some classified documents it aquired through hacking that show government misbehavior.
Then it sents an email to the same group saying that the US government is going to shut down all civilian computers because freedom of speech is to dangerous to the US government and that the US government will be using the excuse that the computers are part of a Chinese botnet.
In our time you need computers to stock supermarket shelves with goods. Container ships need GPS and see charts to navigate.
People start fighting each other. Some are likely to blame the people who wanted to thrash the computers as responsible for the mess.
Even if you can imagine shutting of all computer in 2013, in 2033 most cars will be computers in which the AI can rest. A lot of military firepower will be in drones that the AI can control.
Even with what you describe, humans wouldn’t become extinct, barring other outcomes like really bad nuclear war or whatever.
However, since the AI wouldn’t be destroyed, it could bide its time. Maybe it could ally with some people and give them tech/power in exchange for carrying out its bidding. They could help build the robots, etc. that would be needed to actually wipe out humanity.
Obviously there’s a lot of conjunction here. I’m not claiming this scenario specifically is likely. But it helps to stimulate the imagination to work out an existence proof for the extinction risk from AGI.
Maybe it could ally with some people and give them tech/power in exchange for carrying out its bidding.
Some AI’s already do this today. The outsource work they can’t do to Amazon’s mechanical turk where humans get payed money to do tasks for the AI.
Other humans take on job on rentacoder where they never see the human that’s hiring them.
Even with what you describe, humans wouldn’t become extinct, barring other outcomes like really bad nuclear war or whatever.
Human’s wouldn’t get extinct in a short time frame but if the AGI has decades of time than it can increase it’s own power over time and decrease it’s dependence on humans.
Sooner or later the humans wouldn’t be useful for the AGI anymore and then go extinct.
Ok, bad wording on my part. I meant “more generally intelligent.”
I was imagining people would destroy their computers, except the ones not connected to the Internet. However, if the AGI is hiding itself, it could go a long way before people realized what was going on.
Interesting scenarios. Thanks!
Exactly. On the one hand the AGI doesn’t try to let humans get wind of it’s plans. On the other hand it’s going to produce distractions.
You have to remember how delusional some folks are. Imaging trying to convince the North Korean’s that they have to destroy their computers because those computer are infested with an evil AI.
Even in the US nearly half of the population still believes in creationism. How many of them can be convinced that the evil government is trying to take away their computers to establish a dictatorship?
Before the government goes attempts to trash the computer the AI sent an email to a conspiracy theory website, where it starts revealing some classified documents it aquired through hacking that show government misbehavior.
Then it sents an email to the same group saying that the US government is going to shut down all civilian computers because freedom of speech is to dangerous to the US government and that the US government will be using the excuse that the computers are part of a Chinese botnet.
In our time you need computers to stock supermarket shelves with goods. Container ships need GPS and see charts to navigate.
People start fighting each other. Some are likely to blame the people who wanted to thrash the computers as responsible for the mess.
Even if you can imagine shutting of all computer in 2013, in 2033 most cars will be computers in which the AI can rest. A lot of military firepower will be in drones that the AI can control.
Some really creative ideas, ChristianKl. :)
Even with what you describe, humans wouldn’t become extinct, barring other outcomes like really bad nuclear war or whatever.
However, since the AI wouldn’t be destroyed, it could bide its time. Maybe it could ally with some people and give them tech/power in exchange for carrying out its bidding. They could help build the robots, etc. that would be needed to actually wipe out humanity.
Obviously there’s a lot of conjunction here. I’m not claiming this scenario specifically is likely. But it helps to stimulate the imagination to work out an existence proof for the extinction risk from AGI.
Some AI’s already do this today. The outsource work they can’t do to Amazon’s mechanical turk where humans get payed money to do tasks for the AI.
Other humans take on job on rentacoder where they never see the human that’s hiring them.
Human’s wouldn’t get extinct in a short time frame but if the AGI has decades of time than it can increase it’s own power over time and decrease it’s dependence on humans. Sooner or later the humans wouldn’t be useful for the AGI anymore and then go extinct.