The fact that we are breathing is proof that we will not die from this. The fact that we simply exist. Because to insist we all die from this means that we’ll be the first in the whole universe to unleash killer AI on the rest of the universe.
Because you can’t conveniently kneecap the potential of AI once it kills us all but then somehow slows down to not discovering interstellar travel, retrofitting factories to make a trillion spaceships to go to every corner of the universe to kill all life.
To accept the AI Armageddon argument, you basically have to also own the belief that we are alone in the universe or are the most advanced civilization in the universe and there are no aliens, Roswell never happened, etc.
We’re literally the first to cook up killer AI. Unless there’s 1 million other killer AI on the other side of the universe from 1 million other galaxies and it just hasn’t spread here yet in the millions of years it’s had time to.
Are we really going to be that arrogant to say that there’s no way any civilization in this galaxy or nearby galaxies is more advanced than us? Even just 100 years more advanced? Because that’s probably how quickly it could take post-singularity for killer AI to conceive advanced forms of interstellar travel that we could never dream of and dispatch killer AI to our solar system.
And I don’t even want to hazard a guess at what a super AGI will cook up to replace the earliest forms of interstellar travel, 1000 years after they first started heading out beyond the solar system.
Even if we’ve got a 10% chance of AI killing us all. That’s the same math where 1 out of every 10 alien civilizations are knowing or unknowingly unleashing killer AI on the rest of the universe. And yet we’re still standing.
It’s not happening. Either because of divine intervention, some otherworldly entities that intervene with the tech of civilizations before it gets to the point of endangering the rest of the universe or we are just discounting the potential for AI to align itself.
I might be able to accept the premise of AI Armageddon if I didn’t also have to accept the bad math of us being alone in the universe or being the most advanced civilization out there.
You might be interested in Dissolving the Fermi Paradox by Sandberg, Drexler and Ord, who IIRC take into account the uncertainties in various parameters in the Drake equation and conclude that it is very plausible for us to be alone in the Universe.
There is also the “grabby aliens” model proposed by Robin Hanson, which (together with an anthropic principle?) is supposed to resolve the Fermi paradox while allowing for alien civilizations that expand close to the speed of light.
This is a surprisingly good point against “AI moratorium” of even 5 minutes. Because if Eliezers beliefs are correct, and AGIs are inherently uncontrollable and will seek to malevolently optimize the entire universe, killing their creators for 0.0000000000000000001 percent more usable matter, where is everyone? Why do we exist at all?
Maybe Eliezer is wrong and AI systems saturate much sooner than he thinks.
Your suggestion that the AI would only get 1e-21 more usable matter by eliminating humans made me think about orders of magnitude a bit. According to the World Economic Forum humans have made (hence presumably used) around 1.1e15kg of matter. That’s around 2e-10 of the Earth’s mass of 5.9e24kg. Now you could argue that what should be counted is the mass that can eventually be used by a super optimizer, but then we’d have to go into the weeds of how long the system would be slowed down by trying to keep humanity alive, figuring out what is needed for that, etc.
Right plus think on a solar system or galaxy level sale.
Now consider that properly keeping humans alive—in a way actually competent not the scam life support humans offer now—involves separating their brain from their body and keeping it alive and in perfect help essentially forever using nanotechnology to replace all other organ functions etc. The human would experience a world via VR or remote surrogates.
This would cost like 10 kg of matter a human with plausible limit level tech. They can’t breed so it’s 80 billion times 10 kg....
The fact that we are breathing is proof that we will not die from this. The fact that we simply exist. Because to insist we all die from this means that we’ll be the first in the whole universe to unleash killer AI on the rest of the universe.
Because you can’t conveniently kneecap the potential of AI once it kills us all but then somehow slows down to not discovering interstellar travel, retrofitting factories to make a trillion spaceships to go to every corner of the universe to kill all life.
To accept the AI Armageddon argument, you basically have to also own the belief that we are alone in the universe or are the most advanced civilization in the universe and there are no aliens, Roswell never happened, etc.
We’re literally the first to cook up killer AI. Unless there’s 1 million other killer AI on the other side of the universe from 1 million other galaxies and it just hasn’t spread here yet in the millions of years it’s had time to.
Are we really going to be that arrogant to say that there’s no way any civilization in this galaxy or nearby galaxies is more advanced than us? Even just 100 years more advanced? Because that’s probably how quickly it could take post-singularity for killer AI to conceive advanced forms of interstellar travel that we could never dream of and dispatch killer AI to our solar system.
And I don’t even want to hazard a guess at what a super AGI will cook up to replace the earliest forms of interstellar travel, 1000 years after they first started heading out beyond the solar system.
Even if we’ve got a 10% chance of AI killing us all. That’s the same math where 1 out of every 10 alien civilizations are knowing or unknowingly unleashing killer AI on the rest of the universe. And yet we’re still standing.
It’s not happening. Either because of divine intervention, some otherworldly entities that intervene with the tech of civilizations before it gets to the point of endangering the rest of the universe or we are just discounting the potential for AI to align itself.
I might be able to accept the premise of AI Armageddon if I didn’t also have to accept the bad math of us being alone in the universe or being the most advanced civilization out there.
You might be interested in Dissolving the Fermi Paradox by Sandberg, Drexler and Ord, who IIRC take into account the uncertainties in various parameters in the Drake equation and conclude that it is very plausible for us to be alone in the Universe.
There is also the “grabby aliens” model proposed by Robin Hanson, which (together with an anthropic principle?) is supposed to resolve the Fermi paradox while allowing for alien civilizations that expand close to the speed of light.
This is a surprisingly good point against “AI moratorium” of even 5 minutes. Because if Eliezers beliefs are correct, and AGIs are inherently uncontrollable and will seek to malevolently optimize the entire universe, killing their creators for 0.0000000000000000001 percent more usable matter, where is everyone? Why do we exist at all?
Maybe Eliezer is wrong and AI systems saturate much sooner than he thinks.
Your suggestion that the AI would only get 1e-21 more usable matter by eliminating humans made me think about orders of magnitude a bit. According to the World Economic Forum humans have made (hence presumably used) around 1.1e15kg of matter. That’s around 2e-10 of the Earth’s mass of 5.9e24kg. Now you could argue that what should be counted is the mass that can eventually be used by a super optimizer, but then we’d have to go into the weeds of how long the system would be slowed down by trying to keep humanity alive, figuring out what is needed for that, etc.
Right plus think on a solar system or galaxy level sale.
Now consider that properly keeping humans alive—in a way actually competent not the scam life support humans offer now—involves separating their brain from their body and keeping it alive and in perfect help essentially forever using nanotechnology to replace all other organ functions etc. The human would experience a world via VR or remote surrogates.
This would cost like 10 kg of matter a human with plausible limit level tech. They can’t breed so it’s 80 billion times 10 kg....