I’ve also been doing searches for topics related to the singularity and space travel (this thought came up after playing a bit of Mass Effect ^ _ ^). It would seem to me that biological restrictions on space travel wouldn’t apply to a sufficiently advanced AI. This AI could colonize other worlds using near speed of light travel with minimal physical payload and harvest the raw materials on some new planet using algorithms programmed in small harvesting bots. If this is possible then it seem to me that unfriendly AI might not be that much of a threat since they would have many more “habitable” worlds to harvest/live on (like Venus or Mars, comets, asteroids, or extra-solar planets).
Another thing. If this is possible it sort of leads to a paradox: Why hasn’t it happened already with other intelligent life on other planets?
It would seem to me that biological restrictions on space travel wouldn’t apply to a sufficiently advanced AI. This AI could colonize other worlds using near speed of light travel with minimal physical payload and harvest the raw materials on some new planet using algorithms programmed in small harvesting bots.
Pretty much right.
If this is possible then it seem to me that unfriendly AI might not be that much of a threat since they would have many more “habitable” worlds to harvest/live on (like Venus or Mars, comets, asteroids, or extra-solar planets).
We would eventually like to inhabit the currently uninhabitable planets. Terraforming, self modification, sealed colonies, or some combination of those will eventually make this feasable. At that time, we would rather that those planets not fight back.
Symmetrically, an unfriendly process will not be satisfied with taking merely Mercury, Venus, Mars, Jupiter, Saturn, Uranus, Neptune, and the rest of the universe; it will want to do its thing on earth as well.
The choice between “kill the humans and take over earth” and “don’t kill the humans and don’t take over earth” is independent of the existence of other territory, so it doesn’t matter and it will kill us.
(the short answer is that there is no “satisfied” or “enough” among nonhuman agents.)
Another thing. If this is possible it sort of leads to a paradox: Why hasn’t it happened already with other intelligent life on other planets?
You mean the fermi paradox? You’ll have to expand, but note that a singularity will expand at lightspeed (=we wouldn’t see it until it were here), and it will consume all resources (= if it had been here, we wouldn’t).
I’ve also been doing searches for topics related to the singularity and space travel (this thought came up after playing a bit of Mass Effect ^ _ ^). It would seem to me that biological restrictions on space travel wouldn’t apply to a sufficiently advanced AI. This AI could colonize other worlds using near speed of light travel with minimal physical payload and harvest the raw materials on some new planet using algorithms programmed in small harvesting bots. If this is possible then it seem to me that unfriendly AI might not be that much of a threat since they would have many more “habitable” worlds to harvest/live on (like Venus or Mars, comets, asteroids, or extra-solar planets).
Another thing. If this is possible it sort of leads to a paradox: Why hasn’t it happened already with other intelligent life on other planets?
Pretty much right.
We would eventually like to inhabit the currently uninhabitable planets. Terraforming, self modification, sealed colonies, or some combination of those will eventually make this feasable. At that time, we would rather that those planets not fight back.
Symmetrically, an unfriendly process will not be satisfied with taking merely Mercury, Venus, Mars, Jupiter, Saturn, Uranus, Neptune, and the rest of the universe; it will want to do its thing on earth as well. The choice between “kill the humans and take over earth” and “don’t kill the humans and don’t take over earth” is independent of the existence of other territory, so it doesn’t matter and it will kill us.
(the short answer is that there is no “satisfied” or “enough” among nonhuman agents.)
You mean the fermi paradox? You’ll have to expand, but note that a singularity will expand at lightspeed (=we wouldn’t see it until it were here), and it will consume all resources (= if it had been here, we wouldn’t).