I love this article, but I disagree with the conclusion. You’re essentially saying that a post-singularity world would be too impatient to explore the stars. I grant you that thinking a million times faster would make someone very impatient, but living a million times longer seems likely to counterbalance that.
My case against outward expansion is not based on issues of patience. It’s an economic issue. I should have made this more clear in the article, perhaps strike that one sentence about how long interstellar travel will subjectively take for accelerated intelligences, as that’s not even really relevant.
Outward expansion is unimaginably expensive, risky, and would take massive amounts of time to reach a doubling. Moore’s Law allows a much lower route risk for AGI’s to double their population/intelligence/whatever using a tiny tiny fraction of the time and energy required to double through space travel. See my reply above to Mitchell Porter.
If you knew you could build a rocket and fly it to mars or alpha centauri, and that it was 100% guaranteed to get there, and you’d have the mass and energy of an entire planet at your disposal once you did,
What’s the point? In the best case scenario you can eventually double your population after hundreds or thousands of years. You could spend a tiny tiny fraction of those resources and double your population thousands of times faster by riding Moore’s Law. Space travel only ever makes sense if Moore’s Law type growth ends completely.
There’s also the serious risks of losing the craft on the way and even discovering that Alpha Centauri is already occupied.
There’s also the serious risks of losing the craft on the way and even discovering that Alpha Centauri is already occupied.
The latter point is in tension with the rest of your argument. “No one colonizes the vast resources of space: they’re too crowded” doesn’t work as a Fermi Paradox explanation. Uncertainty about one’s prospects for successfully colonizing first could modestly diminish expected resource gain, but the more this argument seems persuasive, the more it indicates that potential rivals won’t beat you to the punch.
If older, powerful alien civilizations are already around then colonization may not even be an option for us at all. It’s an option for that lucky first civilization, but nobody else.
IIRC, one of the concerns about AIs grabbing as much territory and resources as possible is that they want to improve the odds that nothing else can be a threat to their core mission.
My case against outward expansion is not based on issues of patience. It’s an economic issue. I should have made this more clear in the article, perhaps strike that one sentence about how long interstellar travel will subjectively take for accelerated intelligences, as that’s not even really relevant.
Outward expansion is unimaginably expensive, risky, and would take massive amounts of time to reach a doubling. Moore’s Law allows a much lower route risk for AGI’s to double their population/intelligence/whatever using a tiny tiny fraction of the time and energy required to double through space travel. See my reply above to Mitchell Porter.
What’s the point? In the best case scenario you can eventually double your population after hundreds or thousands of years. You could spend a tiny tiny fraction of those resources and double your population thousands of times faster by riding Moore’s Law. Space travel only ever makes sense if Moore’s Law type growth ends completely.
There’s also the serious risks of losing the craft on the way and even discovering that Alpha Centauri is already occupied.
Why WOULDN’T moore’s law type growth end completely? Are you saying the speed of light is unbreakable but the planck limit isn’t?
The latter point is in tension with the rest of your argument. “No one colonizes the vast resources of space: they’re too crowded” doesn’t work as a Fermi Paradox explanation. Uncertainty about one’s prospects for successfully colonizing first could modestly diminish expected resource gain, but the more this argument seems persuasive, the more it indicates that potential rivals won’t beat you to the punch.
If older, powerful alien civilizations are already around then colonization may not even be an option for us at all. It’s an option for that lucky first civilization, but nobody else.
IIRC, one of the concerns about AIs grabbing as much territory and resources as possible is that they want to improve the odds that nothing else can be a threat to their core mission.