Talking about whether an AI would or would not want to expand indefinitely is sort of missing the point. Barring a completely dominant singleton, someone is going to expand beyond Earth with overwhelming probability. The legacy of humans will be completely dominated by those who didn’t stay on Earth. It doesn’t matter whether the social impulse is generally towards expansion.
Edit: To be more precise, arguments that “most possible minds wouldn’t want to expand” must be incredibly strong in order to have any bearing whatsoever on the long term likelihood of expansion. I don’t really buy your argument at all (I would be happy to create new worlds inhabited by like-minded people even if there was a long communication delay between us...) but it seems like your argument isn’t even claiming to be strong enough to matter.
Some other notes: you can’t really expand inwards very much. You can only fit so much data into a small space (unless our understanding of relativity is wrong, in which case the discussion is irrelevant). Of course, you hit a much earlier limit if you aren’t willing to send something to the stars to harvest resources. Maybe these limits seem distant to us, but to an intelligence thinking a billion times faster we are almost up against them already.
The difference between old communication and new communication is not just speed; there is also a difference in availability and bandwidth. One month of latency is hardly even a relevant drawback compared to the incredible expense and limited bandwidth of sending human messengers. Although our current understanding of relativity suggests speeds will never improve that much, the cost per byte can drop an astronomical amount before running into physical limitations. If you want to draw an analogy to prehistoric times, you have to amend the situation by introducing armies of humans who live only to carry messages across continents.
Your conception of “unified minds” as opposed to groups of cooperating minds seems unlikely to remain relevant into the distant future. At least, I can think of no particular reason why our current understanding should remain applicable, so I would be incredibly surprised if it did. Similarly for your other strong predictions (which are supposed to apply trillions of years of human thought in the future, a truly ridiculous distance); saying for example that strong singletons are impossible really presumes a great deal about the nature of singletons and their reliance on communication.
Talking about whether an AI would or would not want to expand indefinitely is sort of missing the point. Barring a completely dominant singleton, someone is going to expand beyond Earth with overwhelming probability. The legacy of humans will be completely dominated by those who didn’t stay on Earth. It doesn’t matter whether the social impulse is generally towards expansion.
Edit: To be more precise, arguments that “most possible minds wouldn’t want to expand” must be incredibly strong in order to have any bearing whatsoever on the long term likelihood of expansion. I don’t really buy your argument at all (I would be happy to create new worlds inhabited by like-minded people even if there was a long communication delay between us...) but it seems like your argument isn’t even claiming to be strong enough to matter.
Some other notes: you can’t really expand inwards very much. You can only fit so much data into a small space (unless our understanding of relativity is wrong, in which case the discussion is irrelevant). Of course, you hit a much earlier limit if you aren’t willing to send something to the stars to harvest resources. Maybe these limits seem distant to us, but to an intelligence thinking a billion times faster we are almost up against them already.
The difference between old communication and new communication is not just speed; there is also a difference in availability and bandwidth. One month of latency is hardly even a relevant drawback compared to the incredible expense and limited bandwidth of sending human messengers. Although our current understanding of relativity suggests speeds will never improve that much, the cost per byte can drop an astronomical amount before running into physical limitations. If you want to draw an analogy to prehistoric times, you have to amend the situation by introducing armies of humans who live only to carry messages across continents.
Your conception of “unified minds” as opposed to groups of cooperating minds seems unlikely to remain relevant into the distant future. At least, I can think of no particular reason why our current understanding should remain applicable, so I would be incredibly surprised if it did. Similarly for your other strong predictions (which are supposed to apply trillions of years of human thought in the future, a truly ridiculous distance); saying for example that strong singletons are impossible really presumes a great deal about the nature of singletons and their reliance on communication.