For that path, it takes AI that’s capable enough for all industrial (and non-industrial) tasks. But you also need all the physical plant (both the factories and the compute power to distribute to the tasks) that the AI uses to perform these industrial tasks.
I think it’s closer to 20 than 5 that the capabilities will be developed, possibly longer until the knowledge/techniques for the necessary manufacturing variants can be adapted to non-human production. And it’s easy to underestimate how long it takes to just build stuff, even if automated.
It’s not clear it’s POSSIBLE to convert enough stuff without breaking humanity badly enough that they revolt and destroy most things. Whether that kills everyone, reverts the world to the bronze age, or actually gets control of the AI is deeply hard to predict. It does seem clear that converting that much matter won’t be quick.
THAT is a crux. whether any component of it is exponential or logistical is VERY hard to know until you get close to the inflection. Absent “sufficiently advanced technology” like general-purpose nanotech (able to mine and refine, or convert existing materials into robots & factories in very short time), there is a limit to how parallel the building of the AI-friendly world can be, and a limit to how fast it can convert.
How severe do you think the logistics growth penalties are? I kinda mentally imagine a world where all desert and similar type land is covered in solar. Deeper mines than humans normally dig are supplying the minerals for further production. Many mines are underwater. The limit at that point is environment, you have exhausted the available land for more energy acquisition and are limited in what you can do safely without damaging the biosphere.
Somewhere around that point you shift to lunar factories which are in an exponential growth phase until the lunar surface is covered.
Basically I don’t see the penalties being relevant. There’s enough production to break geopolitical power deadlocks, and enough for a world of “everyone gets their needs and most luxury wants met”, assuming approximately 10 billion humans. The fact that further expansion may slow down isn’t relevant on a human scale.
Do you mean “when can we distinguish exponential from logistical curve”? I dunno, but I do know that many things which look exponential turn out to slow down after a finite (and small) number of doublings.
No I mean what I typed. Try my toy model, factories driven by AGI expanding across the earth or Moon. A logistical growth curve explicitly applies a penalty that scales with scale. When do you think this matters and by how much?
If say at lunar 50 percent the penalty is 10 percent, you have a case of basically exponential growth.
For that path, it takes AI that’s capable enough for all industrial (and non-industrial) tasks. But you also need all the physical plant (both the factories and the compute power to distribute to the tasks) that the AI uses to perform these industrial tasks.
I think it’s closer to 20 than 5 that the capabilities will be developed, possibly longer until the knowledge/techniques for the necessary manufacturing variants can be adapted to non-human production. And it’s easy to underestimate how long it takes to just build stuff, even if automated.
It’s not clear it’s POSSIBLE to convert enough stuff without breaking humanity badly enough that they revolt and destroy most things. Whether that kills everyone, reverts the world to the bronze age, or actually gets control of the AI is deeply hard to predict. It does seem clear that converting that much matter won’t be quick.
It’s exponential. You’re correct in the first years, badly off near the end.
THAT is a crux. whether any component of it is exponential or logistical is VERY hard to know until you get close to the inflection. Absent “sufficiently advanced technology” like general-purpose nanotech (able to mine and refine, or convert existing materials into robots & factories in very short time), there is a limit to how parallel the building of the AI-friendly world can be, and a limit to how fast it can convert.
How severe do you think the logistics growth penalties are? I kinda mentally imagine a world where all desert and similar type land is covered in solar. Deeper mines than humans normally dig are supplying the minerals for further production. Many mines are underwater. The limit at that point is environment, you have exhausted the available land for more energy acquisition and are limited in what you can do safely without damaging the biosphere.
Somewhere around that point you shift to lunar factories which are in an exponential growth phase until the lunar surface is covered.
Basically I don’t see the penalties being relevant. There’s enough production to break geopolitical power deadlocks, and enough for a world of “everyone gets their needs and most luxury wants met”, assuming approximately 10 billion humans. The fact that further expansion may slow down isn’t relevant on a human scale.
Do you mean “when can we distinguish exponential from logistical curve”? I dunno, but I do know that many things which look exponential turn out to slow down after a finite (and small) number of doublings.
No I mean what I typed. Try my toy model, factories driven by AGI expanding across the earth or Moon. A logistical growth curve explicitly applies a penalty that scales with scale. When do you think this matters and by how much?
If say at lunar 50 percent the penalty is 10 percent, you have a case of basically exponential growth.
I mean, that sounds like it would already absolutely fuck up most ecosystems and thus life support.