So, I might be misunderstanding your question, but here’s an example of what shminux is saying.
(Note: Something this large isn’t necessary for the point to hold, but nuance is the enemy of a clear explanation)
Imagine an AI pops up tomorrow and says “human beings will not get hurt any more”. You no longer need to worry about food, shelter, protection from others, and many other things you needed money for. You’d also expect much of old Earth to change radically when governments are unable to use the threats they previously relied on when controlling their slice of the world.
If the AI has already done this, there’s nothing specific it needs your business for.
This makes sense to me, but it seems like a better fit for describing a post-AGI world. I am asking about the period before AGI has arrived (like now) but when the probability of it arriving within the early lifetime of a business is high enough to merit specific consideration (also like now, I claim).
There has to be a transitionary period before AGI actually is in a position to be doing any of these things even in the fast takeoff scenario; there’s too many atoms to move around for it to be otherwise, it seems to me.
So, I might be misunderstanding your question, but here’s an example of what shminux is saying.
(Note: Something this large isn’t necessary for the point to hold, but nuance is the enemy of a clear explanation)
Imagine an AI pops up tomorrow and says “human beings will not get hurt any more”. You no longer need to worry about food, shelter, protection from others, and many other things you needed money for. You’d also expect much of old Earth to change radically when governments are unable to use the threats they previously relied on when controlling their slice of the world.
If the AI has already done this, there’s nothing specific it needs your business for.
This makes sense to me, but it seems like a better fit for describing a post-AGI world. I am asking about the period before AGI has arrived (like now) but when the probability of it arriving within the early lifetime of a business is high enough to merit specific consideration (also like now, I claim).
There has to be a transitionary period before AGI actually is in a position to be doing any of these things even in the fast takeoff scenario; there’s too many atoms to move around for it to be otherwise, it seems to me.