long time in the subjective future [...] subjective decades [...] subjective centuries
What is subjective time? Is the idea that human-imitating AI will be sufficiently faithful to what humans would do, such that if AI does something that humans would have done in ten years, we say it happened in a “subjective decade” (which could be much shorter in sidereal time, i.e., the actual subjective time of existing biological humans)?
This argument implicitly measures developments by calendar time—how many years elapsed between the development of AI and the development of destructive physical technology? If we haven’t gotten our house in order by 2045, goes the argument, then what chance do we have of getting our house in order by 2047?
But in the worlds where AI radically increases the pace of technological progress, this is the wrong way to measure. In those worlds science isn’t being done by humans, it is being done by a complex ecology of interacting machines moving an order of magnitude faster than modern society. Probably it’s not just science: everything is getting done by a complex ecology of interacting machines at unprecedented speed.
If we want to ask about “how much stuff will happen”, or “how much change we will see”, it is more appropriate to think about subjective time: how much thinking and acting actually got done? It doesn’t really matter how many times the earth went around the sun.
I’m not thinking of AI that is faithful to what humans would do, just AI that at all represents human interests well enough that “the AI had 100 years to think” is meaningful. If you don’t have such an AI, then (i) we aren’t in the competitive AI alignment world, (ii) you are probably dead anyway.
If you think in terms of calendar time, then yes everything happens incredibly quickly. It’s weird to me that Rob is even talking about “5 years” (though I have no idea what AGI means, so maybe?). I would usually guess that 5 calendar years after TAI is probably post-singularity, so effectively many subjective millennia and so the world is unlikely to closely resemble our world (at least with respect to governance of new technologies).
What is subjective time? Is the idea that human-imitating AI will be sufficiently faithful to what humans would do, such that if AI does something that humans would have done in ten years, we say it happened in a “subjective decade” (which could be much shorter in sidereal time, i.e., the actual subjective time of existing biological humans)?
… ah, I see you address this in the linked post on “Handling Destructive Technology”:
I’m not thinking of AI that is faithful to what humans would do, just AI that at all represents human interests well enough that “the AI had 100 years to think” is meaningful. If you don’t have such an AI, then (i) we aren’t in the competitive AI alignment world, (ii) you are probably dead anyway.
If you think in terms of calendar time, then yes everything happens incredibly quickly. It’s weird to me that Rob is even talking about “5 years” (though I have no idea what AGI means, so maybe?). I would usually guess that 5 calendar years after TAI is probably post-singularity, so effectively many subjective millennia and so the world is unlikely to closely resemble our world (at least with respect to governance of new technologies).