Rereading this post some years later I found a number of things were explained:
“Localization” usually has to do with being spatially adjacent. This post apparently means something quite different by the term—something to do with not doing “sharing of cognitive content between diverse AIs”.
That apparently explains some of the confusing comments in the RSI post—in particular this:
I haven’t yet touched on the issue of localization (though the basic issue is obvious: the initial recursive cascade of an intelligence explosion can’t race through human brains because human brains are not modifiable until the AI is already superintelligent).
...which just seems like crazy talk under the usual meaning of the term “localization”.
A more conventional vision involves development being spread across labs and data centres distributed throughout the planet. That is not terribly “local” in ordinary 3D space—but may be “local” in whatever space is being talked about in this post.
The post contains this:
I can imagine there being an exception to this for non-diverse agents that are deliberately designed to carry out this kind of trading within their code-clade.
Rereading this post some years later I found a number of things were explained:
“Localization” usually has to do with being spatially adjacent. This post apparently means something quite different by the term—something to do with not doing “sharing of cognitive content between diverse AIs”.
That apparently explains some of the confusing comments in the RSI post—in particular this:
...which just seems like crazy talk under the usual meaning of the term “localization”.
A more conventional vision involves development being spread across labs and data centres distributed throughout the planet. That is not terribly “local” in ordinary 3D space—but may be “local” in whatever space is being talked about in this post.
The post contains this:
Exactly. So: not “local” al all.