i observe that processes seem to have a tendency towards what i’ll call “surreal equilibria”. [status: trying to put words to a latent concept. may not be legible, feel free to skip. partly ‘writing as if i know the reader will understand’ so i can write about this at all. maybe it will interest some.]
progressively smaller-scale examples:
it’s probably easiest to imagine this with AI neural nets, procedurally following some adapted policy even as the context changes from the one they grew in. if these systems have an influential, hard to dismantle role, then they themselves become the rules governing the progression of the system for whatever arises next, themselves ceasing to be the actors or components they originally were; yet as they are “intelligent” they still emit the words as if the old world is true; they become simulacra, the automatons keep moving as they were, this is surreal. going out with a whimper.
early → late-stage capitalism. early → late-stage democracy.
structures which became ingrained as rules of the world. note the difference between “these systems have Naturally Changed from an early to late form” and “these systems became persistent constraints, and new adapted optimizers sprouted within them”.
it looks like i’m trying to describe an iterative pattern of established patterns becoming constraints bearing permanent resemblance to what they were, and new things sprouting up within the new context / constrained world, eventually themselves becoming constraints.[1]
i also had in mind smaller scale examples.
a community forms around some goal and decides to moderate and curate itself in some consistent way, hoping this will lead to good outcomes; eventually the community is no longer the thing it set out to be; the original principles became the constraints. (? - not sure how much this really fits)
a group of internet friends agrees to regularly play a forum game but eventually they’re each just ‘going along with it’, no longer passionate about the game itself. “continuing to meet to do the thing” was a policy and stable meta-pattern that continued beyond its original context. albeit in this case it was an easily disrupted pattern. but for a time it led to a kind of deadness in behavior, me and those friends became surreal?
this is possibly a stretch from what i was originally describing. i’m just sampling words from my mind, here, and hoping they correlate to the latent which i wanted to put words to.
this feels related to goodhart, but where goodhart is framed more individually, and this is more like “a learned policy and its original purpose coming apart as a tendency of reality”.
The most likely/frequent outcome of “trying to build something that will last” is failure. You tried to build an AI, but it doesn’t work. You tried to convince people that trade is better than violence, they cooked you for dinner. You tried to found a community, no one was interested. A group of friends couldn’t decide when and where to meet.
But if you succeed to… create a pattern that keeps going on… then the thing you describe is the second most likely outcome. It turns out that your initial creation had parts that were easier or harder to replicate, and the easier ones keep going and growing, and the harder ones gradually disappear. The fluffy animal died, but its skeleton keeps walking.
It’s like casting an animation spell on a thing, and finding out that the spell only affects certain parts of the thing, if any.
I would distinguish two variants of this. There’s just plain inertia, like if you have a big pile of legacy code that accumulated from a lot of work, then it takes a commensurate amount of work to change it. And then there’s security, like a society needs rules to maintain itself against hostile forces. The former is sort of accidentally surreal, whereas the latter is somewhat intentionally so, in that a tendency to re-adapt would be a vulnerability.
i observe that processes seem to have a tendency towards what i’ll call “surreal equilibria”. [status: trying to put words to a latent concept. may not be legible, feel free to skip. partly ‘writing as if i know the reader will understand’ so i can write about this at all. maybe it will interest some.]
progressively smaller-scale examples:
it’s probably easiest to imagine this with AI neural nets, procedurally following some adapted policy even as the context changes from the one they grew in. if these systems have an influential, hard to dismantle role, then they themselves become the rules governing the progression of the system for whatever arises next, themselves ceasing to be the actors or components they originally were; yet as they are “intelligent” they still emit the words as if the old world is true; they become simulacra, the automatons keep moving as they were, this is surreal. going out with a whimper.
i don’t mean this to be about AI in particular; the A in AI is not fundamental.
early → late-stage capitalism. early → late-stage democracy.
structures which became ingrained as rules of the world. note the difference between “these systems have Naturally Changed from an early to late form” and “these systems became persistent constraints, and new adapted optimizers sprouted within them”.
it looks like i’m trying to describe an iterative pattern of established patterns becoming constraints bearing permanent resemblance to what they were, and new things sprouting up within the new context / constrained world, eventually themselves becoming constraints.[1]
i also had in mind smaller scale examples.
a community forms around some goal and decides to moderate and curate itself in some consistent way, hoping this will lead to good outcomes; eventually the community is no longer the thing it set out to be; the original principles became the constraints. (? - not sure how much this really fits)
a group of internet friends agrees to regularly play a forum game but eventually they’re each just ‘going along with it’, no longer passionate about the game itself. “continuing to meet to do the thing” was a policy and stable meta-pattern that continued beyond its original context. albeit in this case it was an easily disrupted pattern. but for a time it led to a kind of deadness in behavior, me and those friends became surreal?
this is possibly a stretch from what i was originally describing. i’m just sampling words from my mind, here, and hoping they correlate to the latent which i wanted to put words to.
this feels related to goodhart, but where goodhart is framed more individually, and this is more like “a learned policy and its original purpose coming apart as a tendency of reality”.
tangential: in this frame physics can be called the ‘first constraint’
The most likely/frequent outcome of “trying to build something that will last” is failure. You tried to build an AI, but it doesn’t work. You tried to convince people that trade is better than violence, they cooked you for dinner. You tried to found a community, no one was interested. A group of friends couldn’t decide when and where to meet.
But if you succeed to… create a pattern that keeps going on… then the thing you describe is the second most likely outcome. It turns out that your initial creation had parts that were easier or harder to replicate, and the easier ones keep going and growing, and the harder ones gradually disappear. The fluffy animal died, but its skeleton keeps walking.
It’s like casting an animation spell on a thing, and finding out that the spell only affects certain parts of the thing, if any.
I would distinguish two variants of this. There’s just plain inertia, like if you have a big pile of legacy code that accumulated from a lot of work, then it takes a commensurate amount of work to change it. And then there’s security, like a society needs rules to maintain itself against hostile forces. The former is sort of accidentally surreal, whereas the latter is somewhat intentionally so, in that a tendency to re-adapt would be a vulnerability.