In some sense there’s probably no option other than that, since creating a synapse should count as a computational operation. But there’d be different options for what the computations would be.
The simplest might just be storing pairwise relationships. That’s going to add size, even if sparse.
I agree that LLMs do that too, but I’m skeptical about claims that LLMs are near human ability. It’s not that I’m confident that they aren’t—it just seems hard to say. (I do think they now have surface-level language ability similar to humans, but they still struggle at deeper understanding, and I don’t know how much improvement is needed to fix that weakness.)
I haven’t fully digested this comment, but:
In some sense there’s probably no option other than that, since creating a synapse should count as a computational operation. But there’d be different options for what the computations would be.
The simplest might just be storing pairwise relationships. That’s going to add size, even if sparse.
I agree that LLMs do that too, but I’m skeptical about claims that LLMs are near human ability. It’s not that I’m confident that they aren’t—it just seems hard to say. (I do think they now have surface-level language ability similar to humans, but they still struggle at deeper understanding, and I don’t know how much improvement is needed to fix that weakness.)