Takeaway: there can’t be that many possible semantic targets for words. The set of semantic targets for words (in humans) is at least exponentially smaller than the set of random variables in an agent’s world model.
I don’t think this follows. The set of semantic targets could be immense, but children and adults could share sufficiently similar priors, such that children land on adequately similar concepts to those that adults are trying to communicate with very little data.
Think of it like a modified Schelling-point game, where some communication is possible, but sending information is expensive. Alice is trying to find Bob in the galaxy, and Bob has been able to communicate only a little information for Alice to go on. However, Alice and Bob are both from Earth, so they share a lot of context. Bob can say “the moon” and Alice knows which moon Bob is probably talking about, and also knows that there is only one habitable moon-base on the moon to check.
Bob could find a way to point Alice to any point in the galaxy, but Bob probably won’t need to. So the set of possibilities appears to be small, from the perspective of someone who only sees a few rounds of this game.
So really, rather than “the set of semantic targets is small”, I should say something like “the set of semantic targets with significant prior probability is small”, or something like that. Unclear exactly what the right operationalization is there, but I think I buy the basic point.
I don’t think this follows. The set of semantic targets could be immense, but children and adults could share sufficiently similar priors, such that children land on adequately similar concepts to those that adults are trying to communicate with very little data.
Think of it like a modified Schelling-point game, where some communication is possible, but sending information is expensive. Alice is trying to find Bob in the galaxy, and Bob has been able to communicate only a little information for Alice to go on. However, Alice and Bob are both from Earth, so they share a lot of context. Bob can say “the moon” and Alice knows which moon Bob is probably talking about, and also knows that there is only one habitable moon-base on the moon to check.
Bob could find a way to point Alice to any point in the galaxy, but Bob probably won’t need to. So the set of possibilities appears to be small, from the perspective of someone who only sees a few rounds of this game.
So really, rather than “the set of semantic targets is small”, I should say something like “the set of semantic targets with significant prior probability is small”, or something like that. Unclear exactly what the right operationalization is there, but I think I buy the basic point.