If I port this type of idea over to AI, I would get things like “the definition of human pain is whether the typical sufferer desires to scream or not”. Those definition can be massively gamed, of course; but it does hint that if we define a critical mass of concepts correctly (typical, desires...) we can ground some undefined concepts in those ones. It probably falls apart the more we move away from standard human society (eg will you definition of porn work for tenth generation uploads?).
So in total, if we manage to keep human society relatively static, and we have defined a whole host of concepts, we may be able to ground extra ambiguous concepts using what we’ve already defined. The challenge seems keeping human society (and humans!) relatively static.
If I port this type of idea over to AI, I would get things like “the definition of human pain is whether the typical sufferer desires to scream or not”. Those definition can be massively gamed, of course; but it does hint that if we define a critical mass of concepts correctly (typical, desires...) we can ground some undefined concepts in those ones. It probably falls apart the more we move away from standard human society (eg will you definition of porn work for tenth generation uploads?).
So in total, if we manage to keep human society relatively static, and we have defined a whole host of concepts, we may be able to ground extra ambiguous concepts using what we’ve already defined. The challenge seems keeping human society (and humans!) relatively static.