A superintelligent mind with a reasonable amount of working memory could process generic statements all day long and never whine about dangling concepts. (I feel like the really smart people on LessWrong and Math Overflow also exhibit this behavior to some degree.) But as humans with tragically limited short-term memories, we need all the help we can get. We need our authors and teachers to give us mind-hangers.
I think we can do substantially better than four items in working memory, but not have a working memory with thousands of slots. That is because working memory is the state in which all things in memory are related to each other in one (or two) particular ways, e.g. social relationships between different people or variables in a mathematical system, and that setup creates a combinatorial explosion. Sometimes such variables can be partitioned that the combinatorial explosion isn’t as big of a problem because partitions can be dealt with independently (cf. companies as such Coasean partitions vs. the global optimization in a planned economy), but then it’s several different working memories (or a hierarchy of those). If we say a main sequence star makes available (at Landauer limit) ≈1047 bit erasures per second. If we want to track all pairwise relationships between WM elements once per second, that gives us ≈1023 elements available. If we want to track all subsets/their relations, the level is far lower, at ~157 elements.
(I’m playing a bit fast and loose here with the exact numbers, but in the spirit of speed…)
I think that superintelligences are probably going to come up with very clever ways of partitioning elements in reasoning tasks, but for very gnarly and inter-related problems working memory slots might become scarce.
I think we can do substantially better than four items in working memory, but not have a working memory with thousands of slots. That is because working memory is the state in which all things in memory are related to each other in one (or two) particular ways, e.g. social relationships between different people or variables in a mathematical system, and that setup creates a combinatorial explosion. Sometimes such variables can be partitioned that the combinatorial explosion isn’t as big of a problem because partitions can be dealt with independently (cf. companies as such Coasean partitions vs. the global optimization in a planned economy), but then it’s several different working memories (or a hierarchy of those). If we say a main sequence star makes available (at Landauer limit) ≈1047 bit erasures per second. If we want to track all pairwise relationships between WM elements once per second, that gives us ≈1023 elements available. If we want to track all subsets/their relations, the level is far lower, at ~157 elements.
(I’m playing a bit fast and loose here with the exact numbers, but in the spirit of speed…)
I think that superintelligences are probably going to come up with very clever ways of partitioning elements in reasoning tasks, but for very gnarly and inter-related problems working memory slots might become scarce.