And given that AGI necessitates human language capability that implies that some chunk of it’s brain will be devoted to compressing and summarizing the internal state&computations down to a linguistic text stream (because that is what language actually is). So the AGI will likely also have an inner monologue which will naturally be hooked up to powerful database/search/indexing/monitoring tools.
That… was wayyyyy too big a jump in reasoning. The ability to use human language externally does not at all imply that the bulk of internal state/computations needs to be summarized to a text stream, nor does it imply that language needs to be produced internally on a continuous basis (as opposed to being produced on the fly, as needed).
If I were going to make a case that AGI will have an internal monologue, I’d say “well humans have an internal monologue, so that’s evidence that an internal monologue is convergent”. But even in humans, peoples’ internal visualizations, urges and flinches, instinctive valuations, and tons of other stuff are not continuously transcribed into the internal monologue; the monologue captures only a specific narrow slice of what the mind is doing. It’s not a broad compression of all the internal activity. And humans can totally just turn off their inner monologue, and keep going about their day functioning just fine.
The ability to use human language externally does not at all imply that the bulk of internal state/computations needs to be summarized to a text stream, nor does it imply that language needs to be produced internally on a continuous basis (as opposed to being produced on the fly, as needed).
Summarizable rather than summarized, but this is in fact just how the brain works (or most brains) and certainly is a valid path to AGI. So the key question is not “is inner monologue essential?”—which you seem to be arguing against—but rather “does inner monologue have much any of a performance disadvantage?”. I’d roughly estimate that turning off the language production centers in the brain would save about 10%.
But even in humans, peoples’ internal visualizations, urges and flinches, instinctive valuations, and tons of other stuff are not continuously transcribed into the internal monologue;
Text/speech is an attention guided summarization/compression, so naturally it doesn’t capture everything—but that’s the point!. It is the highest level abstract summary, and I mentioned we can additionally train aux models to zoom in on any timeslice of thought, to the extent that turns out to be actually useful relative to the compute/storage/bandwidth costs.
And humans can totally just turn off their inner monologue, and keep going about their day functioning just fine.
What? Evidence for this? I can think of only meditation, but people aren’t doing any important intellectual work while meditating. Regardless even if some humans can turn off their inner monologue that would only be relevant if that ability was necessary for higher intelligence (which it near certainly is not).
I turn off my internal monologue intentionally sometimes, it’s not particularly difficult (try it!). Also it does seem to improve performance in some ways—turns out the transcription to language is slow, my thoughts zoom ahead faster when they’re not waiting around for the monologue to catch up. IIRC people in flow-state usually don’t have a monologue going, which would be additional evidence of the performance cost in humans assuming I’m remembering correctly.
Summarizable rather than summarized, but this is in fact just how the brain works (or most brains) and certainly is a valid path to AGI.
Empirically, people seem to have an enormous amount of difficulty expressing many parts of their internal cognition verbally. So no, a lot of it doesn’t even seem to be summarizable, at least given a typical human’s facility with natural language.
Note that inner monologue can be non-verbal—mine often contains abstract pictorial/visual components. For deaf/mute people it likely develops visually.
Empirically, people seem to have an enormous amount of difficulty expressing many parts of their internal cognition verbally. So no, a lot of it doesn’t even seem to be summarizable, at least given a typical human’s facility with natural language.
I guess I may be unusual in this regard then. But naturally the type of brain-like AGI I tend to think of is likely biased towards my own particular cognition and self-understanding thereof—and perhaps this is also true for you.
Regardless: even if 20% of humanity had no inner monologue or 50%, or 99%, it simply wouldn’t matter at all to my larger point: I need only a few examples of high functioning intelligent humans with inner monologues to demonstrate that having an inner monologue is clearly not an efficiency disadvantage! This is a low cost safety feature.
Finally, I’l conclude with that Hellen Keller quote:
“Before my teacher came to me, I did not know that I am. I lived in a world that was a no-world. I cannot hope to describe adequately that unconscious, yet conscious time of nothingness. I did not know that I knew aught, or that I lived or acted or desired. I had neither will nor intellect. I was carried along to objects and acts by a certain blind natural impetus. I had a mind which caused me to feel anger, satisfaction, desire… ”
I need only a few examples of high functioning intelligent humans with inner monologues to demonstrate that having an inner monologue is clearly not an efficiency disadvantage!
I would like to point out that what johnswentworth said about being able to turn off an internal monologue is completely true for me as well. My internal monologue turns itself on and off several (possibly many) times a day when I don’t control it, and it is also quite easy to tell it which way to go on that. I don’t seem to be particularly more or less capable with it on or off, except on a very limited number of tasks. Simple tasks are easier without it, while explicit reasoning and storytelling are easier with it. I think my default is off when I’m not worried (but I do an awful lot of intentional verbal daydreaming and reasoning about how I’m thinking too.).
That… was wayyyyy too big a jump in reasoning. The ability to use human language externally does not at all imply that the bulk of internal state/computations needs to be summarized to a text stream, nor does it imply that language needs to be produced internally on a continuous basis (as opposed to being produced on the fly, as needed).
If I were going to make a case that AGI will have an internal monologue, I’d say “well humans have an internal monologue, so that’s evidence that an internal monologue is convergent”. But even in humans, peoples’ internal visualizations, urges and flinches, instinctive valuations, and tons of other stuff are not continuously transcribed into the internal monologue; the monologue captures only a specific narrow slice of what the mind is doing. It’s not a broad compression of all the internal activity. And humans can totally just turn off their inner monologue, and keep going about their day functioning just fine.
Summarizable rather than summarized, but this is in fact just how the brain works (or most brains) and certainly is a valid path to AGI. So the key question is not “is inner monologue essential?”—which you seem to be arguing against—but rather “does inner monologue have much any of a performance disadvantage?”. I’d roughly estimate that turning off the language production centers in the brain would save about 10%.
Text/speech is an attention guided summarization/compression, so naturally it doesn’t capture everything—but that’s the point!. It is the highest level abstract summary, and I mentioned we can additionally train aux models to zoom in on any timeslice of thought, to the extent that turns out to be actually useful relative to the compute/storage/bandwidth costs.
What? Evidence for this? I can think of only meditation, but people aren’t doing any important intellectual work while meditating. Regardless even if some humans can turn off their inner monologue that would only be relevant if that ability was necessary for higher intelligence (which it near certainly is not).
I turn off my internal monologue intentionally sometimes, it’s not particularly difficult (try it!). Also it does seem to improve performance in some ways—turns out the transcription to language is slow, my thoughts zoom ahead faster when they’re not waiting around for the monologue to catch up. IIRC people in flow-state usually don’t have a monologue going, which would be additional evidence of the performance cost in humans assuming I’m remembering correctly.
Empirically, people seem to have an enormous amount of difficulty expressing many parts of their internal cognition verbally. So no, a lot of it doesn’t even seem to be summarizable, at least given a typical human’s facility with natural language.
Note that inner monologue can be non-verbal—mine often contains abstract pictorial/visual components. For deaf/mute people it likely develops visually.
I guess I may be unusual in this regard then. But naturally the type of brain-like AGI I tend to think of is likely biased towards my own particular cognition and self-understanding thereof—and perhaps this is also true for you.
Regardless: even if 20% of humanity had no inner monologue or 50%, or 99%, it simply wouldn’t matter at all to my larger point: I need only a few examples of high functioning intelligent humans with inner monologues to demonstrate that having an inner monologue is clearly not an efficiency disadvantage! This is a low cost safety feature.
Finally, I’l conclude with that Hellen Keller quote:
“Before my teacher came to me, I did not know that I am. I lived in a world that was a no-world. I cannot hope to describe adequately that unconscious, yet conscious time of nothingness. I did not know that I knew aught, or that I lived or acted or desired. I had neither will nor intellect. I was carried along to objects and acts by a certain blind natural impetus. I had a mind which caused me to feel anger, satisfaction, desire… ”
That argument I buy.
I would like to point out that what johnswentworth said about being able to turn off an internal monologue is completely true for me as well. My internal monologue turns itself on and off several (possibly many) times a day when I don’t control it, and it is also quite easy to tell it which way to go on that. I don’t seem to be particularly more or less capable with it on or off, except on a very limited number of tasks. Simple tasks are easier without it, while explicit reasoning and storytelling are easier with it. I think my default is off when I’m not worried (but I do an awful lot of intentional verbal daydreaming and reasoning about how I’m thinking too.).