I agree with the general points and especially “It should be obvious by now that AGI will necessarily be brain-like (in the same ways that DL is brain-like but a bit moreso), and necessarily conscious in human-like ways, as that is simply what intelligence demands”.
However, there is much interesting grey area when we start comparing the consciousness of humans with specific types of brain damage to current large transformer AI.
Transformers (even in their more agentic forms) are missing many components of the brain, but their largest deficit is the lack of strong recurrence and medium term memory capacity. Transformer LLMs like GPT3 have an equivalent conscious experience of essentially waking up from scratch, reading around a thousand tokens (in parallel), thinking about those for just a few hundred steps (equivalent to a dozen seconds or so of human thought), and then reset/repeat. They have reasonably extensive short term memory (attention), and very long term memory (standard weights), but not much in between, and they completely lack brain style RNN full recurrence.
In DL terminology brains are massive RNNs with a full spectrum of memory (roughly 10GB equiv in activations, and then a truly massive capacity of 10TB equiv or more in synapses that covers a full spectrum of timescales).
But some humans do have impairments to their medium term memory systems that is perhaps more comparable to LLM transformers—humans with missing/damaged hippocampus/EC regions like the infamous HM. Still conscious, but not in the same way.
I agree with the general points and especially “It should be obvious by now that AGI will necessarily be brain-like (in the same ways that DL is brain-like but a bit moreso), and necessarily conscious in human-like ways, as that is simply what intelligence demands”.
However, there is much interesting grey area when we start comparing the consciousness of humans with specific types of brain damage to current large transformer AI.
Transformers (even in their more agentic forms) are missing many components of the brain, but their largest deficit is the lack of strong recurrence and medium term memory capacity. Transformer LLMs like GPT3 have an equivalent conscious experience of essentially waking up from scratch, reading around a thousand tokens (in parallel), thinking about those for just a few hundred steps (equivalent to a dozen seconds or so of human thought), and then reset/repeat. They have reasonably extensive short term memory (attention), and very long term memory (standard weights), but not much in between, and they completely lack brain style RNN full recurrence.
In DL terminology brains are massive RNNs with a full spectrum of memory (roughly 10GB equiv in activations, and then a truly massive capacity of 10TB equiv or more in synapses that covers a full spectrum of timescales).
But some humans do have impairments to their medium term memory systems that is perhaps more comparable to LLM transformers—humans with missing/damaged hippocampus/EC regions like the infamous HM. Still conscious, but not in the same way.