Hi, thanks tons for your interesting summary, even as STEM non-literate i believe i managed to grasp a few interesting bits !
I wondered, what’s your intuition about the software/operating system side, as in the way data is handled/ordered at the subconscious level ? Wasteful or efficient ? Isn’t that a key point that could render neuromorphic hardware obsolete, and perhaps suggest that AGI could run on current PCs as one commentator posited ?
As in, if i’m asking myself where i was yesterday, on a computer cognitive architecture, i’d just need to access a few pointers and do some dictionary searches and i’d get a working list of pointers in a minimal number of cycles, while my human mind seems to run a 100% CPU search with visual memories flooding back. I didn’t ask for images or emotions or details of what was at the scene, i just needed a pointer to these places to be able to pronounce their name.
Isn’t the human mind extremely limited by it’s wavering attention span ? By its working memory ? By the way information is probably intricately linked with sensory experience ? How much compression / abstraction is taking place ?
Think about asking a computer to design a house. As a human i’d never even be able to hold the design in memory, i’d need a pen and paper and that’d considerably slow me down, and getting all the details correctly would ask me even more time. A computer probably could yield you a proper perfect design in a fraction of a second. Mental calculus is the same.
Yet, the 2000 TB of the mind seems like small number. How does the brain do that much with so little ?
There’s also the question of people functioning normally after getting half of their brain removed, what if 1⁄4 was enough ? 1⁄8 ? That could be at least 1 OOM of inefficiency for brains in relation to normal human intelligence.
Hi, thanks tons for your interesting summary, even as STEM non-literate i believe i managed to grasp a few interesting bits !
I wondered, what’s your intuition about the software/operating system side, as in the way data is handled/ordered at the subconscious level ? Wasteful or efficient ? Isn’t that a key point that could render neuromorphic hardware obsolete, and perhaps suggest that AGI could run on current PCs as one commentator posited ?
As in, if i’m asking myself where i was yesterday, on a computer cognitive architecture, i’d just need to access a few pointers and do some dictionary searches and i’d get a working list of pointers in a minimal number of cycles, while my human mind seems to run a 100% CPU search with visual memories flooding back. I didn’t ask for images or emotions or details of what was at the scene, i just needed a pointer to these places to be able to pronounce their name.
Isn’t the human mind extremely limited by it’s wavering attention span ? By its working memory ? By the way information is probably intricately linked with sensory experience ? How much compression / abstraction is taking place ?
Think about asking a computer to design a house. As a human i’d never even be able to hold the design in memory, i’d need a pen and paper and that’d considerably slow me down, and getting all the details correctly would ask me even more time. A computer probably could yield you a proper perfect design in a fraction of a second. Mental calculus is the same.
Yet, the 2000 TB of the mind seems like small number. How does the brain do that much with so little ?
There’s also the question of people functioning normally after getting half of their brain removed, what if 1⁄4 was enough ? 1⁄8 ? That could be at least 1 OOM of inefficiency for brains in relation to normal human intelligence.