Alright, fair warning, this is an out there kind of comment. But I think there’s some kind of there there, so I’ll make it anyway.
Although I don’t have much of anything new to say about it lately, I spent several years really diving into developmental psychology and my take on most of it is that its an attempt to map changes in the order of complexity of the structure thoughts can take on. I view the stages of human psychological development as building up the mental infrastructure to be able to hold up to three levels of fully-formed structure (yes, this is kind of handwavy about what a fully-formed structure is) in your mind simultaneously without effort (i.e. your System 1 can do this). My most recent post exploring this idea in detail is here.
This fact about how humans think and develop seems an important puzzle piece in understanding how, among other things, we address your questions around understanding what other minds understand.
For example, as people move through different phases of psychological development, one of the key skills they gain is better cognitive empathy. I think this comes from being able to hold more complex structures in their mind and thus be able to model other minds more richly. An interesting question I don’t know the answer to is if you get more cognitive empathy past the end of where human psychological development seems to stop. LIke, if an AI could hold 4 or 5 levels simultaneously instead of just 3, would they understand more than us, or just be faster. I might compare it to a stack based computer. A 3-register stack is sufficient to run arbitrary computations, but if you’ve ever used an RPN calculator you know that having 4 or more registers sure makes life easier even if you know you could always do it with just 3.
I don’t know that I really have a lot of answers here, but hopefully these are somewhat useful puzzle pieces you can work on fitting together with other things you’re looking at.
An interesting question I don’t know the answer to is if you get more cognitive empathy past the end of where human psychological development seems to stop.
Why isn’t the answer obviously “yes”? What would it look like for this not to be the case? (I’m generally somewhat skeptical of descriptions like “just faster” if the faster is like multiple orders of magnitude and sure seems to result from new ideas rather than just a bigger computer.)
There’s more in the sense I’m thinking in that it’s not clear additional levels of abstraction enable deeper understanding given enough time. If 3 really is all the more levels you need because that’s how many it takes to think about any number of levels of depth (again by swapping out levels in your “abstraction registers”), additional levels end up being in the same category.
And then there’s more like doing things faster which makes things cheaper. I’m more skeptical of scaling than you are perhaps. I do agree that many things become cheap at scale that are too expensive to do otherwise, and that does produce a real difference.
I’m doubtful in my comment of the former kind of more. The latter type seems quite likely.
Alright, fair warning, this is an out there kind of comment. But I think there’s some kind of there there, so I’ll make it anyway.
Although I don’t have much of anything new to say about it lately, I spent several years really diving into developmental psychology and my take on most of it is that its an attempt to map changes in the order of complexity of the structure thoughts can take on. I view the stages of human psychological development as building up the mental infrastructure to be able to hold up to three levels of fully-formed structure (yes, this is kind of handwavy about what a fully-formed structure is) in your mind simultaneously without effort (i.e. your System 1 can do this). My most recent post exploring this idea in detail is here.
This fact about how humans think and develop seems an important puzzle piece in understanding how, among other things, we address your questions around understanding what other minds understand.
For example, as people move through different phases of psychological development, one of the key skills they gain is better cognitive empathy. I think this comes from being able to hold more complex structures in their mind and thus be able to model other minds more richly. An interesting question I don’t know the answer to is if you get more cognitive empathy past the end of where human psychological development seems to stop. LIke, if an AI could hold 4 or 5 levels simultaneously instead of just 3, would they understand more than us, or just be faster. I might compare it to a stack based computer. A 3-register stack is sufficient to run arbitrary computations, but if you’ve ever used an RPN calculator you know that having 4 or more registers sure makes life easier even if you know you could always do it with just 3.
I don’t know that I really have a lot of answers here, but hopefully these are somewhat useful puzzle pieces you can work on fitting together with other things you’re looking at.
Why isn’t the answer obviously “yes”? What would it look like for this not to be the case? (I’m generally somewhat skeptical of descriptions like “just faster” if the faster is like multiple orders of magnitude and sure seems to result from new ideas rather than just a bigger computer.)
So there’s different notions of more here.
There’s more in the sense I’m thinking in that it’s not clear additional levels of abstraction enable deeper understanding given enough time. If 3 really is all the more levels you need because that’s how many it takes to think about any number of levels of depth (again by swapping out levels in your “abstraction registers”), additional levels end up being in the same category.
And then there’s more like doing things faster which makes things cheaper. I’m more skeptical of scaling than you are perhaps. I do agree that many things become cheap at scale that are too expensive to do otherwise, and that does produce a real difference.
I’m doubtful in my comment of the former kind of more. The latter type seems quite likely.