I’m also generally skeptical of the sentiment “build an intelligence which mimics a human as closely as possible.” This is competing with the principle “build things you understand,”
Do you really think that one can build an AGI without first getting a good understanding of human intelligence, to the degree where one can be reproduced (but possibly shouldn’t be)?
Do you really think that one can build an AGI without first getting a good understanding of human intelligence, to the degree where one can be reproduced
It was possibly to achieve heavier than air flight without reproducing the flexible wings of birds.
Good understanding of the design principles may be enough, or of the organisation into cortical columns and the like. The rest is partly a mess of evolutionary hacks, such as “let’s put the primary visual cortex in the back of the brain” (excuse the personification), and probably not integral for sufficient understanding. So I guess my question would be what granularity of “understanding” you’re referring to. ‘So that it can be reproduced’ seems too low a barrier: Consider we found some alien technology that we could reproduce strictly by copying it, without having any idea how it actually worked.
Do you ‘understand’ large RNNs that exhibit strange behavior because you understand the underlying mechanisms and could use them to create other RNNs?
There is a sort of trade-off, you can’t go too basic and still consider yourself to understand the higher-level abstractions in a meaningful way, just as the physical layer of the TCP/IP stack in principle encapsulates all necessary information, but is still … user-unfriendly. Otherwise we could say we understand a human brain perfectly just because we know the laws that governs it on a physical level.
I shouldn’t comment when sleep deprived … ignore at your leisure.
Do you really think that one can build an AGI without first getting a good understanding of human intelligence, to the degree where one can be reproduced (but possibly shouldn’t be)?
It was possibly to achieve heavier than air flight without reproducing the flexible wings of birds.
Right, an excellent point. Biology can be unnecessarily messy.
Good understanding of the design principles may be enough, or of the organisation into cortical columns and the like. The rest is partly a mess of evolutionary hacks, such as “let’s put the primary visual cortex in the back of the brain” (excuse the personification), and probably not integral for sufficient understanding. So I guess my question would be what granularity of “understanding” you’re referring to. ‘So that it can be reproduced’ seems too low a barrier: Consider we found some alien technology that we could reproduce strictly by copying it, without having any idea how it actually worked.
Do you ‘understand’ large RNNs that exhibit strange behavior because you understand the underlying mechanisms and could use them to create other RNNs?
There is a sort of trade-off, you can’t go too basic and still consider yourself to understand the higher-level abstractions in a meaningful way, just as the physical layer of the TCP/IP stack in principle encapsulates all necessary information, but is still … user-unfriendly. Otherwise we could say we understand a human brain perfectly just because we know the laws that governs it on a physical level.
I shouldn’t comment when sleep deprived … ignore at your leisure.