I’m not very bullish on HMI. I think the progress humanity makes in understanding the brain is extremely slow and because it’s so hard to do research on the brain, I don’t expect us to get much faster.
Basically, I expect humanity to build AGI way before we are even close to understanding the brain.
Well maybe. I still think it’s easier to build AGI than to understand the brain, so even the smartest narrow AIs might not be able to build a consistent theory before someone else builds AGI.
I’m not very bullish on HMI. I think the progress humanity makes in understanding the brain is extremely slow and because it’s so hard to do research on the brain, I don’t expect us to get much faster.
Basically, I expect humanity to build AGI way before we are even close to understanding the brain.
Well, narrow superintelligent AIs might help us understand the brain before then.
Well maybe. I still think it’s easier to build AGI than to understand the brain, so even the smartest narrow AIs might not be able to build a consistent theory before someone else builds AGI.