Why do I never see anyone suggesting we use BCIs to wire human brains together to enable more efficient coordination and learning? I think that if we survive as a species, it will be by increasing our natural capacity to act as hive minds. We don’t need to create AGI to begin with; we can collectively become superintelligent. And if only narrow AIs (for translating information between the brain structures of different humans, which would almost certainly have to be learned) are used, I don’t see how this could be dangerous, as humans are already (presumably!) aligned on average to human values.
Why do I never see anyone suggesting we use BCIs to wire human brains together to enable more efficient coordination and learning? I think that if we survive as a species, it will be by increasing our natural capacity to act as hive minds. We don’t need to create AGI to begin with; we can collectively become superintelligent. And if only narrow AIs (for translating information between the brain structures of different humans, which would almost certainly have to be learned) are used, I don’t see how this could be dangerous, as humans are already (presumably!) aligned on average to human values.