Human neurons run at ~100 hertz; computer run at a few gigahertz. An AGI would be millions of times faster by default.
By that logic, we should have an AI that’s millions of times faster than a human already merely by the virtue of being implemented on silicon hardware.
There are two inputs to intelligence: software and hardware. Combined, they produce a system that reasons. When the artificial system that reasons is brought to a point where it is as “smart” as the world’s AI researchers, it will produce AI insights at the same speed they do. I don’t see how the combination of software/hardware inputs that produced that result is especially important.
This seems wrong to me- can you justify this?
Which part? The part where the AI makes progress at the rate I said it would in the previous paragraph? In that case, your issue is with the previous paragraph, as I’m just restating what I already said. The only new thing is that an AI that’s 10% smarter would find insights 10% faster… I agree that I’m being kinda handwavey here, but hopefully the waving is vaguely plausbile.
By that logic, we should have an AI that’s millions of times faster than a human already merely by the virtue of being implemented on silicon hardware.
There are two inputs to intelligence: software and hardware. Combined, they produce a system that reasons. When the artificial system that reasons is brought to a point where it is as “smart” as the world’s AI researchers, it will produce AI insights at the same speed they do. I don’t see how the combination of software/hardware inputs that produced that result is especially important.
Which part? The part where the AI makes progress at the rate I said it would in the previous paragraph? In that case, your issue is with the previous paragraph, as I’m just restating what I already said. The only new thing is that an AI that’s 10% smarter would find insights 10% faster… I agree that I’m being kinda handwavey here, but hopefully the waving is vaguely plausbile.