As I previously pointed out, an AGI that was just as “smart” as all the world’s AI researchers combined would make AI progress at the same slow rate they are making AI progress, with no explosion.
Human neurons run at ~100 hertz; computer run at a few gigahertz. An AGI would be millions of times faster by default.
Having that AI make itself 10% “smarter” (which would take a long time—it’s only as smart as the world’s AI researchers) would only result in self-improvement progress that was 10% faster.
Human neurons run at ~100 hertz; computer run at a few gigahertz. An AGI would be millions of times faster by default.
I don’t think that’s a reasonable comparison. Neurons do a bunch of calculations. It’s easy to imagine a AI that would have need a second to get from one mental state to the next.
Human neurons run at ~100 hertz; computer run at a few gigahertz. An AGI would be millions of times faster by default.
By that logic, we should have an AI that’s millions of times faster than a human already merely by the virtue of being implemented on silicon hardware.
There are two inputs to intelligence: software and hardware. Combined, they produce a system that reasons. When the artificial system that reasons is brought to a point where it is as “smart” as the world’s AI researchers, it will produce AI insights at the same speed they do. I don’t see how the combination of software/hardware inputs that produced that result is especially important.
This seems wrong to me- can you justify this?
Which part? The part where the AI makes progress at the rate I said it would in the previous paragraph? In that case, your issue is with the previous paragraph, as I’m just restating what I already said. The only new thing is that an AI that’s 10% smarter would find insights 10% faster… I agree that I’m being kinda handwavey here, but hopefully the waving is vaguely plausbile.
Human neurons run at ~100 hertz; computer run at a few gigahertz. An AGI would be millions of times faster by default.
This seems wrong to me- can you justify this?
I don’t think that’s a reasonable comparison. Neurons do a bunch of calculations. It’s easy to imagine a AI that would have need a second to get from one mental state to the next.
By that logic, we should have an AI that’s millions of times faster than a human already merely by the virtue of being implemented on silicon hardware.
There are two inputs to intelligence: software and hardware. Combined, they produce a system that reasons. When the artificial system that reasons is brought to a point where it is as “smart” as the world’s AI researchers, it will produce AI insights at the same speed they do. I don’t see how the combination of software/hardware inputs that produced that result is especially important.
Which part? The part where the AI makes progress at the rate I said it would in the previous paragraph? In that case, your issue is with the previous paragraph, as I’m just restating what I already said. The only new thing is that an AI that’s 10% smarter would find insights 10% faster… I agree that I’m being kinda handwavey here, but hopefully the waving is vaguely plausbile.