Maybe it would be useful to define ‘mild superintelligence.’ This would be human baseline? Or just a really strong AGI? Also, if AI fears spread to the general public as tech improves isn’t it possible that it would take a lot longer to develop even a mild superintelligence because there would be regulations/norms in place to prevent it?
I hope your predictions are right. It could turn out that it’s relatively easy to build a ‘mild superintelligence’ but much more difficult to go all the way.
Roughly, I’m talking something like 10x a human’s intelligence, roughly, though in practice it’s likely 2-4x assuming it uses the same energy as a human brain.
But in this scenario, scaling up superintelligence is actually surprisingly easy by adding in more energy, and this would allow more intelligence at the cost of more energy.
Also, this is still a world which would have vast changes fast.
I don’t believe we will go extinct or have a catastrophe, due to my beliefs around alignment, but this would still represent a catastrophic, potentially existential threat if the AGIs/Mild ASIa wanted to.
Remember, that would allow a personal phone or device to host a mild superintelligence, that is 2-10x more intelligent than humans.
Maybe it would be useful to define ‘mild superintelligence.’ This would be human baseline? Or just a really strong AGI? Also, if AI fears spread to the general public as tech improves isn’t it possible that it would take a lot longer to develop even a mild superintelligence because there would be regulations/norms in place to prevent it?
I hope your predictions are right. It could turn out that it’s relatively easy to build a ‘mild superintelligence’ but much more difficult to go all the way.
Roughly, I’m talking something like 10x a human’s intelligence, roughly, though in practice it’s likely 2-4x assuming it uses the same energy as a human brain.
But in this scenario, scaling up superintelligence is actually surprisingly easy by adding in more energy, and this would allow more intelligence at the cost of more energy.
Also, this is still a world which would have vast changes fast.
I don’t believe we will go extinct or have a catastrophe, due to my beliefs around alignment, but this would still represent a catastrophic, potentially existential threat if the AGIs/Mild ASIa wanted to.
Remember, that would allow a personal phone or device to host a mild superintelligence, that is 2-10x more intelligent than humans.
That’s a huge deal in itself!