Honestly I think the strongest criticism will come from someone arguing that there’s not enough leverage in our world for superintelligence to be much more powerful than us, for good or bad. People who argue that ASI is absolutely necessary because it will make us immortal and colonise the stars but doesn’t warrant any worry about the possibility it may direct its vast power to less desirable goals are just unserious though. Also obviously the possibility that AGI may actually be still far off, but that doesn’t say much about whether it’s dangerous, just whether the danger is imminent.
Honestly I think the strongest criticism will come from someone arguing that there’s not enough leverage in our world for superintelligence to be much more powerful than us, for good or bad. People who argue that ASI is absolutely necessary because it will make us immortal and colonise the stars but doesn’t warrant any worry about the possibility it may direct its vast power to less desirable goals are just unserious though. Also obviously the possibility that AGI may actually be still far off, but that doesn’t say much about whether it’s dangerous, just whether the danger is imminent.