There have been some strong criticisms of this statement, notably by Jeremy Howard et al here. I’ve written a detailed response to the criticisms here:https://www.soroushjp.com/2023/06/01/yes-avoiding-extinction-from-ai-is-an-urgent-priority-a-response-to-seth-lazar-jeremy-howard-and-arvind-narayanan/Please feel free to share with others who may find it valuable (e.g. skeptics of AGI x-risk).
There have been some strong criticisms of this statement, notably by Jeremy Howard et al here. I’ve written a detailed response to the criticisms here:
https://www.soroushjp.com/2023/06/01/yes-avoiding-extinction-from-ai-is-an-urgent-priority-a-response-to-seth-lazar-jeremy-howard-and-arvind-narayanan/
Please feel free to share with others who may find it valuable (e.g. skeptics of AGI x-risk).