It is not a fully general counterargument because only if FAI approach is right it is a good idea to suppress open dissemination of some AGI information.
That isn’t true. It would be a good idea to suppress some AGI information if the FAI approach is futile and any creation of AGI would turn out to be terrible.
That isn’t true. It would be a good idea to suppress some AGI information if the FAI approach is futile and any creation of AGI would turn out to be terrible.