That’s definitely true, but it would be dishonest if SIAI wasn’t up front about its views on the Singularity.
There’s enough alternative terminology (intelligence explosion, hard takeoff, etc.) that they could (and often do (e.g. the AI Risk paper)) manage to talk about the Singularity just fine without talking about the “Singularity”. I don’t think anyone’s suggesting that they not be upfront about their actual positions on those issues; some people would just prefer to avoid the “Singularity” terminology now that it’s turned into a bloated mutant futurist meme complex. (So really, the name change suggestions are about being less accidentally misleading; they wouldn’t have to spend as much time explaining that they aren’t Ray Kurzweil. And hey, if they abandon the “artificial intelligence” terminology, maybe they won’t have to spend so much time explaining that they aren’t building Skynet or Terminators or HAL or the Matrix, either! ..but probably not.)
There’s enough alternative terminology (intelligence explosion, hard takeoff, etc.) that they could (and often do (e.g. the AI Risk paper)) manage to talk about the Singularity just fine without talking about the “Singularity”. I don’t think anyone’s suggesting that they not be upfront about their actual positions on those issues; some people would just prefer to avoid the “Singularity” terminology now that it’s turned into a bloated mutant futurist meme complex. (So really, the name change suggestions are about being less accidentally misleading; they wouldn’t have to spend as much time explaining that they aren’t Ray Kurzweil. And hey, if they abandon the “artificial intelligence” terminology, maybe they won’t have to spend so much time explaining that they aren’t building Skynet or Terminators or HAL or the Matrix, either! ..but probably not.)