Thinking that FAI is extremely difficult or unlikely isn’t obviously crazy, but Stross isn’t just saying “don’t bother trying FAI” but rather “don’t bother trying anything with the aim of making a good Singularity more likely”. The first sentence of his answer, which I neglected to quote, is “Forget it.”
Pretty much how I read it. It should acknowledge the attempts to make a FAI, but it seems like a reasonable pessimistic opinion that FAI is too difficult to ever be pulled off successfully before strong AI in general.
Seems like a sensible default stance to me. Since humans exist, we know that a general intelligence can be built out of atoms, and since humans have many obvious flaws as physical computation systems, we know that any successful AGI is likely to end up at least weakly superhuman. There isn’t a similarly strong reason to assume a FAI can be built, and the argument for one seems to be more on the lines of things being likely to go pretty weird and bad for humans if one can’t be built but an AGI can.
That quote could also be interpreted as saying that UFAI is far more likely than FAI.
Thinking that FAI is extremely difficult or unlikely isn’t obviously crazy, but Stross isn’t just saying “don’t bother trying FAI” but rather “don’t bother trying anything with the aim of making a good Singularity more likely”. The first sentence of his answer, which I neglected to quote, is “Forget it.”
Pretty much how I read it. It should acknowledge the attempts to make a FAI, but it seems like a reasonable pessimistic opinion that FAI is too difficult to ever be pulled off successfully before strong AI in general.
Seems like a sensible default stance to me. Since humans exist, we know that a general intelligence can be built out of atoms, and since humans have many obvious flaws as physical computation systems, we know that any successful AGI is likely to end up at least weakly superhuman. There isn’t a similarly strong reason to assume a FAI can be built, and the argument for one seems to be more on the lines of things being likely to go pretty weird and bad for humans if one can’t be built but an AGI can.