I was just going to quote your comment on Overcoming Bias to emphasise this.
AFAIK, all SIAI personnel think and AFAIK have always thought that UFAI cannot possibly explain the Great Filter; the possibility of an intelligence explosion, Friendly or unFriendly or global-economic-based or what-have-you, resembles the prospect of molecular nanotechnology in that it makes the Great Filter more puzzling, not less. I don’t view this as a particularly strong critique of UFAI or intelligence explosion, because even without that the Great Filter is still very puzzling—it’s already very mysterious.
I think some people may be misinterpreting you as believing this because many people understand your advocacy as implying “UFAI is the biggest baddest existential risk we need to deal with”. Assuming a late filter not explained by UFAI suggests there is an unidentified risk in our future that is much likelier than an uncontrolled intelligence explosion.
I think some people may be misinterpreting you as believing this because many people understand your advocacy as implying “UFAI is the biggest baddest existential risk we need to deal with”.
It is; I don’t particularly think the answer to the Great Filter is a Bigger Threat that comes after this. There’s a possibility that most species like ours happen to be inside the volume of some earlier species’s “F”AI’s enforced Prime Directive with a restriction threshold (species are allowed to get as far as ours, but are not allowed to colonize galaxies) but if so I’m not sure what our own civilization ought to do about that. I suspect, and certainly hope, that there’s actually a hidden rarity factor.
But I do think some fallacy of the form, “This argument would make UFAI more threatening—therefore UFAI-fearers must endorse it—but the argument is wrong, ha ha!” might have occurred.
But I do think some fallacy of the form, “This argument would make UFAI more threatening—therefore UFAI-fearers must endorse it—but the argument is wrong, ha ha!” might have occurred.
I think this is it. However, there are at least a few enthusiasts, even if they are relatively peripheral, who do tend to engage in such indiscriminate argument. Sort of like internet skeptics who confabulate wrong arguments for true skeptical conclusions in the course of comment thread combat that the scientists they are citing would not endorse.
Assuming a late filter not explained by UFAI suggests there is an unidentified risk in our future that is much likelier than an uncontrolled intelligence explosion.
What has prevented local living systems from colonising the universe so far has been delays—not risks.
I was just going to quote your comment on Overcoming Bias to emphasise this.
I think some people may be misinterpreting you as believing this because many people understand your advocacy as implying “UFAI is the biggest baddest existential risk we need to deal with”. Assuming a late filter not explained by UFAI suggests there is an unidentified risk in our future that is much likelier than an uncontrolled intelligence explosion.
That’s a big assumption, both uncertain and decisive if made.
It is; I don’t particularly think the answer to the Great Filter is a Bigger Threat that comes after this. There’s a possibility that most species like ours happen to be inside the volume of some earlier species’s “F”AI’s enforced Prime Directive with a restriction threshold (species are allowed to get as far as ours, but are not allowed to colonize galaxies) but if so I’m not sure what our own civilization ought to do about that. I suspect, and certainly hope, that there’s actually a hidden rarity factor.
But I do think some fallacy of the form, “This argument would make UFAI more threatening—therefore UFAI-fearers must endorse it—but the argument is wrong, ha ha!” might have occurred.
I think this is it. However, there are at least a few enthusiasts, even if they are relatively peripheral, who do tend to engage in such indiscriminate argument. Sort of like internet skeptics who confabulate wrong arguments for true skeptical conclusions in the course of comment thread combat that the scientists they are citing would not endorse.
What has prevented local living systems from colonising the universe so far has been delays—not risks.