Substrate Needs Convergence (SNC) is the theory that Artificial General Intelligence (AGI) will gradually change under strong evolutionary pressures toward expanding itself. This converges over the long term on making the Earth uninhabitable for biological life. SNC focuses on AI systems that are comprehensive enough to form fully self-sufficient machine ecosystems that persist over time. The theory contends that it is impossible in principle to build such a system that is compatible with biological life. By implication, the only way to be safe from AGI is to not build it.
This sequence is an attempt to explain the underlying intuitions of SNC and bridge the inferential gap to those more accustomed to thinking about Artificial Super Intelligence (ASI) in terms of the alignment problem as described by Yudkowsky and others.
Recommended reading pattern: start with “What if Alignment is not enough” to get a high level summary. If the argument seems implausible, make a note of your objections, move on to the subsequent intuition-forming posts, and then return to the initial summary. If your objection remains, please comment regarding where you see a crux.