I have been (and I am not the only one) very put off by the trend in the last months/years of doomerism pervading LW, with things like “we have to get AGI right at the first try or we all die” repeated constantly as a dogma.
To someone who is very skeptical of the classical doomist position (aka AGI will make nanofactories and will kill everyone at once), this post is very persuasive and compelling. This is something I could see happening. This post serves as an excellent example for those seeking effective ways to convince skeptics.
I have been (and I am not the only one) very put off by the trend in the last months/years of doomerism pervading LW, with things like “we have to get AGI right at the first try or we all die” repeated constantly as a dogma.
To someone who is very skeptical of the classical doomist position (aka AGI will make nanofactories and will kill everyone at once), this post is very persuasive and compelling. This is something I could see happening. This post serves as an excellent example for those seeking effective ways to convince skeptics.
Yes this is a slow-takeoff scenario that it is realistic to be worried about.