I agree, but there’s a certain standard story we tend to tell, not so much because we’re certain it’s the initial trajectory as because it helps make the risks more concrete and vivid. To cite the most recent instance of this meme:
The notion of a ‘superintelligence’ is not that it sits around in Goldman Sachs’s basement trading stocks for its corporate masters. The concrete illustration I often use is that a superintelligence asks itself what the fastest possible route is to increasing its real-world power, and then, rather than bothering with the digital counters that humans call money, the superintelligence solves the protein structure prediction problem, emails some DNA sequences to online peptide synthesis labs, and gets back a batch of proteins which it can mix together to create an acoustically controlled equivalent of an artificial ribosome which it can use to make second-stage nanotechnology which manufactures third-stage nanotechnology which manufactures diamondoid molecular nanotechnology and then… well, it doesn’t really matter from our perspective what comes after that, because from a human perspective any technology more advanced than molecular nanotech is just overkill. A superintelligence with molecular nanotech does not wait for you to buy things from it in order for it to acquire money. It just moves atoms around into whatever molecular structures or large-scale structures it wants. [...]
And then with respect to very advanced AI, the sort that might be produced by AI self-improving and going FOOM, asking about the effect of machine superintelligence on the conventional human labor market is like asking how US-Chinese trade patterns would be affected by the Moon crashing into the Earth. There would indeed be effects, but you’d be missing the point.
What I was looking for is just this standard story, or something similarly plausible and specific, fleshed out and briefly defended in its own post, as a way of using narrative to change people’s go-to envisioned scenario to something fast and brutal.
I agree, but there’s a certain standard story we tend to tell, not so much because we’re certain it’s the initial trajectory as because it helps make the risks more concrete and vivid. To cite the most recent instance of this meme:
What I was looking for is just this standard story, or something similarly plausible and specific, fleshed out and briefly defended in its own post, as a way of using narrative to change people’s go-to envisioned scenario to something fast and brutal.