“Create bacteria that can quickly decompose any metal in any environment, including alloys and including metal that has been painted, and which also are competitive in those environments, and will retain all of those properties under all of the diverse selection pressures they will be under worldwide” is a much harder problem than “create bacteria that can decompose one specific type of metal in one specific environment”, which in turn is harder than “identify specific methanobacteria which can corrode exposed steel by a small fraction of a millimeter per year, and find ways to improve that to a large fraction of a millimeter per year.”
Also it seems the mechanism is “cause industrial society to collapse without killing literally all humans”—I think “drop a sufficiently large but not too large rock on the earth” would also work to achieve that goal, you don’t have to do anything galaxy-brained.
Because of the difficulty of predicting a “safe upper bound” on size of rock below which the risk of human extinction is within acceptable limits, I prefer the idea of destroying all the leading-edge fabs in the world or reducing the supply of electricity worldwide to low enough levels that the AI labs cannot compete for electricity with municipalities who need some electricity just to maintain social order and cannot compete with the basic infrastructure required just to keep most people alive. If either of those 2 outcomes weren’t hard enough, we would have to maintain such an improved state of affairs (i.e., no leading-edge fab capability or severely degraded electricity generation capability) long enough (i.e., probably at least a century in my estimation) for there to come into being some other, less drastic way of protecting against reckless AI development.
Neither OP’s metal-eating bacteria, the large rock from space nor either of the 2 interventions I just described is feasible enough to be worth thinking about much, IMO (and again the large rock from space carries much higher extinction risk than the other 3).
Not only do you have to maintain such a state but also induce it in the first place. Neither the “correctly” sized rock nor the electricity reduction is something that you can just do without a government concensus that would enable you to prescribe and enforce a moratorium, which would be the preferred and most humane solution anyways.
Another humane idea I had, but is likely same as unrealistic would be to just buy up most of the GPUs or the resources used to produce them and making it economically unviable to build large GPU clusters. I doubt that all the EA and LessWrong-Doomer money in the world would suffice for that though.
buy up most of the GPUs or the resources used to produce them
That would backfire IMHO. Specifically, GPUs would become more expensive, but that would last only as long as it takes for the GPU producers to ramp up production (which is very unlikely to take more than 5 years) after which GPU prices would go lower than they would’ve gone if we hadn’t started buying them up (because of better economies of scale).
GPUs and the products and services needs to produce GPUs are not like the commodity silver where if you buy up most of the silver, the economy probably cannot respond promptly by producing a lot more silver. If you could make leading-edge fabs blow up in contrast that would make GPUs more expensive permanently (by reducing investment in fabs) or at least it would if you could convince investors that leading-edge fabs are likely to continue to blow up.
Yes, this kind of “preemptive teardown” is the underlying mechanism, however the “taking the semicondutor out of the equation” seemed to be a more indefinite approach as opposed to the things you outlined as well as most of the other “non-galaxy-brained” things I have thought about.
“Create bacteria that can quickly decompose any metal in any environment, including alloys and including metal that has been painted, and which also are competitive in those environments, and will retain all of those properties under all of the diverse selection pressures they will be under worldwide” is a much harder problem than “create bacteria that can decompose one specific type of metal in one specific environment”, which in turn is harder than “identify specific methanobacteria which can corrode exposed steel by a small fraction of a millimeter per year, and find ways to improve that to a large fraction of a millimeter per year.”
Also it seems the mechanism is “cause industrial society to collapse without killing literally all humans”—I think “drop a sufficiently large but not too large rock on the earth” would also work to achieve that goal, you don’t have to do anything galaxy-brained.
Because of the difficulty of predicting a “safe upper bound” on size of rock below which the risk of human extinction is within acceptable limits, I prefer the idea of destroying all the leading-edge fabs in the world or reducing the supply of electricity worldwide to low enough levels that the AI labs cannot compete for electricity with municipalities who need some electricity just to maintain social order and cannot compete with the basic infrastructure required just to keep most people alive. If either of those 2 outcomes weren’t hard enough, we would have to maintain such an improved state of affairs (i.e., no leading-edge fab capability or severely degraded electricity generation capability) long enough (i.e., probably at least a century in my estimation) for there to come into being some other, less drastic way of protecting against reckless AI development.
Neither OP’s metal-eating bacteria, the large rock from space nor either of the 2 interventions I just described is feasible enough to be worth thinking about much, IMO (and again the large rock from space carries much higher extinction risk than the other 3).
Thank you for your answer!
Not only do you have to maintain such a state but also induce it in the first place. Neither the “correctly” sized rock nor the electricity reduction is something that you can just do without a government concensus that would enable you to prescribe and enforce a moratorium, which would be the preferred and most humane solution anyways.
Another humane idea I had, but is likely same as unrealistic would be to just buy up most of the GPUs or the resources used to produce them and making it economically unviable to build large GPU clusters. I doubt that all the EA and LessWrong-Doomer money in the world would suffice for that though.
That would backfire IMHO. Specifically, GPUs would become more expensive, but that would last only as long as it takes for the GPU producers to ramp up production (which is very unlikely to take more than 5 years) after which GPU prices would go lower than they would’ve gone if we hadn’t started buying them up (because of better economies of scale).
GPUs and the products and services needs to produce GPUs are not like the commodity silver where if you buy up most of the silver, the economy probably cannot respond promptly by producing a lot more silver. If you could make leading-edge fabs blow up in contrast that would make GPUs more expensive permanently (by reducing investment in fabs) or at least it would if you could convince investors that leading-edge fabs are likely to continue to blow up.
Yes, this kind of “preemptive teardown” is the underlying mechanism, however the “taking the semicondutor out of the equation” seemed to be a more indefinite approach as opposed to the things you outlined as well as most of the other “non-galaxy-brained” things I have thought about.