Because of the difficulty of predicting a “safe upper bound” on size of rock below which the risk of human extinction is within acceptable limits, I prefer the idea of destroying all the leading-edge fabs in the world or reducing the supply of electricity worldwide to low enough levels that the AI labs cannot compete for electricity with municipalities who need some electricity just to maintain social order and cannot compete with the basic infrastructure required just to keep most people alive. If either of those 2 outcomes weren’t hard enough, we would have to maintain such an improved state of affairs (i.e., no leading-edge fab capability or severely degraded electricity generation capability) long enough (i.e., probably at least a century in my estimation) for there to come into being some other, less drastic way of protecting against reckless AI development.
Neither OP’s metal-eating bacteria, the large rock from space nor either of the 2 interventions I just described is feasible enough to be worth thinking about much, IMO (and again the large rock from space carries much higher extinction risk than the other 3).
Not only do you have to maintain such a state but also induce it in the first place. Neither the “correctly” sized rock nor the electricity reduction is something that you can just do without a government concensus that would enable you to prescribe and enforce a moratorium, which would be the preferred and most humane solution anyways.
Another humane idea I had, but is likely same as unrealistic would be to just buy up most of the GPUs or the resources used to produce them and making it economically unviable to build large GPU clusters. I doubt that all the EA and LessWrong-Doomer money in the world would suffice for that though.
buy up most of the GPUs or the resources used to produce them
That would backfire IMHO. Specifically, GPUs would become more expensive, but that would last only as long as it takes for the GPU producers to ramp up production (which is very unlikely to take more than 5 years) after which GPU prices would go lower than they would’ve gone if we hadn’t started buying them up (because of better economies of scale).
GPUs and the products and services needs to produce GPUs are not like the commodity silver where if you buy up most of the silver, the economy probably cannot respond promptly by producing a lot more silver. If you could make leading-edge fabs blow up in contrast that would make GPUs more expensive permanently (by reducing investment in fabs) or at least it would if you could convince investors that leading-edge fabs are likely to continue to blow up.
Because of the difficulty of predicting a “safe upper bound” on size of rock below which the risk of human extinction is within acceptable limits, I prefer the idea of destroying all the leading-edge fabs in the world or reducing the supply of electricity worldwide to low enough levels that the AI labs cannot compete for electricity with municipalities who need some electricity just to maintain social order and cannot compete with the basic infrastructure required just to keep most people alive. If either of those 2 outcomes weren’t hard enough, we would have to maintain such an improved state of affairs (i.e., no leading-edge fab capability or severely degraded electricity generation capability) long enough (i.e., probably at least a century in my estimation) for there to come into being some other, less drastic way of protecting against reckless AI development.
Neither OP’s metal-eating bacteria, the large rock from space nor either of the 2 interventions I just described is feasible enough to be worth thinking about much, IMO (and again the large rock from space carries much higher extinction risk than the other 3).
Thank you for your answer!
Not only do you have to maintain such a state but also induce it in the first place. Neither the “correctly” sized rock nor the electricity reduction is something that you can just do without a government concensus that would enable you to prescribe and enforce a moratorium, which would be the preferred and most humane solution anyways.
Another humane idea I had, but is likely same as unrealistic would be to just buy up most of the GPUs or the resources used to produce them and making it economically unviable to build large GPU clusters. I doubt that all the EA and LessWrong-Doomer money in the world would suffice for that though.
That would backfire IMHO. Specifically, GPUs would become more expensive, but that would last only as long as it takes for the GPU producers to ramp up production (which is very unlikely to take more than 5 years) after which GPU prices would go lower than they would’ve gone if we hadn’t started buying them up (because of better economies of scale).
GPUs and the products and services needs to produce GPUs are not like the commodity silver where if you buy up most of the silver, the economy probably cannot respond promptly by producing a lot more silver. If you could make leading-edge fabs blow up in contrast that would make GPUs more expensive permanently (by reducing investment in fabs) or at least it would if you could convince investors that leading-edge fabs are likely to continue to blow up.