Thank you for your answer!
There might be more obvious ways but (at least the ones I can think of) are either only temporary and local disruptions or cannot be executed by a small coordinated group.
I understand that LW and the AI Safety community does not want to be associated with terrorism and ending civilisation, however Eliezer talked about blowing up AI labs, so I am uncertain where the line here would be.
I accept that “preemtively destroying civilization” might be excluded from the definition of PWAs but is that something that is at all discussed on LW or in the AI Safety community? Seems to me that if you 99.99% believe that AGI will kill or worse torture us, then it should be on the table.
doomyeser
Thank you for your answer!
Not only do you have to maintain such a state but also induce it in the first place. Neither the “correctly” sized rock nor the electricity reduction is something that you can just do without a government concensus that would enable you to prescribe and enforce a moratorium, which would be the preferred and most humane solution anyways.Another humane idea I had, but is likely same as unrealistic would be to just buy up most of the GPUs or the resources used to produce them and making it economically unviable to build large GPU clusters. I doubt that all the EA and LessWrong-Doomer money in the world would suffice for that though.
Yes, this kind of “preemptive teardown” is the underlying mechanism, however the “taking the semicondutor out of the equation” seemed to be a more indefinite approach as opposed to the things you outlined as well as most of the other “non-galaxy-brained” things I have thought about.
Thank you for you answer!
As I said I don’t know much about microbiology and chemistry so I cannot challenge anything you said. It seems consistent with other answers as well telling me that there is no way of doing it.
I was queezy about it anyways since teleporting our technology back to the stone age and committing 99.9% of humanity to die is quite extreme.