[Question] Is this a Pivotal Weak Act? Creating bacteria that decompose metal

This has been haunting my mind for a while and I would appreciate feedback on it!

In his infamous article “AGI Ruin: A list of lethalities” Eliezer defines a “pivotal weak act” and gives a heuristic proof that no such thing can exist.

TLDR: I think his proof is wrong and there is a counterexample. I believe creating bacteria that decompose metal, silicone, (or any other superset of the materials GPUs consist of) would constitute a pivotal weak act.

Long Version:
In his article, Eliezer outlines several hopes of people claiming AGI won’t be as bad or any problem at all, and then cruelly squashes them. One of those hopes is the possibility of executing a “pivotal weak act”. The idea being that a small group of people executes some action X that will prevent AGI from being built, for example a group that is privy to the dangers of AGI would command a friendly AGI to “burn all GPUs” and then we are good. Eliezer argues that any AGI powerful enough (pivotal) to prevent or at least indefinitely postpone unaligned AGI must itself be powerful enough such that it needs to be aligned (not weak), which we don’t know how to do. I believe his proof is false.

Definition:
A Pivotal Weak Act, would be some action or event A, such that
1. A happening or being executed prevents or delays the advent of an unaligned AGI indefinitely or at least very long (Pivotal)
2. A does not itself pose a significant X-risk for humanity as a whole (Weak)
3. A is realistically achievable with technology attainable in the coming decades (Realism)

furthermore it is not required that
- A is in any way related to or facilitated by an AI system
- A has no collateral damage
- A is moral, legal or anywhere near the Overton Window
- A is achievable today with current technology


I think that the following is an example of a Pivotal Weak Act.

Creating bacteria that decompose metal (and spreading them worldwide)

This is pivotal, since it is a special scenario of “Burning all GPUs”
It is (likely) weak, since there are uncontacted tribes in the Amazone that certainly live without (forged) metal and would barely even notice.
I am not sure how realistic it is because I have no knowledge about Microbiology, but it does not seems to be SciFi, as there already seems to be something like that going on:

https://​​www.bam.de/​​Content/​​EN/​​Standard-Articles/​​Topics/​​Environment/​​Biocorrosion/​​mic-microbiologically-influenced-corrosion.html

Caveats:
- There is a Weakness vs. Pivotality tradeoff since bacteria don’t spread as quickly as viruses. They would have to be specially engineered and that can be risky. The more natural the bacteria the weaker but also less pivotal the act.
- This is not advocacy since it would likely kill a lot of people, but I hope that there is renewed interest in discussion the idea of Pivotal Weak Acts and I am sure that some smart people out there can come up with better ideas and scenarios.

But didn’t Eliezer prove that there are no Pivotal Weak Acts?
I believe Eliezer has made an error in his proof. I will restate the proof as I understood it and highlight the error.

Eliezer takes a look at the act of “Burning all GPUs (BAG)” and states that this is a slight overestimation of the complexity needed for an act to be pivotal. I agree so far. In order to prevent AGI it is necessary to avoid large GPU clusters to run potentially dangerous training algorithms. In order to achieve that you can have these scenarios in ascending complexity.
1. Let the clusters be assembled but make sure nobody run dangerous algorithms on them (unrealistic)
2. Have GPUs exists, but prevent them from being assembled into large enough clusters (I guess this is the current policy-goal)
3. Have GPUs not exists.

Then he states that “A GPU-burner is also a system powerful enough to, and purportedly authorized to, build nanotechnology, so it requires operating in a dangerous domain at a dangerous level of intelligence and capability”. Thus, since the BAG scenario is basically the minimum complexity necessary and that is already difficult to align there can be no Pivotal Weak Acts.

But, as is clear, he assumes incorrectly that the pivotal act has to be carried out by some sort of AI system and he also seems to suggest that the goal of the GPU-burner is to “Burn only GPUs” (BOG) while the task BAG is only an slight overestimate when it is taken to mean “Burning (at least) all GPUs” as opposed to “Burning (only) all GPUs”.

I believe that this switch of what is meant by “Burning all GPUs” together with the assumption that an AI system needs to execute it is what incorrectly leads him to conclude there is no Pivotal Weak Act. The gap opening up between “Burning only the GPUs” and “Burning any superset of GPUs not containing all of humanity” is where the possibilities lie.