No, that is not realistic. Bacteria described in the article don’t really eat iron, they just make corrosive chemicals as methabolic waste. They rely on other sources of energy (sulfates or organic compounds). Metal-eating bacteria (those which derive energy by reducing metals) exist but require metals dissolved in water, eating solid metals doesn’t work chemically.
Generally I think Eliziers definition of weak pivotal act doesn’t include civilization collapse, because there are multiple obvious ways which don’t kill all humans.
There might be more obvious ways but (at least the ones I can think of) are either only temporary and local disruptions or cannot be executed by a small coordinated group.
I understand that LW and the AI Safety community does not want to be associated with terrorism and ending civilisation, however Eliezer talked about blowing up AI labs, so I am uncertain where the line here would be.
I accept that “preemtively destroying civilization” might be excluded from the definition of PWAs but is that something that is at all discussed on LW or in the AI Safety community? Seems to me that if you 99.99% believe that AGI will kill or worse torture us, then it should be on the table.
No, that is not realistic. Bacteria described in the article don’t really eat iron, they just make corrosive chemicals as methabolic waste. They rely on other sources of energy (sulfates or organic compounds). Metal-eating bacteria (those which derive energy by reducing metals) exist but require metals dissolved in water, eating solid metals doesn’t work chemically.
Generally I think Eliziers definition of weak pivotal act doesn’t include civilization collapse, because there are multiple obvious ways which don’t kill all humans.
Thank you for your answer!
There might be more obvious ways but (at least the ones I can think of) are either only temporary and local disruptions or cannot be executed by a small coordinated group.
I understand that LW and the AI Safety community does not want to be associated with terrorism and ending civilisation, however Eliezer talked about blowing up AI labs, so I am uncertain where the line here would be.
I accept that “preemtively destroying civilization” might be excluded from the definition of PWAs but is that something that is at all discussed on LW or in the AI Safety community? Seems to me that if you 99.99% believe that AGI will kill or worse torture us, then it should be on the table.