I agree that “the worst that can happen is...” suggests an unreasonably low estimate of risk, and technically implies implies either zero threat or zero risk of human error.
That said, I think it’s worth distinguishing the story “we will be able to see the threat and we will stop,” from “there is no threat.” The first story makes it clearer that there is actually broad support for measurement to detect risk and institutional structures that can slow down if risk is large.
It also feels like the key disagreement isn’t about corporate law or arguments for risk, it’s about how much warning we get in advance and how reliably institutions like Meta would stop building AI if they don’t figure out a good way to make AI safe. I think both are interesting, but the “how much warning” disagreement is probably more important for technical experts to debate—my rough sense is that the broader intellectual world already isn’t really on Yann’s page when he says “we’d definitely stop if this was unsafe, nothing to worry about.”
I agree that “the worst that can happen is...” suggests an unreasonably low estimate of risk, and technically implies implies either zero threat or zero risk of human error.
That said, I think it’s worth distinguishing the story “we will be able to see the threat and we will stop,” from “there is no threat.” The first story makes it clearer that there is actually broad support for measurement to detect risk and institutional structures that can slow down if risk is large.
It also feels like the key disagreement isn’t about corporate law or arguments for risk, it’s about how much warning we get in advance and how reliably institutions like Meta would stop building AI if they don’t figure out a good way to make AI safe. I think both are interesting, but the “how much warning” disagreement is probably more important for technical experts to debate—my rough sense is that the broader intellectual world already isn’t really on Yann’s page when he says “we’d definitely stop if this was unsafe, nothing to worry about.”