if safety consciousness should become endemic in the AGI community, there is a real risk that someone else might produce FAI before Eliezer.
That makes no sense. If safety consciousness means that the AGI community is likely to produce FAI before Eliezer, then without safety consciousness, the AGI community is even more likely to produce UFAI before Eliezer produces FAI. Either way, Eliezer gets scooped; but in the second case, we’re very dead.
That makes no sense. If safety consciousness means that the AGI community is likely to produce FAI before Eliezer, then without safety consciousness, the AGI community is even more likely to produce UFAI before Eliezer produces FAI. Either way, Eliezer gets scooped; but in the second case, we’re very dead.