The SI folk say that they are deliberately not releasing the work they’ve done that’s directly related to AGI. Doing so would speed up the development of an AGI without necessarily speeding up the development of an FAI, and therefore increase existential risk.
The SI folk say that they are deliberately not releasing the work they’ve done that’s directly related to AGI. Doing so would speed up the development of an AGI without necessarily speeding up the development of an FAI, and therefore increase existential risk.
ETA: Retracted, couldn’t find my source for this.