The paper on TDT has some words that mean math, but the hard parts are mostly not done.
Eliezer is deliberately working on developing basics of FAI theory rather than producing code, but even then either he spends little time writing it down or he’s not making much progress.
The SI folk say that they are deliberately not releasing the work they’ve done that’s directly related to AGI. Doing so would speed up the development of an AGI without necessarily speeding up the development of an FAI, and therefore increase existential risk.
The paper on TDT has some words that mean math, but the hard parts are mostly not done.
Eliezer is deliberately working on developing basics of FAI theory rather than producing code, but even then either he spends little time writing it down or he’s not making much progress.
The SI folk say that they are deliberately not releasing the work they’ve done that’s directly related to AGI. Doing so would speed up the development of an AGI without necessarily speeding up the development of an FAI, and therefore increase existential risk.
ETA: Retracted, couldn’t find my source for this.