It’s true that publishing the material can hasten the arrival of unfriendly AI, but it can also give the world a chance where it had none. If the problem of Friendliness is hard enough that SingInst folks can’t generate all the required insights by themselves before unfriendly AI arrives, then secrecy has negative expected utility. Looking at the apparent difficulty of the problem and the apparent productivity of SingInst over the 10 years of its existence, that seems to me to be the case. Eliezer believes the solution is just a handful of insights away, but I don’t see why.
It’s true that publishing the material can hasten the arrival of unfriendly AI, but it can also give the world a chance where it had none. If the problem of Friendliness is hard enough that SingInst folks can’t generate all the required insights by themselves before unfriendly AI arrives, then secrecy has negative expected utility. Looking at the apparent difficulty of the problem and the apparent productivity of SingInst over the 10 years of its existence, that seems to me to be the case. Eliezer believes the solution is just a handful of insights away, but I don’t see why.