Because I want to read the material, I want to build upon it, and I want to see other people build upon it. For example, Wei Dai is not on the list of new associates. Do you agree that hiding the material from him will likely slow down progress?
It’s true that publishing the material can hasten the arrival of unfriendly AI, but it can also give the world a chance where it had none. If the problem of Friendliness is hard enough that SingInst folks can’t generate all the required insights by themselves before unfriendly AI arrives, then secrecy has negative expected utility. Looking at the apparent difficulty of the problem and the apparent productivity of SingInst over the 10 years of its existence, that seems to me to be the case. Eliezer believes the solution is just a handful of insights away, but I don’t see why.
Because I want to read the material, I want to build upon it, and I want to see other people build upon it. For example, Wei Dai is not on the list of new associates. Do you agree that hiding the material from him will likely slow down progress?
Sure, but to make a good decision you need to weigh upsides against downsides.
It’s true that publishing the material can hasten the arrival of unfriendly AI, but it can also give the world a chance where it had none. If the problem of Friendliness is hard enough that SingInst folks can’t generate all the required insights by themselves before unfriendly AI arrives, then secrecy has negative expected utility. Looking at the apparent difficulty of the problem and the apparent productivity of SingInst over the 10 years of its existence, that seems to me to be the case. Eliezer believes the solution is just a handful of insights away, but I don’t see why.