Well, there’s the AI box experiment. Those transcripts could be kept hidden because private, personal things were spoken of, but then people who have done the experiment could tell us that that was the reason for secrecy. Eliazer seems to believe that humanity will be more likely to create and be harmed by a boxed AI if the transcripts are revealed.
This is kind of a hard question to answer without crossing the lines that should not be crossed, though. You can really only get an answer about ideas that shouldn’t be spoken of only under certain conditions, or ones that can be referred to in a more general sense (like the AI box).
Well, there’s the AI box experiment. Those transcripts could be kept hidden because private, personal things were spoken of, but then people who have done the experiment could tell us that that was the reason for secrecy. Eliazer seems to believe that humanity will be more likely to create and be harmed by a boxed AI if the transcripts are revealed.
This is kind of a hard question to answer without crossing the lines that should not be crossed, though. You can really only get an answer about ideas that shouldn’t be spoken of only under certain conditions, or ones that can be referred to in a more general sense (like the AI box).