To add another question to the list: Would you keep internal discussions on “what should we do” secret?
(Right now it appears as if there is little disagreement within the SIAI “inner circle” regarding how best to achieve a positive Singularity, but is it because there is actually little disagreement, or because they think such disagreements should be kept from public view for PR reasons?)
Also, would any SIAI people like to chime in and say whether they see outside discussions like this one as being productive or counter-productive?
I’d make anything except the actual technical details of the AI work public. I would also make any technical details that could not be used by others to create an unfriendly AI public. Information hoarding is always counterproductive in the medium to long term.
To add another question to the list: Would you keep internal discussions on “what should we do” secret?
(Right now it appears as if there is little disagreement within the SIAI “inner circle” regarding how best to achieve a positive Singularity, but is it because there is actually little disagreement, or because they think such disagreements should be kept from public view for PR reasons?)
Also, would any SIAI people like to chime in and say whether they see outside discussions like this one as being productive or counter-productive?
I’d make anything except the actual technical details of the AI work public. I would also make any technical details that could not be used by others to create an unfriendly AI public. Information hoarding is always counterproductive in the medium to long term.