I think secrecy is rarely a long-term solution because it’s fragile, but it can definitely have short-term uses? For example, I’m sure that some insights into AI have the capacity to advance both alignment and capabilities; if you have such an insight then you might want to share it secretly with alignment researchers while avoiding sharing it publicly because you’d rather Facebook AI not enhance its capabilities. And so the secrecy doesn’t have to be a permanent load-bearing part of a system; instead it’s just that every day the secrecy holds up is one more day you get to pull ahead of Facebook.
From what I know of security, any system requiring secrecy is already implicitly flawed.
(Naturally, if this doesn’t apply and you backchanneled your idea for some legitimate meta-reason, I withdraw my objection.)
I think secrecy is rarely a long-term solution because it’s fragile, but it can definitely have short-term uses? For example, I’m sure that some insights into AI have the capacity to advance both alignment and capabilities; if you have such an insight then you might want to share it secretly with alignment researchers while avoiding sharing it publicly because you’d rather Facebook AI not enhance its capabilities. And so the secrecy doesn’t have to be a permanent load-bearing part of a system; instead it’s just that every day the secrecy holds up is one more day you get to pull ahead of Facebook.