I will say—unfortunately, we are in a tight situation. but, eliezer’s approach to communicating is a bit … much. it is the case that humanity has to respond quickly. But I think that, by doing that, we can do that. Just don’t think you’re worried on your own, or that you need to be paralyzed. yudkowsky’s approach to communicating causes that, and as I’m sure you know from your context in fighting, sometimes there are conflicts on earth. this conflict is between [humans and mostly-friendly ai] and [unfriendly ai], and there is in fact reason to believe that security resistance to unfriendly ai is meaningfully weak. I unfortunately do not intend to fully reassure, but we can survive this, let’s figure out how—I think the insights about how to make computers and biology defensibly secure suddenly, well, do exist, but aren’t trivial to find and use.
I will say—unfortunately, we are in a tight situation. but, eliezer’s approach to communicating is a bit … much. it is the case that humanity has to respond quickly. But I think that, by doing that, we can do that. Just don’t think you’re worried on your own, or that you need to be paralyzed. yudkowsky’s approach to communicating causes that, and as I’m sure you know from your context in fighting, sometimes there are conflicts on earth. this conflict is between [humans and mostly-friendly ai] and [unfriendly ai], and there is in fact reason to believe that security resistance to unfriendly ai is meaningfully weak. I unfortunately do not intend to fully reassure, but we can survive this, let’s figure out how—I think the insights about how to make computers and biology defensibly secure suddenly, well, do exist, but aren’t trivial to find and use.