Yes, but fear of a Snowden would make project leaders distrustful of their own staff.
And if many top researchers in the field were known to be publicly opposed to any unsafe project that the agencies are likely to create, it would shrink their recruiting pool.
The idea is to create a moral norm in the community. The norm can be violated, but it would put a crimp in the projects as compared to a situation where there is no such moral norm.
This presupposes that the AGI community is, on average, homogenous across the world and would behave accordingly. What if the political climates, traditions and culture make certain (powerful) countries less likely to be fearful given their own AGI pool?
In otherwords, if country A distrusts their staff more than country B due to political/economic/cultural factors, country A would be behind in the AGI arms race, which would lead to the “even if I hold onto my morals, we’re still heading into the abyss” attitude. I could see organizations or governments rationalizing against the community moral pledge in this way by highlighting the futility of slowing down the research.
Indeed. Just imagine the fear of the next Snowden in the NSA, and trying to work out just how many past Snowdens they’ve had who took their secrets to the enemy rather than the public.
You’ve made my point clearly—and perhaps I didn’t make it clearly enough in my post. I was focusing not on a leak in itself, but on what suspicion can do to an organization. As I described it, the suspicion would “cast a shadow” and “hover over” the project.
At this point, NSA may well be looking for anyone who expressed hacker/cypherpunk/copyfighter sentiments. Not that these need to disqualify someone from serving in the NSA, but at this point, the NSA is probably pretty suspicious.
I would like to agree with you but experience says otherwise. Tyrants have always been able to find enough professionals with dubious morals to further their plans.
In World War I, German Jewish scientists contributed to the German war effort. In World War II, refugee scientist contributed to the Allied war effort. Tyrants can shoot themselves in the foot quite effectively.
A few top physicists were left in Germany, including Heisenberg, but it was not enough to move the project forward, and it’s suspected that Heisenberg may have deliberately sabotaged the project.
But you have a point. So long as AGI is at the cutting edge, only a handful of top people can move it forward. As Moore’s Law of Mad Science has its effect, “ordinary” scientists will be enough.
(And to make it clear, I am not suggesting that the US government is tyrannical.)
if the project does not take safety into account, we want exactly this—so long as it doesn’t get close enough to success that failure involves paper-clipping the world.
What good would a Snowden do? The research would continue.
Yes, but fear of a Snowden would make project leaders distrustful of their own staff.
And if many top researchers in the field were known to be publicly opposed to any unsafe project that the agencies are likely to create, it would shrink their recruiting pool.
The idea is to create a moral norm in the community. The norm can be violated, but it would put a crimp in the projects as compared to a situation where there is no such moral norm.
This presupposes that the AGI community is, on average, homogenous across the world and would behave accordingly. What if the political climates, traditions and culture make certain (powerful) countries less likely to be fearful given their own AGI pool?
In otherwords, if country A distrusts their staff more than country B due to political/economic/cultural factors, country A would be behind in the AGI arms race, which would lead to the “even if I hold onto my morals, we’re still heading into the abyss” attitude. I could see organizations or governments rationalizing against the community moral pledge in this way by highlighting the futility of slowing down the research.
The AGI community is tiny today. As it grows, its future composition will be determined by the characteristic of the tiny seed that expands.
I won’t claim that the future AGI community will be homogeneous, but it may be possible to establish norms starting today.
Indeed. Just imagine the fear of the next Snowden in the NSA, and trying to work out just how many past Snowdens they’ve had who took their secrets to the enemy rather than the public.
Yes, exactly.
You’ve made my point clearly—and perhaps I didn’t make it clearly enough in my post. I was focusing not on a leak in itself, but on what suspicion can do to an organization. As I described it, the suspicion would “cast a shadow” and “hover over” the project.
At this point, NSA may well be looking for anyone who expressed hacker/cypherpunk/copyfighter sentiments. Not that these need to disqualify someone from serving in the NSA, but at this point, the NSA is probably pretty suspicious.
I would like to agree with you but experience says otherwise. Tyrants have always been able to find enough professionals with dubious morals to further their plans.
In World War I, German Jewish scientists contributed to the German war effort. In World War II, refugee scientist contributed to the Allied war effort. Tyrants can shoot themselves in the foot quite effectively.
A few top physicists were left in Germany, including Heisenberg, but it was not enough to move the project forward, and it’s suspected that Heisenberg may have deliberately sabotaged the project.
But you have a point. So long as AGI is at the cutting edge, only a handful of top people can move it forward. As Moore’s Law of Mad Science has its effect, “ordinary” scientists will be enough.
(And to make it clear, I am not suggesting that the US government is tyrannical.)
There are plenty cases where government puts a bunch of incompetent people on a project and the project fails.
if the project does not take safety into account, we want exactly this—so long as it doesn’t get close enough to success that failure involves paper-clipping the world.