Organizations concerned about future Snowdens will be less likely to hire someone who takes such a pledge. Indeed, as I would expect these organizations to be putting in place mechanisms to identify future Snowdens I would expect them to be the biggest supporters of getting lots of people to signal their likelihood of becoming another Snowden.
Instead, how about people (such as myself) who will never have the technical skills to help create an AGI take a pledge that we we provide financial support to anyone who suffers great personal loss because he exposed AGI development risks.
Eliezer’s view, as I understand it, is that only the smartest of the smart have a chance of creating AGI for now.
If these outliers declare their principles, then it will exclude dangerous AGI projects from getting researchers who can make it work (until Moore’s Law of Mad Science has its effect). And perhaps lesser minds will follow these leaders in choosing the norms for their community.
If we compare to the canonical example, the Manhattan Project, we see that the smartest people did not refuse to join. World War II is on, and their viewpoint is at least understandable A few passed info to the Soviet Union—I don’t know if we can analogize to that. A few people left the project (Rothblatt) or turned into anti-nuclear crusaders afterward. A few leading minds just continued developing newer and nastier bombs.
But Szilard understood the tremendous danger from A-bombs before they were built. Einstein was a pacifist, and many physicists has strong ethical convictions. If they had all spoken loudly and publicly about the evils of A-bombs, and made that a norm in their community in the 1930s, that might have slowed down the Manhattan Project.
The Manhattan project was a strategic failure because it greatly helped the Soviets build atomic weapons. The U.S. military would have been far better off if it more carefully chose who could work on nuclear weapons development, even if this added several years to how long it took them to get atomic weapons.
So, analogizing to a future AGI project, you’re saying that having more “ideologically incorrect” people in the research community can indeed harm a potentially dangerous project.
If the leaders of the unsafe project exclude more “ideologically incorrect” ppeople, then this will add to the time required for development.
On the other hand, if there are more people with the “incorrect” leak-prone and these are not excluded (possibly because they never made their ideology public), then a potentially beneficial leak is more likely.
The role of espionage in the Soviet nuclear weapons program has been greatly exaggerated. While spies did accelerate their progress, it’s pretty clear that they could have developed nuclear weapons entirely on their own. I don’t think things would have been vastly different if the only information they had was that the US dropped a nuclear weapon on Hiroshima and Nagasaki.
Organizations concerned about future Snowdens will be less likely to hire someone who takes such a pledge. Indeed, as I would expect these organizations to be putting in place mechanisms to identify future Snowdens I would expect them to be the biggest supporters of getting lots of people to signal their likelihood of becoming another Snowden.
Instead, how about people (such as myself) who will never have the technical skills to help create an AGI take a pledge that we we provide financial support to anyone who suffers great personal loss because he exposed AGI development risks.
Your commitment idea is good.
Eliezer’s view, as I understand it, is that only the smartest of the smart have a chance of creating AGI for now.
If these outliers declare their principles, then it will exclude dangerous AGI projects from getting researchers who can make it work (until Moore’s Law of Mad Science has its effect). And perhaps lesser minds will follow these leaders in choosing the norms for their community.
If we compare to the canonical example, the Manhattan Project, we see that the smartest people did not refuse to join. World War II is on, and their viewpoint is at least understandable A few passed info to the Soviet Union—I don’t know if we can analogize to that. A few people left the project (Rothblatt) or turned into anti-nuclear crusaders afterward. A few leading minds just continued developing newer and nastier bombs.
But Szilard understood the tremendous danger from A-bombs before they were built. Einstein was a pacifist, and many physicists has strong ethical convictions. If they had all spoken loudly and publicly about the evils of A-bombs, and made that a norm in their community in the 1930s, that might have slowed down the Manhattan Project.
The Manhattan project was a strategic failure because it greatly helped the Soviets build atomic weapons. The U.S. military would have been far better off if it more carefully chose who could work on nuclear weapons development, even if this added several years to how long it took them to get atomic weapons.
So, analogizing to a future AGI project, you’re saying that having more “ideologically incorrect” people in the research community can indeed harm a potentially dangerous project.
If the leaders of the unsafe project exclude more “ideologically incorrect” ppeople, then this will add to the time required for development.
On the other hand, if there are more people with the “incorrect” leak-prone and these are not excluded (possibly because they never made their ideology public), then a potentially beneficial leak is more likely.
Yes
The role of espionage in the Soviet nuclear weapons program has been greatly exaggerated. While spies did accelerate their progress, it’s pretty clear that they could have developed nuclear weapons entirely on their own. I don’t think things would have been vastly different if the only information they had was that the US dropped a nuclear weapon on Hiroshima and Nagasaki.