You are missing a major problem. Not “secrecy will kill progress” That is, in this context, a lesser problem. The major problem is that scientific secrecy would eventually kill the planet.
In a context of ongoing research and use of any discipline, Dangerous techniques must be published, or they will be duplicated over and over again, until they cause major damage. If the toxicity of Dimethylmercury was a secret, chemical laboratories and entire college campuses dying slowly, horrifically and painfully would be regular occurrences. No scientific work is done without a context, and so all discoveries will happen again. If you do not flag any landmines you spot , someone not-quite-as-sharp will eventually reach the same territory and step on them.
If you find a technique you consider a threat to the world, it is now your problem to deal with, and secrecy is never going to be a sufficient response, but is instead merely an abdication of moral responsibility onto the next person to get there.
It’s a recitation of arguments and anecdotes in favor of secrecy, so of course it’s an argument in that direction. If that wasn’t the intention there would also have been anti-secrecy arguments and anecdotes.
This is an extremely important point. Historically it might take a long time, if ever, for someone else to come to a similar discovery that you just made. For example, Leonardo’s submarines. But that was when only a tiny fraction of humanity devoted time to experiments. His decision to hide his invention kicked the can of secret attacks by submarines many years down the road and may have saved many lives. (I’m not so sure—leaders who wanted wars I’m sure found other secret plots and strategems, but at least he exercised his agency to not be the father of them)
But things are different now. You can be practically guaranteed that if you are working on something, someone else in the world is working on it too, or will be soon. Being at a certain place and time in your industry puts you in a position to see the possible next steps, and you aren’t alone.
If you see something dangerous that others don’t, the best bet is to talk about it. More minds thinking and talking about it from multiple different perspectives have the best chance to solve it.
Communication is a great, helpful key to survival. I think we had it when the U.S. and the Soviets didn’t annihilate the world when the U.S. policy was Mutual Assured Destruction. And I think we didn’t have it in the U.S. Civil War and in WWI, when combat technology had raced ahead of the knowledge and training of the generals of those wars, and that led to shocking massacres unintended by either side.
An example other than unfriendly AI is asteroid mining and serious space travel in general. Right now we have the dangers from asteroids. But the ability to controllably move mass in orbit would inevitably become one of the most powerful weapons ever seen. Unless people make a conscious choice not to use it for that. Although I’ve wanted to write fiction stories about it and work on it, I’ve actually hesitated for the simple fact that I think it’s inevitable that it will become a weapon.
This post makes me confident. The action most likely to lead to humanity’s growth and survival is to talk about it openly. First because we’re already vulnerable to asteroids and can’t do anything about it. And second because talking about it raises awareness of the problem so that more people can focus on solving it.
I really think that avoiding nuclear war is an example. When I was a teenager everyone just assumed we’d all die in a nuclear war someday. Eventually through a deliberate war or an accident or a skynet-style-terminator incident civilization as a whole would be gone. And eventually that fear just evaporated. I think it’s because we as a culture kept talking about it so much and not leaving it up to only a few monarchic leaders.
So I’m changing my outlook and plans based on this post and this comment. I plan to talk about and promote asteroid mining and write short stories about terrorists dropping asteroids on cities. To talk about it it is better in the long run.
, I have given some thought to this specific problem - not just asteroids, but the fact that any spaceship is potentially a weapon, and as working conditions go, extended isolation does not have the best of records on the mental stability front.
Likely solutions: Full automation and one-time-pad locked command and control—This renders it a weapon as well controlled as nuclear arsenals, except with longer lead times on any strike, so even safer from a MAD perspective. (… and no fully private actor ever gets to run them. ) Or if full automation is not workable, a good deal of effort expended on maintaining crew sanity—Psyc/political officers—called something nice, fluffy, and utterly anodyne to make people forget just how much authority they have, backed up with a remote controlled self destruct. Again, one time pad com lock. It’s not going to be a libertarian free for-all as industries go, more a case of “Extremely well paid, to make up for the conditions and the sword that will take your head if you crack under the pressure” Good story potential in that, though.
I think we’re heading off-topic with this one, and I’d like to continue the discussion and focus it on space, not just whether to reveal or keep secrets.
You are missing a major problem. Not “secrecy will kill progress” That is, in this context, a lesser problem. The major problem is that scientific secrecy would eventually kill the planet.
In a context of ongoing research and use of any discipline, Dangerous techniques must be published, or they will be duplicated over and over again, until they cause major damage. If the toxicity of Dimethylmercury was a secret, chemical laboratories and entire college campuses dying slowly, horrifically and painfully would be regular occurrences. No scientific work is done without a context, and so all discoveries will happen again. If you do not flag any landmines you spot , someone not-quite-as-sharp will eventually reach the same territory and step on them. If you find a technique you consider a threat to the world, it is now your problem to deal with, and secrecy is never going to be a sufficient response, but is instead merely an abdication of moral responsibility onto the next person to get there.
My impression of this post was not that it made a focused argument in favor of secrecy specifically.
It’s a recitation of arguments and anecdotes in favor of secrecy, so of course it’s an argument in that direction. If that wasn’t the intention there would also have been anti-secrecy arguments and anecdotes.
See this comment.
Also, I said focused argument.
This is an extremely important point. Historically it might take a long time, if ever, for someone else to come to a similar discovery that you just made. For example, Leonardo’s submarines. But that was when only a tiny fraction of humanity devoted time to experiments. His decision to hide his invention kicked the can of secret attacks by submarines many years down the road and may have saved many lives. (I’m not so sure—leaders who wanted wars I’m sure found other secret plots and strategems, but at least he exercised his agency to not be the father of them)
But things are different now. You can be practically guaranteed that if you are working on something, someone else in the world is working on it too, or will be soon. Being at a certain place and time in your industry puts you in a position to see the possible next steps, and you aren’t alone.
If you see something dangerous that others don’t, the best bet is to talk about it. More minds thinking and talking about it from multiple different perspectives have the best chance to solve it.
Communication is a great, helpful key to survival. I think we had it when the U.S. and the Soviets didn’t annihilate the world when the U.S. policy was Mutual Assured Destruction. And I think we didn’t have it in the U.S. Civil War and in WWI, when combat technology had raced ahead of the knowledge and training of the generals of those wars, and that led to shocking massacres unintended by either side.
An example other than unfriendly AI is asteroid mining and serious space travel in general. Right now we have the dangers from asteroids. But the ability to controllably move mass in orbit would inevitably become one of the most powerful weapons ever seen. Unless people make a conscious choice not to use it for that. Although I’ve wanted to write fiction stories about it and work on it, I’ve actually hesitated for the simple fact that I think it’s inevitable that it will become a weapon.
This post makes me confident. The action most likely to lead to humanity’s growth and survival is to talk about it openly. First because we’re already vulnerable to asteroids and can’t do anything about it. And second because talking about it raises awareness of the problem so that more people can focus on solving it.
I really think that avoiding nuclear war is an example. When I was a teenager everyone just assumed we’d all die in a nuclear war someday. Eventually through a deliberate war or an accident or a skynet-style-terminator incident civilization as a whole would be gone. And eventually that fear just evaporated. I think it’s because we as a culture kept talking about it so much and not leaving it up to only a few monarchic leaders.
So I’m changing my outlook and plans based on this post and this comment. I plan to talk about and promote asteroid mining and write short stories about terrorists dropping asteroids on cities. To talk about it it is better in the long run.
This distinction doesn’t seem important.
, I have given some thought to this specific problem - not just asteroids, but the fact that any spaceship is potentially a weapon, and as working conditions go, extended isolation does not have the best of records on the mental stability front.
Likely solutions: Full automation and one-time-pad locked command and control—This renders it a weapon as well controlled as nuclear arsenals, except with longer lead times on any strike, so even safer from a MAD perspective. (… and no fully private actor ever gets to run them. ) Or if full automation is not workable, a good deal of effort expended on maintaining crew sanity—Psyc/political officers—called something nice, fluffy, and utterly anodyne to make people forget just how much authority they have, backed up with a remote controlled self destruct. Again, one time pad com lock. It’s not going to be a libertarian free for-all as industries go, more a case of “Extremely well paid, to make up for the conditions and the sword that will take your head if you crack under the pressure” Good story potential in that, though.
I think we’re heading off-topic with this one, and I’d like to continue the discussion and focus it on space, not just whether to reveal or keep secrets.
So I started this thread: http://lesswrong.com/r/discussion/lw/gsv/asteroids_and_spaceships_are_kinetic_bombs_and/