I wrote, “Wouldn’t it just be easier to convince the public to accept a certain amount of risk, to accept debates about trade-offs?”
Zubon replied:
How?
Keeping secrets is a known technology. Overcoming widespread biases is the reason we are here. If you have a way to sway the public on these issues, please, share.
“Keeping secrets” is a vague description of Eliezer’s proposal. “Keeping secrets” might be known technology, but so is “convincing the public to accept risks.” (E.g., they accept automobile fatality rates.) Which of these “technologies” would be easier to deploy in this case? That depends on the particular secrets to be kept and the particular risks to be accepted.
Since Eliezer talked about keeping projects “classified”, I assume that he’s talking about government-funded research. So, as I read him, he wants the government to fund basic, nonmilitary research that carries existential risks, but he wants the projects and the reports on the existential risks to be kept classified.
In a democracy, that means that the public, or their elected representatives, need to be convinced to spend their tax dollars on research, even while they know that they will not be told of the risks, or even of the nature of the specific research projects being funded. That is routine for military research, but there the public believes that the secrecy is protecting them from a greater existential threat. Eliezer is talking about basic research that does not obviously protect us from an existential threat.
The point is really this: To convince the public to fund research of this nature, you will need to convince them to accept risks anyways, since they need to vote for all this funding to go into some black box marked “Research that poses a potential existential threat, so you can’t know about it.” So, Eliezer’s plan already requires convincing the public to accept risks. Then, on top of that, he needs to keep the secrets. That’s why it seems to me that his plan can only be harder than mine, which just requires convincing them to accept risks, without the need for the secrecy.
I wrote, “Wouldn’t it just be easier to convince the public to accept a certain amount of risk, to accept debates about trade-offs?”
Zubon replied:
How?
Keeping secrets is a known technology. Overcoming widespread biases is the reason we are here. If you have a way to sway the public on these issues, please, share.
“Keeping secrets” is a vague description of Eliezer’s proposal. “Keeping secrets” might be known technology, but so is “convincing the public to accept risks.” (E.g., they accept automobile fatality rates.) Which of these “technologies” would be easier to deploy in this case? That depends on the particular secrets to be kept and the particular risks to be accepted.
Since Eliezer talked about keeping projects “classified”, I assume that he’s talking about government-funded research. So, as I read him, he wants the government to fund basic, nonmilitary research that carries existential risks, but he wants the projects and the reports on the existential risks to be kept classified.
In a democracy, that means that the public, or their elected representatives, need to be convinced to spend their tax dollars on research, even while they know that they will not be told of the risks, or even of the nature of the specific research projects being funded. That is routine for military research, but there the public believes that the secrecy is protecting them from a greater existential threat. Eliezer is talking about basic research that does not obviously protect us from an existential threat.
The point is really this: To convince the public to fund research of this nature, you will need to convince them to accept risks anyways, since they need to vote for all this funding to go into some black box marked “Research that poses a potential existential threat, so you can’t know about it.” So, Eliezer’s plan already requires convincing the public to accept risks. Then, on top of that, he needs to keep the secrets. That’s why it seems to me that his plan can only be harder than mine, which just requires convincing them to accept risks, without the need for the secrecy.