If that happens, it means the person who made the breakthrough released
it to the public. That would be a huge mistake, because it would greatly
increase the chances of an unfriendly AI being built before a friendly one.
You are so concerned about the possibility of failure that you want to slow down research, publication and progress in the field—in order to promote research into safety?
Do you think all progress should be slowed down—or just progress in this area?
The costs of stupidity are a million road deaths a year, and goodness knows how many deaths in hospitals. Intelligence would have to be pretty damaging to outweigh that.
There is an obvious good associated with publication—the bigger the concentration of knowledge about intelligent machines there is in one place, the greater wealth inequality is likely to result, and the harder it would be for the rest of society to deal with a dominant organisation. Spreading knowlege helps spread out the power—which reduces the chance of any one group of people becoming badly impoverished. Such altruistic measures may help to prevent a bloody revolution from occurring.
You are so concerned about the possibility of failure that you want to slow down research, publication and progress in the field—in order to promote research into safety?
Do you think all progress should be slowed down—or just progress in this area?
The costs of stupidity are a million road deaths a year, and goodness knows how many deaths in hospitals. Intelligence would have to be pretty damaging to outweigh that.
There is an obvious good associated with publication—the bigger the concentration of knowledge about intelligent machines there is in one place, the greater wealth inequality is likely to result, and the harder it would be for the rest of society to deal with a dominant organisation. Spreading knowlege helps spread out the power—which reduces the chance of any one group of people becoming badly impoverished. Such altruistic measures may help to prevent a bloody revolution from occurring.