I don’t see why I should be hesitant to discuss this matter nowadays here on lesswrong—there are probably a hundred other discussions about the creative ways in which self-improving AGI may end us. (Although admittedly I am not aware of any that openly ask whether self-improving AGI development should happen in secrecy).
In the stupendously unlikely scenario that this article inspires some kind of “pulling the AGI-stuff out of the public sphere” a decade from now, it would have more than made up for it’s presence—and if not, then it’s just another drop in the bucket for all to see and a worthwhile discussion to be had.
I’m serious, self-improving AGI is at least on the same threat-level as nuclear warheads and it would be quite foolish to assume that 30-50 years from now people like Eliezer or Ben Goertzel could actually build one and somehow remain “unmolested” by governments or public outrage.
You don’t hesitate to discuss the possibility of secrecy exactly because you don’t expect secrecy to have huge benefits that will be spoiled by others’ expecting it.
My level of concern over this post is also nearly zero.
I think this is about effects far in the future (even so: may be worth thinking about now), that depend on decisions that will be made far in the future (so: safe to postpone thinking about).
I don’t see why I should be hesitant to discuss this matter nowadays here on lesswrong—there are probably a hundred other discussions about the creative ways in which self-improving AGI may end us. (Although admittedly I am not aware of any that openly ask whether self-improving AGI development should happen in secrecy).
In the stupendously unlikely scenario that this article inspires some kind of “pulling the AGI-stuff out of the public sphere” a decade from now, it would have more than made up for it’s presence—and if not, then it’s just another drop in the bucket for all to see and a worthwhile discussion to be had.
I’m serious, self-improving AGI is at least on the same threat-level as nuclear warheads and it would be quite foolish to assume that 30-50 years from now people like Eliezer or Ben Goertzel could actually build one and somehow remain “unmolested” by governments or public outrage.
You don’t hesitate to discuss the possibility of secrecy exactly because you don’t expect secrecy to have huge benefits that will be spoiled by others’ expecting it.
My level of concern over this post is also nearly zero.
I think this is about effects far in the future (even so: may be worth thinking about now), that depend on decisions that will be made far in the future (so: safe to postpone thinking about).