That’s just… wow. That’s frighteningly stupid. That’s about as bad as someone saying they aren’t worried about a nuclear reactor undergoing a meltdown because they had their local clergy bless it although the potential negative pay off of this is orders of magnitude higher. I don’t put a high probability to an AI triggered singularity but this is just… wow. One thing seems pretty clear: if an AGI does do a hard-take off to control its light cone, the result is likely to be really bad, simply because so many people are being stupid about it.
Part of me is worried that the SIAI people are thinking much more carefully about some of these issues maybe should suggest that their estimates for recursively self-improving are much more likely than I estimate.
To me, the frightening thing isn’t the original mistake (though it is egregious), it’s the fact that the response to having it pointed out was “You should not assume such a poor implementation of my idea that it cannot make discriminations that are trivial to current humans” rather than “OOPS!”
Well, he sort of did, actually.
That’s just… wow. That’s frighteningly stupid. That’s about as bad as someone saying they aren’t worried about a nuclear reactor undergoing a meltdown because they had their local clergy bless it although the potential negative pay off of this is orders of magnitude higher. I don’t put a high probability to an AI triggered singularity but this is just… wow. One thing seems pretty clear: if an AGI does do a hard-take off to control its light cone, the result is likely to be really bad, simply because so many people are being stupid about it.
Part of me is worried that the SIAI people are thinking much more carefully about some of these issues maybe should suggest that their estimates for recursively self-improving are much more likely than I estimate.
To me, the frightening thing isn’t the original mistake (though it is egregious), it’s the fact that the response to having it pointed out was “You should not assume such a poor implementation of my idea that it cannot make discriminations that are trivial to current humans” rather than “OOPS!”