Right, it’s not only noise that can alter the value of copying/propagating source code, since we can at least imagine that future improvements will be also more stable in this regard: there’s also the possibility that the moral landscape of an AI could be fractal, so that even a small modification might turn friendliness into unfriendliness/hostility.
Right, it’s not only noise that can alter the value of copying/propagating source code, since we can at least imagine that future improvements will be also more stable in this regard: there’s also the possibility that the moral landscape of an AI could be fractal, so that even a small modification might turn friendliness into unfriendliness/hostility.