Huh? Strong evidence for that would be us all being dead.
I want to insist that “it’s unreasonable to strongly update about technological risks until we’re all dead” is not a great heuristic for evaluating GCRs.
The latter has come to be true, in no small part as a result of his writing. This implies that there was indeed something academics were missing about alignment.
Only a minority agree with him. Any number of (contradictory!) ideas will “seem to be right” if the criterion is only that some people agree with them.
A sizable shift has occurred because of him, which is different than your interpretation of my position. If you’re convincing Stuart Russell, who is convincing Turing award winners like Yoshua Bengio and Judea Pearl, then there was something that wasn’t considered.
Huh? Strong evidence for that would be us all being dead. Or did you just mean that some people in the field agree with him?
I want to insist that “it’s unreasonable to strongly update about technological risks until we’re all dead” is not a great heuristic for evaluating GCRs.
The latter has come to be true, in no small part as a result of his writing. This implies that there was indeed something academics were missing about alignment.
Only a minority agree with him. Any number of (contradictory!) ideas will “seem to be right” if the criterion is only that some people agree with them.
A sizable shift has occurred because of him, which is different than your interpretation of my position. If you’re convincing Stuart Russell, who is convincing Turing award winners like Yoshua Bengio and Judea Pearl, then there was something that wasn’t considered.