Frank: I do not suggest “completely re-evalating” a belief when it is contradicted. And it is true that if you have a considered opinion on some matter, even though you know that many people disagree with it, you are unlikely to completely change your mind when you hear that some particular person disagrees with it.
However, if you are surprised to hear that some particular person disagrees with it, then you should update your opinion in such a way that there is a greater probability (than before) that the person holds an unreasonable opinion in the matter. But you should also update your opinion in such a way that there is a greater probability than before that you are wrong.
For example, since Eliezer was surprised to hear of Dennett’s opinion, he should assign a greater probability than before to the possibility that human level AI will not be developed with the foreseeable future. Likewise, to take the more extreme case, assuming that he was surprised at Aumann’s religion, he should assign a greater probability to the Jewish religion, even if only to a slight degree.
Of course this would be less necessary to the degree that you are unsurprised by his disagreeing opinion, just as if you knew about it completely when you originally formed your opinion, you would not need to update.
Frank: I do not suggest “completely re-evalating” a belief when it is contradicted. And it is true that if you have a considered opinion on some matter, even though you know that many people disagree with it, you are unlikely to completely change your mind when you hear that some particular person disagrees with it.
However, if you are surprised to hear that some particular person disagrees with it, then you should update your opinion in such a way that there is a greater probability (than before) that the person holds an unreasonable opinion in the matter. But you should also update your opinion in such a way that there is a greater probability than before that you are wrong.
For example, since Eliezer was surprised to hear of Dennett’s opinion, he should assign a greater probability than before to the possibility that human level AI will not be developed with the foreseeable future. Likewise, to take the more extreme case, assuming that he was surprised at Aumann’s religion, he should assign a greater probability to the Jewish religion, even if only to a slight degree.
Of course this would be less necessary to the degree that you are unsurprised by his disagreeing opinion, just as if you knew about it completely when you originally formed your opinion, you would not need to update.