Robin, I agree that everyone’s opinion on this matter is certainly relevant information which you should incorporate. And arguably Eliezer’s position is an outlier. But you are not actively interacting this week with the AI expert community (I assume). You are interacting with Eliezer. He is the one who knows your opinion and knows you have not changed it despite knowing his opinion, and so on ad infinitum. It is the two of you who are interacting and the two of you who must agree.
Everyone else’s opinions are relevant data which will influence both of your initial estimates, to be sure. But that is the extent of that influence. The fact that everyone else holds a different position than one of you does not prevent you from coming to agreement. Assuming that Eliezer is aware of his differences with other experts, the fact that he still holds to his view ought to be even more persuasive to you that he has good grounds for his position. And if somehow you thought that Eliezer was not aware of his outlying position, your own stubbornness on these grounds would have communicated to Eliezer that you have persuasive private information that he is wrong. And again in that case, the fact that he has not changed his mind despite your persistence must again imply that his own reasons for his belief are robust and powerful.
I can understand why you might be puzzled as to how Eliezer can reach such a different view than the AI expert community, and with such confidence, but fundamentally that is irrelevant. The fact remains that regardless of how reliable you think Eliezer’s conclusions are a priori, once you interact with him and he remains set on his position, as long as you see each other as rational (and meta-rational), you must come to agree.
Robin, I agree that everyone’s opinion on this matter is certainly relevant information which you should incorporate. And arguably Eliezer’s position is an outlier. But you are not actively interacting this week with the AI expert community (I assume). You are interacting with Eliezer. He is the one who knows your opinion and knows you have not changed it despite knowing his opinion, and so on ad infinitum. It is the two of you who are interacting and the two of you who must agree.
Everyone else’s opinions are relevant data which will influence both of your initial estimates, to be sure. But that is the extent of that influence. The fact that everyone else holds a different position than one of you does not prevent you from coming to agreement. Assuming that Eliezer is aware of his differences with other experts, the fact that he still holds to his view ought to be even more persuasive to you that he has good grounds for his position. And if somehow you thought that Eliezer was not aware of his outlying position, your own stubbornness on these grounds would have communicated to Eliezer that you have persuasive private information that he is wrong. And again in that case, the fact that he has not changed his mind despite your persistence must again imply that his own reasons for his belief are robust and powerful.
I can understand why you might be puzzled as to how Eliezer can reach such a different view than the AI expert community, and with such confidence, but fundamentally that is irrelevant. The fact remains that regardless of how reliable you think Eliezer’s conclusions are a priori, once you interact with him and he remains set on his position, as long as you see each other as rational (and meta-rational), you must come to agree.