Specifically with regard to the apparent persistent disagreement between you and Robin, none of those things explain it. You guys could just take turns doing nothing but calling out your estimates on the issue in question (for example, the probability of a hard takeoff AI this century), and you should reach agreement within a few rounds. The actual reasoning behind your opinions has no bearing whatsoever on your ability to reach agreement (or more precisely, on your inability to maintain disagreement).
Now, this is assuming that you both are honest and rational, and view each other that way. Exposing your reasoning may give one or the other of you grounds to question these attributes, and that may explain a persistent disagreement.
It is also useful to discuss your reasoning, if case your goal is not to simply reach agreement, but to get the right answer. It is possible that this is the real explanation behind your apparent disagreement. You might be able to reach agreement relatively quickly by fiat, but one or both of you would still be left puzzled about how things could be so different from what your seemingly very valid reasoning led you to expect. You would still want to hash over the issues and talk things out.
Robin earlier posted, “since this topic is important, my focus here is on gaining a better understanding of it”. I read this as suggesting that his goal is not merely to resolve the disagreement, and perhaps not to particularly pursue the disagreement aspects at all. He also pointed out, “you do not know that I have not changed my opinion since learning of Eliezer’s opinion, and I do not assume that he has not changed his opinion.” This is consistent with the possibility that there is no disagreement at all, and that Robin and possibly Eliezer have changed their views enough that they substantially agree.
Robin has also argued that there is no reason for agreement to limit vigorous dissension and debate about competing views. In effect, people would act as devil’s advocates, advancing ideas and positions that they thought were probably wrong, but which still deserved a hearing. It’s possible that he has come to share Eliezer’s position yet continues to challenge it along the lines proposed in that posting.
One thing that bothers me, as an observer who is more interested in the nature of disagreement than the future course of humanity and its descendants, is that Robin and Eliezer have not taken more opportunity to clarify these matters and to lay out the time course of their disagreement more clearly. It would help too for them to periodically estimate how likely they think the other is to be behaving as a rational, honest, “Bayesian wannabe”. As two of the most notable wannabe’s around, both very familiar with the disagreement theorems, both highly intelligent and rational, this is a terrible missed opportunity. I understand that Robin’s goal may be as stated, to air the issues, but I don’t see why they can’t simultaneously serve the community by shedding light on the nature of this disagreement.
Specifically with regard to the apparent persistent disagreement between you and Robin, none of those things explain it. You guys could just take turns doing nothing but calling out your estimates on the issue in question (for example, the probability of a hard takeoff AI this century), and you should reach agreement within a few rounds. The actual reasoning behind your opinions has no bearing whatsoever on your ability to reach agreement (or more precisely, on your inability to maintain disagreement).
Now, this is assuming that you both are honest and rational, and view each other that way. Exposing your reasoning may give one or the other of you grounds to question these attributes, and that may explain a persistent disagreement.
It is also useful to discuss your reasoning, if case your goal is not to simply reach agreement, but to get the right answer. It is possible that this is the real explanation behind your apparent disagreement. You might be able to reach agreement relatively quickly by fiat, but one or both of you would still be left puzzled about how things could be so different from what your seemingly very valid reasoning led you to expect. You would still want to hash over the issues and talk things out.
Robin earlier posted, “since this topic is important, my focus here is on gaining a better understanding of it”. I read this as suggesting that his goal is not merely to resolve the disagreement, and perhaps not to particularly pursue the disagreement aspects at all. He also pointed out, “you do not know that I have not changed my opinion since learning of Eliezer’s opinion, and I do not assume that he has not changed his opinion.” This is consistent with the possibility that there is no disagreement at all, and that Robin and possibly Eliezer have changed their views enough that they substantially agree.
Robin has also argued that there is no reason for agreement to limit vigorous dissension and debate about competing views. In effect, people would act as devil’s advocates, advancing ideas and positions that they thought were probably wrong, but which still deserved a hearing. It’s possible that he has come to share Eliezer’s position yet continues to challenge it along the lines proposed in that posting.
One thing that bothers me, as an observer who is more interested in the nature of disagreement than the future course of humanity and its descendants, is that Robin and Eliezer have not taken more opportunity to clarify these matters and to lay out the time course of their disagreement more clearly. It would help too for them to periodically estimate how likely they think the other is to be behaving as a rational, honest, “Bayesian wannabe”. As two of the most notable wannabe’s around, both very familiar with the disagreement theorems, both highly intelligent and rational, this is a terrible missed opportunity. I understand that Robin’s goal may be as stated, to air the issues, but I don’t see why they can’t simultaneously serve the community by shedding light on the nature of this disagreement.