I’m the person affiliated with CFAR who has done the most work on Double Crux in the past year. I both teach the unit (and it’s new accompaniment-class “Finding Cruxes”) at workshops, and semi-frequently run full or half day Double Crux development-and-test sessions on weekends. (However, I am technically a contractor, not an employee of CFAR.)
In the process of running test sessions, I’ve developed several CFAR units worth of support material or prerequisite for doing Double Crux well. We haven’t yet solved all of the blockers, but attendees of those full-day workshops are much more skilled at applying the technique successfully (according to my subjective impression, and by the count of “successfully resolved” conversations.)
This new content is currently unpublished, but I expect that I’ll put it up on LessWrong in some form (see the last few bullet point below), sometime in the next year.
I broadly agree with this post. Some of my current thoughts:
I’m fully aware that Double Crux is hard to use successfully, which is what motivated me to work on improving the usability of the technique in the first place.
Despite those usability issues, I have seen it work effectively to the point of completely resolving a disagreement. (Notably, most of the instances I can recall were Double Cruxes between CFAR staff, who have a very high level of familiarity with Double Crux as a concept.)
The specific algorithm that we teach at workshops has undergone iteration. The steps we teach now are quite different than those of a year ago.
Most of the value of Double Crux, it seems to me, comes not from formal application of the framework, but rather from using conversational moves from Double Crux in “regular” conversations. TAPs to “operationalize”, or “ask what would change your own mind” are very useful. (Indeed, about half of the Double Crux support content is explicitly for training those TAPs, individually.) This is, I think, what you’re pointing to with the difference between “the actual double crux technique. and ‘the overall pattern of behaviour surrounding this Official Double Crux technique’”.
In particular, I think that the greatest value of having the Double Crux class at workshops is the propagation of the jargon “crux”. It is useful for the CFAR alumni community to have a distinct concept for “a thing that would cause you to change your mind”, because that concept can then be invoked in conversation.
I think the full stack of habits, TAPs, concepts, and mindsets that lead to resolution of apparently intractable disagreement, is the interesting thing, and what we should be pursuing, regardless of if that stack “is Double Crux.” (This is in fact what I’m working on.)
Currently, I am unconvinced that Double Crux is the best or “correct” framework for resolving disagreements. Personally, I am more interested in other (nearby) conversational frameworks,
In particular, I expect that non-symmetrical methods for grocking another person’s intuitions, as Thrasymachus suggests, to be fruitful. I, personally, currently use an asymmetrical framework much more frequently than I use a symmetric Double Crux framework. (In part because this doesn’t require my interlocutor to do anything in particular or have knowledge of any particular conversational frame.)
I broadly agree with the section on asymmetry of cruxes (and it is an open curriculum development consideration). One frequently does not find a Double Crux, and furthermore doesn’t need to find a Double Crux to make progress: single cruxs are very useful. (The current CFAR unit currently says as much.)
There are some non-obvious advantages to finding a Double Crux though, namely that (if successful), you don’t just agree about the top-level proposition, but also share the same underlying model. (Double Crux is not, however, the only framework that produces this result.)
I have a few points of disagreement, however. Most notably, how common cruxes are.
I suggest the main problem facing the ‘double crux technique’ is that disagreements like Xenia’s and Yevgeny’s, which can be eventually traced to a single underlying consideration, are the exception rather than the rule.
My empirical experience is that disputes can be traced down to a single underlying consideration more frequently than one might naively think, particularly in on-the-fly disagreements about “what we should do” between two people with similar goals (which, I believe, is Double Crux’s ideal use case.)
For many recondite topics I think about, my credence it in arises from the balance of a variety of considerations pointing in either direction.
While this is usually true (at least for sophisticated reasoners), it sometimes doesn’t bear on the possibility of finding a (single) crux.
For instance, as a very toy example, I have lots of reasons to believe that acceleration due to gravity is about 9.806 m/s^2: the textbooks I’ve read, the experiments I did in highschool, my credence in the edifice of science, ect.
But, if I were to find out that I were currently on the moon, this would render all of those factors irrelevant. It isn’t that some huge event changed my credence about all of the above factors. It’s that all of those factors flow into a single higher-level node and if you break the connection between that node and the top level proposition, your view can change drastically, because those factors are no longer important. In one sense it’s a massive update, but in another sense, it’s only a single bit flipped.
I think that many real to life seemingly intractable disagreements, particularly when each party has a strong and contrary-to-the-other’s intuition, have this characteristic. It’s not that you disagree about the evidence in question, you disagree about which evidence matters.
Because I think we’re on the moon, and you think we’re on earth.
But this is often hard to notice, because that sort of background content is something we both take for granted.
Next I’ll try to give some realer-to-life examples. (Full real life examples will be hard to convey because they will be more subtle or require more context. Very simplified anecdotes will have to do for now.)
You can notice something like this happening when...
1) You are surprised or taken aback at some piece of information that the other person thinks is true:
“Wait. You think if we had open borders almost the same number of people would immigrate as under the current US immigration policy?!” [This was a full Double Crux from a real conversation, resolved with recourse to available stats and a fermi estimate.]
Whether or not more people will imigrate could very well change your mind about open borders.
2) You have (according to you) knock-down arguments against their case that they seem to concede quickly, that they don’t seem very interested in, or that don’t change their overall view much.
You’re talking with a person who doesn’t think that decreasing carbon emissions is important. You give them a bunch of evidence about the havoc that global warming will wreak, and they agree with it all. It turns out (though they didn’t quite realize it themselves) that they’re expecting that things are so bad that geo-engineering will be necessary, and it’s not worth doing anything short of geoengineering. [Fictionalized real example.]
The possibility and/or necessity of geoengineering could easily be a crux for someone in favor of carbon-minimizing interventions.
3) They keep talking about considerations that, to you, seem minor (this also happens between people who agree):
A colleague tells you that something you did was “rude” and seems very upset. It seems to you that it was a little abrasive, but that it is important that actions of that sort be allowed in the social space. Your colleague declares that it is unacceptable to be “rude.” It becomes clear that she is operating from a model whereby being “rude” is so chilling to the discourse that it effectively makes discussion impossible. [Real, heavily simplified, example.]
If this were true it might very well cause you to reassess your sense of what is socially acceptable.
Additionally, here are some more examples of Cruxes, that on the face of it, seem too shallow to be useful, but can actually move the conversation forward:
If there were complete nuclear disarmament, more people would die violently. [Real example from a CFAR workshop, though clouded memory.]
If everyone were bisexual, people would have more sex. [I’m not sure if I’ve actually seen this one, but it seems like a plausible disagreement from watching people Double Crux on a nearby topic.]
CFAR wants to reach as many people as possible. [Real example]
For each of these, we might tend to take the proposition (or its opposite!) as given, but rather frequently, two people disagree about the truth value.
I claim that there is crux-structure hiding in each of these instances, and that instances like these are surprisingly common (acknowledging that they could seem frequent only because I’m looking for them, and the key feature of some other conversational paradigm is at least as common.)
More specifically, I claim that on hard questions and in situations that call for intuitive judgement, it is frequently the case that the two parties are paying attention to different considerations, and some of the time, the consideration that the other person if tracking, if born out, is sufficient to change your view substantially.
. . .
I was hoping to respond to more points here, but this is already long, and, I fear, a bit rambly. As I said, I’ll write up my full thoughts at some point.
I’m curious if I could operationalize a bet with Thrasymachus about how similar the next (or final, or 5 years out, or whatever) iteration of disagreement resolution social-technology will be to Double Crux. I don’t think I would take 1-to-1 odds, but I might take something like 3-to-1, depending on the operationalization.
Completely aside from the content, I’m glad to have posts like this one, critiquing CFAR’s content.
I didn’t think your comment was too long, nor would it even if it was twice as long. Nor did I find it rambly. Please consider writing up portions of your thoughts whenever you can if doing so is much easier than writing up your full thoughts.
I’m the person affiliated with CFAR who has done the most work on Double Crux in the past year. I both teach the unit (and it’s new accompaniment-class “Finding Cruxes”) at workshops, and semi-frequently run full or half day Double Crux development-and-test sessions on weekends. (However, I am technically a contractor, not an employee of CFAR.)
In the process of running test sessions, I’ve developed several CFAR units worth of support material or prerequisite for doing Double Crux well. We haven’t yet solved all of the blockers, but attendees of those full-day workshops are much more skilled at applying the technique successfully (according to my subjective impression, and by the count of “successfully resolved” conversations.)
This new content is currently unpublished, but I expect that I’ll put it up on LessWrong in some form (see the last few bullet point below), sometime in the next year.
I broadly agree with this post. Some of my current thoughts:
I’m fully aware that Double Crux is hard to use successfully, which is what motivated me to work on improving the usability of the technique in the first place.
Despite those usability issues, I have seen it work effectively to the point of completely resolving a disagreement. (Notably, most of the instances I can recall were Double Cruxes between CFAR staff, who have a very high level of familiarity with Double Crux as a concept.)
The specific algorithm that we teach at workshops has undergone iteration. The steps we teach now are quite different than those of a year ago.
Most of the value of Double Crux, it seems to me, comes not from formal application of the framework, but rather from using conversational moves from Double Crux in “regular” conversations. TAPs to “operationalize”, or “ask what would change your own mind” are very useful. (Indeed, about half of the Double Crux support content is explicitly for training those TAPs, individually.) This is, I think, what you’re pointing to with the difference between “the actual double crux technique. and ‘the overall pattern of behaviour surrounding this Official Double Crux technique’”.
In particular, I think that the greatest value of having the Double Crux class at workshops is the propagation of the jargon “crux”. It is useful for the CFAR alumni community to have a distinct concept for “a thing that would cause you to change your mind”, because that concept can then be invoked in conversation.
I think the full stack of habits, TAPs, concepts, and mindsets that lead to resolution of apparently intractable disagreement, is the interesting thing, and what we should be pursuing, regardless of if that stack “is Double Crux.” (This is in fact what I’m working on.)
Currently, I am unconvinced that Double Crux is the best or “correct” framework for resolving disagreements. Personally, I am more interested in other (nearby) conversational frameworks,
In particular, I expect that non-symmetrical methods for grocking another person’s intuitions, as Thrasymachus suggests, to be fruitful. I, personally, currently use an asymmetrical framework much more frequently than I use a symmetric Double Crux framework. (In part because this doesn’t require my interlocutor to do anything in particular or have knowledge of any particular conversational frame.)
I broadly agree with the section on asymmetry of cruxes (and it is an open curriculum development consideration). One frequently does not find a Double Crux, and furthermore doesn’t need to find a Double Crux to make progress: single cruxs are very useful. (The current CFAR unit currently says as much.)
There are some non-obvious advantages to finding a Double Crux though, namely that (if successful), you don’t just agree about the top-level proposition, but also share the same underlying model. (Double Crux is not, however, the only framework that produces this result.)
I have a few points of disagreement, however. Most notably, how common cruxes are.
My empirical experience is that disputes can be traced down to a single underlying consideration more frequently than one might naively think, particularly in on-the-fly disagreements about “what we should do” between two people with similar goals (which, I believe, is Double Crux’s ideal use case.)
While this is usually true (at least for sophisticated reasoners), it sometimes doesn’t bear on the possibility of finding a (single) crux.
For instance, as a very toy example, I have lots of reasons to believe that acceleration due to gravity is about 9.806 m/s^2: the textbooks I’ve read, the experiments I did in highschool, my credence in the edifice of science, ect.
But, if I were to find out that I were currently on the moon, this would render all of those factors irrelevant. It isn’t that some huge event changed my credence about all of the above factors. It’s that all of those factors flow into a single higher-level node and if you break the connection between that node and the top level proposition, your view can change drastically, because those factors are no longer important. In one sense it’s a massive update, but in another sense, it’s only a single bit flipped.
I think that many real to life seemingly intractable disagreements, particularly when each party has a strong and contrary-to-the-other’s intuition, have this characteristic. It’s not that you disagree about the evidence in question, you disagree about which evidence matters.
Because I think we’re on the moon, and you think we’re on earth.
But this is often hard to notice, because that sort of background content is something we both take for granted.
Next I’ll try to give some realer-to-life examples. (Full real life examples will be hard to convey because they will be more subtle or require more context. Very simplified anecdotes will have to do for now.)
You can notice something like this happening when...
1) You are surprised or taken aback at some piece of information that the other person thinks is true:
Whether or not more people will imigrate could very well change your mind about open borders.
2) You have (according to you) knock-down arguments against their case that they seem to concede quickly, that they don’t seem very interested in, or that don’t change their overall view much.
The possibility and/or necessity of geoengineering could easily be a crux for someone in favor of carbon-minimizing interventions.
3) They keep talking about considerations that, to you, seem minor (this also happens between people who agree):
If this were true it might very well cause you to reassess your sense of what is socially acceptable.
Additionally, here are some more examples of Cruxes, that on the face of it, seem too shallow to be useful, but can actually move the conversation forward:
For each of these, we might tend to take the proposition (or its opposite!) as given, but rather frequently, two people disagree about the truth value.
I claim that there is crux-structure hiding in each of these instances, and that instances like these are surprisingly common (acknowledging that they could seem frequent only because I’m looking for them, and the key feature of some other conversational paradigm is at least as common.)
More specifically, I claim that on hard questions and in situations that call for intuitive judgement, it is frequently the case that the two parties are paying attention to different considerations, and some of the time, the consideration that the other person if tracking, if born out, is sufficient to change your view substantially.
. . .
I was hoping to respond to more points here, but this is already long, and, I fear, a bit rambly. As I said, I’ll write up my full thoughts at some point.
I’m curious if I could operationalize a bet with Thrasymachus about how similar the next (or final, or 5 years out, or whatever) iteration of disagreement resolution social-technology will be to Double Crux. I don’t think I would take 1-to-1 odds, but I might take something like 3-to-1, depending on the operationalization.
Completely aside from the content, I’m glad to have posts like this one, critiquing CFAR’s content.
I didn’t think your comment was too long, nor would it even if it was twice as long. Nor did I find it rambly. Please consider writing up portions of your thoughts whenever you can if doing so is much easier than writing up your full thoughts.
Thanks. : ) I’ll take this into consideration.
Some updates on what I think about Double Crux these days are here.