I really liked this sequence. I agree that specificity is important, and think this sequence does a great job of illustrating many scenarios in which it might be useful.
However, I believe that there are a couple implicit frames that permeate the entire sequence, alongside the call for specificity. I believe that these frames together can create a “valley of bad rationality” in which calls for specificity can actually make you worse at reasoning than the default.
------------------------------------
The first of these frames is not just that being specific can be useful, but that it’s ALWAYS the right thing, and that being non-specific is a sign of sloppy thinking or expression.
Again, nowhere in the sequence is this outright stated. Rather it’s implied through word choice, tone, and focus. Here’s a few passages from this article that I believe showcase this implicit belief:
Nooooo, this is not the enlightening conversation we were hoping for. You can sense that I haven’t made much progress “pinning him down”.
By sloshing around his mental ball pit and flinging smart-sounding assertions about “capitalism” and “exploitation”, he just might win over a neutral audience of our peers.
by making him flesh out a specific example of his claim, I’ve now pulled him out of his ball pit of loosely-associated concepts.
It sounds meaningful, doesn’t it? But notice that it’s generically-worded and lacks any specific examples. This is a red flag.
All of these frame non-specificity and genericness as almost certainly wrong or bad, and even potentially dumb or low status. So why do I believe this belief is problematic? A number of reasons.
Firstly, I believe that that not all knowledge is propositional. Meaning, much knowledge is encoded in the Brain as a series of loose associations leading to intuitions honed from lots of tight feedback loops.
Holding this sort of knowledge is in fact often actually a sign of deep expertise in someone, rather than sloppy thinking. It’s just expertise that came through lots of repetitions rather than reading, discussing, or explicit logical thinking.
Often, the right move is to ask them to get in touch with some of these intuitions, and just see what comes up. To free associate, or express in vague ways.
For instance, when working with Malcolm Ocean on his implicit understanding of design, rather than first asking him the aspects of good design, we worked on an “expressive sentence” of “Design is not just for unlocking doors, but for creating new doorways.” This is still a very vague sentence, but it was able to evoke many aspects of his intuitions. Only once we had a few of these vague sentences that came at the problem from different angles did we use them to get specific about the different elements of good design.
The second reason why I think the belief of “being non-specific is sloppy thinking” is bad is that there are several types of rationality. “Predictive Rationality “—that is, instrumental rationality that focuses on whether something will happen or whether something is true, will often benefit from specificity.
However there’s also “Generative Rationality”, that is, generating useful ideas. As alkjash notes in his Babble and Prune sequence, Babbling (generating useful thoughts) often benefits from not having to make ideas very specific in the early stage , and may in fact be stymied by that. If you have the kneejerk reaction of always asking for more specificity, it may lead to worse ideas in yourself or your culture.
In addition to “Generative Rationality” (which often comes before predictive rationality) you can also talk about “Effectuative Rationality”, getting people to take action on their ideas (which often comes after predictive rationality.
Here too, specificity can often be harmful. Much of the management research finds that to create pro-active employees, you want to be quite specific in what you want, but very vague about “how”, in order to let your underlings feel ownership and initiative. If you have the belief that you must always be specific, you may screw this up.
Once again, it’s not explicitly stated. However, here are some passages that I believe point to this implicit belief:
The Power to Demolish Bad Arguments
Want to see what a 3D Chess argument looks like? Behold the conversation I had the other day with my friend “Steve”:
This is a preschool-level standard that your average arguer can’t pass.
Before you think about winning the argument, just start by drilling down into whether their point is coherent.
All of these frame things as winning, arguing, zero-sum dynamics.
Just to be clear, I think that Combat Culture can be quite effective for learning. As idle speculation, I’ve seen this culture work very effectively in Israel, and as a child of Israeli immigrants it’s possible that this is where the author learned this culture.
However, I think that a Combat frame is much more brittle in a nurture culture, and accidentally adopting it without realizing that’s what you’re doing could be quite damaging to your epistemics.
Here’s why: Combat Culture is putting a lot of faith in the person you’re arguing with. If they’re very eloquent, and good at operationalizing their beliefs, and want to engage with you when attacked, then Combat-style discussion is a very fast way to get to the meat of an argument.
However, if the person you’re talking with is not as good at operationalizing their beliefs, or gets defensive, or takes time to think, then what’s likely to happen is you’ll get an argument that’s not their true crux, or they’ll just give up and say I don’t know. You’ll think you “won” but in fact all you got was more unjustified confidence in your beliefs.
On the other hand, if you took a more nurture approach of a detective, trying to find out if there’s any true things in their beliefs that can bolster your own arguments, this will work equally well, no matter if the other person is coming from a combat or nurture frame. You’re putting the onus on yourself to help the person state their beliefs in a way that’s useful to you, rather than challenging them to do it all on their own.
------------------------------------------
Because these frames are implicit, rather than explicitly being stated, I believe it makes it easy to unconsciously internalize them without being examined through repeated reading. For inclusion in the final book, I’d like to see a version of whichever post was included in the book that either
Made these implicit frames explicit, and made the case for them.
Removed the tone and word choice that expressed these implicit frames
Thanks for the feedback. I agree that the tone of the post has been undermining its content. I’m currently working on editing this post to blast away the gratuitously bad-tone parts :)
Update: I’ve edited the post to remove a lot of parts that I recognized as gratuitous yuckiness.
I think the post reads much better now and I think specifically the second point I made about Combat Culture is addressed. In regards to the first point, I made some specific points in here about when and when not specificity might be useful that I feel still aren’t fully addressed. I also get that this review was made really late and you didn’t really have an opportunity to digest and incorporate the feedback, so I’m not blaming you in regards to this, but pointing out where I think the post still may need some work and my potential reasons for not voting for it or others in the sequence.
Glad to hear you feel I’ve addressed the Combat Culture issues. I think those were the lowest-hanging fruits that everyone agreed on, including me :)
As for the first point, I guess this is the same thing we had a long comment thread about last year, and I’m not sure how much our views diverge at this point...
Let’s take this paragraph you quoted: “It sounds meaningful, doesn’t it? But notice that it’s generically-worded and lacks any specific examples. This is a red flag.” Do you not agree with my point that Seibel should have endeavored to be more clear in his public statement?
I really liked this sequence. I agree that specificity is important, and think this sequence does a great job of illustrating many scenarios in which it might be useful.
However, I believe that there are a couple implicit frames that permeate the entire sequence, alongside the call for specificity. I believe that these frames together can create a “valley of bad rationality” in which calls for specificity can actually make you worse at reasoning than the default.
------------------------------------
The first of these frames is not just that being specific can be useful, but that it’s ALWAYS the right thing, and that being non-specific is a sign of sloppy thinking or expression.
Again, nowhere in the sequence is this outright stated. Rather it’s implied through word choice, tone, and focus. Here’s a few passages from this article that I believe showcase this implicit belief:
All of these frame non-specificity and genericness as almost certainly wrong or bad, and even potentially dumb or low status. So why do I believe this belief is problematic? A number of reasons.
Firstly, I believe that that not all knowledge is propositional. Meaning, much knowledge is encoded in the Brain as a series of loose associations leading to intuitions honed from lots of tight feedback loops.
Holding this sort of knowledge is in fact often actually a sign of deep expertise in someone, rather than sloppy thinking. It’s just expertise that came through lots of repetitions rather than reading, discussing, or explicit logical thinking.
When someone has this type of knowledge, the right move is often NOT to get more specific (which I discovered over the past year of 20 or so interviews with people with this type of knowledge).
Often, the right move is to ask them to get in touch with some of these intuitions, and just see what comes up. To free associate, or express in vague ways.
For instance, when working with Malcolm Ocean on his implicit understanding of design, rather than first asking him the aspects of good design, we worked on an “expressive sentence” of “Design is not just for unlocking doors, but for creating new doorways.” This is still a very vague sentence, but it was able to evoke many aspects of his intuitions. Only once we had a few of these vague sentences that came at the problem from different angles did we use them to get specific about the different elements of good design.
The second reason why I think the belief of “being non-specific is sloppy thinking” is bad is that there are several types of rationality. “Predictive Rationality “—that is, instrumental rationality that focuses on whether something will happen or whether something is true, will often benefit from specificity.
However there’s also “Generative Rationality”, that is, generating useful ideas. As alkjash notes in his Babble and Prune sequence, Babbling (generating useful thoughts) often benefits from not having to make ideas very specific in the early stage , and may in fact be stymied by that. If you have the kneejerk reaction of always asking for more specificity, it may lead to worse ideas in yourself or your culture.
In addition to “Generative Rationality” (which often comes before predictive rationality) you can also talk about “Effectuative Rationality”, getting people to take action on their ideas (which often comes after predictive rationality.
Here too, specificity can often be harmful. Much of the management research finds that to create pro-active employees, you want to be quite specific in what you want, but very vague about “how”, in order to let your underlings feel ownership and initiative. If you have the belief that you must always be specific, you may screw this up.
----------------------------------------
The second implicit belief is one of “default to Combat Culture”.
Once again, it’s not explicitly stated. However, here are some passages that I believe point to this implicit belief:
All of these frame things as winning, arguing, zero-sum dynamics.
Just to be clear, I think that Combat Culture can be quite effective for learning. As idle speculation, I’ve seen this culture work very effectively in Israel, and as a child of Israeli immigrants it’s possible that this is where the author learned this culture.
However, I think that a Combat frame is much more brittle in a nurture culture, and accidentally adopting it without realizing that’s what you’re doing could be quite damaging to your epistemics.
Here’s why: Combat Culture is putting a lot of faith in the person you’re arguing with. If they’re very eloquent, and good at operationalizing their beliefs, and want to engage with you when attacked, then Combat-style discussion is a very fast way to get to the meat of an argument.
However, if the person you’re talking with is not as good at operationalizing their beliefs, or gets defensive, or takes time to think, then what’s likely to happen is you’ll get an argument that’s not their true crux, or they’ll just give up and say I don’t know. You’ll think you “won” but in fact all you got was more unjustified confidence in your beliefs.
On the other hand, if you took a more nurture approach of a detective, trying to find out if there’s any true things in their beliefs that can bolster your own arguments, this will work equally well, no matter if the other person is coming from a combat or nurture frame. You’re putting the onus on yourself to help the person state their beliefs in a way that’s useful to you, rather than challenging them to do it all on their own.
------------------------------------------
Because these frames are implicit, rather than explicitly being stated, I believe it makes it easy to unconsciously internalize them without being examined through repeated reading. For inclusion in the final book, I’d like to see a version of whichever post was included in the book that either
Made these implicit frames explicit, and made the case for them.
Removed the tone and word choice that expressed these implicit frames
Thanks for the feedback. I agree that the tone of the post has been undermining its content. I’m currently working on editing this post to blast away the gratuitously bad-tone parts :)
Update: I’ve edited the post to remove a lot of parts that I recognized as gratuitous yuckiness.
I think the post reads much better now and I think specifically the second point I made about Combat Culture is addressed. In regards to the first point, I made some specific points in here about when and when not specificity might be useful that I feel still aren’t fully addressed. I also get that this review was made really late and you didn’t really have an opportunity to digest and incorporate the feedback, so I’m not blaming you in regards to this, but pointing out where I think the post still may need some work and my potential reasons for not voting for it or others in the sequence.
Glad to hear you feel I’ve addressed the Combat Culture issues. I think those were the lowest-hanging fruits that everyone agreed on, including me :)
As for the first point, I guess this is the same thing we had a long comment thread about last year, and I’m not sure how much our views diverge at this point...
Let’s take this paragraph you quoted: “It sounds meaningful, doesn’t it? But notice that it’s generically-worded and lacks any specific examples. This is a red flag.” Do you not agree with my point that Seibel should have endeavored to be more clear in his public statement?