In fact, this is one of the major problems I have with—forgive me for saying so!—your own posts. They are very nuanced! But this makes them difficult, sometimes almost impossible, to understand (not to mention very long); “bane of clarity” seems exactly right to me. (Indeed, I have noticed this tendency in the writing of several members of the LW team as well, and a few others.)
You say:
I think there’s a compelling argument to be made that much or even the majority of intellectual progress lies in the cumulative ability to make ever-finer distinctions, i.e. increasing our capacity for nuance.
There is certainly something to this view. But the counterpoint is that as you make ever finer distinctions, two trends emerge:
The distinctions come to matter less and less—and yet, they impose at least constant, and often increasing, cognitive costs. But this is surely perverse! Cognitive resource expenditures should be proportional to importance/impact, otherwise you end up wasting said resources—talking, and thinking, more and more, about things that matter less and less…
The likelihood that the distinctions you are making, and the patterns you are seeing, are perceived inaccurately, or even are entirely imaginary, increases dramatically. We might analogize this to attempting to observe increasingly tiny (or more distant) physical objects—there comes a point where the noise inherent in our means of observation (our instruments, etc.) dominates our observations.
I think that both of these trends may be seen in discussions taking place on Less Wrong, and that they are responsible for a good share of the epistemic degradation we can see.
I disagree with 1 entirely (both parts), and while 2 is sort of logically necessary, that doesn’t mean the effect is as large as you imply with “increases dramatically,” nor that it can’t be overcome. c.f. it’s not what it looks like.
(Reply more curt than usual for brevity’s sake. =P)
I think of robustness/redundancy as the opposite of nuance for the purposes of this thread. It’s not the kind of redundancy where you set up a lot of context to gesture at an idea from different sides, specify the leg/trunk/tail to hopefully indicate the elephant. It’s the kind of redundancy where saying this once in the first sentence should already be enough, the second sentence makes it inevitable, and the third sentence preempts an unreasonable misinterpretation that’s probably logically impossible.
(But then maybe you add a second paragraph, and later write a fictional dialogue where characters discuss the same idea, and record a lecture where you present this yet again on a whiteboard. There’s a lot of nuance, it adds depth by incising the grooves in the same pattern, and none of it is essential. Perhaps there are multiple levels of detail, but then there must be levels with little detail than make sense out of context, on their own, and the levels with a lot of detail must decompose into smaller self-contained points. I don’t think I’m saying anything that’s not tiresomely banal.)
I agree with Vladimir, FWIW.
In fact, this is one of the major problems I have with—forgive me for saying so!—your own posts. They are very nuanced! But this makes them difficult, sometimes almost impossible, to understand (not to mention very long); “bane of clarity” seems exactly right to me. (Indeed, I have noticed this tendency in the writing of several members of the LW team as well, and a few others.)
You say:
There is certainly something to this view. But the counterpoint is that as you make ever finer distinctions, two trends emerge:
The distinctions come to matter less and less—and yet, they impose at least constant, and often increasing, cognitive costs. But this is surely perverse! Cognitive resource expenditures should be proportional to importance/impact, otherwise you end up wasting said resources—talking, and thinking, more and more, about things that matter less and less…
The likelihood that the distinctions you are making, and the patterns you are seeing, are perceived inaccurately, or even are entirely imaginary, increases dramatically. We might analogize this to attempting to observe increasingly tiny (or more distant) physical objects—there comes a point where the noise inherent in our means of observation (our instruments, etc.) dominates our observations.
I think that both of these trends may be seen in discussions taking place on Less Wrong, and that they are responsible for a good share of the epistemic degradation we can see.
I disagree with 1 entirely (both parts), and while 2 is sort of logically necessary, that doesn’t mean the effect is as large as you imply with “increases dramatically,” nor that it can’t be overcome. c.f. it’s not what it looks like.
(Reply more curt than usual for brevity’s sake. =P)
I think of robustness/redundancy as the opposite of nuance for the purposes of this thread. It’s not the kind of redundancy where you set up a lot of context to gesture at an idea from different sides, specify the leg/trunk/tail to hopefully indicate the elephant. It’s the kind of redundancy where saying this once in the first sentence should already be enough, the second sentence makes it inevitable, and the third sentence preempts an unreasonable misinterpretation that’s probably logically impossible.
(But then maybe you add a second paragraph, and later write a fictional dialogue where characters discuss the same idea, and record a lecture where you present this yet again on a whiteboard. There’s a lot of nuance, it adds depth by incising the grooves in the same pattern, and none of it is essential. Perhaps there are multiple levels of detail, but then there must be levels with little detail than make sense out of context, on their own, and the levels with a lot of detail must decompose into smaller self-contained points. I don’t think I’m saying anything that’s not tiresomely banal.)