The deferrer will copy beliefs mistakenly imputed to the deferred-to that would have explained the deferred-to’s externally visible behavior. This pushes in the direction opposite to science because science is the way of making beliefs come apart from their pre-theoretical pragmatic implications.
Clarificaton request, this means that in addition to the stuff that the deferred-to opines, leaners will take as advice stuff the author didn’t mean to be opining?
I don’t know whether the high-mindedness magisteria matters. I question whether that activity is actually philosophy rather than science (I guess there is a link through “natural philosophy”). Seem I don’t know da way.
What I mean is, suppose the deferred-to has some belief X. This X is a refined, theoretically consilient belief to some extent, and to some extent it isn’t, but is instead pre-theoretic; intuitive, pragmatic, unreliable, and potentially inconsistent with other beliefs. What happens when the deferred-to takes practical, externally visible action, which is somehow related to X? Many of zer other beliefs will also play a role in that action, and many of those beliefs will be to a large extent pre-theoretical. Pre-theoreticalness is contagious, in action: theoretical refinement, to be expressed in action, asks for a rethinking of previously used protocols, so the easiest way to act on X is to use what is functionally a more pre-theoretical version of X.
So if the deferrer is imputing beliefs based on action, they’ll in general impute a more pre-theoretical belief; and they’ll place extra drag on their own processes of theoretical refinement. Like, when they notice contradictions, instead of rethinking their concepts and assumptions, they’ll avoid doing so, because that would contradict the apparent belief implied by the deferred-to’s behavior.
(Sorry this is isn’t more clear or concrete. I think the history of phlogiston is an example of some of this, where two theories are nearly identical in terms of pre-theoretic behavioral implications / expectations (e.g., both theories say that fire will be snuffed out by being in an enclosed space); but then by drawing out more implications, the threat of inconsistency forces one theory to be more and more complicated.)
The deferrer will copy beliefs mistakenly imputed to the deferred-to that would have explained the deferred-to’s externally visible behavior. This pushes in the direction opposite to science because science is the way of making beliefs come apart from their pre-theoretical pragmatic implications.
Clarificaton request, this means that in addition to the stuff that the deferred-to opines, leaners will take as advice stuff the author didn’t mean to be opining?
I don’t know whether the high-mindedness magisteria matters. I question whether that activity is actually philosophy rather than science (I guess there is a link through “natural philosophy”). Seem I don’t know da way.
What I mean is, suppose the deferred-to has some belief X. This X is a refined, theoretically consilient belief to some extent, and to some extent it isn’t, but is instead pre-theoretic; intuitive, pragmatic, unreliable, and potentially inconsistent with other beliefs. What happens when the deferred-to takes practical, externally visible action, which is somehow related to X? Many of zer other beliefs will also play a role in that action, and many of those beliefs will be to a large extent pre-theoretical. Pre-theoreticalness is contagious, in action: theoretical refinement, to be expressed in action, asks for a rethinking of previously used protocols, so the easiest way to act on X is to use what is functionally a more pre-theoretical version of X.
So if the deferrer is imputing beliefs based on action, they’ll in general impute a more pre-theoretical belief; and they’ll place extra drag on their own processes of theoretical refinement. Like, when they notice contradictions, instead of rethinking their concepts and assumptions, they’ll avoid doing so, because that would contradict the apparent belief implied by the deferred-to’s behavior.
(Sorry this is isn’t more clear or concrete. I think the history of phlogiston is an example of some of this, where two theories are nearly identical in terms of pre-theoretic behavioral implications / expectations (e.g., both theories say that fire will be snuffed out by being in an enclosed space); but then by drawing out more implications, the threat of inconsistency forces one theory to be more and more complicated.)
I was about to request clarification on this too. I don’t get
And I would like to get it.
(See my response to the parent comment.)