I guess it depends what you mean by ‘depending significantly on the world outside our heads’. If they mean it in the trivial sense, then the fractions in all schools should be so close to 1 that you shouldn’t be able to get significant differences in correlation out (a covariance, I suppose). Since there was significant variation, I took them to mean something else. If so, that would be likely to mess us up first.
By ‘depend’ I don’t primarily mean causal dependence. One heuristic: If you’re an internalist, you’re likely to think that a brain in a vat could have the same mental states as you. If you’re an externalist, you’re likely to think that a brain in a vat couldn’t have the same mental states as you even if it’s physical state and introspective semblance were exactly alike, because the brain in a vat’s environment and history constitutively (and not just causally) alter which mental states it counts as having.
Perhaps the clearest example of this trend is disjunctivism, which is in the Externalism cluster. Disjunctivists think that a hallucination as of an apple, and a veridical perception of an apple, have nothing really in common; they may introspectively seem the same, and they may have a lot of neurological details in common, but any class that groups those two things (and only those two things) together will be a fairly arbitrary, gerrymandered collection. The representational, causal, historical, etc. links between my perception and the external world play a fundamental role in individuating that mental state, and you can’t abstract away from those contextual facts and preserve a sensible picture of minds/brains.
I guess it depends what you mean by ‘depending significantly on the world outside our heads’. If they mean it in the trivial sense, then the fractions in all schools should be so close to 1 that you shouldn’t be able to get significant differences in correlation out (a covariance, I suppose). Since there was significant variation, I took them to mean something else. If so, that would be likely to mess us up first.
By ‘depend’ I don’t primarily mean causal dependence. One heuristic: If you’re an internalist, you’re likely to think that a brain in a vat could have the same mental states as you. If you’re an externalist, you’re likely to think that a brain in a vat couldn’t have the same mental states as you even if it’s physical state and introspective semblance were exactly alike, because the brain in a vat’s environment and history constitutively (and not just causally) alter which mental states it counts as having.
Perhaps the clearest example of this trend is disjunctivism, which is in the Externalism cluster. Disjunctivists think that a hallucination as of an apple, and a veridical perception of an apple, have nothing really in common; they may introspectively seem the same, and they may have a lot of neurological details in common, but any class that groups those two things (and only those two things) together will be a fairly arbitrary, gerrymandered collection. The representational, causal, historical, etc. links between my perception and the external world play a fundamental role in individuating that mental state, and you can’t abstract away from those contextual facts and preserve a sensible picture of minds/brains.
Thanks.
So yeah, Externalism isn’t particularly close to an LW norm.