The problem with that is that people here aren’t familiar with many of the concepts. For example, I like Hume’s work on the philosophy of science, but I’m not a philosopher and I have no idea what it means for a position to be Humean or non-Humean. I think more people would answer without really understanding what they are answering than would take the time to figure out the questions.
I would argue that this was a problem for the professional philosophers who took this survey as well. A moral philosopher may have a passing knowledge of the philosophy of time, but not enough to defend the particular position she reports in the survey.
I agree. I think it would make more sense to just have discussions about whichever of the topics interested people, rather than having a fixed poll. If there were such a poll, it should be one designed to encourage ‘other’ views and frequent revisions of one’s view.
I might make something like this at some point, if only as a pedagogical tool or conversation-starter. At the moment, I have good introductions and links explaining all the PhilPapers questions up here.
Glancing at the survey, it looks like it contains a large amount of jargon that although very likely accessible to professional philosophers, most people here (myself included) would not know what most of the questions are asking, so I don’t think it would be practical to do this survey as is among LW.
They think the content of our mental lives in general (.66) and perception in particular (.55), and the justification for our beliefs (.64), all depend significantly on the world outside our heads. They also think that you can fully understand a moral imperative without being at all motivated to obey it (.5).
I guess it depends what you mean by ‘depending significantly on the world outside our heads’. If they mean it in the trivial sense, then the fractions in all schools should be so close to 1 that you shouldn’t be able to get significant differences in correlation out (a covariance, I suppose). Since there was significant variation, I took them to mean something else. If so, that would be likely to mess us up first.
By ‘depend’ I don’t primarily mean causal dependence. One heuristic: If you’re an internalist, you’re likely to think that a brain in a vat could have the same mental states as you. If you’re an externalist, you’re likely to think that a brain in a vat couldn’t have the same mental states as you even if it’s physical state and introspective semblance were exactly alike, because the brain in a vat’s environment and history constitutively (and not just causally) alter which mental states it counts as having.
Perhaps the clearest example of this trend is disjunctivism, which is in the Externalism cluster. Disjunctivists think that a hallucination as of an apple, and a veridical perception of an apple, have nothing really in common; they may introspectively seem the same, and they may have a lot of neurological details in common, but any class that groups those two things (and only those two things) together will be a fairly arbitrary, gerrymandered collection. The representational, causal, historical, etc. links between my perception and the external world play a fundamental role in individuating that mental state, and you can’t abstract away from those contextual facts and preserve a sensible picture of minds/brains.
Is there a cluster that has more than 1 position in common with LW norms? None of these fit more than a little.
We should give the same survey to LW.
The problem with that is that people here aren’t familiar with many of the concepts. For example, I like Hume’s work on the philosophy of science, but I’m not a philosopher and I have no idea what it means for a position to be Humean or non-Humean. I think more people would answer without really understanding what they are answering than would take the time to figure out the questions.
I would argue that this was a problem for the professional philosophers who took this survey as well. A moral philosopher may have a passing knowledge of the philosophy of time, but not enough to defend the particular position she reports in the survey.
Yes. It would be important to at least have respondents provide some self-assessment of how well they understand each question.
I agree. I think it would make more sense to just have discussions about whichever of the topics interested people, rather than having a fixed poll. If there were such a poll, it should be one designed to encourage ‘other’ views and frequent revisions of one’s view.
I might make something like this at some point, if only as a pedagogical tool or conversation-starter. At the moment, I have good introductions and links explaining all the PhilPapers questions up here.
http://lesswrong.com/lw/56q/how_would_you_respond_to_the_philpapers_what_are/
Glancing at the survey, it looks like it contains a large amount of jargon that although very likely accessible to professional philosophers, most people here (myself included) would not know what most of the questions are asking, so I don’t think it would be practical to do this survey as is among LW.
Right, but I meant in an accessible way that would let us analyze the data—e.g. a google survey.
All these seem to be vaguely LW-like.
I guess it depends what you mean by ‘depending significantly on the world outside our heads’. If they mean it in the trivial sense, then the fractions in all schools should be so close to 1 that you shouldn’t be able to get significant differences in correlation out (a covariance, I suppose). Since there was significant variation, I took them to mean something else. If so, that would be likely to mess us up first.
By ‘depend’ I don’t primarily mean causal dependence. One heuristic: If you’re an internalist, you’re likely to think that a brain in a vat could have the same mental states as you. If you’re an externalist, you’re likely to think that a brain in a vat couldn’t have the same mental states as you even if it’s physical state and introspective semblance were exactly alike, because the brain in a vat’s environment and history constitutively (and not just causally) alter which mental states it counts as having.
Perhaps the clearest example of this trend is disjunctivism, which is in the Externalism cluster. Disjunctivists think that a hallucination as of an apple, and a veridical perception of an apple, have nothing really in common; they may introspectively seem the same, and they may have a lot of neurological details in common, but any class that groups those two things (and only those two things) together will be a fairly arbitrary, gerrymandered collection. The representational, causal, historical, etc. links between my perception and the external world play a fundamental role in individuating that mental state, and you can’t abstract away from those contextual facts and preserve a sensible picture of minds/brains.
Thanks.
So yeah, Externalism isn’t particularly close to an LW norm.