For this discussion, the one of principal relevance is the one on the use of words, >especially the mind projection fallacy
It would have been helpful to say how it is relevant.
(including the non-existence of mental or supernatural entities).
That mental entities don’t exist at all is a very bold claim: much bolder than the claim
about the supernatural that you bracket it with, and one that many physicalists would disagree with. Moreover, neither claim follows from the very general consideration (which in itself I do not contend) that there is a
“mind projection fallacy”.
Reductionism would be useful as well, and quantum physics.
What do you mean by “reductionism would be useful”? If there were
a generally accepted reduction of qualia, there would be no problem of qualia.
There isn’t such a reduction. So are you talking about promissory reduction (we have to believe it will arrive one day)....or what?
The quantum physics part is particularly helpful for disabusing one’s self of many > naive intuitions about object identity, that otherwise lead to belief in things like
souls, or consciousness as something separate from bodies, or the idea that an
exact duplicate of you wouldn’t actually be you.
I have studied quantum physics, and I don’t think my ideas about qualia are based
on naive ideas about identity. I think they are based on what I have said they
are based on. If you have a criticism that is relevant to something I have said,
I will be glad to hear it. I would rather you did not guess at my motivations.
If you don’t get at least that much about the basics of physics, then it’s way too easy
to believe in fairy tales when they have words like “consciousness” and “qualia”
attached.
Calling something a “fairy tale” is not an argument. I am still waiting for an argument relevant to something I said.
In other words, human beings are born with various intuitions (hardwired into the >brain, as has been shown by experiments on babies who can’t even talk yet) that, >without sufficient education, we use as the basis for reasoning about minds and >reality. Huge amounts of philosophy and “common sense” reasoning are then >based on these false premises.
That argument is a non sequitur. The fact that an intuition is hardwired does not make it false.
Of course, this makes most philosophical discussions equivalent to nothing but hot > air: reasoning based on false premises.
That’s another non-sequitur based on the on the previous one: you haven’t shown
that philosophical arguments are mostly based on intuitions. Moreover, you are
in danger of throwing out the arguments of your fellow qualiaphobes,
such as Dennett.
which is why I keep pointing to the Sequences. They contain the necessary >information to refute the premises that support the vast majority of philosophical
and supernatural nonsense.
They contain a bunch of stuff about logic and language that most philosophers
(in the anglosphere at least) are vey familiar with. I have read arguments for
and against qualia, and found them both to be based on reason. I think it is possible
for reasonable people to disagree deeply.
And the equation between philosophy and the supernatural remains uninformed, to
say the least.
(Such as some of Chalmers’s and Searle’s, for example.)
OK. Someone doesn’t like Chalmers’s Zombie argument. Guess what? I don’t
like it either. I know it is possible to have qualiaphilia without p-zombies because my
own qualiaphilc arguments work that way. I never mentioned zombies in the first place...
That mental entities don’t exist at all is a very bold claim: much bolder than the claim about the supernatural that you bracket it with, and one that many physicalists would disagree with. Moreover, neither claim follows from the very general consideration (which in itself I do not contend) that there is a “mind projection fallacy”.
The mind projection fallacy (or more specifically, the Less Wrong sequence on it) is more than sufficient as an explanation for how mental and supernatural entities are perceived; what “many physicalists” may or may not believe is not really relevant here.
are you talking about promissory reduction (we have to believe it will arrive one day)....or what?
I’m saying the Less Wrong sequences on reductionism and quantum physics will be useful in dissolving your confusion about qualia.
I have read arguments for and against qualia, and found them both to be based on reason.
But not Bayesian evidence, which is what’s relevant on LessWrong.com. This is a community devoted to furthering the practice of Bayesian rationalism, not the discussion of philosophy in general, or what philosophers consider to be reasonable or not reasonable. This is a community that considers dissolution of the confusion about “free will” to be a basic exercise in rationality, rather than an abstruse philosophical question requiring years of argument, or something that’s still considered an unsettled open question, subject to disagreement.
I think it is possible for reasonable people to disagree deeply.
...and on LessWrong, we agree that’s true… IF and only if one or more of these conditions apply:
The reasonable people have different information,
The reasonable people are using different methods of evaluating the same information (due to e..g different values/desires), or
One or both of the “reasonable” people aren’t really reasonable at all
If you believe that there is some other way for reasonable people to disagree, then it’s a good indication that we’re not on the same page enough to bother talking about this at all.
OK. Someone doesn’t like Chalmers’s Zombie argument.
If you think that’s just an opinion, you don’t get Bayesianism yet; that’s why I suggested the Sequences to you, in case you’re genuinely interested in being able to settle philosophical arguments once and for all, instead of just having philosophical arguments. ;-)
I never mentioned zombies in the first place...
You didn’t need to. Any argument for epiphenomenalism reduces in roughly the same way: if it has an effect, then the effect is phenomenal and reducible. If it doesn’t have an effect (i.e. produces no difference in our predicted observations), why do we care?
Ontologically fundamental mental entities of any sort require one to think of the mind as a supernatural entity, rather than a physical one. But it’s very hard to notice that you’re doing this, because it’s implicit in how we think about thinking by default.
Ontologically fundamental mental entities of any sort require one to think of the mind as a supernatural entity, rather than a physical one.
What’s a “supernatural entity”? The word ‘supernatural’ is ill-defined: if something exists in the real world, then it is natural by definition.
For the record, I don’t think minds are ontologically fundamental per se, because minds are far too complex and they’re explained already by physical brains. But it may be that some precursor of subjective experience is fundamental.
The word ‘supernatural’ is ill-defined: if something exists in the real world, then it is natural by definition.
Yeah, just like the word ‘metaphysics’ is ill-defined. If something exists in the real world, then it is physical by definition.
Or to be even more snarky but at least more explanatory: I doubt that ‘exists’, ‘physical’, ‘meta-’, ‘super-’ or ‘natural’ are sufficiently well-defined in these contexts for your accusation of ill-definition to hold any weight. If I try to interpret what you’re saying in roughly the same manner in which it seems to me that you’re interpreting what most folk mean by ‘supernatural’, except instead of being uncharitable in the direction of being snobbishly literal I reverse it and be uncharitable in the direction of not paying attention to your explicit message, it looks something like this: “People who use the word ‘supernatural’ tend to be wrong in obvious ways and I like to point this out in a mildly intellectual fashion so that I can feel superior to them; also since I just denounced the enemy tribe you should like me more”. But that would be no more accurate a characterization of what you meant, than your characterization of what is typically meant by ‘supernatural’, and nobody on either side would learn anything from such analysis.
(This comment is not really a reply to User:bogus so much as an expression of annoyance at certain traditional rationalist memes. Sorry you got caught in the crossfire, User:bogus.)
The naive impression of “mind” in general philosophical discussion is a good example of a supernatural entity—the concept of mind separated from a specific human brain, some almost spirit-like entity.
In order to commit the mind-projection fallacy, you have to forget (really: not notice) that your brain actually exists and is not an objective observer of fact, but only an opinion-generating machine. Thus, discussions of consciousness and “qualia” are hugely hampered by forgetting that the mind is not an abstraction, it’s a specific physical thing, and that the various properties being attributed to it in these discussions exist only in the brain of the beholder, rather than in the thing being discussed. (As a natural consequence of physics not having layers or levels.)
The word ‘supernatural’ is ill-defined: if something exists in the real world, then it is natural by definition.
The mind projection fallacy (or more specifically, the Less Wrong sequence on it) is >more than sufficient as an explanation for how mental and supernatural entities are >perceived;
I disagree.I don’t see the specific application at all.
I’m saying the Less Wrong sequences on reductionism and quantum physics will be useful in dissolving your confusion about qualia.
OK.I’m saying I already know quite a lot about both subjects, and I don’t see the application. You need to stop assuming that I am ignorant, and start putting forward
relevant arguments. Repetition of “you are confused” won’t cut it.
I have read arguments for and against qualia, and found them both to be based on reason.
But not Bayesian evidence, which is what’s relevant on LessWrong.com.
I don’t see the relevance of Bayes. The topic is at the level of of clarifying concepts, not of making computations on datasets.
This is a community devoted to furthering the practice of Bayesian rationalism, not > the discussion of philosophy in general,
To say that qualia don’t exist, as you have been, is philosophy in general. To say
that philsophy as a whole is wrong-headed, as you have been, is metaphilosophy. Your position is inconsistent. You say both that philosophy is wrong headed and
that a certain philosophical problem is (dis)solved in the Sequences (in a typically
philosophical way, dismissed as a verbal/conceptual confusion).
or what philosophers consider to be reasonable or not reasonable. This is a >community that considers dissolution of the confusion about “free will” to be a basic >exercise in rationality, rather than an abstruse philosophical question requiring >years of argument, or something that’s still considered an unsettled open question, >subject to disagreement.
If it is a community based on reason, it will be open to reasoned objections.
that’s why I
suggested the Sequences to you, in case you’re genuinely interested in being able
to settle philosophical arguments once and for all, instead of just having >philosophical arguments. ;-)
That seems laughably naive to me. You don;t have an algortihm for
settling phil. arguments, because they do depend on evaluations, and
other stumbling blocks you haven’t thought of. You think it is just obvious
that we should ditch the idea of qualia to retain physicalism and avoid
epiphenomenalism. But that isn’t an obvious objective fact which other
people are toostupid to understand: that is
you de-valuing qualia and subjective experience.
You didn’t need to. Any argument for epiphenomenalism
I didn’t mention epiphenomenalism either, and I don’t believe in it..or, rather, I value
theories that avoid it.
reduces in roughly the same way: if it has an effect, then the effect is phenomenal > and reducible. If it doesn’t have an effect (i.e. produces no difference in our
predicted observations), why do we care?
Ontologically fundamental mental entities of any sort require
I haven’t said qualia are fundamental. and they are not defined that way.
If you don’t think that these arguments can be settled, there is no point in continuing this discussion.
And if you don’t think that Bayes matters to updating your beliefs, then you are not a Bayesian rationalist.
The reason I asked about the sequences was to find out whether you were someone trying to learn an application of Bayesian rationalism, or someone who’s just trying to have a philosophical argument.
Apparently, you fall in the latter category, which means I have no interest in continuing the discussion.
If it is a community based on reason, it will be open to reasoned objections.
What is considered “reasoning” by philosophy doesn’t reach the level of rigor that is required here… as was amply demonstrated by statements of yours such as:
You don;t have an algortihm for settling phil. arguments, because they do depend on evaluations
They only depend on evaluations if you’re interested in having an argument, as opposed to finding the truth (with or without a capital T) of a situation. Here, we expect arguments to be supported (or at least not opposed) by physics and cognitive science, in order to be considered “reasonable”, and we expect that hypotheses not be privileged.
I don’t think they have been settled. And I think there is value in reversing the Dunning Kruger Effect: getting someone to realise how difficult something really is.
I didn’t claim to be a Bayesian or not. I am comparing Bayes to Popper and various other
things at the moment. What I did say, and stand by, is that the formal part of Bayes is only applicable to problem areas that have already been marshalled into a less ambigous and non-linear form than typical phil, problems.
You can say you have some wonderfully high level of reasoning, but I don’t have to believe you. I can judge from the examples supplied. You have not applied Bayesian reasoning as a formalism to any problem. and the material you directed me to in the sequences didn’t either. It is all typical philosophical reasoning, neither particularly good not particularly bad.
They only depend on evaluations if you’re interested in having an argument, as
opposed to finding the truth (with or without a capital T) of a situation. Here, we
expect arguments to be supported (or at least not opposed) by physics and
cognitive science, in order to be considered “reasonable”, and we expect that
hypotheses not be privilege oopposed to finding the truth (with or without a capital
T) of a situation. Here, we expect arguments to be supported (or at least not
opposed) by physics and cognitive science, in order to be considered
“reasonable”, and we expect that hypotheses not be privileged.
Ie...you value science.
But the idea that just by basing your philosophical arguments on science, you can
Avoid Arguments and Find Truth is very naive. Most English-speaking philosophy
is science based, and is full of plenty of disagreements. Why don’t you know that?
Oh yeah: the Dunning-Kruger effect means that the less someone knows about a subject, the more they over-estimate their own abilities at it...
Zombies?! I never said a word about zombies...
It would have been helpful to say how it is relevant.
That mental entities don’t exist at all is a very bold claim: much bolder than the claim about the supernatural that you bracket it with, and one that many physicalists would disagree with. Moreover, neither claim follows from the very general consideration (which in itself I do not contend) that there is a “mind projection fallacy”.
What do you mean by “reductionism would be useful”? If there were a generally accepted reduction of qualia, there would be no problem of qualia. There isn’t such a reduction. So are you talking about promissory reduction (we have to believe it will arrive one day)....or what?
I have studied quantum physics, and I don’t think my ideas about qualia are based on naive ideas about identity. I think they are based on what I have said they are based on. If you have a criticism that is relevant to something I have said, I will be glad to hear it. I would rather you did not guess at my motivations.
Calling something a “fairy tale” is not an argument. I am still waiting for an argument relevant to something I said.
That argument is a non sequitur. The fact that an intuition is hardwired does not make it false.
That’s another non-sequitur based on the on the previous one: you haven’t shown that philosophical arguments are mostly based on intuitions. Moreover, you are in danger of throwing out the arguments of your fellow qualiaphobes, such as Dennett.
They contain a bunch of stuff about logic and language that most philosophers (in the anglosphere at least) are vey familiar with. I have read arguments for and against qualia, and found them both to be based on reason. I think it is possible for reasonable people to disagree deeply.
And the equation between philosophy and the supernatural remains uninformed, to say the least.
OK. Someone doesn’t like Chalmers’s Zombie argument. Guess what? I don’t like it either. I know it is possible to have qualiaphilia without p-zombies because my own qualiaphilc arguments work that way. I never mentioned zombies in the first place...
The mind projection fallacy (or more specifically, the Less Wrong sequence on it) is more than sufficient as an explanation for how mental and supernatural entities are perceived; what “many physicalists” may or may not believe is not really relevant here.
I’m saying the Less Wrong sequences on reductionism and quantum physics will be useful in dissolving your confusion about qualia.
But not Bayesian evidence, which is what’s relevant on LessWrong.com. This is a community devoted to furthering the practice of Bayesian rationalism, not the discussion of philosophy in general, or what philosophers consider to be reasonable or not reasonable. This is a community that considers dissolution of the confusion about “free will” to be a basic exercise in rationality, rather than an abstruse philosophical question requiring years of argument, or something that’s still considered an unsettled open question, subject to disagreement.
...and on LessWrong, we agree that’s true… IF and only if one or more of these conditions apply:
The reasonable people have different information,
The reasonable people are using different methods of evaluating the same information (due to e..g different values/desires), or
One or both of the “reasonable” people aren’t really reasonable at all
If you believe that there is some other way for reasonable people to disagree, then it’s a good indication that we’re not on the same page enough to bother talking about this at all.
If you think that’s just an opinion, you don’t get Bayesianism yet; that’s why I suggested the Sequences to you, in case you’re genuinely interested in being able to settle philosophical arguments once and for all, instead of just having philosophical arguments. ;-)
You didn’t need to. Any argument for epiphenomenalism reduces in roughly the same way: if it has an effect, then the effect is phenomenal and reducible. If it doesn’t have an effect (i.e. produces no difference in our predicted observations), why do we care?
Ontologically fundamental mental entities of any sort require one to think of the mind as a supernatural entity, rather than a physical one. But it’s very hard to notice that you’re doing this, because it’s implicit in how we think about thinking by default.
What’s a “supernatural entity”? The word ‘supernatural’ is ill-defined: if something exists in the real world, then it is natural by definition.
For the record, I don’t think minds are ontologically fundamental per se, because minds are far too complex and they’re explained already by physical brains. But it may be that some precursor of subjective experience is fundamental.
Yeah, just like the word ‘metaphysics’ is ill-defined. If something exists in the real world, then it is physical by definition.
Or to be even more snarky but at least more explanatory: I doubt that ‘exists’, ‘physical’, ‘meta-’, ‘super-’ or ‘natural’ are sufficiently well-defined in these contexts for your accusation of ill-definition to hold any weight. If I try to interpret what you’re saying in roughly the same manner in which it seems to me that you’re interpreting what most folk mean by ‘supernatural’, except instead of being uncharitable in the direction of being snobbishly literal I reverse it and be uncharitable in the direction of not paying attention to your explicit message, it looks something like this: “People who use the word ‘supernatural’ tend to be wrong in obvious ways and I like to point this out in a mildly intellectual fashion so that I can feel superior to them; also since I just denounced the enemy tribe you should like me more”. But that would be no more accurate a characterization of what you meant, than your characterization of what is typically meant by ‘supernatural’, and nobody on either side would learn anything from such analysis.
(This comment is not really a reply to User:bogus so much as an expression of annoyance at certain traditional rationalist memes. Sorry you got caught in the crossfire, User:bogus.)
The naive impression of “mind” in general philosophical discussion is a good example of a supernatural entity—the concept of mind separated from a specific human brain, some almost spirit-like entity.
In order to commit the mind-projection fallacy, you have to forget (really: not notice) that your brain actually exists and is not an objective observer of fact, but only an opinion-generating machine. Thus, discussions of consciousness and “qualia” are hugely hampered by forgetting that the mind is not an abstraction, it’s a specific physical thing, and that the various properties being attributed to it in these discussions exist only in the brain of the beholder, rather than in the thing being discussed. (As a natural consequence of physics not having layers or levels.)
Exactly.
Well, I don’t have a naive conception of the mind, and I do remember my brain exists, so I am not committing the MPF. Hurrrah!
I disagree.I don’t see the specific application at all.
OK.I’m saying I already know quite a lot about both subjects, and I don’t see the application. You need to stop assuming that I am ignorant, and start putting forward relevant arguments. Repetition of “you are confused” won’t cut it.
I don’t see the relevance of Bayes. The topic is at the level of of clarifying concepts, not of making computations on datasets.
To say that qualia don’t exist, as you have been, is philosophy in general. To say that philsophy as a whole is wrong-headed, as you have been, is metaphilosophy. Your position is inconsistent. You say both that philosophy is wrong headed and that a certain philosophical problem is (dis)solved in the Sequences (in a typically philosophical way, dismissed as a verbal/conceptual confusion).
If it is a community based on reason, it will be open to reasoned objections.
That seems laughably naive to me. You don;t have an algortihm for settling phil. arguments, because they do depend on evaluations, and other stumbling blocks you haven’t thought of. You think it is just obvious that we should ditch the idea of qualia to retain physicalism and avoid epiphenomenalism. But that isn’t an obvious objective fact which other people are toostupid to understand: that is you de-valuing qualia and subjective experience.
I didn’t mention epiphenomenalism either, and I don’t believe in it..or, rather, I value theories that avoid it.
I haven’t said qualia are fundamental. and they are not defined that way.
If you don’t think that these arguments can be settled, there is no point in continuing this discussion.
And if you don’t think that Bayes matters to updating your beliefs, then you are not a Bayesian rationalist.
The reason I asked about the sequences was to find out whether you were someone trying to learn an application of Bayesian rationalism, or someone who’s just trying to have a philosophical argument.
Apparently, you fall in the latter category, which means I have no interest in continuing the discussion.
What is considered “reasoning” by philosophy doesn’t reach the level of rigor that is required here… as was amply demonstrated by statements of yours such as:
They only depend on evaluations if you’re interested in having an argument, as opposed to finding the truth (with or without a capital T) of a situation. Here, we expect arguments to be supported (or at least not opposed) by physics and cognitive science, in order to be considered “reasonable”, and we expect that hypotheses not be privileged.
I don’t think they have been settled. And I think there is value in reversing the Dunning Kruger Effect: getting someone to realise how difficult something really is.
I didn’t claim to be a Bayesian or not. I am comparing Bayes to Popper and various other things at the moment. What I did say, and stand by, is that the formal part of Bayes is only applicable to problem areas that have already been marshalled into a less ambigous and non-linear form than typical phil, problems.
You can say you have some wonderfully high level of reasoning, but I don’t have to believe you. I can judge from the examples supplied. You have not applied Bayesian reasoning as a formalism to any problem. and the material you directed me to in the sequences didn’t either. It is all typical philosophical reasoning, neither particularly good not particularly bad.
Ie...you value science.
But the idea that just by basing your philosophical arguments on science, you can Avoid Arguments and Find Truth is very naive. Most English-speaking philosophy is science based, and is full of plenty of disagreements. Why don’t you know that? Oh yeah: the Dunning-Kruger effect means that the less someone knows about a subject, the more they over-estimate their own abilities at it...