External only in that wetware is modelling something outside of the skull, rather than it’s internal state. The intent was to state that merely because you perceive reality along certain ontological lines does not imply that reality has the same ontology.
This should be particularly obvious when your internal sense fails to correspond to reality; if conscious states are an imperfect guide to external states then why should the apparent ontology of consciousness be accurate?
In this regard I have observed a number of positions taken.
None of which you refute here or in the OP, especially those who deny that “blueness” is a veridical property of reality.
All these things (color, time, meaning, unity) exist in consciousness; which means that they exist in at least one part of reality.
No; it means that something referencing them exists in some part of reality (your skull). An equivalence relation; an internal tag that this object is blue.
To counter the realism, consider mathematicians, who consciously deal in infinite sets, or all theorems provable under some axioms (model theory). Just because something appears plainly to you does not mean it exists. Kant says it better than I can.
Do you agree that color, time, meaning, unity exist in consciousness?
Not if you mean more than perception by consciousness. Even in perception, they’re just the ontology imposed by our neurology, and have neural correlates that suffice.
Consciousness isn’t prior to perception or action; it’s after it. There isn’t a homunculus in there for experience to “appear to”. If anything, there’s a compressed model of your own behaviour to which experience is fed into; that’s the “you” in the primate—a model of that same primate for planning and conterfactual reasoning.
Do you agree that color, time, meaning, unity exist in consciousness?
Not if you mean more than perception by consciousness. Even in perception, they’re just the ontology imposed by our neurology, and have neural correlates that suffice.
Let’s suppose I have a hallucinatory perception of a banana. So, there’s no yellow object outside my skull—we can both agree on that. It seems we also agree that I’m having a yellow perception.
But we part ways on the meaning of that. Apparently you think that even my hallucination isn’t really yellow. Instead, there’s some neural thing happening which has been tagged as yellow—whatever that means.
I really wonder about how you interpret your own experience. I suppose you experience colors just like I do, but (when you think about it) you tell yourself that what naively seems to be a matter of seeing a yellow object is actually experiencing what it’s like to have a perception tagged as yellow. But how does that translate, subjectively? When you see yellow, do you tell yourself you’re seeing the tag? Do you just semi-visualize a bunch of neurons firing in a certain way?
I suppose you experience colors just like I do, but (when you think about it) you tell yourself that what naively seems to be a matter of seeing a yellow object is actually experiencing what it’s like to have a perception tagged as yellow. But how does that translate, subjectively? When you see yellow, do you tell yourself you’re seeing the tag? Do you just semi-visualize a bunch of neurons firing in a certain way?
We went over this issue a bit in the previous discussion. My response (following Drescher) was: “To experience [yellow] is to feel your cognitive architecture assigning a label to sensory data.”
As I elaborated:
… the phenomenal experience of blue is what it is like to be a program that has classified incoming data as being a certain kind of light, under the constraint of having to coherently represent all of its other data (other colors, other visual qualities, other senses, other combined extrapolations from multiple senses, etc) but with limited comparison abilities.
The point being: I can’t give a complete answer now, but I can tell you what the solution will look like. It will involve describing how a cognitive architecture works, then looking at the distinctions it has to make, then looking at what constraints these distinctions operate under (e.g. color being orthogonal to sound [unless you have synaesthesia], etc.), then identifying what parts of the process can access each other.
Out of all of that, only certain data representations are possible, and one of these (perhaps, hopefully, the only one) is the one with the same qualities as our perception of color. You know you’re at the solution, when you say, Aha! If I had to express what information I receive, under all those constraints, that is what qualities it would need to have.
Though you object to the comparison, this is the same kind of error as demanding that there be a fundamental “chess thing” in Deep Blue. There is no fundamental color, just as there is no fundamental chess. There is only a regularity the system follows, compressible by reference to the concept of color or chess.
I suppose you experience colors just like I do, but (when you think about it) you tell yourself that what naively seems to be a matter of seeing a yellow object is actually experiencing what it’s like to have a perception tagged as yellow.
I am intrigued by your wording, here. I suppose I experience colors just like you do, but—when I think about it—I tell myself that what is, in fact seeing a yellow object is, in fact the same thing as experiencing what it’s like to have a perception tagged as yellow. I believe these descriptions to be equivalent in the same sense that “breaking of hydrogen bonds between dihydrogen monoxide molecules, leading to those molecules traveling in near-independent trajectories outside the crystalline structure” is equivalent to “ice sublimating”.
But we part ways on the meaning of that. Apparently you think that even my hallucination isn’t really yellow. Instead, there’s some neural thing happening which has been tagged as yellow—whatever that means.
The relevant part of the optical cortex which fires on yellow objects has fired; the rest of your brain behaves as if there were a yellow banana out in front of it. “Tagging” seemed like the best high level term for it. A collection of stimuli are being collected together as an atomic thing. There’s a neural thing happening, and part of that neural thing is normally caused by yellow things in the visual field.
The most obvious point where it has subjective import is when things change[1]. I probably experience colours as you do; when I introspect on colour, or time, I cannot find good cause to distinguish it from “visualising” an infinite set or a function. The only apparent different is that reality isn’t under concious control. I don’t assume that the naive ontology that is presented to me is a true ontology.
[1] There are a pair of coloured mugs (blue and purple) that I can’t distinguish in my peripheral vision, for example. When I see one in my peripheral vision, it is coloured (blue, say); when I look at it directly, there is a period in which it is both blue and purple, as best I can describe, before definitively becoming purple. Head MRI’s do this too.
Edit: The problem is that there isn’t an easy way to introspect on the processes leading to perceptions; they are presented ex nihilo. As best I can tell, there’s no good distinguisher of my senses from “experiencing what it’s like to have a perception tagged as yellow”
External only in that wetware is modelling something outside of the skull, rather than it’s internal state. The intent was to state that merely because you perceive reality along certain ontological lines does not imply that reality has the same ontology.
This should be particularly obvious when your internal sense fails to correspond to reality; if conscious states are an imperfect guide to external states then why should the apparent ontology of consciousness be accurate?
None of which you refute here or in the OP, especially those who deny that “blueness” is a veridical property of reality.
No; it means that something referencing them exists in some part of reality (your skull). An equivalence relation; an internal tag that this object is blue.
To counter the realism, consider mathematicians, who consciously deal in infinite sets, or all theorems provable under some axioms (model theory). Just because something appears plainly to you does not mean it exists. Kant says it better than I can.
Not if you mean more than perception by consciousness. Even in perception, they’re just the ontology imposed by our neurology, and have neural correlates that suffice.
Consciousness isn’t prior to perception or action; it’s after it. There isn’t a homunculus in there for experience to “appear to”. If anything, there’s a compressed model of your own behaviour to which experience is fed into; that’s the “you” in the primate—a model of that same primate for planning and conterfactual reasoning.
Let’s suppose I have a hallucinatory perception of a banana. So, there’s no yellow object outside my skull—we can both agree on that. It seems we also agree that I’m having a yellow perception.
But we part ways on the meaning of that. Apparently you think that even my hallucination isn’t really yellow. Instead, there’s some neural thing happening which has been tagged as yellow—whatever that means.
I really wonder about how you interpret your own experience. I suppose you experience colors just like I do, but (when you think about it) you tell yourself that what naively seems to be a matter of seeing a yellow object is actually experiencing what it’s like to have a perception tagged as yellow. But how does that translate, subjectively? When you see yellow, do you tell yourself you’re seeing the tag? Do you just semi-visualize a bunch of neurons firing in a certain way?
We went over this issue a bit in the previous discussion. My response (following Drescher) was: “To experience [yellow] is to feel your cognitive architecture assigning a label to sensory data.”
As I elaborated:
The point being: I can’t give a complete answer now, but I can tell you what the solution will look like. It will involve describing how a cognitive architecture works, then looking at the distinctions it has to make, then looking at what constraints these distinctions operate under (e.g. color being orthogonal to sound [unless you have synaesthesia], etc.), then identifying what parts of the process can access each other.
Out of all of that, only certain data representations are possible, and one of these (perhaps, hopefully, the only one) is the one with the same qualities as our perception of color. You know you’re at the solution, when you say, Aha! If I had to express what information I receive, under all those constraints, that is what qualities it would need to have.
To that, you replied:
Though you object to the comparison, this is the same kind of error as demanding that there be a fundamental “chess thing” in Deep Blue. There is no fundamental color, just as there is no fundamental chess. There is only a regularity the system follows, compressible by reference to the concept of color or chess.
I am intrigued by your wording, here. I suppose I experience colors just like you do, but—when I think about it—I tell myself that what is, in fact seeing a yellow object is, in fact the same thing as experiencing what it’s like to have a perception tagged as yellow. I believe these descriptions to be equivalent in the same sense that “breaking of hydrogen bonds between dihydrogen monoxide molecules, leading to those molecules traveling in near-independent trajectories outside the crystalline structure” is equivalent to “ice sublimating”.
The relevant part of the optical cortex which fires on yellow objects has fired; the rest of your brain behaves as if there were a yellow banana out in front of it. “Tagging” seemed like the best high level term for it. A collection of stimuli are being collected together as an atomic thing. There’s a neural thing happening, and part of that neural thing is normally caused by yellow things in the visual field.
The most obvious point where it has subjective import is when things change[1]. I probably experience colours as you do; when I introspect on colour, or time, I cannot find good cause to distinguish it from “visualising” an infinite set or a function. The only apparent different is that reality isn’t under concious control. I don’t assume that the naive ontology that is presented to me is a true ontology.
[1] There are a pair of coloured mugs (blue and purple) that I can’t distinguish in my peripheral vision, for example. When I see one in my peripheral vision, it is coloured (blue, say); when I look at it directly, there is a period in which it is both blue and purple, as best I can describe, before definitively becoming purple. Head MRI’s do this too.
Edit: The problem is that there isn’t an easy way to introspect on the processes leading to perceptions; they are presented ex nihilo. As best I can tell, there’s no good distinguisher of my senses from “experiencing what it’s like to have a perception tagged as yellow”