Temperature is an average. All individual information about the particles is lost, so you can’t invert the mapping from exact microphysical state to thermodynamic state.
So divide the particle velocities by temperature or whatever.
Most of the invertible functions you mention would reduce to one of a handful of non-redundant functions, obfuscated by redundant complexity.
How do you tell what’s redundant complexity and what’s ontologically fundamental? Position or momentum model of quantum mechanics, for instance?
Now I’d add that the derived nature of macroscopic “causes” is also a problem, if you want to have the usual materialist ontology of mind and you also want to say that mental states are causes.
What bothers me about your viewpoint is that you are solving the problem that, in your view, some things are epiphenomenal by making an epiphenomenal declaration—the statement that they are not epiphenomenal, but rather, fundamental.
So I posit the existence of what Dennett calls a “Cartesian theater”, a place where the seeing actually happens and where consciousness is located; it’s the end of the sensory causal chain and the beginning of the motor causal chain. And I further posit that, in current physical language, this place is a “quantum system”, not just a classically distributed neural network; because this would allow me to avoid the problems of many-to-one mappings and of derived macroscopic causality. That way, the individual conscious mind can have genuine causal relations with other objects in the world (the simpler quantum systems that are its causal neighbors in the brain).
Is there anything about your or anyone else’s actions that provides evidence for this hypothesis?
“genuine” causal relations is much weaker than “ontologically fundamental” relations.
Do only pure qualia really exist? Do beliefs, desires, etc. also exist?
That’s way too hard, so I’ll just illustrate the original point: You can map a set of three donkeys onto a set of three dogs, one-to-one, but that doesn’t let you deduce that a dog is a donkey.
You can map a set of three quantum states onto a set of {, , }
This doesn’t mean ontological structure that has no causal relations; it means ontological structure that isn’t made of causality. A causal sequence is a structure that is made of causality. But if the individual elements of the sequence have internal structure, it’s going to be ontologically non-causal. A data structure might serve as an example of a non-causal structure. So would a spatially extended arrangement of particles. It’s a spatial structure, not a causal structure.
No, it means ontological structure—not structures of things, but the structure of thing’s ontology—that doesn’t say anything about the things themselves, just about their ontology.
Could you revisit this point in the light of what I’ve now said? What sort of disconnection are you talking about?
A logical/probabilistic one. There is no evidence for a correlation between the statements “These beings have large-scale quantum entanglement” and “These beings think and talk about consciousness”
That’s clever, except that I would have to be saying that the world of experience is nothing but love, and that QM is nothing but the world of experience
You would have to be saying that to be exactly the same as your character. You’re contrasting two views here. One thinks the world is made up of nothing but STUFF, which follows the laws of quantum mechanics. The other thinks the world is made up of nothing but STUFF and EXPERIENCES. If you show them a quantum state, and tell the first guy “the stuff is in this arrangement” and the second guy “the stuff is in this arrangement, and the experiences are in that arrangement”, they agree exactly on what happens, except that the second guy thinks that some of the things that happen are not stuff, but experiences.
That doesn’t seem at all suspicious to you?
All that’s a digression, but the idea that QM could be the formal theory of any informal concept you like, tastes of a similar disregard for the prior meanings of words.
You are correct. “balloons” refers to balloons, not to quarks.
I guess what’s going on is that the guy is saying that’s what he believes balloons are.
But thinking about the meaning of words is clarifying.
It seems like the question is almost—“Is ‘experience’ a word like phlogiston or a word like elephant?”
More or less, whatever has been causing us to see all those elephants gets to be called an elephant. Elephants are reductionism-compatible. There are some extreme circumstances—images of elephants I have seen are fabrication, the people who claim to have seen elephants are lying to me—that break this rule. Phlogiston, on the other hand, is a word we give up on much more readily. Heat is particle bouncing around, but the absence of oxygen is not phlogiston—it’s just the absence of oxygen.
You believe that “experience” is fundamentally incompatible with reduction. An experience, to exist at all, must be an ontologically fundamental experience. Thus saying “I see red” makes two claims—one, that the brain is in a certain class of its possible total configuration states, those in which the person is seeing red, and two, that the experience of seeing red is ontologically fundamental.
I see no way to ever get the physical event of people claiming that they experience color correlated with the ontological fundamentalness of their color, as we can investigate the phlogiston hypothesis and stop using it if and only if it turns out to be a bad model.
What is a claim when it’s not correlated with its subject? The whole point of the words within it has been irrevocably lost. It is pure speculation.
I really, really don’t think, that when I say I see red, I’m just speculating.
It’s almost a month since we started this discussion, and it’s a bit of a struggle to remember what’s important and what’s incidental. So first, a back-to-basics statement from me.
Colors do exist, appearances do exist; that’s nonnegotiable. That they do not exist in an ontology of “nothing but particles in space” is also, fundamentally, nonnegotiable. I will engage in debates as to whether this is so, but only because people are so amazingly reluctant to see it, and the implication that their favorite materialistic theories of mind actually involve property dualism, in which color (for example) is tied to a particular structure or behavior of particles in the brain, but can’t be identified with it.
We aren’t like the ancient atomists who only had an informal concept of the world as atoms in a void, we have mathematical theories of physics, so a logical further question is whether these mathematical theories can be interpreted so that some of the entities they posit can be identified with color, with “experiences”, and so on.
Here I’d say there are two further important facts. First, an experience is a whole and has to be tackled as a whole. Patches of color are just a part of a multi-sensory whole, which in turn is just the sensory aspect of an experience which also has a conceptual element, temporal flow, a cognitive frame locating current events in a larger context, and so on. Any fundamental theory of reality which purports to include consciousness has to include this whole, it can’t just talk about atomized sensory qualia.
Second, any theory which says that the elementary degrees of freedom in a conscious state correspond to averaged collective physical degrees of freedom will have to involve property dualism. That’s because it’s a many-to-one mapping (from physical states to conscious states), and a many-to-one mapping can’t be an identity.
All that is the starting point for my line of thought, which is an attempt to avoid property dualism. I want to have something in my mathematical theory of reality which simply is the bearer of conscious states, has the properties and structure of a conscious whole, and is appropriately located in the causal chain. Since the mathematics describing a configuration of particles in space seems very unpromising for such a reinterpretation; and since our physics is quantum mechanics anyway, and the formalism of quantum mechanics contains entangled wavefunctions that can’t be factorized into localized wavefunctions, it’s quite natural to look for these conscious wholes in some form of QM where entanglement is ontological. However, since consciousness is in the brain and causally relevant, this implies that there must be a functionally relevant brain subsystem that is in a quantum coherent state.
That is the argument which leads me from “consciousness is real” to “there’s large-scale quantum entanglement in the brain”. Given the physics we have, it’s the only way I see to avoid property dualism, and it’s still just a starting point, on every level: mathematically, ontologically, and of course neurobiologically. But that is the argument you should be scrutinizing. What’s at stake in some of our specific exchanges may be a little obscure, so I wanted to set down the main argument in one piece, in one place, so you could see what you’re dealing with.
I will lay down the main thing convincing me that you’re correct.
Consider the three statements:
“there’s a large-scale quantum entanglement in the brain”
“consciousness is real”
“Mitchell Porter says that consciousness is real.”
Your inference requires that 1 and 2 are correlated. It is non-negotiable that 2 or 3 are correlated. There is no special connection between 1 and 3 that would make them uncorrelated.
However, 1 and 3 are both clearly-defined physical statements, and there is no physical mechanism for their correlation. We conclude that they are uncorrelated. We conclude that 1 and 2 are uncorrelated.
So divide the particle velocities by temperature or whatever.
How do you tell what’s redundant complexity and what’s ontologically fundamental? Position or momentum model of quantum mechanics, for instance?
What bothers me about your viewpoint is that you are solving the problem that, in your view, some things are epiphenomenal by making an epiphenomenal declaration—the statement that they are not epiphenomenal, but rather, fundamental.
Is there anything about your or anyone else’s actions that provides evidence for this hypothesis?
“genuine” causal relations is much weaker than “ontologically fundamental” relations.
Do only pure qualia really exist? Do beliefs, desires, etc. also exist?
You can map a set of three quantum states onto a set of {, , }
No, it means ontological structure—not structures of things, but the structure of thing’s ontology—that doesn’t say anything about the things themselves, just about their ontology.
A logical/probabilistic one. There is no evidence for a correlation between the statements “These beings have large-scale quantum entanglement” and “These beings think and talk about consciousness”
You would have to be saying that to be exactly the same as your character. You’re contrasting two views here. One thinks the world is made up of nothing but STUFF, which follows the laws of quantum mechanics. The other thinks the world is made up of nothing but STUFF and EXPERIENCES. If you show them a quantum state, and tell the first guy “the stuff is in this arrangement” and the second guy “the stuff is in this arrangement, and the experiences are in that arrangement”, they agree exactly on what happens, except that the second guy thinks that some of the things that happen are not stuff, but experiences.
That doesn’t seem at all suspicious to you?
You are correct. “balloons” refers to balloons, not to quarks.
I guess what’s going on is that the guy is saying that’s what he believes balloons are.
But thinking about the meaning of words is clarifying.
It seems like the question is almost—“Is ‘experience’ a word like phlogiston or a word like elephant?”
More or less, whatever has been causing us to see all those elephants gets to be called an elephant. Elephants are reductionism-compatible. There are some extreme circumstances—images of elephants I have seen are fabrication, the people who claim to have seen elephants are lying to me—that break this rule. Phlogiston, on the other hand, is a word we give up on much more readily. Heat is particle bouncing around, but the absence of oxygen is not phlogiston—it’s just the absence of oxygen.
You believe that “experience” is fundamentally incompatible with reduction. An experience, to exist at all, must be an ontologically fundamental experience. Thus saying “I see red” makes two claims—one, that the brain is in a certain class of its possible total configuration states, those in which the person is seeing red, and two, that the experience of seeing red is ontologically fundamental.
I see no way to ever get the physical event of people claiming that they experience color correlated with the ontological fundamentalness of their color, as we can investigate the phlogiston hypothesis and stop using it if and only if it turns out to be a bad model.
What is a claim when it’s not correlated with its subject? The whole point of the words within it has been irrevocably lost. It is pure speculation.
I really, really don’t think, that when I say I see red, I’m just speculating.
It’s almost a month since we started this discussion, and it’s a bit of a struggle to remember what’s important and what’s incidental. So first, a back-to-basics statement from me.
Colors do exist, appearances do exist; that’s nonnegotiable. That they do not exist in an ontology of “nothing but particles in space” is also, fundamentally, nonnegotiable. I will engage in debates as to whether this is so, but only because people are so amazingly reluctant to see it, and the implication that their favorite materialistic theories of mind actually involve property dualism, in which color (for example) is tied to a particular structure or behavior of particles in the brain, but can’t be identified with it.
We aren’t like the ancient atomists who only had an informal concept of the world as atoms in a void, we have mathematical theories of physics, so a logical further question is whether these mathematical theories can be interpreted so that some of the entities they posit can be identified with color, with “experiences”, and so on.
Here I’d say there are two further important facts. First, an experience is a whole and has to be tackled as a whole. Patches of color are just a part of a multi-sensory whole, which in turn is just the sensory aspect of an experience which also has a conceptual element, temporal flow, a cognitive frame locating current events in a larger context, and so on. Any fundamental theory of reality which purports to include consciousness has to include this whole, it can’t just talk about atomized sensory qualia.
Second, any theory which says that the elementary degrees of freedom in a conscious state correspond to averaged collective physical degrees of freedom will have to involve property dualism. That’s because it’s a many-to-one mapping (from physical states to conscious states), and a many-to-one mapping can’t be an identity.
All that is the starting point for my line of thought, which is an attempt to avoid property dualism. I want to have something in my mathematical theory of reality which simply is the bearer of conscious states, has the properties and structure of a conscious whole, and is appropriately located in the causal chain. Since the mathematics describing a configuration of particles in space seems very unpromising for such a reinterpretation; and since our physics is quantum mechanics anyway, and the formalism of quantum mechanics contains entangled wavefunctions that can’t be factorized into localized wavefunctions, it’s quite natural to look for these conscious wholes in some form of QM where entanglement is ontological. However, since consciousness is in the brain and causally relevant, this implies that there must be a functionally relevant brain subsystem that is in a quantum coherent state.
That is the argument which leads me from “consciousness is real” to “there’s large-scale quantum entanglement in the brain”. Given the physics we have, it’s the only way I see to avoid property dualism, and it’s still just a starting point, on every level: mathematically, ontologically, and of course neurobiologically. But that is the argument you should be scrutinizing. What’s at stake in some of our specific exchanges may be a little obscure, so I wanted to set down the main argument in one piece, in one place, so you could see what you’re dealing with.
I will lay down the main thing convincing me that you’re correct.
Consider the three statements:
“there’s a large-scale quantum entanglement in the brain”
“consciousness is real”
“Mitchell Porter says that consciousness is real.”
Your inference requires that 1 and 2 are correlated. It is non-negotiable that 2 or 3 are correlated. There is no special connection between 1 and 3 that would make them uncorrelated.
However, 1 and 3 are both clearly-defined physical statements, and there is no physical mechanism for their correlation. We conclude that they are uncorrelated. We conclude that 1 and 2 are uncorrelated.