Objection 1: many difficulties (Dust theory being one) are avoided if you simply do not use the term ‘subjective experience’. Don’t try to define it. Don’t assume something exists that should be called that.
What is the discussion of ‘subjective experience’ needed for? What is the problem with discarding the entire concept? (I’m aware there are some problems, but I’m interested in your take on it because I think most of them can be explained away.)
Objection 2, to your item 3: the mapping of a ‘mental state’ to the configuration of some physical system is purely a matter of interpretation. The problem here is that you ascribe to physical configurations, some properties that are normally reserved for causal sequences of physical states, i.e. outright simulations.
Suppose I have a model of your ‘mentality’ - that thing whose states are your mental states. Since it’s embodied in a physical system, I can enumerate all possible mental states. Suppose there are countable many states (I don’t know physics that well, but at the very least this is valid if you accept arbitrarily precise descriptions of the physical system as mapping to arbitrarily precise specifications of your corresponding mental state).
Now, I can write down a few (very large) numbers that map to some mental states of yours. Do you think my act of writing down these numbers literally brings into existence a subjective experience that did not otherwise exists?
(You may object because the actual numbers involved are so big they probably can’t be literally written down even on a Universe-sized piece of paper. But I can use any physical system, not much more complex than the one I’m modelling (that’s you), to encode the numbers. Pen and paper aren’t privileged media.)
Suppose I find a number, or a series of numbers, that corresponds to a state of extreme suffering on your part. How many real-life resources would you invest to prevent me, an AI, from storing that number in my memory where no-one’s looking? (If your answer is ‘none at all’, then what else does this theory make you do differently, ever?) Would you react differently if I stored a sequence of very similar numbers, which correspond to almost-indistinguishable successive states of a real brain? (And that’s without taking into account the problems raised by rearranging the states in time.)
But that’s not the real problem. Remember that the mapping of mental states to numbers is purely arbitrary: any ordering of the natural numbers will do. What makes a given number invoke a given mental state? Is it just my own mental intention in using it? What if I build a non-sentient AI proxy to do it for me? What if I proclaim that I use the number 1 to encode a state of suffering in my simulation of you—will you try to stop everyone in the universe from writing down ‘1’? Will you counter-proclaim that no, the number 1 actually encodes your state of supreme happiness?
Objection 3, to your C2: your logic is invalid. Compare: “somewhere in the universe are mental states which correspond to someone mentally identical to yourself experiencing eternal torture. Therefore you will experience eternal torture, starting a moment from now.”
The problem is that you have not defined what ‘you’ means in this context. If there are many similar or identical states to “yours”, embodied at different points in the universe, which one are “you”? If there are several identical ones that diverge, and some experience eternal torture and some experience eternal happiness, which one is then “you”? If somewhere there is a sequence of mental states that starts out with the same memories you (Jack) have right now, but its actual experiences are of being on Mars, then do you expect to be on Mars?
This would be the Subjective Dust Theory, except that it’s wrong. It’s empirically wrong: my experiences have been highly ordered in the past and so I expect them to be ordered in the future, and not to jump randomly around the universe just because there exist embodiments of every possible future state I might experience. Of course you could say I just happen to be a state that remembers an ordered past—that’s the Boltzmann’s Brain (timelessness) postulate—but you can’t really conclude anything based on this, so I think it’s better to assume our memories are real and we really live in an ordered universe.
It sounds like I should clarify that I don’t actually endorse the argument. I’m just trying to make the argument explicit so that we can stop all the hand-waving.
Objection 1:..What is the discussion of ‘subjective experience’ needed for? What is the problem with discarding the entire concept? (I’m aware there are some problems, but I’m interested in your take on it because I think most of them can be explained away.)
I successfully referred to something with the phrase. I know I did because your response wasn’t “Huh? What does that word mean?” I’m more than open to the suggestion that subjective experience is a illusion or an error—but it is the constitutive feature of our existence. Curious people aren’t going to just stop talking about it without a very good reason. The burden is on those who don’t think it should not be discussed, to explain why.
Objection 2, to your item 3: the mapping of a ‘mental state’ to the configuration of some physical system is purely a matter of interpretation. The problem here is that you ascribe to physical configurations, some properties that are normally reserved for causal sequences of physical states, i.e. outright simulations.
Agreed. This is a good line of attack. Egan’s response in the FAQ is:
some people have suggested that a sequence of states could only experience consciousness if there was a genuine causal relationship between them. The whole point of the Dust Theory, though, is that there is nothing more to causality than the correlations between states.
I don’t really know where he is coming from. If that is “the point” of Dust theory I don’t see how he as made that argument. It looks to me like brains and genuine simulations are indeed causal but arbitrary patterns are not. That said, it isn’t obvious to me why causation would be necessary for consciousness. Say we simulate your brain and record the simulation. We then divide the recording into 100,000 pieces, scramble them, and put them back together. Then we play the recording. The Dust theory says that the recording will be conscious just in no way proceeding along the arrow of time the way we are. Is the recording still a causal system?
Regarding the rest of the objection: First, Obviously the argument is counter-intuitive. Second, as I understand the argument, your mental intention has nothing to do with it. What matters is that there be multiple structural patterns that relate to one another as sequential mental states do. Thats it. If happiness are suffering are symmetrical it might ever be the case that every time you experience suffering there is another you that is happy and vice versa. You definitely can’t just say of some set of patterns “These are Jack suffering” and make them that way. With patterns that we are sure are sufficiently complex that they include a series of mental states those patterns will also include infinitely many other mental states (I think infinite, anyway). No is going to be able to alter the structure such that it only represents mental states where I am suffering. With less complex patterns we will be less sure any minds are included. To be sure there is a mind and to specify a particular experience of me suffering I suspect you would have to actually simulate me suffering. The sentence “Jack is suffering” isn’t complex enough to support mental states. Neither would the simulation written out in a programming language. Does that make any sense?
Objection 3, to your C2: your logic is invalid. Compare: “somewhere in the universe are mental states which correspond to someone mentally identical to yourself experiencing eternal torture. Therefore you will experience eternal torture, starting a moment from now.”
My logic isn’t invalid. I addition to their being mental states with relation K to your final mental state there are some such states where you are happy and others in which you experience eternal torture. That is also a consequence of the argument.
The problem is that you have not defined what ‘you’ means in this context.
My position on this is basically that our concept of personhood confuses types and tokens because it was developed in a world where every person had only one token. The fact that our concept of personal identity isn’t equipped to deal with the Dust argument isn’t really a point against it.
This would be the Subjective Dust Theory
Clarifying: Is the “Subjective Dust Theory” different from some other Dust theory as you understand it? I’m trying to describe Egan’s Dust theory.
Also, I agree. Based on our experiences we can conclude that we are not dust-minds.
That said, it isn’t obvious to me why causation would be necessary for consciousness.
How about “consciousness is computation, and causation is necesary for computation”.
Say we simulate your brain and record the simulation. We then divide the recording into 100,000 pieces, scramble them, and put them back together. Then we play the recording. The Dust theory says that the recording will be conscious just in no way proceeding along the arrow of time the way we are. Is the recording still a causal system?
Yes. But who ever said that any old causation is necesary for consciousness? Both computationalism and phsyicalism say consc.is a particular kind of causal process. The causation of playing back a simulation is not the causation of generating it in the first place.
It’s empirically wrong: my experiences have been highly ordered in the past and so I expect them to be ordered in the future, and not to jump randomly around the universe just because there exist embodiments of every possible future state I might experience.
Just because the encoding of the different states are scattered about the universe doesn’t mean the conscious experience does not appear to be contiguous and linear to the observer; while they’d be in the minority in an infinite configuration space it is impossible that there won’t be states without memories of contiguous experiences.
Also, I agree. Based on our experiences we can conclude that we are not dust-minds.
Could either of you explain how you would expect your current state of consciousness with its memories of experiences to be any different from how it is now if it were a dust-mind?
Of course it wouldn’t be different at all. But what matters is that my current state of consciousness would be extremely unlikely for a dust mind. This doesn’t totally rule out the possibility but it basically puts it in the same category as every other skeptical thesis.
And actually it is probably worse than the other skeptical theses since it includes some really weird assumptions about information and causation, as far as I can tell.
I successfully referred to something with the phrase. I know I did because your response wasn’t “Huh? What does that word mean?”
That’s also true of things like the Christian Trinity and immortal souls and consciousness and acausal free will. All these words refer to things that are untestable and unobservable, or are described in internally inconsistent ways (logical impossibilities); those of them that could potentially exist, don’t exist as a matter of fact; and some of them are just meaningless strings of words.
The real referent in these cases is just the sum of everything people tend to say or feel about these supposed concepts.
I’m more that open to the suggestion that subject experience is a illusion or an error—but it is the constitutive feature of our existence. Curious people aren’t going to just stop talking about it without a very good reason. The burden is on those who don’t think it should be discussed, to explain why.
I certainly agree that experience exists—I know I have mine, everyone else says the same about themselves. But if we insist on treating it as purely subjective experience, then we’ll never be able to say anything about it, pretty much by definition. In my experience all those curious people are talking about badly-understood notions deriving from beliefs of body-mind duality. No matter how much we learn about objective experience, even if we can manipulate somebody’s experience in any way we like, people can still say they don’t understand subjective experience.
It’s easy to think that because we experience things (as a verb), there must be some subjective experience to talk about. But my position is that if we can’t formulate a question about subjective experience—a question that will make us behave differently depending on the answer—then there’s nothing to talk about. We can go searching for answers, but there’s no such thing as searching for a question. Are we supposed to one day think of a question and be enlightened? But that question exists in its own right, and can be answered if we ever think it’s important in the way that any question may be important. Meanwhile, if we have some kind of psychological drive to look for The Question, we might as well ignore that drive or look for a way to suppress it—just as we do with other unprofitable, un-fullfillable drives.
That’s my position, anyway...
Agreed. This is a good line of attack. Egan’s response in the FAQ is:
some people have suggested that a sequence of states could only experience consciousness if there was a genuine causal relationship between them. The whole point of the Dust Theory, though, is that there is nothing more to causality than the correlations between states.
I don’t really know where he is coming from. If that is “the point” of Dust theory I don’t see how he as made that argument. It looks to me like brains and genuine simulations are indeed causal but arbitrary patterns are not.
In a timeless view, causality is just (regular) correlation in spacetime, as Egan says. I’m not sure what you are saying, though.
I also asked if simulating one or just a few mental states, instead of the whole evolution of your mental state over time, created some kind of subjective experience? In that case, would it be morally wrong to keep a highly detailed scan of your brain taken when you were feeling sad?
That said, it isn’t obvious to me why causation would be necessary for consciousness. Say we simulate your brain and record the simulation. We then divide the recording into 100,000 pieces, scramble them, and put them back together. Then we play the recording. The Dust theory says that the recording will be conscious just in no way proceeding along the arrow of time the way we are. Is the recording still a causal system?
That’s entirely a matter of definition—the definitions of “consciousness” and of “causality”. You can define them any way you like, but what do you actually learn about reality from this?
This part of Dust theory strikes me as leading up to the conclusion that “there are many conscious states!” without defining what consciousness means, and so not actually saying anything.
Regarding the rest of the objection...
You don’t address my central claim: that the mapping of ‘mental states’ to the physical representation used is arbitrary. If I can build a physics-simulator as a state machine encoding complete rules of physical evolution, then by the principles of Turing equivalence, I can build some other machine that uses any mapping I like from physical states to numbers (each number representing the simulator’s state). Does it then generate subjective experience in “someone”, and is it morally significant, to feed that machine some given number—even the number 1 for instance—because I built the machine to represent torture with that number?
You say of this,
You definitely can’t just say of some set of patterns “These are Jack suffering” and make them that way.
But why not? I can choose the mapping as I like.
To be sure there is a mind and to specify a particular experience of me suffering I suspect you would have to actually simulate me suffering.
Do you mean that it’s difficult to get the information needed to build a machine that correctly maps the number 1 to your suffering?
I can do this, for instance, by observing you in a normal state—recording what your brain looks like—then applying my understanding of nervous systems pain signals to build a description of your brain suffering pain. It’s not difficult in principle.
Or do you mean I might simulate your brain suffering without thereby creating a suffering mind? Why? And why would this be anything more than a matter of arbitrary definition?
My logic isn’t invalid. I addition to their being mental states with relation K to your final mental state there are some such states where you are happy and others in which you experience eternal torture. That is also a consequence of the argument.
That means there are (infinitely) many entities you are, with many different experiences. And all of them are you. That sounds like an… unorthodox use of the word “you” :-)
The problem is that you have not defined what ‘you’ means in this context.
My position on this is basically that our concept of personhood confuses types and tokens because it was developed in a world where every person had only one token. The fact that our concept of personal identity isn’t equipped to deal with the Dust argument isn’t really a point against it.
Why call this hypothetical collection of persons “you” (or indeed “me”) if it contains many different persons and doesn’t match our existing use of the word “you”?
Clarifying: Is the “Subjective Dust Theory” different from some other Dust theory as you understand it? I’m trying to describe Egan’s Dust theory.
I coined the term “subjective Dust theory” to mean Dust theory as applied to me and my subjective experience (producing conclusions such as “I’m necessarily immortal”), as opposed to Dust theory applied to other minds.
Also, I agree. Based on our experiences we can conclude that we are not dust-minds.
Obligatory question: given what observations would you assign high probability to the possibility that you are a dust-mind? Why would you privilege it over competing theories, which include:
You, but not all other possible experiences, are being simulated by someone for some reason.
Your memory or other parts of your mind are faulty, making you falsely remember or experience a chaotic universe (or whatever it is you observed).
Someone is deliberately messing / has messed with your mind, result as above.
You are a Boltzmann brain. (Unlike a dust-mind, you live normally over time, it’s just that you and your surroundings were created out of chaos by chance.)
Arguably, each of these theories is much more specific than the idea that you’re a dust-mind. That is, Dust theory predicts that there will be (infinitely many?) versions of you, some of which are being simulated, others have faulty memory, yet others are having their minds messed with by aliens, and still others are genuine Boltzmann brains. So in the absence of evidence to choose one of these, we should stick with the most general applicable theory—Dust theory.
On the other hand, the classes of all simulations and of all Boltzmann-brains also include all dust-minds… (You can simulate a universe containing a dust mind, and a a dust mind’s states can come about by Boltzmann chance.) So it’s not conclusive.
I certainly agree that experience exists—I know I have mine, everyone else says the same about themselves.
I was actually going to remark in the original comment and my previous one that I thought “subjective experience” was redundant. I truly have no idea what non-subjective experience could possibly be. “Subjective experience” isn’t something that is contrasted from other kinds of experience. It isn’t my coinage, as far as I know it is a legacy term but helpful in that it combines ‘the subject’ with ‘experiencing’. If that makes you uncomfortable by all means replace every instance of ‘subjective experience’ with ‘experience’. I think you can safely do the same with ‘consciousness’ or ‘qualia’ but I imagine you don’t like those terms either.
In a timeless view, causality is just (regular) correlation in spacetime, as Egan says. I’m not sure what you are saying, though.
The mercury in my barometer always drops before a thunderstorm. My barometer has never caused a thunderstorm. Thus, I prefer a counterfactual theory of causation. If you think Egan is right then how is a dust mind different from “causal sequences of physical states, i.e. outright simulations”>
I also asked if simulating one or just a few mental states, instead of the whole evolution of your mental state over time, created some kind of subjective experience? In that case, would it be morally wrong to keep a highly detailed scan of your brain taken when you were feeling sad?
I think the argument requires that there be more than one mental state, though one can skip mental states. But lets say you had three detailed scans from a period of sadness. Whether or not it is immoral would depend on whether or not we distinguish identical and simultaneous copies of persons in our utility function. But if you do care about such copies then yeah, it wouldn’t be the nicest thing to do.
That’s entirely a matter of definition—the definitions of “consciousness” and of “causality”. You can define them any way you like, but what do you actually learn about reality from this? This part of Dust theory strikes me as leading up to the conclusion that “there are many conscious states!” without defining what consciousness means, and so not actually saying anything.
The concepts of ‘consciousness’ and ‘causality’ describe features of the way we relate to the external world. I would like a coherent picture of this relation. Cause and effect, in particular is a huge part of how we experience the world. How this concept relates to what is actually going on is a really interesting question. If a system needs to be the kind of system we recognize as a causal system in order to produce a subject that experiences the world that would be something interesting to know. Getting a really precise definition about what consciousness is would be really cool. I know there are a lot of people working on it but that isn’t me. I don’t at all think that one needs a really precise analytic definition of a concept in order to employ it or say meaningful things about it.
You don’t address my central claim: that the mapping of ‘mental states’ to the physical representation used is arbitrary.
I just meant that something can’t just be labelled “Jack suffering”. There has to actually be a set of patterns that represent it. The set of real numbers, for example, is sufficiently complex that if it is represented in the universe (say by an infinite number of particles) then according to the dust theory that representation includes a pattern that is Jack suffering. But it also includes a pattern that is Jack really happy. And it includes lots of other patterns. You saying “This is Jack suffering” doesn’t change that. The information is there even if you aren’t reading it that way./*
Now what you might be able to do is build a really well determined structure such that only the states of me suffering are represented. I don’t really know. If you can though, I’m inclined to say that what you end up being will just be a run-of-the-mill simulation of me suffering, something we’d all recognize as bad. The only way to determine that you’ve created a pattern that is exactly as you want it to be is to run a regular old person-simulation—I think.
But I might not be responding to your concern, I’m still pretty confused about what that is.
That means there are (infinitely) many entities you are, with many different experiences. And all of them are you. That sounds like an… unorthodox use of the word “you” :-)
Yeah, I put it more delicately in the undergraduate thesis proposal I just turned in. But yeah it is unorthodox. :-) But the alternative is to give up a coherent account of personal identity altogether, as far as I’m concerned.
Why call this hypothetical collection of persons “you” (or indeed “me”) if it contains many different persons and doesn’t match our existing use of the word “you”?
Long answer, I’ll come back to it.
Obligatory question: given what observations would you assign high probability to the possibility that you are a dust-mind? Why would you privilege it over competing theories.
I’m not sure we can distinguish skeptical hypotheses by empirical evidence—I’m pretty sure we can’t by definition. But we might find empirical evidence that alters our estimations of the premises of those positions (our understanding of entropy could change, our understanding of brain simulations will surely change once we actually succeed, etc. Also we might be able to distinguish them according to a priori criteria like parsimony.
Objection 1: many difficulties (Dust theory being one) are avoided if you simply do not use the term ‘subjective experience’. Don’t try to define it. Don’t assume something exists that should be called that.
What is the discussion of ‘subjective experience’ needed for? What is the problem with discarding the entire concept? (I’m aware there are some problems, but I’m interested in your take on it because I think most of them can be explained away.)
Objection 2, to your item 3: the mapping of a ‘mental state’ to the configuration of some physical system is purely a matter of interpretation. The problem here is that you ascribe to physical configurations, some properties that are normally reserved for causal sequences of physical states, i.e. outright simulations.
Suppose I have a model of your ‘mentality’ - that thing whose states are your mental states. Since it’s embodied in a physical system, I can enumerate all possible mental states. Suppose there are countable many states (I don’t know physics that well, but at the very least this is valid if you accept arbitrarily precise descriptions of the physical system as mapping to arbitrarily precise specifications of your corresponding mental state).
Now, I can write down a few (very large) numbers that map to some mental states of yours. Do you think my act of writing down these numbers literally brings into existence a subjective experience that did not otherwise exists?
(You may object because the actual numbers involved are so big they probably can’t be literally written down even on a Universe-sized piece of paper. But I can use any physical system, not much more complex than the one I’m modelling (that’s you), to encode the numbers. Pen and paper aren’t privileged media.)
Suppose I find a number, or a series of numbers, that corresponds to a state of extreme suffering on your part. How many real-life resources would you invest to prevent me, an AI, from storing that number in my memory where no-one’s looking? (If your answer is ‘none at all’, then what else does this theory make you do differently, ever?) Would you react differently if I stored a sequence of very similar numbers, which correspond to almost-indistinguishable successive states of a real brain? (And that’s without taking into account the problems raised by rearranging the states in time.)
But that’s not the real problem. Remember that the mapping of mental states to numbers is purely arbitrary: any ordering of the natural numbers will do. What makes a given number invoke a given mental state? Is it just my own mental intention in using it? What if I build a non-sentient AI proxy to do it for me? What if I proclaim that I use the number 1 to encode a state of suffering in my simulation of you—will you try to stop everyone in the universe from writing down ‘1’? Will you counter-proclaim that no, the number 1 actually encodes your state of supreme happiness?
Objection 3, to your C2: your logic is invalid. Compare: “somewhere in the universe are mental states which correspond to someone mentally identical to yourself experiencing eternal torture. Therefore you will experience eternal torture, starting a moment from now.”
The problem is that you have not defined what ‘you’ means in this context. If there are many similar or identical states to “yours”, embodied at different points in the universe, which one are “you”? If there are several identical ones that diverge, and some experience eternal torture and some experience eternal happiness, which one is then “you”? If somewhere there is a sequence of mental states that starts out with the same memories you (Jack) have right now, but its actual experiences are of being on Mars, then do you expect to be on Mars?
This would be the Subjective Dust Theory, except that it’s wrong. It’s empirically wrong: my experiences have been highly ordered in the past and so I expect them to be ordered in the future, and not to jump randomly around the universe just because there exist embodiments of every possible future state I might experience. Of course you could say I just happen to be a state that remembers an ordered past—that’s the Boltzmann’s Brain (timelessness) postulate—but you can’t really conclude anything based on this, so I think it’s better to assume our memories are real and we really live in an ordered universe.
It sounds like I should clarify that I don’t actually endorse the argument. I’m just trying to make the argument explicit so that we can stop all the hand-waving.
I successfully referred to something with the phrase. I know I did because your response wasn’t “Huh? What does that word mean?” I’m more than open to the suggestion that subjective experience is a illusion or an error—but it is the constitutive feature of our existence. Curious people aren’t going to just stop talking about it without a very good reason. The burden is on those who don’t think it should not be discussed, to explain why.
Agreed. This is a good line of attack. Egan’s response in the FAQ is:
I don’t really know where he is coming from. If that is “the point” of Dust theory I don’t see how he as made that argument. It looks to me like brains and genuine simulations are indeed causal but arbitrary patterns are not. That said, it isn’t obvious to me why causation would be necessary for consciousness. Say we simulate your brain and record the simulation. We then divide the recording into 100,000 pieces, scramble them, and put them back together. Then we play the recording. The Dust theory says that the recording will be conscious just in no way proceeding along the arrow of time the way we are. Is the recording still a causal system?
Regarding the rest of the objection: First, Obviously the argument is counter-intuitive. Second, as I understand the argument, your mental intention has nothing to do with it. What matters is that there be multiple structural patterns that relate to one another as sequential mental states do. Thats it. If happiness are suffering are symmetrical it might ever be the case that every time you experience suffering there is another you that is happy and vice versa. You definitely can’t just say of some set of patterns “These are Jack suffering” and make them that way. With patterns that we are sure are sufficiently complex that they include a series of mental states those patterns will also include infinitely many other mental states (I think infinite, anyway). No is going to be able to alter the structure such that it only represents mental states where I am suffering. With less complex patterns we will be less sure any minds are included. To be sure there is a mind and to specify a particular experience of me suffering I suspect you would have to actually simulate me suffering. The sentence “Jack is suffering” isn’t complex enough to support mental states. Neither would the simulation written out in a programming language. Does that make any sense?
My logic isn’t invalid. I addition to their being mental states with relation K to your final mental state there are some such states where you are happy and others in which you experience eternal torture. That is also a consequence of the argument.
My position on this is basically that our concept of personhood confuses types and tokens because it was developed in a world where every person had only one token. The fact that our concept of personal identity isn’t equipped to deal with the Dust argument isn’t really a point against it.
Clarifying: Is the “Subjective Dust Theory” different from some other Dust theory as you understand it? I’m trying to describe Egan’s Dust theory.
Also, I agree. Based on our experiences we can conclude that we are not dust-minds.
How about “consciousness is computation, and causation is necesary for computation”.
Yes. But who ever said that any old causation is necesary for consciousness? Both computationalism and phsyicalism say consc.is a particular kind of causal process. The causation of playing back a simulation is not the causation of generating it in the first place.
Just because the encoding of the different states are scattered about the universe doesn’t mean the conscious experience does not appear to be contiguous and linear to the observer; while they’d be in the minority in an infinite configuration space it is impossible that there won’t be states without memories of contiguous experiences.
Could either of you explain how you would expect your current state of consciousness with its memories of experiences to be any different from how it is now if it were a dust-mind?
Of course it wouldn’t be different at all. But what matters is that my current state of consciousness would be extremely unlikely for a dust mind. This doesn’t totally rule out the possibility but it basically puts it in the same category as every other skeptical thesis.
And actually it is probably worse than the other skeptical theses since it includes some really weird assumptions about information and causation, as far as I can tell.
It is extremely unlikely, but in an unbounded configuration space it simply has to happen, and to happen many times.
That’s also true of things like the Christian Trinity and immortal souls and consciousness and acausal free will. All these words refer to things that are untestable and unobservable, or are described in internally inconsistent ways (logical impossibilities); those of them that could potentially exist, don’t exist as a matter of fact; and some of them are just meaningless strings of words.
The real referent in these cases is just the sum of everything people tend to say or feel about these supposed concepts.
I certainly agree that experience exists—I know I have mine, everyone else says the same about themselves. But if we insist on treating it as purely subjective experience, then we’ll never be able to say anything about it, pretty much by definition. In my experience all those curious people are talking about badly-understood notions deriving from beliefs of body-mind duality. No matter how much we learn about objective experience, even if we can manipulate somebody’s experience in any way we like, people can still say they don’t understand subjective experience.
It’s easy to think that because we experience things (as a verb), there must be some subjective experience to talk about. But my position is that if we can’t formulate a question about subjective experience—a question that will make us behave differently depending on the answer—then there’s nothing to talk about. We can go searching for answers, but there’s no such thing as searching for a question. Are we supposed to one day think of a question and be enlightened? But that question exists in its own right, and can be answered if we ever think it’s important in the way that any question may be important. Meanwhile, if we have some kind of psychological drive to look for The Question, we might as well ignore that drive or look for a way to suppress it—just as we do with other unprofitable, un-fullfillable drives.
That’s my position, anyway...
In a timeless view, causality is just (regular) correlation in spacetime, as Egan says. I’m not sure what you are saying, though.
I also asked if simulating one or just a few mental states, instead of the whole evolution of your mental state over time, created some kind of subjective experience? In that case, would it be morally wrong to keep a highly detailed scan of your brain taken when you were feeling sad?
That’s entirely a matter of definition—the definitions of “consciousness” and of “causality”. You can define them any way you like, but what do you actually learn about reality from this?
This part of Dust theory strikes me as leading up to the conclusion that “there are many conscious states!” without defining what consciousness means, and so not actually saying anything.
You don’t address my central claim: that the mapping of ‘mental states’ to the physical representation used is arbitrary. If I can build a physics-simulator as a state machine encoding complete rules of physical evolution, then by the principles of Turing equivalence, I can build some other machine that uses any mapping I like from physical states to numbers (each number representing the simulator’s state). Does it then generate subjective experience in “someone”, and is it morally significant, to feed that machine some given number—even the number 1 for instance—because I built the machine to represent torture with that number?
You say of this,
But why not? I can choose the mapping as I like.
Do you mean that it’s difficult to get the information needed to build a machine that correctly maps the number 1 to your suffering?
I can do this, for instance, by observing you in a normal state—recording what your brain looks like—then applying my understanding of nervous systems pain signals to build a description of your brain suffering pain. It’s not difficult in principle.
Or do you mean I might simulate your brain suffering without thereby creating a suffering mind? Why? And why would this be anything more than a matter of arbitrary definition?
That means there are (infinitely) many entities you are, with many different experiences. And all of them are you. That sounds like an… unorthodox use of the word “you” :-)
Why call this hypothetical collection of persons “you” (or indeed “me”) if it contains many different persons and doesn’t match our existing use of the word “you”?
I coined the term “subjective Dust theory” to mean Dust theory as applied to me and my subjective experience (producing conclusions such as “I’m necessarily immortal”), as opposed to Dust theory applied to other minds.
Obligatory question: given what observations would you assign high probability to the possibility that you are a dust-mind? Why would you privilege it over competing theories, which include:
You, but not all other possible experiences, are being simulated by someone for some reason.
Your memory or other parts of your mind are faulty, making you falsely remember or experience a chaotic universe (or whatever it is you observed).
Someone is deliberately messing / has messed with your mind, result as above.
You are a Boltzmann brain. (Unlike a dust-mind, you live normally over time, it’s just that you and your surroundings were created out of chaos by chance.)
Arguably, each of these theories is much more specific than the idea that you’re a dust-mind. That is, Dust theory predicts that there will be (infinitely many?) versions of you, some of which are being simulated, others have faulty memory, yet others are having their minds messed with by aliens, and still others are genuine Boltzmann brains. So in the absence of evidence to choose one of these, we should stick with the most general applicable theory—Dust theory.
On the other hand, the classes of all simulations and of all Boltzmann-brains also include all dust-minds… (You can simulate a universe containing a dust mind, and a a dust mind’s states can come about by Boltzmann chance.) So it’s not conclusive.
I was actually going to remark in the original comment and my previous one that I thought “subjective experience” was redundant. I truly have no idea what non-subjective experience could possibly be. “Subjective experience” isn’t something that is contrasted from other kinds of experience. It isn’t my coinage, as far as I know it is a legacy term but helpful in that it combines ‘the subject’ with ‘experiencing’. If that makes you uncomfortable by all means replace every instance of ‘subjective experience’ with ‘experience’. I think you can safely do the same with ‘consciousness’ or ‘qualia’ but I imagine you don’t like those terms either.
The mercury in my barometer always drops before a thunderstorm. My barometer has never caused a thunderstorm. Thus, I prefer a counterfactual theory of causation. If you think Egan is right then how is a dust mind different from “causal sequences of physical states, i.e. outright simulations”>
I think the argument requires that there be more than one mental state, though one can skip mental states. But lets say you had three detailed scans from a period of sadness. Whether or not it is immoral would depend on whether or not we distinguish identical and simultaneous copies of persons in our utility function. But if you do care about such copies then yeah, it wouldn’t be the nicest thing to do.
The concepts of ‘consciousness’ and ‘causality’ describe features of the way we relate to the external world. I would like a coherent picture of this relation. Cause and effect, in particular is a huge part of how we experience the world. How this concept relates to what is actually going on is a really interesting question. If a system needs to be the kind of system we recognize as a causal system in order to produce a subject that experiences the world that would be something interesting to know. Getting a really precise definition about what consciousness is would be really cool. I know there are a lot of people working on it but that isn’t me. I don’t at all think that one needs a really precise analytic definition of a concept in order to employ it or say meaningful things about it.
I just meant that something can’t just be labelled “Jack suffering”. There has to actually be a set of patterns that represent it. The set of real numbers, for example, is sufficiently complex that if it is represented in the universe (say by an infinite number of particles) then according to the dust theory that representation includes a pattern that is Jack suffering. But it also includes a pattern that is Jack really happy. And it includes lots of other patterns. You saying “This is Jack suffering” doesn’t change that. The information is there even if you aren’t reading it that way./*
Now what you might be able to do is build a really well determined structure such that only the states of me suffering are represented. I don’t really know. If you can though, I’m inclined to say that what you end up being will just be a run-of-the-mill simulation of me suffering, something we’d all recognize as bad. The only way to determine that you’ve created a pattern that is exactly as you want it to be is to run a regular old person-simulation—I think.
But I might not be responding to your concern, I’m still pretty confused about what that is.
Yeah, I put it more delicately in the undergraduate thesis proposal I just turned in. But yeah it is unorthodox. :-) But the alternative is to give up a coherent account of personal identity altogether, as far as I’m concerned.
Long answer, I’ll come back to it.
I’m not sure we can distinguish skeptical hypotheses by empirical evidence—I’m pretty sure we can’t by definition. But we might find empirical evidence that alters our estimations of the premises of those positions (our understanding of entropy could change, our understanding of brain simulations will surely change once we actually succeed, etc. Also we might be able to distinguish them according to a priori criteria like parsimony.