I certainly agree that experience exists—I know I have mine, everyone else says the same about themselves.
I was actually going to remark in the original comment and my previous one that I thought “subjective experience” was redundant. I truly have no idea what non-subjective experience could possibly be. “Subjective experience” isn’t something that is contrasted from other kinds of experience. It isn’t my coinage, as far as I know it is a legacy term but helpful in that it combines ‘the subject’ with ‘experiencing’. If that makes you uncomfortable by all means replace every instance of ‘subjective experience’ with ‘experience’. I think you can safely do the same with ‘consciousness’ or ‘qualia’ but I imagine you don’t like those terms either.
In a timeless view, causality is just (regular) correlation in spacetime, as Egan says. I’m not sure what you are saying, though.
The mercury in my barometer always drops before a thunderstorm. My barometer has never caused a thunderstorm. Thus, I prefer a counterfactual theory of causation. If you think Egan is right then how is a dust mind different from “causal sequences of physical states, i.e. outright simulations”>
I also asked if simulating one or just a few mental states, instead of the whole evolution of your mental state over time, created some kind of subjective experience? In that case, would it be morally wrong to keep a highly detailed scan of your brain taken when you were feeling sad?
I think the argument requires that there be more than one mental state, though one can skip mental states. But lets say you had three detailed scans from a period of sadness. Whether or not it is immoral would depend on whether or not we distinguish identical and simultaneous copies of persons in our utility function. But if you do care about such copies then yeah, it wouldn’t be the nicest thing to do.
That’s entirely a matter of definition—the definitions of “consciousness” and of “causality”. You can define them any way you like, but what do you actually learn about reality from this? This part of Dust theory strikes me as leading up to the conclusion that “there are many conscious states!” without defining what consciousness means, and so not actually saying anything.
The concepts of ‘consciousness’ and ‘causality’ describe features of the way we relate to the external world. I would like a coherent picture of this relation. Cause and effect, in particular is a huge part of how we experience the world. How this concept relates to what is actually going on is a really interesting question. If a system needs to be the kind of system we recognize as a causal system in order to produce a subject that experiences the world that would be something interesting to know. Getting a really precise definition about what consciousness is would be really cool. I know there are a lot of people working on it but that isn’t me. I don’t at all think that one needs a really precise analytic definition of a concept in order to employ it or say meaningful things about it.
You don’t address my central claim: that the mapping of ‘mental states’ to the physical representation used is arbitrary.
I just meant that something can’t just be labelled “Jack suffering”. There has to actually be a set of patterns that represent it. The set of real numbers, for example, is sufficiently complex that if it is represented in the universe (say by an infinite number of particles) then according to the dust theory that representation includes a pattern that is Jack suffering. But it also includes a pattern that is Jack really happy. And it includes lots of other patterns. You saying “This is Jack suffering” doesn’t change that. The information is there even if you aren’t reading it that way./*
Now what you might be able to do is build a really well determined structure such that only the states of me suffering are represented. I don’t really know. If you can though, I’m inclined to say that what you end up being will just be a run-of-the-mill simulation of me suffering, something we’d all recognize as bad. The only way to determine that you’ve created a pattern that is exactly as you want it to be is to run a regular old person-simulation—I think.
But I might not be responding to your concern, I’m still pretty confused about what that is.
That means there are (infinitely) many entities you are, with many different experiences. And all of them are you. That sounds like an… unorthodox use of the word “you” :-)
Yeah, I put it more delicately in the undergraduate thesis proposal I just turned in. But yeah it is unorthodox. :-) But the alternative is to give up a coherent account of personal identity altogether, as far as I’m concerned.
Why call this hypothetical collection of persons “you” (or indeed “me”) if it contains many different persons and doesn’t match our existing use of the word “you”?
Long answer, I’ll come back to it.
Obligatory question: given what observations would you assign high probability to the possibility that you are a dust-mind? Why would you privilege it over competing theories.
I’m not sure we can distinguish skeptical hypotheses by empirical evidence—I’m pretty sure we can’t by definition. But we might find empirical evidence that alters our estimations of the premises of those positions (our understanding of entropy could change, our understanding of brain simulations will surely change once we actually succeed, etc. Also we might be able to distinguish them according to a priori criteria like parsimony.
I was actually going to remark in the original comment and my previous one that I thought “subjective experience” was redundant. I truly have no idea what non-subjective experience could possibly be. “Subjective experience” isn’t something that is contrasted from other kinds of experience. It isn’t my coinage, as far as I know it is a legacy term but helpful in that it combines ‘the subject’ with ‘experiencing’. If that makes you uncomfortable by all means replace every instance of ‘subjective experience’ with ‘experience’. I think you can safely do the same with ‘consciousness’ or ‘qualia’ but I imagine you don’t like those terms either.
The mercury in my barometer always drops before a thunderstorm. My barometer has never caused a thunderstorm. Thus, I prefer a counterfactual theory of causation. If you think Egan is right then how is a dust mind different from “causal sequences of physical states, i.e. outright simulations”>
I think the argument requires that there be more than one mental state, though one can skip mental states. But lets say you had three detailed scans from a period of sadness. Whether or not it is immoral would depend on whether or not we distinguish identical and simultaneous copies of persons in our utility function. But if you do care about such copies then yeah, it wouldn’t be the nicest thing to do.
The concepts of ‘consciousness’ and ‘causality’ describe features of the way we relate to the external world. I would like a coherent picture of this relation. Cause and effect, in particular is a huge part of how we experience the world. How this concept relates to what is actually going on is a really interesting question. If a system needs to be the kind of system we recognize as a causal system in order to produce a subject that experiences the world that would be something interesting to know. Getting a really precise definition about what consciousness is would be really cool. I know there are a lot of people working on it but that isn’t me. I don’t at all think that one needs a really precise analytic definition of a concept in order to employ it or say meaningful things about it.
I just meant that something can’t just be labelled “Jack suffering”. There has to actually be a set of patterns that represent it. The set of real numbers, for example, is sufficiently complex that if it is represented in the universe (say by an infinite number of particles) then according to the dust theory that representation includes a pattern that is Jack suffering. But it also includes a pattern that is Jack really happy. And it includes lots of other patterns. You saying “This is Jack suffering” doesn’t change that. The information is there even if you aren’t reading it that way./*
Now what you might be able to do is build a really well determined structure such that only the states of me suffering are represented. I don’t really know. If you can though, I’m inclined to say that what you end up being will just be a run-of-the-mill simulation of me suffering, something we’d all recognize as bad. The only way to determine that you’ve created a pattern that is exactly as you want it to be is to run a regular old person-simulation—I think.
But I might not be responding to your concern, I’m still pretty confused about what that is.
Yeah, I put it more delicately in the undergraduate thesis proposal I just turned in. But yeah it is unorthodox. :-) But the alternative is to give up a coherent account of personal identity altogether, as far as I’m concerned.
Long answer, I’ll come back to it.
I’m not sure we can distinguish skeptical hypotheses by empirical evidence—I’m pretty sure we can’t by definition. But we might find empirical evidence that alters our estimations of the premises of those positions (our understanding of entropy could change, our understanding of brain simulations will surely change once we actually succeed, etc. Also we might be able to distinguish them according to a priori criteria like parsimony.