The experience of a subjective thread of consciousness—let’s call it STC—is pretty much the experience of experience, or qualia. So Chalmers’ experiment is apt.
The relevant fact is that STC is purely subjective. The reason I think you and everyone else have STCs is because you say you do, and it’s a simpler hypothesis than the one saying I’m different from everyone else. But the reason I think I have an STC is completely different: I know it from direct experience, which is pretty much by definition incontrovertible. (I may be mistaken about the real world, my experience may not match reality, I may even have false memories, but I can’t be literally mistaken about what I’m experiencing right now. Unless you suppose I constantly have false memories of the last half-second and that the induction hypothesis is therefore inapplicable to subjective experience.)
So yes, I can imagine a universe without STCs. By construction, I won’t exist in that universe. This is exactly the same question as: can I rule out that no-one else but me in this universe has STCs, or qualia, or consciousness? No, I can’t; I can’t test the suggestion. But I also can’t proceed to the idea that I have no STC, because that would be denying my actual moment-to-moment experience. It would be as pointless as saying I’m colorblind when I’m not.
So to recap, I have no idea how to solve the hard problem of consciousness. I have no idea how to explain the subjective experience and connect it to the objective world, or to test for subjective experiences in somebody else. But I can’t pretend consciousness and experience don’t exist, either, at least not for me. I hope someone comes along and solves these problems and explains the solution to me—I don’t know if this is actually possible, but I hope so—but lack of a solution doesn’t make me ignore the problem.
The experience of a subjective thread of consciousness—let’s call it STC—is pretty much the experience of experience, or qualia. So Chalmers’ experiment is apt.
There’s a difference between the questions of STC and of qualia. We might live in a universe where you can have multiple causal descendants, each of them related to you-now in the same way that you-now are to you-one-year-ago, and none of them distinct from the others in a morally relevant way. Of course, each of these descendants would go on to experience qualia. The question of whether they have qualia is distinct from the question of whether one of them must be the unique inheritor of your STC.
So yes, I can imagine a universe without STCs.
Well, you can’t really. None of us can really imagine a universe. We can imagine a bundle of properties which don’t, so far as we can see, contradict each other. But the caveat “so far as we can see” is important.
The question of whether they have qualia is distinct from the question of whether one of them must be the unique inheritor of your STC.
Yes, you’re right. You’re describing a “branching thread of consciousness” model. Next, can there be a branching-and-merging model? Does it even make sense to ask the question?
If STC is a purely subjective experience unobservable from the outside, then we can’t really count the STCs. We may not be able to actually assert that the number of STCs at time n+1 is greater than at time n. In other words, we don’t really know if STCs can branch and/or merge even in our actual universe.
So you knowingly execute an inference process that has no correlation with the truth, because the possible-universes where it’s wrong aren’t morally significant, so you don’t care what consequences the incorrect conclusions have here/there? (Is there a demonstrative pronoun that doesn’t specify whether the object is near or far?) (“You” refers to whatever replies to this comment, not any epiphenomenon that may or may not be associated with it.) In the absence of a causal relation, where did you get the idea that your morality has anything to do with STCs?
If you have a single STC, why do you hypothesize that the bridging law associates it with the copy of you that’s going to die rather than one of the ones that survives? Or would you decline in the thought-experiment, not due to certainty of death, but due to uncertainty?
What about consciousness that isn’t arranged in threads? e.g. a branching structure, like the causal graph of the physics involved. If you frequently branch and rarely or never merge, then any given instance of you will still remember only a single history, so introspection (even epiphenomenal introspection, if you think there is such a thing, let alone introspection by a biological instantiation) can’t distinguish this from the STC theory.
So you knowingly execute an inference process that has no correlation with the truth, because the possible-universes where it’s wrong aren’t morally significant, so you don’t care what consequences the incorrect conclusions have here/there?
I don’t see what morals have to do with it. I didn’t talk about morals.
If you have a single STC, why do you hypothesize that the bridging law associates it with the copy of you that’s going to die rather than one of the ones that survives? Or would you decline in the thought-experiment, not due to certainty of death, but due to uncertainty?
I am indeed uncertain, and so won’t risk death. However, I do hypothesize that it’s much more likely that, if a copy is created, my STC will remain with the original—inasfar as an original is identifiable. And when it is not identifiable, I fear that my STC will die entirely and not posses any of the clones.
What about consciousness that isn’t arranged in threads? e.g. a branching structure, like the causal graph of the physics involved.
This is perfectly possible. In fact it’s very likely. Because, if we create a clone, and if it necessarily has an STC of its own that is (in terms of memories and personality) a clone of the original’s STC, then it makes at least as much sense to say that the STC branched than to say we somehow “created” a new STC to specification.
In this case I would anticipate becoming some one of the branches (clones). However, I do not know yet the probability weight of becoming each one, and AFAICS this can only be established empirically—and to test it at all I would have to risk the aforementioned chance of death.
In this scenario I wouldn’t want people to torture clones of me in case I became them—but as long as a clearly identifiable original body remains intact, I very strongly expect my STC to remain associated with it, and not pass to a random clone, so if I’m not threatened I can mistreat my clones. And I certainly would never accept destruction of the original body no matter how many clones were created in exchange.
The experience of a subjective thread of consciousness—let’s call it STC—is pretty much the experience of experience, or qualia. So Chalmers’ experiment is apt.
The relevant fact is that STC is purely subjective. The reason I think you and everyone else have STCs is because you say you do, and it’s a simpler hypothesis than the one saying I’m different from everyone else. But the reason I think I have an STC is completely different: I know it from direct experience, which is pretty much by definition incontrovertible. (I may be mistaken about the real world, my experience may not match reality, I may even have false memories, but I can’t be literally mistaken about what I’m experiencing right now. Unless you suppose I constantly have false memories of the last half-second and that the induction hypothesis is therefore inapplicable to subjective experience.)
So yes, I can imagine a universe without STCs. By construction, I won’t exist in that universe. This is exactly the same question as: can I rule out that no-one else but me in this universe has STCs, or qualia, or consciousness? No, I can’t; I can’t test the suggestion. But I also can’t proceed to the idea that I have no STC, because that would be denying my actual moment-to-moment experience. It would be as pointless as saying I’m colorblind when I’m not.
So to recap, I have no idea how to solve the hard problem of consciousness. I have no idea how to explain the subjective experience and connect it to the objective world, or to test for subjective experiences in somebody else. But I can’t pretend consciousness and experience don’t exist, either, at least not for me. I hope someone comes along and solves these problems and explains the solution to me—I don’t know if this is actually possible, but I hope so—but lack of a solution doesn’t make me ignore the problem.
There’s a difference between the questions of STC and of qualia. We might live in a universe where you can have multiple causal descendants, each of them related to you-now in the same way that you-now are to you-one-year-ago, and none of them distinct from the others in a morally relevant way. Of course, each of these descendants would go on to experience qualia. The question of whether they have qualia is distinct from the question of whether one of them must be the unique inheritor of your STC.
Well, you can’t really. None of us can really imagine a universe. We can imagine a bundle of properties which don’t, so far as we can see, contradict each other. But the caveat “so far as we can see” is important.
Yes, you’re right. You’re describing a “branching thread of consciousness” model. Next, can there be a branching-and-merging model? Does it even make sense to ask the question?
If STC is a purely subjective experience unobservable from the outside, then we can’t really count the STCs. We may not be able to actually assert that the number of STCs at time n+1 is greater than at time n. In other words, we don’t really know if STCs can branch and/or merge even in our actual universe.
So you knowingly execute an inference process that has no correlation with the truth, because the possible-universes where it’s wrong aren’t morally significant, so you don’t care what consequences the incorrect conclusions have here/there? (Is there a demonstrative pronoun that doesn’t specify whether the object is near or far?) (“You” refers to whatever replies to this comment, not any epiphenomenon that may or may not be associated with it.) In the absence of a causal relation, where did you get the idea that your morality has anything to do with STCs?
If you have a single STC, why do you hypothesize that the bridging law associates it with the copy of you that’s going to die rather than one of the ones that survives? Or would you decline in the thought-experiment, not due to certainty of death, but due to uncertainty?
What about consciousness that isn’t arranged in threads? e.g. a branching structure, like the causal graph of the physics involved. If you frequently branch and rarely or never merge, then any given instance of you will still remember only a single history, so introspection (even epiphenomenal introspection, if you think there is such a thing, let alone introspection by a biological instantiation) can’t distinguish this from the STC theory.
I don’t see what morals have to do with it. I didn’t talk about morals.
I am indeed uncertain, and so won’t risk death. However, I do hypothesize that it’s much more likely that, if a copy is created, my STC will remain with the original—inasfar as an original is identifiable. And when it is not identifiable, I fear that my STC will die entirely and not posses any of the clones.
This is perfectly possible. In fact it’s very likely. Because, if we create a clone, and if it necessarily has an STC of its own that is (in terms of memories and personality) a clone of the original’s STC, then it makes at least as much sense to say that the STC branched than to say we somehow “created” a new STC to specification.
In this case I would anticipate becoming some one of the branches (clones). However, I do not know yet the probability weight of becoming each one, and AFAICS this can only be established empirically—and to test it at all I would have to risk the aforementioned chance of death.
In this scenario I wouldn’t want people to torture clones of me in case I became them—but as long as a clearly identifiable original body remains intact, I very strongly expect my STC to remain associated with it, and not pass to a random clone, so if I’m not threatened I can mistreat my clones. And I certainly would never accept destruction of the original body no matter how many clones were created in exchange.