I think you’ve explained your intuition well, but without examples it doesn’t feel like understanding to me. You’ve said some things that seem interesting, like “super-material information” or “one atom in the wrong place”, maybe you could try making them as precise as possible?
Ok, but I said those two things you quoted only in a short argument why I think individual consciousness is not true. That’s not required for anything relating to the theory. All I need there is that there are different ways that consciousness could work, and that they can play a role for probability. I think that can be kept totally separate from a discussion about which of them is true.
So the argument I made was meant to illustrate that individual consciousness requires a mechanism by which the universe remembers that your conscious experience is anchored to your body in particular, and that it’s hard to see how such a mechanism could exist. People generally fear death not because they are afraid of losing the particular conscious experience of their mind, but because they are afraid of losing all conscious conscious experience, period. This only makes sense if there is such a mechanism.
The reductio ad absurdum is making a perfect clone of someone. Either both versions are subjectively different people, so that if one of them died it wouldn’t be any consolation for her that the other one is still alive; or they are one person living in two different bodies, and either one would have to care about the other as much as about herself, even on a purely egoistical level. One of those two things has to be the case.
If it’s the former, that means the universe somehow knows which one is which even though they are identical on a material level. That’s why I meant by super-material information. There must be something not encoded in particles that the universe can use to tell them apart. I think many of us would agree that such a thing doesn’t exist.
If it’s the latter, then that begs the question what happens if you have one copy be slightly imperfect. Is it a different person once you change one atom? Maybe not. But there must be some number such that if you change that many atoms, then they are subjectively different people. If there is such a number, there’s also a smallest number that does this. What follows is that if you change 342513 atoms they are subjectively the same, but if you change 342514 they’re subjectively different. Or alternatively it could turn a few particular atoms?
Either way seems ridiculous, so my conclusion is that there most likely is no mechanism for conscious individuality, period. That means I rationally have no reason to care about my own well-being any more than about anyone else’s, because anyone else is really just another versions of myself. I think most people find this super unintuitive, but it’s actually a simpler theory, it doesn’t give you any trouble with the cloning experiment because now both cones are always the same person no matter how much you change, and it solves the problem of “what a surprise that I happen to be born instead of person-X-who-never-existed!”. It seems to be the far more plausible theory.
But again, you don’t need to agree that one theory of consciousness is more plausible for any of the probability stuff, you only need to agree that there are two different ways it could work.
So one of those ways will agree with SIA and the other will disagree, right? Let’s focus on the second one then. Can you give a procedural problem where the second way disagrees with SIA?
No; like I said, procedures tend to be repeatable. Maybe there is one, but I haven’t come up with one yet. What’s wrong with the presumptuous philosopher problem (about two possible universes) as an example?
Let’s say God flipped a logical coin to choose between creating a billion or a trillion observers in a single universe. Is that equivalent to your example?
What if God does that many times, but you can distinguish between them? First flip a blue coin to decide between creating a billion or a trillion blue people. Then flip a purple coin to decide between creating a billion or a trillion purple people. And so on, for many different colors. You know your own color: green. What are your beliefs about the green coin?
1000:1 on tails (with tails → create large universe). It’s a very good question. My answer is late because it made me think about some stuff that confused me at first, an I wanted to make sure that everything I say now is coherent with everything I said in the post.
If god flipped enough logical coins for you to be able to make the approximation that half of them came up heads, you can update on the color of your logical coin based on the fact that your current version is green. This is a thousand times as likely if the green coin came up tails vs heads. You can’t do the same if god only created one universe.
If god created more than one but still only a few universes, let’s say two, then the chance that your coin came up heads is a bit more than a quarter, which comes from the heads-heads case. The heads-tails case is possible but highly unlikely.
If it’s the former, that means the universe somehow knows which one is which even though they are identical on a material level
Note that “the universe” is already keeping track of two identical bodies..which are, of course, in different places...which gives you a hint as to how the trick is pulled off.
Under dualism , there is problem of how to match up 7 billion souls to 7 billion bodies. Under physicalism, the individual self just is the body-brain, there is not logical possibility of a mismatch , and whatever mechanism (ie different spatial location) that allows the universe to have two identical but distinct bodies allows it to have two identical but distinct consciosunesses.
I think you’ve explained your intuition well, but without examples it doesn’t feel like understanding to me. You’ve said some things that seem interesting, like “super-material information” or “one atom in the wrong place”, maybe you could try making them as precise as possible?
Ok, but I said those two things you quoted only in a short argument why I think individual consciousness is not true. That’s not required for anything relating to the theory. All I need there is that there are different ways that consciousness could work, and that they can play a role for probability. I think that can be kept totally separate from a discussion about which of them is true.
So the argument I made was meant to illustrate that individual consciousness requires a mechanism by which the universe remembers that your conscious experience is anchored to your body in particular, and that it’s hard to see how such a mechanism could exist. People generally fear death not because they are afraid of losing the particular conscious experience of their mind, but because they are afraid of losing all conscious conscious experience, period. This only makes sense if there is such a mechanism.
The reductio ad absurdum is making a perfect clone of someone. Either both versions are subjectively different people, so that if one of them died it wouldn’t be any consolation for her that the other one is still alive; or they are one person living in two different bodies, and either one would have to care about the other as much as about herself, even on a purely egoistical level. One of those two things has to be the case.
If it’s the former, that means the universe somehow knows which one is which even though they are identical on a material level. That’s why I meant by super-material information. There must be something not encoded in particles that the universe can use to tell them apart. I think many of us would agree that such a thing doesn’t exist.
If it’s the latter, then that begs the question what happens if you have one copy be slightly imperfect. Is it a different person once you change one atom? Maybe not. But there must be some number such that if you change that many atoms, then they are subjectively different people. If there is such a number, there’s also a smallest number that does this. What follows is that if you change 342513 atoms they are subjectively the same, but if you change 342514 they’re subjectively different. Or alternatively it could turn a few particular atoms?
Either way seems ridiculous, so my conclusion is that there most likely is no mechanism for conscious individuality, period. That means I rationally have no reason to care about my own well-being any more than about anyone else’s, because anyone else is really just another versions of myself. I think most people find this super unintuitive, but it’s actually a simpler theory, it doesn’t give you any trouble with the cloning experiment because now both cones are always the same person no matter how much you change, and it solves the problem of “what a surprise that I happen to be born instead of person-X-who-never-existed!”. It seems to be the far more plausible theory.
But again, you don’t need to agree that one theory of consciousness is more plausible for any of the probability stuff, you only need to agree that there are two different ways it could work.
So one of those ways will agree with SIA and the other will disagree, right? Let’s focus on the second one then. Can you give a procedural problem where the second way disagrees with SIA?
No; like I said, procedures tend to be repeatable. Maybe there is one, but I haven’t come up with one yet. What’s wrong with the presumptuous philosopher problem (about two possible universes) as an example?
Let’s say God flipped a logical coin to choose between creating a billion or a trillion observers in a single universe. Is that equivalent to your example?
Yes.
I’m not used to the concept of a logical coin, but yes, that’s equivalent.
You need the consciousness condition & that god only does this once. Then my theory outputs the SSA answer.
What if God does that many times, but you can distinguish between them? First flip a blue coin to decide between creating a billion or a trillion blue people. Then flip a purple coin to decide between creating a billion or a trillion purple people. And so on, for many different colors. You know your own color: green. What are your beliefs about the green coin?
1000:1 on tails (with tails → create large universe). It’s a very good question. My answer is late because it made me think about some stuff that confused me at first, an I wanted to make sure that everything I say now is coherent with everything I said in the post.
If god flipped enough logical coins for you to be able to make the approximation that half of them came up heads, you can update on the color of your logical coin based on the fact that your current version is green. This is a thousand times as likely if the green coin came up tails vs heads. You can’t do the same if god only created one universe.
If god created more than one but still only a few universes, let’s say two, then the chance that your coin came up heads is a bit more than a quarter, which comes from the heads-heads case. The heads-tails case is possible but highly unlikely.
Note that “the universe” is already keeping track of two identical bodies..which are, of course, in different places...which gives you a hint as to how the trick is pulled off.
Under dualism , there is problem of how to match up 7 billion souls to 7 billion bodies. Under physicalism, the individual self just is the body-brain, there is not logical possibility of a mismatch , and whatever mechanism (ie different spatial location) that allows the universe to have two identical but distinct bodies allows it to have two identical but distinct consciosunesses.