Physical view: The maintenance of personal identity requires bodily continuity. So, for instance, one cannot preserve a person by downloading their psychological state into a computer.
Psychological view: The maintenance of personal identity requires continuity of psychological states. As long as there is a continuing stream of psychological states with the appropriate causal relations between them, the person persists.
I answered “psychological”, but I should perhaps note that I don’t understand “continuing” to imply “uninterrupted”. I have no problem with the idea of a personal identity that is shut down for a while before being booted back up (with its internal state saved), or one that is computed on a timesharing system.
Yeah, I should have clarified. “Continuity” here does not mean temporal continuity; it means causal continuity. Future states are appropriately causally related to past states. So if I disintegrate right now, and simultaneously, by some bizarre chance, an atom-for-atom duplicate of me is produced by a thermal fluctuation on Mars, that duplicate would not be me, since the appropriate causal connections between my psychological state and his are lacking.
My instinct is to say that I don’t require causal continuity either… e.g., to say that if I appeared on Mars I would consider myself to still be the person who used to exist on Earth, despite the lack of causal connection. That said, I don’t really take that instinct seriously, since I’m incapable of actually imagining this happening without positing a causal connection I just don’t happen to be aware of. The alternative is… well, unimaginable.
So, I dunno. Maybe I do require continuity in that sense, I’m just willing to posit it for any sufficiently complex system.
So where does the information to build the copy of you on Mars come from? It’s all fine and well to say “thermal noise” but if you allow for brains to be built from thermal noise with any sort of frequency, you end up with the bigger philosophical problem of Boltzmann brains. Unless you’re proposing a mechanism by which the brain in question is your brain, in which case you’ve reintroduced causality.
It’s all fine and well to say “thermal noise” but if you allow for brains to be built from thermal noise with any sort of frequency, you end up with the bigger philosophical problem of Boltzmann brains.
I agree that Boltzmann brains are a philosophical problem, but they’re a problem precisely because our current best physical theories tell us that brains can fluctuate into existence. I don’t think the right way to deal with the problem is to say, “Boltzmann brains are problematic, so let’s just deny that they can exist.”
but they’re a problem precisely because our current best physical theories tell us that brains can fluctuate into existence.
Yes, but our current best physical theories also mean that they probably fluctuate into existence considerably less often than they form under normal circumstances (human brains, at least). A mind is a complex thing, so the amount of information it takes to replicate a mind is probably far higher than the amount of information it takes to specify an environment likely to give rise to a mind. If you discard the causal process that gives rise to minds in practice and postulate thermal noise as the cause instead, you end up postulating Boltzmann brains as well.
I didn’t mean that Boltzmann brains are a particularly big philosophical problem, just that they become one when you try to do philosophy where you postulate very specific things occurring by “random chance”.
That’s not how I understood the question. Now it turns out my vote is wrong.
I should not have voted without understanding the question fully. But if I read the survey where the questions are taken from, I will probably also learn what is considered there to be the mainstream position, which will bias my answers.
Physical view: The maintenance of personal identity requires bodily continuity. So, for instance, one cannot preserve a person by downloading their psychological state into a computer.
Psychological view: The maintenance of personal identity requires continuity of psychological states. As long as there is a continuing stream of psychological states with the appropriate causal relations between them, the person persists.
I answered “psychological”, but I should perhaps note that I don’t understand “continuing” to imply “uninterrupted”. I have no problem with the idea of a personal identity that is shut down for a while before being booted back up (with its internal state saved), or one that is computed on a timesharing system.
Yeah, I should have clarified. “Continuity” here does not mean temporal continuity; it means causal continuity. Future states are appropriately causally related to past states. So if I disintegrate right now, and simultaneously, by some bizarre chance, an atom-for-atom duplicate of me is produced by a thermal fluctuation on Mars, that duplicate would not be me, since the appropriate causal connections between my psychological state and his are lacking.
Hm.
My instinct is to say that I don’t require causal continuity either… e.g., to say that if I appeared on Mars I would consider myself to still be the person who used to exist on Earth, despite the lack of causal connection.
That said, I don’t really take that instinct seriously, since I’m incapable of actually imagining this happening without positing a causal connection I just don’t happen to be aware of. The alternative is… well, unimaginable.
So, I dunno. Maybe I do require continuity in that sense, I’m just willing to posit it for any sufficiently complex system.
So where does the information to build the copy of you on Mars come from? It’s all fine and well to say “thermal noise” but if you allow for brains to be built from thermal noise with any sort of frequency, you end up with the bigger philosophical problem of Boltzmann brains. Unless you’re proposing a mechanism by which the brain in question is your brain, in which case you’ve reintroduced causality.
I agree that Boltzmann brains are a philosophical problem, but they’re a problem precisely because our current best physical theories tell us that brains can fluctuate into existence. I don’t think the right way to deal with the problem is to say, “Boltzmann brains are problematic, so let’s just deny that they can exist.”
Yes, but our current best physical theories also mean that they probably fluctuate into existence considerably less often than they form under normal circumstances (human brains, at least). A mind is a complex thing, so the amount of information it takes to replicate a mind is probably far higher than the amount of information it takes to specify an environment likely to give rise to a mind. If you discard the causal process that gives rise to minds in practice and postulate thermal noise as the cause instead, you end up postulating Boltzmann brains as well.
I didn’t mean that Boltzmann brains are a particularly big philosophical problem, just that they become one when you try to do philosophy where you postulate very specific things occurring by “random chance”.
That’s not how I understood the question. Now it turns out my vote is wrong.
I should not have voted without understanding the question fully. But if I read the survey where the questions are taken from, I will probably also learn what is considered there to be the mainstream position, which will bias my answers.